WorldWideScience

Sample records for assessment ioa analysis

  1. Independent Orbiter Assessment (IOA): Assessment of the main propulsion subsystem FMEA/CIL, volume 4

    Science.gov (United States)

    Slaughter, B. C.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Main Propulsion System (MPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were than compared to available data from the Rockwell Downey/NASA JSC FMEA/CIL review. Volume 4 contains the IOA analysis worksheets and the NASA FMEA to IOA worksheet cross reference and recommendations.

  2. Independent Orbiter Assessment (IOA): Assessment of the main propulsion subsystem FMEA/CIL, volume 2

    Science.gov (United States)

    Holden, K. A.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Main Propulsion System (MPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were than compared to available data from the Rockwell Downey/NASA JSC FMEA/CIL review. Volume 2 continues the presentation of IOA worksheets for MPS hardware items.

  3. Independent Orbiter Assessment (IOA): Assessment of the main propulsion subsystem FMEA/CIL, volume 3

    Science.gov (United States)

    Holden, K. A.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Main Propulsion System (MPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to available data from the Rockwell Downey/NASA JSC FMEA/CIL review. Volume 3 continues the presentation of IOA worksheets and includes the potential critical items list.

  4. Independent Orbiter Assessment (IOA): Assessment of the backup flight system FMEA/CIL

    Science.gov (United States)

    Prust, E. E.; Ewell, J. J., Jr.; Hinsdale, L. W.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Backup Flight System (BFS) hardware, generating draft failure modes and Potential Critical Items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the proposed NASA Post 51-L FMEA/CIL baseline. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter BFS hardware. The IOA product for the BFS analysis consisted of 29 failure mode worksheets that resulted in 21 Potential Critical Items (PCI) being identified. This product was originally compared with the proposed NASA BFS baseline and subsequently compared with the applicable Data Processing System (DPS), Electrical Power Distribution and Control (EPD and C), and Displays and Controls NASA CIL items. The comparisons determined if there were any results which had been found by the IOA but were not in the NASA baseline. The original assessment determined there were numerous failure modes and potential critical items in the IOA analysis that were not contained in the NASA BFS baseline. Conversely, the NASA baseline contained three FMEAs (IMU, ADTA, and Air Data Probe) for CIL items that were not identified in the IOA product.

  5. Independent Orbiter Assessment (IOA): Assessment of the data processing system FMEA/CIL

    Science.gov (United States)

    Lowery, H. J.; Haufler, W. A.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Data Processing System (DPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison is documented for the Orbiter DPS hardware.

  6. Independent Orbiter Assessment (IOA): FMEA/CIL assessment

    Science.gov (United States)

    Hinsdale, L. W.; Swain, L. J.; Barnes, J. E.

    1988-01-01

    The McDonnell Douglas Astronautics Company (MDAC) was selected to perform an Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL). Direction was given by the Orbiter and GFE Projects Office to perform the hardware analysis and assessment using the instructions and ground rules defined in NSTS 22206. The IOA analysis featured a top-down approach to determine hardware failure modes, criticality, and potential critical items. To preserve independence, the analysis was accomplished without reliance upon the results contained within the NASA and Prime Contractor FMEA/CIL documentation. The assessment process compared the independently derived failure modes and criticality assignments to the proposed NASA post 51-L FMEA/CIL documentation. When possible, assessment issues were discussed and resolved with the NASA subsystem managers. Unresolved issues were elevated to the Orbiter and GFE Projects Office manager, Configuration Control Board (CCB), or Program Requirements Control Board (PRCB) for further resolution. The most important Orbiter assessment finding was the previously unknown stuck autopilot push-button criticality 1/1 failure mode. The worst case effect could cause loss of crew/vehicle when the microwave landing system is not active. It is concluded that NASA and Prime Contractor Post 51-L FMEA/CIL documentation assessed by IOA is believed to be technically accurate and complete. All CIL issues were resolved. No FMEA issues remain that have safety implications. Consideration should be given, however, to upgrading NSTS 22206 with definitive ground rules which more clearly spell out the limits of redundancy.

  7. Independent Orbiter Assessment (IOA): Assessment of the orbital maneuvering system FMEA/CIL, volume 1

    Science.gov (United States)

    Prust, Chet D.; Haufler, W. A.; Marino, A. J.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Orbital Maneuvering System (OMS) hardware and Electrical Power Distribution and Control (EPD and C), generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the proposed Post 51-L NASA FMEA/CIL baseline. This report documents the results of that comparison for the Orbiter OMS hardware. The IOA analysis defined the OMS as being comprised of the following subsystems: helium pressurization, propellant storage and distribution, Orbital Maneuvering Engine, and EPD and C. The IOA product for the OMS analysis consisted of 284 hardware and 667 EPD and C failure mode worksheets that resulted in 160 hardware and 216 EPD and C potential critical items (PCIs) being identified. A comparison was made of the IOA product to the NASA FMEA/CIL baseline which consisted of 101 hardware and 142 EPD and C CIL items.

  8. Independent Orbiter Assessment (IOA): Assessment of the atmospheric revitalization pressure control subsystem FMEA/CIL

    Science.gov (United States)

    Saiidi, M. J.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the atmospheric Revitalization Pressure Control Subsystem (ARPCS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL proposed Post 51-L updates based upon the CCB/PRCB presentations and an informal criticality summary listing. A discussion of each discrepancy from the comparison is provided through additional analysis as required. These discrepancies were flagged as issues, and recommendations were made based on the FMEA data available at the time. This report documents the results of that comparison for the Orbiter ARPCS hardware.

  9. Independent Orbiter Assessment (IOA): Assessment of the orbiter main propulsion system FMEA/CIL, volume 1

    Science.gov (United States)

    Slaughter, B. C.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Main Propulsion System (MPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to available data from the Rockwell Downey/NASA JSC FMEA/CIL review. The Orbiter MPS is composed of the Propellant Management Subsystem (PMS) consisting of the liquid oxygen (LO2) and liquid hydrogen (LH2) subsystems and the helium subsystem. The PMS is a system of manifolds, distribution lines, and valves by which the liquid propellants pass from the External Tank to the Space Shuttle Main Engine (SSME). The helium subsystem consists of a series of helium supply tanks and their associated regulators, control valves, and distribution lines. Volume 1 contains the MPS description, assessment results, ground rules and assumptions, and some of the IOA worksheets.

  10. Independent Orbiter Assessment (IOA): Assessment of the guidance, navigation, and control subsystem FMEA/CIL

    Science.gov (United States)

    Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Drapela, L. J.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Guidance, Navigation, and Control System (GNC) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison for the Orbiter GNC hardware is documented. The IOA product for the GNC analysis consisted of 141 failure mode worksheets that resulted in 24 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 148 FMEAs and 36 CIL items. This comparison produced agreement on all but 56 FMEAs which caused differences in zero CIL items.

  11. Independent Orbiter Assessment (IOA): Assessment of the body flap subsystem FMEA/CIL

    Science.gov (United States)

    Wilson, R. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Body Flap (BF) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter BF hardware. The IOA product for the BF analysis consisted of 43 failure mode worksheets that resulted in 19 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 34 FMEAs and 15 CIL items. This comparison produced agreement on all CIL items. Based on the Pre 51-L baseline, all non-CIL FMEAs were also in agreement.

  12. Independent Orbiter Assessment (IOA): Assessment of instrumental subsystem FMEA/CIL

    Science.gov (United States)

    Gardner, J. R.; Addis, A. W.

    1988-01-01

    The McDonnell Douglas Astronautics Company (MDAC) was selected in June 1986 to perform an Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL). The IOA effort first completed an analysis of the Instrumentation hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison for the Orbiter Instrumentation hardware are documented. The IOA product for Instrumentation analysis consisted of 107 failure mode worksheets that resulted in 22 critical items being identified. Comparison was made to the Pre 51-L NASA baseline with 14 Post 51-L FMEAs added, which consists of 96 FMEAs and 18 CIL items. This comparison produced agreement on all but 25 FMEAs which caused differences in 5 CIL items.

  13. Independent Orbiter Assessment (IOA): Assessment of the remote manipulator system FMEA/CIL

    Science.gov (United States)

    Tangorra, F.; Grasmeder, R. F.; Montgomery, A. D.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Remote Manipulator System (RMS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were than compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison for the Orbiter RMS hardware are documented. The IOA product for the RMS analysis consisted of 604 failure mode worksheets that resulted in 458 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 45 FMEAs and 321 CIL items. This comparison produced agreement on all but 154 FMEAs which caused differences in 137 CIL items.

  14. Independent Orbiter Assessment (IOA): Assessment of the elevon actuator subsystem FMEA/CIL

    Science.gov (United States)

    Wilson, R. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Elevon Subsystem hardware, generating draft failure modes, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter Elevon hardware. The IOA product for the Elevon analysis consisted of 25 failure mode worksheets that resulted in 17 potential critical items being identified. Comparison was made to the NASA FMEA/CIL, which consisted of 23 FMEAs and 13 CIL items. This comparison produced agreement on all CIL items. Based on the Pre 51-L baseline, all non-CIL FMEAs were also in agreement.

  15. Independent Orbiter Assessment (IOA): Assessment of the ascent thrust vector control actuator subsystem FMEA/CIL

    Science.gov (United States)

    Wilson, R. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Ascent Thrust Vector Control Actuator (ATVD) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter ATVC hardware. The IOA product for the ATVC actuator analysis consisted of 25 failure mode worksheets that resulted in 16 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 21 FMEAs and 13 CIL items. This comparison produced agreement on all CIL items. Based on the Pre 51-L baseline, all non-CIL FMEAs were also in agreement.

  16. Independent Orbiter Assessment (IOA): Assessment of the rudder/speed brake subsystem FMEA/CIL

    Science.gov (United States)

    Wilson, R. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Rudder/Speed Brake (RSB) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline along with the proposed Post 51-L CIL updates included. A resolution of each discrepancy from the comparison was provided through additional analysis as required. This report documents the results of that comparison for the Orbiter RSB hardware. The IOA product for the RSB analysis consisted of 38 failure mode worksheets that resulted in 27 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 34 FMEAs and 18 CIL items. This comparison produced agreement on all CIL items. Based on the Pre 51-L baseline, all non-CIL FMEAs were also in agreement.

  17. Independent Orbiter Assessment (IOA): Assessment of the electrical power generation/power reactant storage and distribution subsystem FMEA/CIL

    Science.gov (United States)

    Ames, B. E.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA effort first completed an analysis of the Electrical Power Generation/Power Reactant Storage and Distribution (EPG/PRSD) subsystem hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baselines with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison are documented for the Orbiter EPG/PRSD hardware. The comparison produced agreement on all but 27 FMEAs and 9 CIL items. The discrepancy between the number of IOA findings and NASA FMEAs can be partially explained by the different approaches used by IOA and NASA to group failure modes together to form one FMEA. Also, several IOA items represented inner tank components and ground operations failure modes which were not in the NASA baseline.

  18. Independent Orbiter Assessment (IOA): Assessment of the EPD and C/remote manipulator system FMEA/CIL

    Science.gov (United States)

    Robinson, W. W.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Electrical Power Distribution and Control (EPD and C)/Remote Manipulator System (RMS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA analysis of the EPD and C/RMS hardware initially generated 345 failure mode worksheets and identified 117 Potential Critical Items (PCIs) before starting the assessment process. These analysis results were compared to the proposed NASA Post 51-L baseline of 132 FMEAs and 66 CIL items.

  19. Independent Orbiter Assessment (IOA): Assessment of the landing/deceleration (LDG/DEC) subsystem FMEA/CIL

    Science.gov (United States)

    Odonnell, R. A.; Weissinger, D.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Landing/Deceleration (LDG/DEC) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter LDG/DEC hardware. The IOA product for the LDG/DEC analysis consisted of 259 failure mode worksheets that resulted in 124 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 267 FMEA's and 120 CIL items. This comparison produced agreement on all but 75 FMEA's which caused differences in 51 CIL items.

  20. Independent Orbiter Assessment (IOA): Assessment of the electrical power generation/fuel cell powerplant subsystem FMEA/CIL

    Science.gov (United States)

    Brown, K. L.; Bertsch, P. J.

    1987-01-01

    Results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Electrical Power Generation/Fuel Cell Powerplant (EPG/FCP) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the proposed Post 51-L NASA FMEA/CIL baseline. A resolution of each discrepancy from the comparison was provided through additional analysis as required. This report documents the results of that comparison for the Orbiter EPG/FCP hardware.

  1. Independent Orbiter Assessment (IOA): Assessment of the Electrical Power Distribution and Control/Electrical Power Generation (EPD and C/EPG) FMEA/CIL

    Science.gov (United States)

    Mccants, C. N.; Bearrow, M.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Electrical Power Distribution and Control/Electrical Power Generation (EPD and C/EPG) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison was provided through additional analysis as required. The results of that comparison is documented for the Orbiter EPD and C/EPG hardware. The IOA product for the EPD and C/EPG analysis consisted of 263 failure mode worksheets that resulted in 42 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 211 FMEA and 47 CIL items.

  2. Independent Orbiter Assessment (IOA): FMEA/CIL instructions and ground rules

    Science.gov (United States)

    Traves, S. T.

    1986-01-01

    The McDonnell Douglas Astronautics Company was selected to conduct an independent assessment of the Orbiter Failure Mode and Effects Analysis/Critical Items List (FMEA/CIL). Part of this effort involved an examination of the FMEA/CIL preparation instructions and ground rules. Assessment objectives were to identify omissions and ambiguities in the ground rules that may impede the identification of shuttle orbiter safety and mission critical items, and to ensure that ground rules allow these items to receive proper management visibility for risk assessment. Assessment objectives were followed during the performance of the assessment without being influenced by external considerations such as effects on budget, schedule, and documentation growth. Assessment personnel were employed who had a strong reliability background but no previous space shuttle FMEA/CIL experience to ensure an independent assessment would be achieved. The following observations were made: (1) not all essential items are in the CIL for management visibility; (2) ground rules omit FMEA/CIL coverage of items that perform critical functions; (3) essential items excluded from the CIL do not receive design justification; and (4) FMEAs/CILs are not updated in a timely manner. In addition to the above issues, a number of other issues were identified that correct FMEA/CIL preparation instruction omissions and clarify ambiguities. The assessment was successful in that many of the issues have significant safety implications.

  3. Sex Determination Using Inion-Opistocranium-Asterion (IOA Triangle in Nigerians’ Skulls

    Directory of Open Access Journals (Sweden)

    C. N. Orish

    2014-01-01

    Full Text Available Background. Determination of sex is an important concern to the forensic anthropologists as it is critical for individual identification. This study has investigated the existence of sexual dimorphism in the dimensions and the area of the IOA triangle. Methods. A total of 100 adult dry skulls, (78 males; 22 females from departments of anatomy in Nigerian universities were used for this study. Automatic digital calliper was used for the measurement. Coefficient of variation, correlation, linear regression, percentiles, and sexual dimorphism ratio were computed from the IOA triangle measurements. The IOA triangle area was compared between sexes. Results. The male parameters were significantly (P<0.05 higher than female parameters. The left opistocranium-asterion length was 71.09±0.56 and 61.68±3.35 mm and the right opistocranium-asterion length was 69.73±0.49 and 60.92±2.10 mm for male and female, respectively. A total area of IOA triangle of 1938.88 mm2 and 1305.68 mm2 for male and female, respectively, was calculated. The left IOA indices were 46.42% and 37.40% in males and females, respectively, while the right IOA indices for males and females were 47.19% and 38.87%, respectively. Conclusion. The anthropometry of inion-opistocranium-asterion IOA triangle can be a guide in gender determination of unknown individuals.

  4. Water resources and environmental input-output analysis and its key study issues: a review

    Science.gov (United States)

    YANG, Z.; Xu, X.

    2013-12-01

    inland water resources IOA. Recent internal study references related to environmental input-output table, pollution discharge analysis and environmental impact assessment had taken the leading position. Pollution discharge analysis mainly aiming at CO2 discharge had been regard as a new hotspot of environmental IOA. Environmental impact assessment was an important direction of inland environmental IOA in recent years. Key study issues including Domestic Technology Assumption(DTA) and Sectoral Aggregation(SA) had been mentioned remarkably. It was pointed out that multiply multi-region input-output analysis(MIOA) may be helpful to solve DTA. Because there was little study using effective analysis tools to quantify the bias of SA and the exploration of the appropriate sectoral aggregation degree was scarce, research dedicating to explore and solve these two key issues was deemed to be urgently needed. According to the study status, several points of outlook were proposed in the end.

  5. Peritonita infecţioasă felină în România

    Directory of Open Access Journals (Sweden)

    HORHOGEA Cristina

    2016-06-01

    Full Text Available The study was conducted in Romania, during 2007-2011 on 58 cats of different breeds and ages (1,7 months–13 years, with clinical signs of feline infectious peritonitis (wet form in 52 cases and dry form in 6 cases. Coronaviral RNA was identified by RT-PCR, using p205/p211 primers in 32 (61,53% ascites fluid and 2 pleural fluid samples. Feline infectious peritonitis was diagnosed in 24 domestic short hair cats, 2 Russian Blue, 2 Burmese, 2 Persiane, 2 Siameee and 2 Chartreaux. 52,94 % of the 34 tested animals were females and 46,06% males. Among domestic short hair cats category with the largest number of individuals, 75% males and 50% females were positive. Regarding age, 70,58% were at least 2 years old and 29,42 % younger than 2 years old. This study is the first in Romania and showed some epidemiological and clinical aspects of feline infections peritonitis in Moldavia. Rezumat. Studiul a fost realizat în România, în perioada 2007-2011, pe 58 de pisici de rase şi vârste diferite (1,7 luni – 13 ani, cu semne clinice de peritonită infecţioasă felină (52 de cazuri suspecte de forma efuzivă şi 6 cazuri de forma uscată. ARN-ul coronaviral a fost identificat prin RT-PCR, utilizându-se perechea de primeri p205/p211 în 32 (61,53% din probele de lichid de ascită (n=52 şi în cele de lichid pleural (n=2. Diagnosticul de peritonită infecţioasă felină a fost stabilit la 24 de pisici din rasa comună, două Albastru de Rusia, două Birmaneze, două Persane, două Siameze şi două Chartreaux. În studiul de faţă, din cele 34 de pisici pozitive, 52,94% au fost femele şi 46,06% masculi. Raportându-ne la categoria cea mai numeroasă, şi anume rasa comună, 75% dintre masculi şi 50% dintre femele au fost pozitivi. Referitor la categoria de vârstă, 70,58 % au avut vârsta de doi ani sau mai mult, iar 29,42 % mai puţin de doi ani. Este primul studiu de acest fel din Romania şi prezintă unele aspecte epidemiologice şi clinice

  6. Analysis and assessment

    International Nuclear Information System (INIS)

    Grahn, D.

    1975-01-01

    The ultimate objective is to predict potential health costs tp man accruing from the effluents or by-products of any energy system or mix of systems, but the establishment of reliable prediction equations first requires a baseline analysis of those preexisting and essentially uncontrolled factors known to have significant influence on patterns of mortality. These factors are the cultural, social, economic, and demographic traits of a defined local or regional population. Thus, the immediate objective is the rigorous statistical definition of consistent relationships that may exist among the above traits and between them and selected causes of death, especially those causes that may have interpretive value for the detection of environmental pollutants

  7. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    International Nuclear Information System (INIS)

    Chen, Shaoqing; Chen, Bin

    2015-01-01

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  8. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  9. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  10. Assessing the Utility of a Demand Assessment for Functional Analysis

    Science.gov (United States)

    Roscoe, Eileen M.; Rooker, Griffin W.; Pence, Sacha T.; Longworth, Lynlea J.

    2009-01-01

    We evaluated the utility of an assessment for identifying tasks for the functional analysis demand condition with 4 individuals who had been diagnosed with autism. During the demand assessment, a therapist presented a variety of tasks, and observers measured problem behavior and compliance to identify demands associated with low levels of…

  11. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  12. Interobserver variability in target volume delineation of hepatocellular carcinoma : An analysis of the working group "Stereotactic Radiotherapy" of the German Society for Radiation Oncology (DEGRO).

    Science.gov (United States)

    Gkika, E; Tanadini-Lang, S; Kirste, S; Holzner, P A; Neeff, H P; Rischke, H C; Reese, T; Lohaus, F; Duma, M N; Dieckmann, K; Semrau, R; Stockinger, M; Imhoff, D; Kremers, N; Häfner, M F; Andratschke, N; Nestle, U; Grosu, A L; Guckenberger, M; Brunner, T B

    2017-10-01

    Definition of gross tumor volume (GTV) in hepatocellular carcinoma (HCC) requires dedicated imaging in multiple contrast medium phases. The aim of this study was to evaluate the interobserver agreement (IOA) in gross tumor delineation of HCC in a multicenter panel. The analysis was performed within the "Stereotactic Radiotherapy" working group of the German Society for Radiation Oncology (DEGRO). The GTVs of three anonymized HCC cases were delineated by 16 physicians from nine centers using multiphasic CT scans. In the first case the tumor was well defined. The second patient had multifocal HCC (one conglomerate and one peripheral tumor) and was previously treated with transarterial chemoembolization (TACE). The peripheral lesion was adjacent to the previous TACE site. The last patient had an extensive HCC with a portal vein thrombosis (PVT) and an inhomogeneous liver parenchyma due to cirrhosis. The IOA was evaluated according to Landis and Koch. The IOA for the first case was excellent (kappa: 0.85); for the second case moderate (kappa: 0.48) for the peripheral tumor and substantial (kappa: 0.73) for the conglomerate. In the case of the peripheral tumor the inconsistency is most likely explained by the necrotic tumor cavity after TACE caudal to the viable tumor. In the last case the IOA was fair, with a kappa of 0.34, with significant heterogeneity concerning the borders of the tumor and the PVT. The IOA was very good among the cases were the tumor was well defined. In complex cases, where the tumor did not show the typical characteristics, or in cases with Lipiodol (Guerbet, Paris, France) deposits, IOA agreement was compromised.

  13. Interobserver variability in target volume delineation of hepatocellular carcinoma. An analysis of the working group ''Stereotactic Radiotherapy'' of the German Society for Radiation Oncology (DEGRO)

    Energy Technology Data Exchange (ETDEWEB)

    Gkika, E.; Kirste, S. [Medical Center - University of Freiburg, Department of Radiation Oncology, Freiburg im Breisgau (Germany); Tanadini-Lang, S.; Andratschke, N.; Guckenberger, M. [University Hospital Zuerich, Department of Radiation Oncology, Zurich (Switzerland); Holzner, P.A.; Neeff, H.P. [Medical Center - University of Freiburg, Department of Visceral Surgery, Freiburg (Germany); Rischke, H.C. [Medical Center - University of Freiburg, Department of Radiation Oncology, Freiburg im Breisgau (Germany); Medical Center - University of Freiburg, Department of Nuclear Medicine, Freiburg (Germany); Reese, T. [University Hospital Halle-Wittenberg, Department of Radiation Oncology, Halle-Wittenberg (Germany); Lohaus, F. [Technische Universitaet Dresden, Department of Radiation Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Dresden (Germany); German Cancer Research Center (DKFZ), Heidelberg (Germany); Technische Universitaet Dresden, OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Dresden (Germany); National Center for Tumor Diseases (NCT), Dresden (Germany); German Cancer Consortium (DKTK), Dresden (Germany); Duma, M.N. [Helmholtz Zentrum Munich, Institute of Innovative Radiotherapy, Department of Radiation Sciences, Munich (Germany); TU Munich, Department of Radiation Oncology, Klinikum Rechts der Isar, Munich (Germany); Dieckmann, K. [Medical University Vienna, Department of Radiation Oncology, General Hospital Vienna, Vienna (Austria); Semrau, R. [University Hospital of Cologne, Department of Radiation Oncology, Cologne (Germany); Stockinger, M. [University Hospital Mainz, Department of Radiation Oncology, Mainz (Germany); Imhoff, D. [University Hospital Frankfurt, Department of Radiation Oncology, Frankfurt (Germany); Saphir Radiosurgery Center, Frankfurt (Germany); Kremers, N. [Medical Center - University of Freiburg, Department of Radiology, Freiburg (Germany); Haefner, M.F. [University Hospital Heidelberg, Department of Radiation Oncology, Heidelberg (Germany); Nestle, U.; Grosu, A.L.; Brunner, T.B. [Medical Center - University of Freiburg, Department of Radiation Oncology, Freiburg im Breisgau (Germany); University of Freiburg, Faculty of Medicine, Freiburg (Germany); German Cancer Consortium (DKTK), Freiburg (Germany); German Cancer Research Center (DKFZ), Heidelberg (Germany)

    2017-10-15

    Definition of gross tumor volume (GTV) in hepatocellular carcinoma (HCC) requires dedicated imaging in multiple contrast medium phases. The aim of this study was to evaluate the interobserver agreement (IOA) in gross tumor delineation of HCC in a multicenter panel. The analysis was performed within the ''Stereotactic Radiotherapy'' working group of the German Society for Radiation Oncology (DEGRO). The GTVs of three anonymized HCC cases were delineated by 16 physicians from nine centers using multiphasic CT scans. In the first case the tumor was well defined. The second patient had multifocal HCC (one conglomerate and one peripheral tumor) and was previously treated with transarterial chemoembolization (TACE). The peripheral lesion was adjacent to the previous TACE site. The last patient had an extensive HCC with a portal vein thrombosis (PVT) and an inhomogeneous liver parenchyma due to cirrhosis. The IOA was evaluated according to Landis and Koch. The IOA for the first case was excellent (kappa: 0.85); for the second case moderate (kappa: 0.48) for the peripheral tumor and substantial (kappa: 0.73) for the conglomerate. In the case of the peripheral tumor the inconsistency is most likely explained by the necrotic tumor cavity after TACE caudal to the viable tumor. In the last case the IOA was fair, with a kappa of 0.34, with significant heterogeneity concerning the borders of the tumor and the PVT. The IOA was very good among the cases were the tumor was well defined. In complex cases, where the tumor did not show the typical characteristics, or in cases with Lipiodol (Guerbet, Paris, France) deposits, IOA agreement was compromised. (orig.) [German] Die Definition des makroskopischen Tumorvolumens (GTV) bei hepatozellulaeren Karzinomen (HCC) erfordert eine dezidierte Bildgebung in mehreren Kontrastmittelphasen. Ziel dieser Studie war es, die Interobservervariabilitaet (IOA) bei der Konturierung von HCC-Laesionen durch ein multizentrisches

  14. Assessing Analysis and Reasoning in Bioethics

    Science.gov (United States)

    Pearce, Roger S.

    2008-01-01

    Developing critical thinking is a perceived weakness in current education. Analysis and reasoning are core skills in bioethics making bioethics a useful vehicle to address this weakness. Assessment is widely considered to be the most influential factor on learning (Brown and Glasner, 1999) and this piece describes how analysis and reasoning in…

  15. Economic impact assessment in pest risk analysis

    NARCIS (Netherlands)

    Soliman, T.A.A.; Mourits, M.C.M.; Oude Lansink, A.G.J.M.; Werf, van der W.

    2010-01-01

    According to international treaties, phytosanitary measures against introduction and spread of invasive plant pests must be justified by a science-based pest risk analysis (PRA). Part of the PRA consists of an assessment of potential economic consequences. This paper evaluates the main available

  16. Sensitivity analysis in life cycle assessment

    NARCIS (Netherlands)

    Groen, E.A.; Heijungs, R.; Bokkers, E.A.M.; Boer, de I.J.M.

    2014-01-01

    Life cycle assessments require many input parameters and many of these parameters are uncertain; therefore, a sensitivity analysis is an essential part of the final interpretation. The aim of this study is to compare seven sensitivity methods applied to three types of case stud-ies. Two

  17. Towards the internet of agents: an analysis of the internet of things from the intelligence and autonomy perspective

    Directory of Open Access Journals (Sweden)

    Pablo Antonio Pico Valencia

    2018-01-01

    Full Text Available Recently, the scientific community has demonstrated a special interest in the process related to the integration of the agent-oriented technology with Internet of Things (IoT platforms. Then, it arises a novel approach named Internet of Agents (IoA as an alternative to add an intelligence and autonomy component for IoT devices and networks. This paper presents an analysis of the main benefits derived from the use of the IoA approach, based on a practical point of view regarding the necessities that humans demand in their daily life and work, which can be solved by IoT networks modeled as IoA infrastructures. It has been presented 24 study cases of the IoA approach at different domains ––smart industry, smart city and smart health wellbeing–– in order to define the scope of these proposals in terms of intelligence and autonomy in contrast to their corresponding generic IoT applications.

  18. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  19. Safety analysis and risk assessment handbook

    International Nuclear Information System (INIS)

    Peterson, V.L.; Colwell, R.G.; Dickey, R.L.

    1997-01-01

    This Safety Analysis and Risk Assessment Handbook (SARAH) provides guidance to the safety analyst at the Rocky Flats Environmental Technology Site (RFETS) in the preparation of safety analyses and risk assessments. Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the mission change at RFETS came the need to establish new authorization basis documents for its facilities, whose functions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents had to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This handbook presents this new standardized approach. The handbook begins with a discussion of the requirements of the different types of authorization basis documents and how to choose the one appropriate for the facility to be evaluated. It then walks the analyst through the process of identifying all the potential hazards in the facility, classifying them, and choosing the ones that need to be analyzed further. It then discusses the methods for evaluating accident initiation and progression and covers the basic steps in a safety analysis, including consequence and frequency binning and risk ranking. The handbook lays out standardized approaches for determining the source terms of the various accidents (including airborne release fractions, leakpath factors, etc.), the atmospheric dispersion factors appropriate for Rocky Flats, and the methods for radiological and chemical consequence assessments. The radiological assessments use a radiological open-quotes templateclose quotes, a spreadsheet that incorporates the standard values of parameters, whereas the chemical assessments use the standard codes ARCHIE and ALOHA

  20. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  1. Interobserver agreement in the assessment of pulmonary infiltrates on chest radiography in community-acquired pneumonia; Detektion pneumonischer Infiltrate bei ambulant erworbener Pneumonie: Uebereinstimmung in der Befundung der Roentgen-Thoraxaufnahme

    Energy Technology Data Exchange (ETDEWEB)

    Pauls, S.; Billich, C.; Boll, D.; Aschoff, A.J. [Diagnostische und Interventionelle Radiologie, Universitaetskliniken Ulm (Germany); Krueger, S. [Medizinische Klinik I, Universitaetskliniken RWTH Aachen (Germany); Richter, K.; Marre, R.; Gonschior, S. [Mikrobiologie und Hygiene, Universitaetskliniken Ulm (Germany); Muche, R. [Inst. fuer Biometrie, Univ. Ulm (Germany); Welte, T. [Abt. fuer Pneumologie, Medizinische Hochschule Hannover (Germany); Schumann, C. [Medizinische Klinik II, Universitaetskliniken Ulm (Germany); Suttorp, N. [Abt. Innere Medizin, Charite Universitaetsmedizin Berlin (Germany)

    2007-11-15

    Purpose: To assess interobserver agreement (IOA) in the diagnosis of pulmonary infiltrates on chest X-rays for patients with community-acquired pneumonia (CAP). Materials and methods: From 7/2002 to 12/2005, 806 adults with CAP were included in the multicenter study 'CAPNETZ' (7 hospitals). Inclusion criteria were clinical signs of pneumonia and pulmonary opacification on chest X-rays. Each X-ray was reevaluated by two radiologists from the university hospital in consensus reading against the interpreter at the referring hospital in regard to: presence of infiltrate (yes/no/equivocal), transparency ({<=}/> 50%), localization, and pattern of infiltrates (alveolar/interstitial). The following parameters were documented: digital or film radiography, hospitalization, fever, findings of auscultation, microbiological findings. Results: The overall IOA concerning the detection of infiltrates was 77.7% (n = 626; Cl 0.75 - 0.81), the infiltrates were not verified in 16.4% (n = 132) by the referring radiologist with equivocal findings in 5.9% (n = 48). The IOA of the different clinical centers varied between 63.2% (n = 38, Cl 0.48 - 0.78) and 92.3% (n = 65, Cl 0.86 - 0.99). The IOA for the diagnosis of infiltrates was significantly higher for inpatients with 82.6% (n = 546; Cl 0.80-0.85) than for outpatients with 55.2% (n = 80; Cl 0.47 - 0.63), p < 0.0001. The IOA of infiltrates with a transparency > 50% was 95.1% (n = 215; Cl 0.92 - 0.98) versus 80.4% (n = 403; Cl 0.77 - 0.84) for infiltrates with a transparency > 50% (p < 0.0001). In patients with positive auscultation, the IOA was higher (p = 0,034). Chest X-rays of patients with antibiotic therapy or an alveolar infiltrate showed more equivocal findings compared to patients without these features. Conclusion: There is considerable interobserver variability in the diagnosis of pulmonary infiltrates on chest radiographs. The IOA is higher in more opaque infiltrates, positive auscultation and inpatients. (orig.)

  2. Interobserver agreement in the assessment of pulmonary infiltrates on chest radiography in community-acquired pneumonia

    International Nuclear Information System (INIS)

    Pauls, S.; Billich, C.; Boll, D.; Aschoff, A.J.; Krueger, S.; Richter, K.; Marre, R.; Gonschior, S.; Muche, R.; Welte, T.; Schumann, C.; Suttorp, N.

    2007-01-01

    Purpose: To assess interobserver agreement (IOA) in the diagnosis of pulmonary infiltrates on chest X-rays for patients with community-acquired pneumonia (CAP). Materials and methods: From 7/2002 to 12/2005, 806 adults with CAP were included in the multicenter study ''CAPNETZ'' (7 hospitals). Inclusion criteria were clinical signs of pneumonia and pulmonary opacification on chest X-rays. Each X-ray was reevaluated by two radiologists from the university hospital in consensus reading against the interpreter at the referring hospital in regard to: presence of infiltrate (yes/no/equivocal), transparency (≤/> 50%), localization, and pattern of infiltrates (alveolar/interstitial). The following parameters were documented: digital or film radiography, hospitalization, fever, findings of auscultation, microbiological findings. Results: The overall IOA concerning the detection of infiltrates was 77.7% (n 626; Cl 0.75 - 0.81), the infiltrates were not verified in 16.4% (n = 132) by the referring radiologist with equivocal findings in 5.9% (n = 48). The IOA of the different clinical centers varied between 63.2% (n = 38, Cl 0.48 - 0.78) and 92.3% (n = 65, Cl 0.86 - 0.99). The IOA for the diagnosis of infiltrates was significantly higher for inpatients with 82.6% (n = 546; Cl 0.80-0.85) than for outpatients with 55.2% (n = 80; Cl 0.47 - 0.63), p 50% was 95.1% (n = 215; Cl 0.92 - 0.98) versus 80.4% (n = 403; Cl 0.77 - 0.84) for infiltrates with a transparency > 50% (p < 0.0001). In patients with positive auscultation, the IOA was higher (p = 0,034). Chest X-rays of patients with antibiotic therapy or an alveolar infiltrate showed more equivocal findings compared to patients without these features. Conclusion: There is considerable interobserver variability in the diagnosis of pulmonary infiltrates on chest radiographs. The IOA is higher in more opaque infiltrates, positive auscultation and inpatients. (orig.)

  3. Uncertainty on faecal analysis on dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Juliao, Ligia M.Q.C.; Melo, Dunstana R.; Sousa, Wanderson de O.; Santos, Maristela S.; Fernandes, Paulo Cesar P. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/n. Via 9, Recreio, CEP 22780-160, Rio de Janeiro, RJ (Brazil)

    2007-07-01

    Monitoring programmes for internal dose assessment may need to have a combination of bioassay techniques, e.g. urine and faecal analysis, especially in workplaces where compounds of different solubilities are handled and also in cases of accidental intakes. Faecal analysis may be an important data for assessment of committed effective dose due to exposure to insoluble compounds, since the activity excreted by urine may not be detectable, unless a very sensitive measurement system is available. This paper discusses the variability of the daily faecal excretion based on data from just one daily collection; collection during three consecutive days: samples analysed individually and samples analysed as a pool. The results suggest that just 1 d collection is not appropriate for dose assessment, since the 24 h uranium excretion may vary by a factor of 40. On the basis of this analysis, the recommendation should be faecal collection during three consecutive days, and samples analysed as a pool, it is more economic and faster. (authors)

  4. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Expert Judgement Assessment & SCENT Ontological Analysis

    Directory of Open Access Journals (Sweden)

    NICHERSU Iulian

    2018-05-01

    Full Text Available This study aims to provide insights in the starting point of the Horizon 2020 ECfunded project SCENT (Smart Toolbox for Εngaging Citizens into a People-Centric Observation Web Citizen Observatory (CO in terms of existing infrastructure, existing monitoring systems and some discussion on the existing legal and administrative framework that relate to flood monitoring and management in the area of Danube Delta. The methodology used in this approach is based on expert judgement and ontological analysis, using the information collected from the identified end-users of the SCENT toolbox. In this type of analysis the stages of flood monitoring and management that the experts are involved in are detailed. This is done through an Expert Judgement Assessment analysis. The latter is complemented by a set of Key Performance Indicators that the stakeholders have assessed and/or proposed for the evaluation of the SCENT demonstrations, for the impact of the project and finally for SCENT toolbox performance and usefulness. The second part of the study presents an analysis that attempts to map the interactions between different organizations and components of the existing monitoring systems in the Danube Delta case study. Expert Judgement (EJ allows to gain information from specialists in a specific field through a consultation process with one or more experts that have experience in similar and complementary topics. Expert judgment, expert estimates, or expert opinion are all terms that refer to the contents of the problem; estimates, outcomes, predictions, uncertainties, and their corresponding assumptions and conditions are all examples of expert judgment. Expert Judgement is affected by the process used to gather it. On the other hand, the ontological analysis comes to complete this study, by organizing and presenting the connections behind the flood management and land use systems in the three phases of the flood event.

  6. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    Parzyck, D.C.

    1980-01-01

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  7. System Analysis and Risk Assessment (SARA) system

    International Nuclear Information System (INIS)

    Krantz, E.A.; Russell, K.D.; Stewart, H.D.; Van Siclen, V.S.

    1986-01-01

    Utilization of Probabilistic Risk Assessment (PRA) related information in the day-to-day operation of plant systems has, in the past, been impracticable due to the size of the computers needed to run PRA codes. This paper discusses a microcomputer-based database system which can greatly enhance the capability of operators or regulators to incorporate PRA methodologies into their routine decision making. This system is called the System Analysis and Risk Assessment (SARA) system. SARA was developed by EG and G Idaho, Inc. at the Idaho National Engineering Laboratory to facilitate the study of frequency and consequence analyses of accident sequences from a large number of light water reactors (LWRs) in this country. This information is being amassed by several studies sponsored by the United States Nuclear Regulatory Commission (USNRC). To meet the need of portability and accessibility, and to perform the variety of calculations necessary, it was felt that a microcomputer-based system would be most suitable

  8. Assessment of right atrial function analysis

    International Nuclear Information System (INIS)

    Shohgase, Takashi; Miyamoto, Atsushi; Kanamori, Katsushi; Kobayashi, Takeshi; Yasuda, Hisakazu

    1988-01-01

    To assess the potential utility of right atrial function analysis in cardiac disease, reservoir function, pump function, and right atrial peak emptying rate (RAPER) were compared in 10 normal subjects, 32 patients with coronary artery disease, and 4 patients with primary pulmonary hypertension. Right atrial volume curves were obtained using cardiac radionuclide method with Kr-81m. In normal subjects, reservoir function index was 0.41+-0.05; pump function index was 0.25+-0.05. Both types of patients has decreased reservoir funcion and increased pump function. Pump function tended to decrease with an increase of right ventricular end-diastolic pressure. RAPER correlated well with right ventricular peak filling rate, probably reflecting right ventricular diastolic function. Analysis of right atrial function seemed to be of value in evaluating factors regulating right ventricular contraction and diastolic function, and cardiac output. (Namekawa, K)

  9. Reliability analysis and assessment of structural systems

    International Nuclear Information System (INIS)

    Yao, J.T.P.; Anderson, C.A.

    1977-01-01

    The study of structural reliability deals with the probability of having satisfactory performance of the structure under consideration within any specific time period. To pursue this study, it is necessary to apply available knowledge and methodology in structural analysis (including dynamics) and design, behavior of materials and structures, experimental mechanics, and the theory of probability and statistics. In addition, various severe loading phenomena such as strong motion earthquakes and wind storms are important considerations. For three decades now, much work has been done on reliability analysis of structures, and during this past decade, certain so-called 'Level I' reliability-based design codes have been proposed and are in various stages of implementation. These contributions will be critically reviewed and summarized in this paper. Because of the undesirable consequences resulting from the failure of nuclear structures, it is important and desirable to consider the structural reliability in the analysis and design of these structures. Moreover, after these nuclear structures are constructed, it is desirable for engineers to be able to assess the structural reliability periodically as well as immediately following the occurrence of severe loading conditions such as a strong-motion earthquake. During this past decade, increasing use has been made of techniques of system identification in structural engineering. On the basis of non-destructive test results, various methods have been developed to obtain an adequate mathematical model (such as the equations of motion with more realistic parameters) to represent the structural system

  10. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  11. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  12. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  13. Global sensitivity analysis in wind energy assessment

    Science.gov (United States)

    Tsvetkova, O.; Ouarda, T. B.

    2012-12-01

    Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present

  14. Contracting Data Analysis: Assessment of Government-Wide Trends

    Science.gov (United States)

    2017-03-01

    CONTRACTING DATA ANALYSIS Assessment of Government -wide Trends Report to Congressional Addressees March 2017...Office Highlights of GAO-17-244SP, a report to congressional addressees March 2017 CONTRACTING DATA ANALYSIS Assessment of Government -wide...Trends What GAO Found GAO’s analysis of government -wide contracting data found that while defense obligations to buy products and services

  15. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    Science.gov (United States)

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  16. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel

    2016-01-01

    After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction......, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference...... spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis....

  17. 5. Basin assessment and watershed analysis

    Science.gov (United States)

    Leslie M. Reid; Robert R. Ziemer

    1994-01-01

    Abstract - Basin assessment is an important component of the President's Forest Plan, yet it has received little attention. Basin assessments are intended both to guide watershed analyses by specifying types of issues and interactions that need to be understood, and, eventually, to integrate the results of watershed analyses occurring within a river basin....

  18. Surface water quality assessment using factor analysis

    African Journals Online (AJOL)

    2006-01-16

    Jan 16, 2006 ... Surface water, groundwater quality assessment and environ- .... Urbanisation influences the water cycle through changes in flow and water ..... tion of aquatic life, CCME water quality Index 1, 0. User`s ... Water, Air Soil Pollut.

  19. Regional analysis and environmental impact assessment

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Brocksen, R.W.; Emanuel, W.R.

    1976-01-01

    This paper presents a number of techniques that can be used to assess environmental impacts on a regional scale. Regional methodologies have been developed which examine impacts upon aquatic and terrestrial biota in regions through consideration of changes in land use, land cover, air quality, water resource use, and water quality. Techniques used to assess long-range atmospheric transport, water resources, effects on sensitive forest and animal species, and impacts on man are presented in this paper, along with an optimization approach which serves to integrate the analytical techniques in an overall assessment framework. A brief review of the research approach and certain modeling techniques used within one regional studies program is provided. While it is not an all inclusive report on regional analyses, it does present an illustration of the types of analyses that can be performed on a regional scale

  20. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  1. Data Analysis and Next Generation Assessments

    Science.gov (United States)

    Pon, Kathy

    2013-01-01

    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  2. Comparative analysis of selected hydromorphological assessment methods

    Czech Academy of Sciences Publication Activity Database

    Šípek, Václav; Matoušková, M.; Dvořák, M.

    2010-01-01

    Roč. 169, 1-4 (2010), s. 309-319 ISSN 0167-6369 Institutional support: RVO:67985874 Keywords : Hydromorphology * Ecohydromorphological river habitat assessment: EcoRivHab * Rapid Bioassessment Protocol * LAWA Field and Overview Survey * Libechovka River * Bilina River * Czech Republic Subject RIV: DA - Hydrology ; Limnology Impact factor: 1.436, year: 2010

  3. Assessment and uncertainty analysis of groundwater risk.

    Science.gov (United States)

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Assessing Visualization: An analysis of Chilean teachers’ guidelines

    DEFF Research Database (Denmark)

    Andrade-Molina, Melissa; Díaz, Leonora

    2018-01-01

    , this importance seems to fade when it comes to assessing students while learning school mathematics and geometry. We conducted an analysis of the official guidelines for the assessment school mathematics in Chile. The analysis of two of those guides is considered here. The results revealed that these guidelines...... do not help teachers while assessing visualization in schools; rather its focus is embedded in a tradition of training that leads to a reduction of space....

  5. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  6. Assessment and Planning Using Portfolio Analysis

    Science.gov (United States)

    Roberts, Laura B.

    2010-01-01

    Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…

  7. Material Analysis for a Fire Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander; Nemer, Martin B.

    2014-08-01

    This report consolidates technical information on several materials and material classes for a fire assessment. The materials include three polymeric materials, wood, and hydraulic oil. The polymers are polystyrene, polyurethane, and melamine- formaldehyde foams. Samples of two of the specific materials were tested for their behavior in a fire - like environment. Test data and the methods used to test the materials are presented. Much of the remaining data are taken from a literature survey. This report serves as a reference source of properties necessary to predict the behavior of these materials in a fire.

  8. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  9. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  10. Heart sounds analysis using probability assessment.

    Science.gov (United States)

    Plesinger, F; Viscor, I; Halamek, J; Jurco, J; Jurak, P

    2017-07-31

    This paper describes a method for automated discrimination of heart sounds recordings according to the Physionet Challenge 2016. The goal was to decide if the recording refers to normal or abnormal heart sounds or if it is not possible to decide (i.e. 'unsure' recordings). Heart sounds S1 and S2 are detected using amplitude envelopes in the band 15-90 Hz. The averaged shape of the S1/S2 pair is computed from amplitude envelopes in five different bands (15-90 Hz; 55-150 Hz; 100-250 Hz; 200-450 Hz; 400-800 Hz). A total of 53 features are extracted from the data. The largest group of features is extracted from the statistical properties of the averaged shapes; other features are extracted from the symmetry of averaged shapes, and the last group of features is independent of S1 and S2 detection. Generated features are processed using logical rules and probability assessment, a prototype of a new machine-learning method. The method was trained using 3155 records and tested on 1277 hidden records. It resulted in a training score of 0.903 (sensitivity 0.869, specificity 0.937) and a testing score of 0.841 (sensitivity 0.770, specificity 0.913). The revised method led to a test score of 0.853 in the follow-up phase of the challenge. The presented solution achieved 7th place out of 48 competing entries in the Physionet Challenge 2016 (official phase). In addition, the PROBAfind software for probability assessment was introduced.

  11. Proteome analysis in the assessment of ageing.

    Science.gov (United States)

    Nkuipou-Kenfack, Esther; Koeck, Thomas; Mischak, Harald; Pich, Andreas; Schanstra, Joost P; Zürbig, Petra; Schumacher, Björn

    2014-11-01

    Based on demographic trends, the societies in many developed countries are facing an increasing number and proportion of people over the age of 65. The raise in elderly populations along with improved health-care will be concomitant with an increased prevalence of ageing-associated chronic conditions like cardiovascular, renal, and respiratory diseases, arthritis, dementia, and diabetes mellitus. This is expected to pose unprecedented challenges both for individuals and societies and their health care systems. An ultimate goal of ageing research is therefore the understanding of physiological ageing and the achievement of 'healthy' ageing by decreasing age-related pathologies. However, on a molecular level, ageing is a complex multi-mechanistic process whose contributing factors may vary individually, partly overlap with pathological alterations, and are often poorly understood. Proteome analysis potentially allows modelling of these multifactorial processes. This review summarises recent proteomic research on age-related changes identified in animal models and human studies. We combined this information with pathway analysis to identify molecular mechanisms associated with ageing. We identified some molecular pathways that are affected in most or even all organs and others that are organ-specific. However, appropriately powered studies are needed to confirm these findings based in in silico evaluation. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Gender analysis of participatory needs assessment of Emeroke ...

    African Journals Online (AJOL)

    Gender analysis of participatory needs assessment of Emeroke community of ... 50%, 26% and 24% of the total households (THHs) were food insecured/core poor, ... farming technologies, inputs, credit and extension services which was worse ...

  13. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  14. PIE Nacelle Flow Analysis and TCA Inlet Flow Quality Assessment

    Science.gov (United States)

    Shieh, C. F.; Arslan, Alan; Sundaran, P.; Kim, Suk; Won, Mark J.

    1999-01-01

    This presentation includes three topics: (1) Analysis of isolated boattail drag; (2) Computation of Technology Concept Airplane (TCA)-installed nacelle effects on aerodynamic performance; and (3) Assessment of TCA inlet flow quality.

  15. Non-human biota dose assessment. Sensitivity analysis and knowledge quality assessment

    International Nuclear Information System (INIS)

    Smith, K.; Robinson, C.; Jackson, D.; La Cruz, I. de; Zinger, I.; Avila, R.

    2010-10-01

    This report provides a summary of a programme of work, commissioned within the BIOPROTA collaborative forum, to assess the quantitative and qualitative elements of uncertainty associated with biota dose assessment of potential impacts of long-term releases from geological disposal facilities (GDF). Quantitative and qualitative aspects of uncertainty were determined through sensitivity and knowledge quality assessments, respectively. Both assessments focused on default assessment parameters within the ERICA assessment approach. The sensitivity analysis was conducted within the EIKOS sensitivity analysis software tool and was run in both generic and test case modes. The knowledge quality assessment involved development of a questionnaire around the ERICA assessment approach, which was distributed to a range of experts in the fields of non-human biota dose assessment and radioactive waste disposal assessments. Combined, these assessments enabled critical model features and parameters that are both sensitive (i.e. have a large influence on model output) and of low knowledge quality to be identified for each of the three test cases. The output of this project is intended to provide information on those parameters that may need to be considered in more detail for prospective site-specific biota dose assessments for GDFs. Such information should help users to enhance the quality of their assessments and build greater confidence in the results. (orig.)

  16. Failure analysis and success analysis: roles in plant aging assessments

    International Nuclear Information System (INIS)

    Johnson, A.B. Jr.

    1985-06-01

    Component aging investigations are an important element in NRC's Nuclear Plant Aging Research (NPAR) strategy. Potential sources of components include plants in decommissioning and commercial plant, both for in situ tests and for examination of equipment removed from service. Nuclear utilities currently have voluntary programs addressing aspects of equipment reliability, such as root cause analysis for safety-related equipment that malfunctions, and trending analysis to follow the course of both successful and abnormal equipment performance. Properly coordinated, the NPAR and utility programs offer an important approach to establish the data base necessary for life extension of nuclear electrical generating plants

  17. Fast Computation and Assessment Methods in Power System Analysis

    Science.gov (United States)

    Nagata, Masaki

    Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.

  18. Pathway analysis concepts for radiological impact assessment

    International Nuclear Information System (INIS)

    Moroney, J.R.

    1992-06-01

    The concepts underlying exposure pathways analysis are outlined with reference to the features of the two broad types of radionuclide transport models now in use - dynamic and steady-state - and the methods for constructing and developing them. By way of illustration, representative radiation doses are estimated for the four main exposure pathways likely to be involved in the land application of effluent water from Retention Pond 2 of Ranger Uranium Mines. These include: external irradiation by 226 Ra and natural uranium (Un at) in soil, ingestion of 226 Ra and Un at in food, inhalation of 222 Rn daughter products from 226 Rn in soil, and inhalation of 226 Ra and Un at in airborne dust resuspended from soil. Consideration has been given to local residents pursuing a traditional lifestyle on conclusion of the land application program. Because of the possible importance of the contribution from resuspended dust, currently available data are explored in refining the methodology for the pathway and developing a more appropriate model for it. 37 refs., 9 tabs., 6 figs

  19. Accuracy Assessment and Analysis for GPT2

    Directory of Open Access Journals (Sweden)

    YAO Yibin

    2015-07-01

    Full Text Available GPT(global pressure and temperature is a global empirical model usually used to provide temperature and pressure for the determination of tropospheric delay, there are some weakness to GPT, these have been improved with a new empirical model named GPT2, which not only improves the accuracy of temperature and pressure, but also provides specific humidity, water vapor pressure, mapping function coefficients and other tropospheric parameters, and no accuracy analysis of GPT2 has been made until now. In this paper high-precision meteorological data from ECWMF and NOAA were used to test and analyze the accuracy of temperature, pressure and water vapor pressure expressed by GPT2, testing results show that the mean Bias of temperature is -0.59℃, average RMS is 3.82℃; absolute value of average Bias of pressure and water vapor pressure are less than 1 mb, GPT2 pressure has average RMS of 7 mb, and water vapor pressure no more than 3 mb, accuracy is different in different latitudes, all of them have obvious seasonality. In conclusion, GPT2 model has high accuracy and stability on global scale.

  20. TRACE Assessment for BWR ATWS Analysis

    International Nuclear Information System (INIS)

    Cheng, L.Y.; Diamond, D.; Cuadra, Arantxa; Raitses, Gilad; Aronson, Arnold

    2010-01-01

    A TRACE/PARCS input model has been developed in order to be able to analyze anticipated transients without scram (ATWS) in a boiling water reactor. The model is based on one developed previously for the Browns Ferry reactor for doing loss-of-coolant accident analysis. This model was updated by adding the control systems needed for ATWS and a core model using PARCS. The control systems were based on models previously developed for the TRAC-B code. The PARCS model is based on information (e.g., exposure and moderator density (void) history distributions) obtained from General Electric Hitachi and cross sections for GE14 fuel obtained from an independent source. The model is able to calculate an ATWS, initiated by the closure of main steam isolation valves, with recirculation pump trip, water level control, injection of borated water from the standby liquid control system and actuation of the automatic depressurization system. The model is not considered complete and recommendations are made on how it should be improved.

  1. Tutorial: Assessment and Analysis of Polysyllables in Young Children

    Science.gov (United States)

    Masso, Sarah; McLeod, Sharynne; Baker, Elise

    2018-01-01

    Purpose: Polysyllables, words of 3 or more syllables, represent almost 30% of words used in American English. The purpose of this tutorial is to support speech-language pathologists' (SLPs') assessment and analysis of polysyllables, extending the focus of published assessment tools that focus on sampling and analyzing children's segmental accuracy…

  2. Content Analysis of Assessment Data in Marketing Education

    Science.gov (United States)

    Vowles, Nicole; Hutto, Alexandra; Miller, Peter Max M.

    2017-01-01

    This study analyzes a sample of students' writing to assess their understanding of marketing concepts in the context of a Principles of Marketing course. Content analysis of pre- and post-essays was used to assess student knowledge of marketing concepts. The data was collected in Principles of Marketing classes. and highlight that many students…

  3. MOJECT: MOTION ANALYSIS TO SUPPORT ASSESSMENT OF SURGICAL SKILLS

    NARCIS (Netherlands)

    Uineken, Ruben; Groot Jebbink, Erik; Halfwerk, F.R.; Bulten, Anne; Knoben, Peter; Roux, Moritz; Wicik, Ola; Groenier, Marleen

    2018-01-01

    Assessment of surgical skills is usually performed through direct observation by experts. This is subjective, expensive and requires assessor training. Motion analysis can support objective and cost-effective assessment. The aim of the current study is to design a low-cost, unobtrusive system for

  4. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  5. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  6. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  7. Computer assessment of interview data using latent semantic analysis.

    Science.gov (United States)

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed.

  8. Task Analysis Assessment on Intrastate Bus Traffic Controllers

    Science.gov (United States)

    Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad

    2016-11-01

    Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.

  9. HVAC fault tree analysis for WIPP integrated risk assessment

    International Nuclear Information System (INIS)

    Kirby, P.; Iacovino, J.

    1990-01-01

    In order to evaluate the public health risk from operation of the Waste Isolation Pilot Plant (WIPP) due to potential radioactive releases, a probabilistic risk assessment of waste handling operations was conducted. One major aspect of this risk assessment involved fault tree analysis of the plant heating, ventilation, and air conditioning (HVAC) systems, which comprise the final barrier between waste handling operations and the environment. 1 refs., 1 tab

  10. Risk analysis within environmental impact assessment of proposed construction activity

    Energy Technology Data Exchange (ETDEWEB)

    Zeleňáková, Martina; Zvijáková, Lenka

    2017-01-15

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation of the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.

  11. Risk analysis within environmental impact assessment of proposed construction activity

    International Nuclear Information System (INIS)

    Zeleňáková, Martina; Zvijáková, Lenka

    2017-01-01

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation of the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.

  12. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  13. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2013-01-01

    We present a method for No-Reference (NR) Video Quality Assessment (VQA) for decoded video without access to the bitstream. This is achieved by extracting and pooling features from a NR image quality assessment method used frame by frame. We also present methods to identify the video coding...... and estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods...

  14. Cable Hot Shorts and Circuit Analysis in Fire Risk Assessment

    International Nuclear Information System (INIS)

    LaChance, Jeffrey; Nowlen, Steven P.; Wyant, Frank

    1999-01-01

    Under existing methods of probabilistic risk assessment (PRA), the analysis of fire-induced circuit faults has typically been conducted on a simplistic basis. In particular, those hot-short methodologies that have been applied remain controversial in regards to the scope of the assessments, the underlying methods, and the assumptions employed. To address weaknesses in fire PRA methodologies, the USNRC has initiated a fire risk analysis research program that includes a task for improving the tools for performing circuit analysis. The objective of this task is to obtain a better understanding of the mechanisms linking fire-induced cable damage to potentially risk-significant failure modes of power, control, and instrumentation cables. This paper discusses the current status of the circuit analysis task

  15. Safety assessment for deep underground disposal vault-pathways analysis

    International Nuclear Information System (INIS)

    Lyon, R.B.; Rosinger, E.L.J.

    1980-01-01

    The concept verification phase of the Canadian programme for the disposal of nuclear fuel waste encompasses a period of about three years before the start of site selection. During this time, the methodology for Environmental and Safety Assessment studies is being developed by focusing on a model site. Pathways analysis is an important component of these studies. It involves the prediction of the rate at which radionuclides might be released from a disposal vault and travel through the geosphere and biosphere to reach man. The pathways analysis studies cover three major topics: geosphere pathways analysis, biosphere pathways analysis and potentially-disruptive-phenomena analysis. Geosphere pathways analysis includes a total systems analysis, using the computer program GARD2, vault analysis, which considers container failure and waste leaching, hydrogeological modelling and geochemical modelling. Biosphere pathways analysis incorporates a compartmental modelling approach using the computer program RAMM, and a food chain analysis using the computer program FOOD II. Potentially-disruptive-phenomena analysis involves the estimation of the probability and consequences of events such as earthquakes which might reduce the effectiveness of the barriers preventing the release of radionuclides. The current stage of development of the required methodology and data is discussed in each of the three areas and preliminary results are presented. (author)

  16. FENCH-analysis of electricity generation greenhouse gas emissions from solar and wind power in Germany

    International Nuclear Information System (INIS)

    Hartmann, D.

    1997-01-01

    The assessment of energy supply systems with regard to the influence on climate change requires not only the quantification of direct emissions caused by the operation of a power plant. It also has to take into account indirect emissions resulting from e.g. construction and dismounting of the power plant. Processes like manufacturing the materials for building the plant, the transportation of components and the construction and maintenance of the power plant are included. A tool to determine and assess the energy and mass flows is the Life Cycle Analysis (LCA) which allows the assessment of environmental impacts related to a product or service. In this paper a FENCH (Full Energy Chain)-analysis based on a LCA of electricity production from wind and solar power plants under operation conditions typical for application its Germany is presented. The FENCH-analysis is based on two methods, Process Chain Analysis (PCA) and Input-Output-Analysis (IOA) which are illustrated by the example of an electricity generation from a wind power plant. The calculated results are shown for the cumulated (indirect and direct) Greenhouse-Gas (GHG)-emissions for an electricity production from wind and solar power plants. A comparison of the results to the electricity production from a coal fired power plant is performed. At last a comparison of 1 kWh electricity from renewable energy to 1 kWh from fossil energy carrier has to be done, because the benefits of 1 kWh electricity from various types of power plants are different. Electricity from wind energy depends on the meteorological conditions while electricity from a fossil fired power plant is able to follow the power requirements of the consumers nearly all the time. By considering the comparison of the different benefit provided the GHG-Emissions are presented. (author)

  17. Strategic Choice Analysis by Expert Panels for Migration Impact Assessment

    NARCIS (Netherlands)

    Kourtit, K.; Nijkamp, P.

    2011-01-01

    International migration is a source of policy and research interest in many countries. This paper provides a review of experiences and findings from migration impact assessment worldwide. Various findings are briefly summarised in the context of a systematic migration SWOT analysis for five distinct

  18. Methods for global sensitivity analysis in life cycle assessment

    NARCIS (Netherlands)

    Groen, Evelyne A.; Bokkers, Eddy; Heijungs, Reinout; Boer, de Imke J.M.

    2017-01-01

    Purpose: Input parameters required to quantify environmental impact in life cycle assessment (LCA) can be uncertain due to e.g. temporal variability or unknowns about the true value of emission factors. Uncertainty of environmental impact can be analysed by means of a global sensitivity analysis to

  19. Assessing Group Interaction with Social Language Network Analysis

    Science.gov (United States)

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  20. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  1. Applying Multi-Criteria Analysis Methods for Fire Risk Assessment

    Directory of Open Access Journals (Sweden)

    Pushkina Julia

    2015-11-01

    Full Text Available The aim of this paper is to prove the application of multi-criteria analysis methods for optimisation of fire risk identification and assessment process. The object of this research is fire risk and risk assessment. The subject of the research is studying the application of analytic hierarchy process for modelling and influence assessment of various fire risk factors. Results of research conducted by the authors can be used by insurance companies to perform the detailed assessment of fire risks on the object and to calculate a risk extra charge to an insurance premium; by the state supervisory institutions to determine the compliance of a condition of object with requirements of regulations; by real state owners and investors to carry out actions for decrease in degree of fire risks and minimisation of possible losses.

  2. Improving assessment of personality disorder traits through social network analysis.

    Science.gov (United States)

    Clifton, Allan; Turkheimer, Eric; Oltmanns, Thomas F

    2007-10-01

    When assessing personality disorder traits, not all judges make equally valid judgments of all targets. The present study uses social network analysis to investigate factors associated with reliability and validity in peer assessment. Participants were groups of military recruits (N=809) who acted as both targets and judges in a round-robin design. Participants completed self- and informant versions of the Multisource Assessment of Personality Pathology. Social network matrices were constructed based on reported acquaintance, and cohesive subgroups were identified. Judges who shared a mutual subgroup were more reliable and had higher self-peer agreement than those who did not. Partitioning networks into two subgroups achieved more consistent improvements than multiple subgroups. We discuss implications for multiple informant assessments.

  3. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  4. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  5. Analysis and assessment of water treatment plant reliability

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2017-03-01

    Full Text Available The subject of the publication is the analysis and assessment of the reliability of the surface water treatment plant (WTP. In the study the one parameter method of reliability assessment was used. Based on the flow sheet derived from the water company the reliability scheme of the analysed WTP was prepared. On the basis of the daily WTP work report the availability index Kg for the individual elements included in the WTP, was determined. Then, based on the developed reliability scheme showing the interrelationships between elements, the availability index Kg for the whole WTP was determined. The obtained value of the availability index Kg was compared with the criteria values.

  6. Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis

    Science.gov (United States)

    Chow, Kui Foon; Kennedy, Kerry John

    2014-01-01

    International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…

  7. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  8. Assessment of surface water quality using hierarchical cluster analysis

    Directory of Open Access Journals (Sweden)

    Dheeraj Kumar Dabgerwal

    2016-02-01

    Full Text Available This study was carried out to assess the physicochemical quality river Varuna inVaranasi,India. Water samples were collected from 10 sites during January-June 2015. Pearson correlation analysis was used to assess the direction and strength of relationship between physicochemical parameters. Hierarchical Cluster analysis was also performed to determine the sources of pollution in the river Varuna. The result showed quite high value of DO, Nitrate, BOD, COD and Total Alkalinity, above the BIS permissible limit. The results of correlation analysis identified key water parameters as pH, electrical conductivity, total alkalinity and nitrate, which influence the concentration of other water parameters. Cluster analysis identified three major clusters of sampling sites out of total 10 sites, according to the similarity in water quality. This study illustrated the usefulness of correlation and cluster analysis for getting better information about the river water quality.International Journal of Environment Vol. 5 (1 2016,  pp: 32-44

  9. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  10. Discriminant analysis in Polish manufacturing sector performance assessment

    Directory of Open Access Journals (Sweden)

    Józef Dziechciarz

    2004-01-01

    Full Text Available This is a presentation of the preliminary results of a larger project on the determination of the attractiveness of manufacturing branches. Results of the performance assessment of Polish manufacturing branches in 2000 (section D „Manufacturing” – based on NACE – Nomenclatures des Activites de Communite Europeene are shown. In the research, the classical (Fisher’s linear discriminant analysis technique was used for the analysis of the profit generation ability by the firms belonging to a certain production branch. For estimation, the data describing group level was used – for cross-validation, the classes data.

  11. No-Reference Video Quality Assessment using Codec Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2015-01-01

    types of videos, estimating the level of quantization used in the I-frames, and exploiting this information to assess the video quality. In order to do this for H.264/AVC, the distribution of the DCT-coefficients after intra-prediction and deblocking are modeled. To obtain VQA features for H.264/AVC, we......A no-reference video quality assessment (VQA) method is presented for videos distorted by H.264/AVC and MPEG-2. The assessment is performed without access to the bit-stream. Instead we analyze and estimate coefficients based on decoded pixels. The approach involves distinguishing between the two...... propose a novel estimation method of the quantization in H.264/AVC videos without bitstream access, which can also be used for Peak Signalto-Noise Ratio (PSNR) estimation. The results from the MPEG-2 and H.264/AVC analysis are mapped to a perceptual measure of video quality by Support Vector Regression...

  12. HSE assessment of explosion risk analysis in offshore safety cases

    Energy Technology Data Exchange (ETDEWEB)

    Brighton, P.W.M.; Fearnley, P.J.; Brearley, I.G. [Health and Safety Executive, Bootle (United Kingdom). Offshore Safety Div.

    1995-12-31

    In the past two years HSE has assessed around 250 Safety Cases for offshore oil and gas installations, building up a unique overview of the current state of the art on fire and explosion risk assessment. This paper reviews the explosion risk methods employed, focusing on the aspects causing most difficulty for assessment and acceptance of Safety Cases. Prediction of overpressures in offshore explosions has been intensively researched in recent years but the justification of the means of prevention, control and mitigation of explosions often depends on much additional analysis of the frequency and damage potential of explosions. This involves a number of factors, the five usually considered being: leak sizes; gas dispersion; ignition probabilities; the frequency distribution of explosion strength; and the prediction of explosion damage. Sources of major uncertainty in these factors and their implications for practical risk management decisions are discussed. (author)

  13. Safety analysis, risk assessment, and risk acceptance criteria

    International Nuclear Information System (INIS)

    Jamali, K.

    1997-01-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, 'ensuring' plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is 'safe.' Use of RACs requires quantitative estimates of consequence frequency and magnitude

  14. Assessment report on NRP sub-theme 'Risk Analysis'

    International Nuclear Information System (INIS)

    Biesiot, W.; Hendrickx, L.; Olsthoorn, A.A.

    1995-01-01

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs

  15. Public transport risk assessment through fault tree analysis

    Directory of Open Access Journals (Sweden)

    Z. Yaghoubpour

    2016-04-01

    Full Text Available This study focused on the public transport risk assessment in District one of ​​Tehran through Fault Tree Analysis involving the three criteria of human, vehicle and road in Haddon matrix. In fact, it examined the factors contributing to the occurrence of road accidents at several urban black spots within District 1. Relying on road safety checklists and survey of experts, this study made an effort to help urban managers to assess the risks in the public transport and prevent road accidents. Finally, the risk identification and assessment of public transport in District one yielded several results to answer the research questions. The hypotheses analysis suggested that safety issues involved in public transport are concerned by urban managers. The key reactive measures are investigation of accidents, identification of causes and correction of black spots. In addition to high costs, however, the reactive measures give rise to multiple operational problems such as traffic navigation and guaranteeing user safety in every operation. The case study highlighted the same fact. The macro-level management in the metropolis of Tehran is critical. The urban road casualties and losses can be curtailed by preventive measures such as continuous assessment of road safety.

  16. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  17. Analysis of risk assessment methods for goods trucking

    Directory of Open Access Journals (Sweden)

    Yunyazova A.O.

    2018-04-01

    Full Text Available the article considers models of risk assessment that can be applied to cargo transportation, for forecasting possible damage in the form of financial and material costs in order to reduce the percentage of probability of their occurrence. The analysis of risk by the method «Criterion. Event. Rule" is represented. This method is based on the collection of information by various methods, assigning an assessment to the identified risks, ranking and formulating a report on the analysis. It can be carried out as a fully manual mechanical method of information collecting and performing calculations or can be brought to an automated level from data collection to the delivery of finished results (but in this case some nuances that could significantly influence the outcome of the analysis can be ignored. The expert method is of particular importance, since it relies directly on human experience. In this case, a special role is played by the human factor. The collection of information and the assigned assessments to risk groups depend on the extent to which experts agree on this issue. The smaller the fluctuations in the values ​​of the estimates of the experts, the more accurate and optimal the results will be.

  18. Assessment of non-linear analysis finite element program (NONSAP) for inelastic analysis

    International Nuclear Information System (INIS)

    Chang, T.Y.; Prachuktam, S.; Reich, M.

    1976-11-01

    An assessment on a nonlinear structural analysis finite element program called NONSAP is given with respect to its inelastic analysis capability for pressure vessels and components. The assessment was made from the review of its theoretical basis and bench mark problem runs. It was found that NONSAP has only limited capability for inelastic analysis. However, the program was written flexible enough that it can be easily extended or modified to suit the user's need. Moreover, some of the numerical difficulties in using NONSAP are pointed out

  19. Modular risk analysis for assessing multiple waste sites

    International Nuclear Information System (INIS)

    Whelan, G.; Buck, J.W.; Nazarali, A.

    1994-06-01

    Human-health impacts, especially to the surrounding public, are extremely difficult to assess at installations that contain multiple waste sites and a variety of mixed-waste constituents (e.g., organic, inorganic, and radioactive). These assessments must address different constituents, multiple waste sites, multiple release patterns, different transport pathways (i.e., groundwater, surface water, air, and overland soil), different receptor types and locations, various times of interest, population distributions, land-use patterns, baseline assessments, a variety of exposure scenarios, etc. Although the process is complex, two of the most important difficulties to overcome are associated with (1) establishing an approach that allows for modifying the source term, transport, or exposure component as an individual module without having to re-evaluate the entire installation-wide assessment (i.e., all modules simultaneously), and (2) displaying and communicating the results in an understandable and useable maimer to interested parties. An integrated, physics-based, compartmentalized approach, which is coupled to a Geographical Information System (GIS), captures the regional health impacts associated with multiple waste sites (e.g., hundreds to thousands of waste sites) at locations within and surrounding the installation. Utilizing a modular/GIS-based approach overcomes difficulties in (1) analyzing a wide variety of scenarios for multiple waste sites, and (2) communicating results from a complex human-health-impact analysis by capturing the essence of the assessment in a relatively elegant manner, so the meaning of the results can be quickly conveyed to all who review them

  20. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  1. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  2. A hybrid input–output multi-objective model to assess economic–energy–environment trade-offs in Brazil

    International Nuclear Information System (INIS)

    Carvalho, Ariovaldo Lopes de; Antunes, Carlos Henggeler; Freire, Fausto; Henriques, Carla Oliveira

    2015-01-01

    A multi-objective linear programming (MOLP) model based on a hybrid Input–Output (IO) framework is presented. This model aims at assessing the trade-offs between economic, energy, environmental (E3) and social objectives in the Brazilian economic system. This combination of multi-objective models with Input–Output Analysis (IOA) plays a supplementary role in understanding the interactions between the economic and energy systems, and the corresponding impacts on the environment, offering a consistent framework for assessing the effects of distinct policies on these systems. Firstly, the System of National Accounts (SNA) is reorganized to include the National Energy Balance, creating a hybrid IO framework that is extended to assess Greenhouse Gas (GHG) emissions and the employment level. The objective functions considered are the maximization of GDP (gross domestic product) and employment levels, as well as the minimization of energy consumption and GHG emissions. An interactive method enabling a progressive and selective search of non-dominated solutions with distinct characteristics and underlying trade-offs is utilized. Illustrative results indicate that the maximization of GDP and the employment levels lead to an increase of both energy consumption and GHG emissions, while the minimization of either GHG emissions or energy consumption cause negative impacts on GDP and employment. - Highlights: • A hybrid Input–Output multi-objective model is applied to the Brazilian economy. • Objective functions are GDP, employment level, energy consumption and GHG emissions. • Interactive search process identifies trade-offs between the competing objectives. • Positive correlations between GDP growth and employment. • Positive correlations between energy consumption and GHG emissions

  3. Interrater reliability of videotaped observational gait-analysis assessments.

    Science.gov (United States)

    Eastlack, M E; Arvidson, J; Snyder-Mackler, L; Danoff, J V; McGarvey, C L

    1991-06-01

    The purpose of this study was to determine the interrater reliability of videotaped observational gait-analysis (VOGA) assessments. Fifty-four licensed physical therapists with varying amounts of clinical experience served as raters. Three patients with rheumatoid arthritis who demonstrated an abnormal gait pattern served as subjects for the videotape. The raters analyzed each patient's most severely involved knee during the four subphases of stance for the kinematic variables of knee flexion and genu valgum. Raters were asked to determine whether these variables were inadequate, normal, or excessive. The temporospatial variables analyzed throughout the entire gait cycle were cadence, step length, stride length, stance time, and step width. Generalized kappa coefficients ranged from .11 to .52. Intraclass correlation coefficients (2,1) and (3,1) were slightly higher. Our results indicate that physical therapists' VOGA assessments are only slightly to moderately reliable and that improved interrater reliability of the assessments of physical therapists utilizing this technique is needed. Our data suggest that there is a need for greater standardization of gait-analysis training.

  4. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Lei Bao

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  5. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  6. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  7. Social and ethical analysis in health technology assessment.

    Science.gov (United States)

    Tantivess, Sripen

    2014-05-01

    This paper presents a review of the domestic and international literature on the assessment of the social and ethical implications of health technologies. It gives an overview of the key concepts, principles, and approaches that should be taken into account when conducting a social and ethical analysis within health technology assessment (HTA). Although there is growing consensus among healthcare experts that the social and ethical ramifications of a given technology should be examined before its adoption, the demand for this kind of analysis among policy-makers around the world, including in Thailand, has so far been lacking. Currently decision-makers mainly base technology adoption decisions using evidence on clinical effectiveness, value for money, and budget impact, while social and ethical aspects have been neglected. Despite the recognized importance of considering equity, justice, and social issues when making decisions regarding health resource allocation, the absence of internationally-accepted principles and methodologies, among other factors, hinders research in these areas. Given that developing internationally agreed standards takes time, it has been recommended that priority be given to defining processes that are justifiable, transparent, and contestable. A discussion of the current situation in Thailand concerning social and ethical analysis of health technologies is also presented.

  8. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  9. Environmental policy assessment and the usefulness of meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Button, K. [Loughborough Univ. of Technology (United Kingdom); Nijkamp, P. [Dept. of Spatial Economics. Fac. of Economics and Econometrics. Vrije Univ., Amsterdam (Netherlands)

    1995-12-31

    Incorporating environmental considerations fully into decision-making is a complex process. To develop improved assessment techniques and procedures it is important to understand these complexities. This paper sets out to explore the underlying nature of the issues involved and to develop a systematic schema which makes transparent the types of consideration which need to be incorporated in the assessment process and the forms of tradeoffs which are inevitable. It then proceeds to the more practical issue of looking at where better use could be made of the information that we already have. Environmental assessment is not new and a body of information and experience already exists. An important question is whether we are using the body of knowledge fully. The paper explores the areas in which meta-analysis could provide new insights in the assessment process by extracting more information from previous work but, in doing this, it also highlights areas where further primary research would yield the greatest return. 4 figs., 21 refs.

  10. Biological dosimetry: chromosomal aberration analysis for dose assessment

    International Nuclear Information System (INIS)

    1986-01-01

    In view of the growing importance of chromosomal aberration analysis as a biological dosimeter, the present report provides a concise summary of the scientific background of the subject and a comprehensive source of information at the technical level. After a review of the basic principles of radiation dosimetry and radiation biology basic information on the biology of lymphocytes, the structure of chromosomes and the classification of chromosomal aberrations are presented. This is followed by a presentation of techniques for collecting blood, storing, transporting, culturing, making chromosomal preparations and scaring of aberrations. The physical and statistical parameters involved in dose assessment are discussed and examples of actual dose assessments taken from the scientific literature are given

  11. Conformity Assessment in Nuclear Material and Environmental Sample Analysis

    International Nuclear Information System (INIS)

    Aregbe, Y.; Jakopic, R.; Richter, S.; Venchiarutti, C.

    2015-01-01

    Safeguards conclusions are based to a large extent on comparison of measurement results between operator and safeguards laboratories. Measurement results must state traceability and uncertainties to be comparable. Recent workshops held at the IAEA and in the frame of the European Safeguards Research and Development Association (ESARDA), reviewed different approaches for Nuclear Material Balance Evaluation (MBE). Among those, the ''bottom-up'' approach requires assessment of operators and safeguards laboratories measurement systems and capabilities. Therefore, inter-laboratory comparisons (ILCs) with independent reference values provided for decades by JRC-IRMM, CEA/CETAMA and US DOE are instrumental to shed light on the current state of practice in measurements of nuclear material and environmental swipe samples. Participating laboratories are requested to report the measurement results with associated uncertainties, and have the possibility to benchmark those results against independent and traceable reference values. The measurement capability of both the IAEA Network of Analytical Laboratories (NWAL) and the nuclear operator's analytical services participating in ILCs can be assessed against the independent reference values as well as against internationally agreed quality goals, in compliance with ISO 13528:2005. The quality goals for nuclear material analysis are the relative combined standard uncertainties listed in the ITV2010. Concerning environmental swipe sample analysis, the IAEA defined measurement quality goals applied in conformity assessment. The paper reports examples from relevant inter-laboratory comparisons, looking at laboratory performance according to the purpose of the measurement and the possible use of the result in line with the IUPAC International Harmonized Protocol. Tendencies of laboratories to either overestimate and/or underestimate uncertainties are discussed using straightforward graphical tools to evaluate

  12. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    International Nuclear Information System (INIS)

    Fensin, Michael L.; Tobin, Stephen J.; Swinhoe, Martyn T.; Menlove, Howard O.; Sandoval, Nathan P.

    2009-01-01

    assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types

  13. A Monte Carlo Based Spent Fuel Analysis Safeguards Strategy Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael L.; Tobin, Stephen J.; Swinhoe, Martyn T.; Menlove, Howard O.; Sandoval, Nathan P. [Los Alamos National Laboratory, E540, Los Alamos, NM 87545 (United States)

    2009-06-15

    the generalized assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types. (authors)

  14. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  15. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  16. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter

    2013-01-01

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  17. Analysis of dependent failures in risk assessment and reliability evaluation

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Kelley, A.P. Jr.; Gas-Cooled Reactors Associates, La Jolla, CA)

    1983-01-01

    The ability to estimate the risk of potential reactor accidents is largely determined by the ability to analyze statistically dependent multiple failures. The importance of dependent failures has been indicated in recent probabilistic risk assessment (PRA) studies as well as in reports of reactor operating experiences. This article highlights the importance of several different types of dependent failures from the perspective of the risk and reliability analyst and provides references to the methods and data available for their analysis. In addition to describing the current state of the art, some recent advances, pitfalls, misconceptions, and limitations of some approaches to dependent failure analysis are addressed. A summary is included of the discourse on this subject, which is presented in the Institute of Electrical and Electronics Engineers/American Nuclear Society PRA Procedures Guide

  18. Health Monitoring System Technology Assessments: Cost Benefits Analysis

    Science.gov (United States)

    Kent, Renee M.; Murphy, Dennis A.

    2000-01-01

    The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.

  19. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  20. Supporting analysis and assessments quality metrics: Utility market sector

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1996-10-01

    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  1. Threat evaluation for impact assessment in situation analysis systems

    Science.gov (United States)

    Roy, Jean; Paradis, Stephane; Allouche, Mohamad

    2002-07-01

    Situation analysis is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of situation awareness, for the decision maker. Data fusion is a key enabler to meeting the demanding requirements of military situation analysis support systems. According to the data fusion model maintained by the Joint Directors of Laboratories' Data Fusion Group, impact assessment estimates the effects on situations of planned or estimated/predicted actions by the participants, including interactions between action plans of multiple players. In this framework, the appraisal of actual or potential threats is a necessary capability for impact assessment. This paper reviews and discusses in details the fundamental concepts of threat analysis. In particular, threat analysis generally attempts to compute some threat value, for the individual tracks, that estimates the degree of severity with which engagement events will potentially occur. Presenting relevant tracks to the decision maker in some threat list, sorted from the most threatening to the least, is clearly in-line with the cognitive demands associated with threat evaluation. A key parameter in many threat value evaluation techniques is the Closest Point of Approach (CPA). Along this line of thought, threatening tracks are often prioritized based upon which ones will reach their CPA first. Hence, the Time-to-CPA (TCPA), i.e., the time it will take for a track to reach its CPA, is also a key factor. Unfortunately, a typical assumption for the computation of the CPA/TCPA parameters is that the track velocity will remain constant. When a track is maneuvering, the CPA/TCPA values will change accordingly. These changes will in turn impact the threat value computations and, ultimately, the resulting threat list. This is clearly undesirable from a command decision-making perspective. In this regard, the paper briefly discusses threat value stabilization

  2. A Big Data Analysis Approach for Rail Failure Risk Assessment.

    Science.gov (United States)

    Jamshidi, Ali; Faghih-Roohi, Shahrzad; Hajizadeh, Siamak; Núñez, Alfredo; Babuska, Robert; Dollevoet, Rolf; Li, Zili; De Schutter, Bart

    2017-08-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  3. Utility analysis and calibration of QOL assessment in disease management.

    Science.gov (United States)

    Liu, Mo

    2018-05-02

    In clinical trials, the assessment of health-related quality of life (QOL) (or patient-reported outcome [PRO] measure) has become very popular especially for clinical studies conducted for evaluating clinical benefits of patients with chronic, severe, and/or life threatening diseases. Health-related QOL information and PRO measures are useful for disease management for achieving best clinical practice. In this article, we will focus on health-related QOL assessment. The concept, design, and analysis of health-related QOL in clinical trials are reviewed. Validation of the use of health-related QOL instrument in terms of some key performance characteristics such as accuracy, reliability, sensitivity, and responsibility for assuring quality, integrity, and validity of collected QOL data are discussed. The concept of utility analysis and calibration (e.g., with respect to life events) for achieving the optimization of disease management are proposed. The change of the QOL could be translated into different life events for effective disease management. These translations could evaluate the treatment effect by more directly displaying the change of the QOL.

  4. Multiple criteria decision analysis for health technology assessment.

    Science.gov (United States)

    Thokala, Praveen; Duenas, Alejandra

    2012-12-01

    Multicriteria decision analysis (MCDA) has been suggested by some researchers as a method to capture the benefits beyond quality adjusted life-years in a transparent and consistent manner. The objectives of this article were to analyze the possible application of MCDA approaches in health technology assessment and to describe their relative advantages and disadvantages. This article begins with an introduction to the most common types of MCDA models and a critical review of state-of-the-art methods for incorporating multiple criteria in health technology assessment. An overview of MCDA is provided and is compared against the current UK National Institute for Health and Clinical Excellence health technology appraisal process. A generic MCDA modeling approach is described, and the different MCDA modeling approaches are applied to a hypothetical case study. A comparison of the different MCDA approaches is provided, and the generic issues that need consideration before the application of MCDA in health technology assessment are examined. There are general practical issues that might arise from using an MCDA approach, and it is suggested that appropriate care be taken to ensure the success of MCDA techniques in the appraisal process. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Multi-dimensional flood vulnerability assessment using data envelopment analysis

    Science.gov (United States)

    Zahid, Zalina; Saharizan, Nurul Syuhada; Hamzah, Paezah; Hussin, Siti Aida Sheikh; Khairi, Siti Shaliza Mohd

    2017-11-01

    Malaysia has been greatly impacted by flood during monsoon seasons. Even though flood prone areas are well identified, assessment on the vulnerability of the disaster is lacking. Assessment of flood vulnerability, defined as the potential for loss when a disaster occurs, is addressed in this paper. The focus is on the development of flood vulnerability measurement in 11 states in Peninsular Malaysia using a non-parametric approach of Data Envelopment Analysis. Scores for three dimensions of flood vulnerability (Population Vulnerability, Social Vulnerability and Biophysical) were calculated using secondary data of selected input and output variables across an 11-year period from 2004 to 2014. The results showed that Johor and Pahang were the most vulnerable to flood in terms of Population Vulnerability, followed by Kelantan, the most vulnerable to flood in terms of Social Vulnerability and Kedah, Pahang and Terengganu were the most vulnerable to flood in terms of Biophysical Vulnerability among the eleven states. The results also showed that the state of Johor, Pahang and Kelantan to be most vulnerable across the three dimensions. Flood vulnerability assessment is important as it provides invaluable information that will allow the authority to identify and develop plans for flood mitigation and to reduce the vulnerability of flood at the affected regions.

  6. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  7. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  8. Analysis of complete logical structures in system reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.; Clarotti, C.A.; Contini, S.; Spizzichino, F.

    1980-01-01

    The application field of the fault-tree techniques has been explored in order to assess whether the AND-OR structures covered all possible actual binary systems. This resulted in the identification of various situations requiring the complete AND-OR-NOT structures for their analysis. We do not use the term non-coherent for such cases, since the monotonicity or not of a structure function is not a characteristic of a system, but of the particular top event being examined. The report presents different examples of complete fault-trees, which can be examined according to different degrees of approximation. In fact, the exact analysis for the determination of the smallest irredundant bases is very time consuming and actually necessary only in some particular cases (multi-state systems, incidental situations). Therefore, together with the exact procedure, the report shows two different methods of logical analysis that permit the reduction of complete fault-trees to AND-OR structures. Moreover, it discusses the problems concerning the evaluation of the probability distribution of the time to first top event occurrence, once the hypothesis of structure function monotonicity is removed

  9. Using miscue analysis to assess comprehension in deaf college readers.

    Science.gov (United States)

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  10. A comparison of integrated safety analysis and probabilistic risk assessment

    International Nuclear Information System (INIS)

    Damon, Dennis R.; Mattern, Kevin S.

    2013-01-01

    The U.S. Nuclear Regulatory Commission conducted a comparison of two standard tools for risk informing the regulatory process, namely, the Probabilistic Risk Assessment (PRA) and the Integrated Safety Analysis (ISA). PRA is a calculation of risk metrics, such as Large Early Release Frequency (LERF), and has been used to assess the safety of all commercial power reactors. ISA is an analysis required for fuel cycle facilities (FCFs) licensed to possess potentially critical quantities of special nuclear material. A PRA is usually more detailed and uses more refined models and data than an ISA, in order to obtain reasonable quantitative estimates of risk. PRA is considered fully quantitative, while most ISAs are typically only partially quantitative. The extension of PRA methodology to augment or supplant ISAs in FCFs has long been considered. However, fuel cycle facilities have a wide variety of possible accident consequences, rather than a few surrogates like LERF or core damage as used for reactors. It has been noted that a fuel cycle PRA could be used to better focus attention on the most risk-significant structures, systems, components, and operator actions. ISA and PRA both identify accident sequences; however, their treatment is quite different. ISA's identify accidents that lead to high or intermediate consequences, as defined in 10 Code of Federal Regulations (CFR) 70, and develop a set of Items Relied on For Safety (IROFS) to assure adherence to performance criteria. PRAs identify potential accident scenarios and estimate their frequency and consequences to obtain risk metrics. It is acceptable for ISAs to provide bounding evaluations of accident consequences and likelihoods in order to establish acceptable safety; but PRA applications usually require a reasonable quantitative estimate, and often obtain metrics of uncertainty. This paper provides the background, features, and methodology associated with the PRA and ISA. The differences between the

  11. Healthcare efficiency assessment using DEA analysis in the Slovak Republic.

    Science.gov (United States)

    Stefko, Robert; Gavurova, Beata; Kocisova, Kristina

    2018-03-09

    A regional disparity is becoming increasingly important growth constraint. Policy makers need quantitative knowledge to design effective and targeted policies. In this paper, the regional efficiency of healthcare facilities in Slovakia is measured (2008-2015) using data envelopment analysis (DEA). The DEA is the dominant approach to assessing the efficiency of the healthcare system but also other economic areas. In this study, the window approach is introduced as an extension to the basic DEA models to evaluate healthcare technical efficiency in individual regions and quantify the basic regional disparities and discrepancies. The window DEA method was chosen since it leads to increased discrimination on results especially when applied to small samples and it enables year-by-year comparisons of the results. Two stable inputs (number of beds, number of medical staff), three variable inputs (number of all medical equipment, number of magnetic resonance (MR) devices, number of computed tomography (CT) devices) and two stable outputs (use of beds, average nursing time) were chosen as production variable in an output-oriented 4-year window DEA model for the assessment of technical efficiency in 8 regions. The database was made available from the National Health Information Center and the Slovak Statistical Office, as well as from the online databases Slovstat and DataCube. The aim of the paper is to quantify the impact of the non-standard Data Envelopment Analysis (DEA) variables as the use of medical technologies (MR, CT) on the results of the assessment of the efficiency of the healthcare facilities and their adequacy in the evaluation of the monitored processes. The results of the analysis have shown that there is an indirect dependence between the values of the variables over time and the results of the estimated efficiency in all regions. The regions that had low values of the variables over time achieved a high degree of efficiency and vice versa. Interesting

  12. Cytogenetic analysis for radiation dose assessment. A manual

    International Nuclear Information System (INIS)

    2001-01-01

    Chromosome aberration analysis is recognized as a valuable dose assessment method which fills a gap in dosimetric technology, particularly when there are difficulties in interpreting the data, in cases where there is reason to believe that persons not wearing dosimeters have been exposed to radiation, in cases of claims for compensation for radiation injuries that are not supported by unequivocal dosimetric evidence, or in cases of exposure over an individual's working lifetime. The IAEA has maintained a long standing involvement in biological dosimetry commencing in 1978. This has been via a sequence of Co-ordinated Research Programmes (CRPs), the running of Regional Training Courses, the sponsorship of individual training fellowships and the provision of necessary equipment to laboratories in developing Member States. The CRP on the 'Use of Chromosome Aberration Analysis in Radiation Protection' was initiated by IAEA in 1982. It ended with the publication of the IAEA Technical Report Series No. 260, titled 'Biological Dosimetry: Chromosomal Aberration Analysis for Dose Assessment' in 1986. The overall objective of the CRP (1998-2000) on 'Radiation Dosimetry through Biological Indicators' is to review and standardize the available methods and amend the above mentioned previous IAEA publication with current techniques on cytogenetic bioindicators which may be of practical use in biological dosimetry worldwide. An additional objective is to identify promising cytogenetic techniques to provide Member States with up to date and generally agreed advice regarding the best focus for research and suggestions for the most suitable techniques for near future practice in biodosimetry. This activity is in accordance with the International Basic Safety Standards (BSS) published in 1996. To pursue this task the IAEA has conducted a Research Co-ordination Meeting (Budapest, Hungary, June 1998) with the participation of senior scientists of 24 biodosimetry laboratories to discuss

  13. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  14. HANFORD SAFETY ANALYSIS and RISK ASSESSMENT HANDBOOK (SARAH)

    International Nuclear Information System (INIS)

    EVANS, C.B.

    2004-01-01

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S and M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard

  15. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  16. Hybrid vehicle assessment. Phase 1: Petroleum savings analysis

    Science.gov (United States)

    Levin, R.; Liddle, S.; Deshpande, G.; Trummel, M.; Vivian, H. C.

    1984-01-01

    The results of a comprehensive analysis of near term electric hybrid vehicles are presented, with emphasis on their potential to save significant amounts of petroleum on a national scale in the 1990s. Performance requirements and expected annual usage patterns of these vehicles are first modeled. The projected U.S. fleet composition is estimated, and conceptual hybrid vehicle designs are conceived and analyzed for petroleum use when driven in the expected annual patterns. These petroleum consumption estimates are then compared to similar estimates for projected 1990 conventional vehicles having the same performance and driven in the same patterns. Results are presented in the form of three utility functions and comparisons of sevral conceptual designs are made. The Hybrid Vehicle (HV) design and assessment techniques are discussed and a general method is explained for selecting the optimum energy management strategy for any vehicle mission battery combination. Conclusions and recommendations are presented, and development recommendations are identified.

  17. Time-dependent reliability analysis and condition assessment of structures

    International Nuclear Information System (INIS)

    Ellingwood, B.R.

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process

  18. Confirmatory Factor Analysis of the Procrastination Assessment Scale for Students

    Directory of Open Access Journals (Sweden)

    Ronald D. Yockey

    2015-10-01

    Full Text Available The relative fit of one- and two-factor models of the Procrastination Assessment Scale for Students (PASS was investigated using confirmatory factor analysis on an ethnically diverse sample of 345 participants. The results indicated that although the two-factor model provided better fit to the data than the one-factor model, neither model provided optimal fit. However, a two-factor model which accounted for common item theme pairs used by Solomon and Rothblum in the creation of the scale provided good fit to the data. In addition, a significant difference by ethnicity was also found on the fear of failure subscale of the PASS, with Whites having significantly lower scores than Asian Americans or Latino/as. Implications of the results are discussed and recommendations made for future work with the scale.

  19. Assessing green waste route by using Network Analysis

    Science.gov (United States)

    Hasmantika, I. H.; Maryono, M.

    2018-02-01

    Green waste, such as waste from park need treat proper. One of the main problems of green waste management is how to design optimum collection. This research aims to determine the optimum green waste collection by determining optimum route among park. The route optimum was assessed by using network analysis method. And the region five of Semarang city’s park within 20 parks in chose as case study. To enhancing recycle of green waste, three scenarios of treatment are proposed. Scenario 1 used one integrated treatment facility as terminal for enhancing recycle of green waste, Scenario 2 used two sites and scenario 3 used three sites. According to the assessment, the length of route of scenario 1 is 36.126 km and the time for collection estimated is 46 minutes. In scenario 2, the length of route is 36.471 km with a travel time is 47 minutes. The length of scenario three is 46.934 km and the time of collection is 60 minutes.

  20. Efficiency Assessment of Inbound Tourist Service Using Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Han-Shen Chen

    2018-06-01

    Full Text Available The successful and sustainable development of inbound tourism necessitates a long-term commitment, balancing between tourism supply and tourist demands. This study manipulated a performance appraisal of tourism service quality in Taiwan with Data Envelopment Analysis (DEA by employing input and output constructs to assess service quality efficiency. The empirical results of the estimation of technical efficiency (TE revealed that (1 the domestic tourism market is competitive, but still needs enhancements for tourism service; (2 Mainland Chinese tourists had the highest score among all foreign tourists, followed by Hong Kong and Macau tourists, tourists from other countries, Japanese tourists, and South Korean tourists; and (3 South Korean tourists had higher travel expenditure than others, but felt less satisfaction with travel services, which can be regarded as inefficient. Tourists from other countries had lower travel expenditure, but had higher satisfaction levels, which was considered efficient based on input and output index. The findings could contribute to bridging the gap between research and practice in assessing the efficiency of inbound tourist service. Tourism practitioners should be aware of tourists’ needs and interests, as these could be key fundamentals for improving tourists’ satisfaction with Taiwan’s service offerings.

  1. Cyber threat impact assessment and analysis for space vehicle architectures

    Science.gov (United States)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  2. Analysis of environmental impact assessment (EIA) system in Turkey.

    Science.gov (United States)

    Coşkun, Aynur Aydın; Turker, Ozhan

    2011-04-01

    The Environmental Impact Assessment (EIA) System, which embodies the "prevention principle" of the environmental law, is an important tool for environmental protection. This tool has a private importance for Turkey since it is a developing country, and it entered the Turkish law in 1983 with the Environmental Law. Besides, the EIA Regulation, which shows the application principles, became effective in 1993. Because Turkey is a candidate for European Union (EU), the EIA Regulation has been changed due to the EU compliance procedure, and its latest version became valid in 2008. This study aims to emphasize The EIA system in Turkey to supervise the efficiency of this procedure and point the success level. In the introduction part, general EIA concept, its importance, and some notations are mentioned. Following that, the legislation, which builds the EIA system, has been analyzed starting from the 1982 Turkish Constitution. Then, the legislation rules are explained due to the basic steps of the EIA procedure. In order to shed light upon the application, the EIA final decisions given until today, the results, and their distributions to the industries are assessed. In the final part of the study, a SWOT analysis is made to mention the weaknesses, strengths, opportunities, and threats of the EIA system in Turkey.

  3. Integrating multicriteria evaluation and stakeholders analysis for assessing hydropower projects

    International Nuclear Information System (INIS)

    Rosso, M.; Bottero, M.; Pomarico, S.; La Ferlita, S.; Comino, E.

    2014-01-01

    The use of hydroelectric potential and the protection of the river ecosystem are two contrasting aspects that arise in the management of the same resource, generating conflicts between different stakeholders. The purpose of the paper is to develop a multi-level decision-making tool, able to support energy planning, with specific reference to the construction of hydropower plants in mountain areas. Starting from a real-world problem concerning the basin of the Sesia Valley (Italy), an evaluation framework based on the combined use of Multicriteria Evaluation and Stakeholders Analysis is proposed in the study. The results of the work show that the methodology is able to grant participated decisions through a multi-stakeholders traceable and transparent assessment process, to highlight the important elements of the decision problem and to support the definition of future design guidelines. - Highlights: • The paper concerns a multi-level decision-making tool able to support energy planning. • The evaluation framework is based on the use of AHP and Stakeholders Analysis. • Hydropower projects in the Sesia Valley (Italy) are evaluated and ranked in the study. • Environmental, economic, technical and sociopolitical criteria have been considered. • 42 stakeholder groups have been included in the evaluation

  4. Lake Michigan Wind Assessment Analysis, 2012 and 2013

    Directory of Open Access Journals (Sweden)

    Charles R Standridge

    2017-03-01

    Full Text Available A study was conducted to address the wind energy potential over Lake Michigan to support a commercial wind farm.  Lake Michigan is an inland sea in the upper mid-western United States.  A laser wind sensor mounted on a floating platform was located at the mid-lake plateau in 2012 and about 10.5 kilometers from the eastern shoreline near Muskegon Michigan in 2013.  Range gate heights for the laser wind sensor were centered at 75, 90, 105, 125, 150, and 175 meters.  Wind speed and direction were measured once each second and aggregated into 10 minute averages.  The two sample t-test and the paired-t method were used to perform the analysis.  Average wind speed stopped increasing between 105 m and 150 m depending on location.  Thus, the collected data is inconsistent with the idea that average wind speed increases with height. This result implies that measuring wind speed at wind turbine hub height is essential as opposed to using the wind energy power law to project the wind speed from lower heights.  Average speed at the mid-lake plateau is no more that 10% greater than at the location near Muskegon.  Thus, it may be possible to harvest much of the available wind energy at a lower height and closer to the shoreline than previously thought.  At both locations, the predominate wind direction is from the south-southwest.  The ability of the laser wind sensor to measure wind speed appears to be affected by a lack of particulate matter at greater heights.   Keywords: wind assessment, Lake Michigan, LIDAR wind sensor, statistical analysis. Article History: Received June 15th 2016; Received in revised form January 16th 2017; Accepted February 2nd 2017 Available online How to Cite This Article: Standridge, C., Zeitler, D., Clark, A., Spoelma, T., Nordman, E., Boezaart, T.A., Edmonson, J.,  Howe, G., Meadows, G., Cotel, A. and Marsik, F. (2017 Lake Michigan Wind Assessment Analysis, 2012 and 2013. Int. Journal of Renewable Energy Development

  5. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  6. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  7. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R

    2009-01-01

    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  8. Modern psychometrics for assessing achievement goal orientation: a Rasch analysis.

    Science.gov (United States)

    Muis, Krista R; Winne, Philip H; Edwards, Ordene V

    2009-09-01

    A program of research is needed that assesses the psychometric properties of instruments designed to quantify students' achievement goal orientations to clarify inconsistencies across previous studies and to provide a stronger basis for future research. We conducted traditional psychometric and modern Rasch-model analyses of the Achievement Goals Questionnaire (AGQ, Elliot & McGregor, 2001) and the Patterns of Adaptive Learning Scale (PALS, Midgley et al., 2000) to provide an in-depth analysis of the two most popular instruments in educational psychology. For Study 1, 217 undergraduate students enrolled in educational psychology courses participated. Thirty-four were male and 181 were female (two did not respond). Participants completed the AGQ in the context of their educational psychology class. For Study 2, 126 undergraduate students enrolled in educational psychology courses participated. Thirty were male and 95 were female (one did not respond). Participants completed the PALS in the context of their educational psychology class. Traditional psychometric assessments of the AGQ and PALS replicated previous studies. For both, reliability estimates ranged from good to very good for raw subscale scores and fit for the models of goal orientations were good. Based on traditional psychometrics, the AGQ and PALS are valid and reliable indicators of achievement goals. Rasch analyses revealed that estimates of reliability for items were very good but respondent ability estimates varied from poor to good for both the AGQ and PALS. These findings indicate that items validly and reliably reflect a group's aggregate goal orientation, but using either instrument to characterize an individual's goal orientation is hazardous.

  9. Credibility assessment in child sexual abuse investigations: A descriptive analysis.

    Science.gov (United States)

    Melkman, Eran P; Hershkowitz, Irit; Zur, Ronit

    2017-05-01

    A major challenge in cases of child sexual abuse (CSA) is determining the credibility of children's reports. Consequently cases may be misclassified as false or deemed 'no judgment possible'. Based on a large national sample of reports of CSA made in Israel in 2014, the study examines child and event characteristics contributing to the probability that reports of abuse would be judged credible. National data files of all children aged 3-14, who were referred for investigation following suspected victimization of sexual abuse, and had disclosed sexual abuse, were analyzed. Cases were classified as either 'credible' or 'no judgment possible'. The probability of reaching a 'credible' judgment was examined in relation to characteristics of the child (age, gender, cognitive delay, marital status of the parents,) and of the abusive event (abuse severity, frequency, perpetrator-victim relationship, perpetrator's use of grooming, and perpetrator's use of coercion), controlling for investigator's identity at the cluster level of the analysis. Of 1563 cases analyzed, 57.9% were assessed as credible. The most powerful predictors of a credible judgment were older age and absence of a cognitive delay. Reports of children to married parents, who experienced a single abusive event that involved perpetrator's use of grooming, were also more likely to be judged as credible. Rates of credible judgments found are lower than expected suggesting under-identification of truthful reports of CSA. In particular, those cases of severe and multiple abuse involving younger and cognitively delayed children are the ones with the lowest chances of being assessed as credible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Assessment of RANS CFD modelling for pressurised thermal shock analysis

    International Nuclear Information System (INIS)

    Sander M Willemsen; Ed MJ Komen; Sander Willemsen

    2005-01-01

    Full text of publication follows: The most severe Pressurised Thermal Shock (PTS) scenario is a cold water Emergency Core Coolant (ECC) injection into the cold leg during a LOCA. The injected ECC water mixes with the hot fluid present in the cold leg and flows towards the downcomer where further mixing takes place. When the cold mixture comes into contact with the Reactor Pressure Vessel (RPV) wall, it may lead to large temperature gradients and consequently to high stresses in the RPV wall. Knowledge of these thermal loads is important for RPV remnant life assessments. The existing thermal-hydraulic system codes currently applied for this purpose are based on one-dimensional approximations and can, therefore, not predict the complex three-dimensional flows occurring during ECC injection. Computational Fluid Dynamics (CFD) can be applied to predict these phenomena, with the ultimate benefit of improved remnant RPV life assessment. The present paper presents an assessment of various Reynolds Averaged Navier Stokes (RANS) CFD approaches for modeling the complex mixing phenomena occurring during ECC injection. This assessment has been performed by comparing the numerical results obtained using advanced turbulence models available in the CFX 5.6 CFD code in combination with a hybrid meshing strategy with experimental results of the Upper Plenum Test Facility (UPTF). The UPTF was a full-scale 'simulation' of the primary system of the four loop 1300 MWe Siemens/KWU Pressurised Water Reactor at Grafenrheinfeld. The test vessel upper plenum internals, downcomer and primary coolant piping were replicas of the reference plant, while other components, such as core, coolant pump and steam generators were replaced by simulators. From the extensive test programme, a single-phase fluid-fluid mixing experiment in the cold leg and downcomer was selected. Prediction of the mixing and stratification is assessed by comparison with the measured temperature profiles at several locations

  11. BURD, Bayesian estimation in data analysis of Probabilistic Safety Assessment

    International Nuclear Information System (INIS)

    Jang, Seung-cheol; Park, Jin-Kyun

    2008-01-01

    1 - Description of program or function: BURD (Bayesian Update for Reliability Data) is a simple code that can be used to obtain a Bayesian estimate easily in the data analysis of PSA (Probabilistic Safety Assessment). According to the Bayes' theorem, basically, the code facilitates calculations of posterior distribution given the prior and the likelihood (evidence) distributions. The distinctive features of the program, BURD, are the following: - The input consists of the prior and likelihood functions that can be chosen from the built-in statistical distributions. - The available prior distributions are uniform, Jeffrey's non informative, beta, gamma, and log-normal that are most-frequently used in performing PSA. - For likelihood function, the user can choose from four statistical distributions, e.g., beta, gamma, binomial and poisson. - A simultaneous graphic display of the prior and posterior distributions facilitate an intuitive interpretation of the results. - Export facilities for the graphic display screen and text-type outputs are available. - Three options for treating zero-evidence data are provided. - Automatic setup of an integral calculus section for a Bayesian updating. 2 - Methods: The posterior distribution is estimated in accordance with the Bayes' theorem, given the prior and the likelihood (evidence) distributions. 3 - Restrictions on the complexity of the problem: The accuracy of the results depends on the calculational error of the statistical function library in MS Excel

  12. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  13. Hybrid vehicle assessment. Phase I. Petroleum savings analysis

    Energy Technology Data Exchange (ETDEWEB)

    Levin, R.; Liddle, S.; Deshpande, G.; Trummel, M.; Vivian, H.

    1984-03-01

    This report presents the results of a comprehensive analysis of near-term electric-hybrid vehicles. Its purpose was to estimate their potential to save significant amounts of petroleum on a national scale in the 1990s. Performance requirements and expected annual usage patterns of these vehicles were first modeled. The projected US fleet composition was estimated, and conceptual hybrid vehicle designs were conceived and analyzed for petroleum use when driven in the expected annual patterns. These petroleum consumption estimates were then compared to similar estimates for projected 1990 conventional vehicles having the same performance and driven in the same patterns. Results are presented in the form of three utility functions and comparisons of several conceptual designs are made. The Hybrid Vehicle (HV) design and assessment techniques are discussed and a general method is explained for selecting the optimum energy management strategy for any vehicle-mission-battery combination. A discussion of lessons learned during the construction and test of the General Electric Hybrid Test Vehicle is also presented. Conclusions and recommendations are presented, and development recommendations are identified.

  14. The radiological assessment system for consequence analysis - RASCAL

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Ramsdell, J.V.; Athey, G.F.

    1996-01-01

    The Radiological Assessment System for Consequence Analysis, Version 2.1 (RASCAL 2.1) has been developed for use during a response to radiological emergencies. The model estimates doses for comparison with U.S. Environmental Protection Agency (EPA) Protective Action Guides (PAGs) and thresholds for acute health effects. RASCAL was designed to be used by U.S. Nuclear Regulatory Commission (NRC) personnel who report to the site of a nuclear accident to conduct an independent evaluation of dose and consequence projections and personnel who conduct training and drills on emergency responses. It allows consideration of the dominant aspects of the source term, transport, dose, and consequences. RASCAL consists of three computational tools: ST-DOSE, FM-DOSE, and DECAY. ST-DOSE computes source term, atmospheric transport, and dose to man from accidental airborne releases of radionuclides. The source-term calculations are appropriate for accidents at U.S. power reactors. FM-DOSE computes doses from environmental concentrations of radionuclides in the air and on the ground. DECAY computes radiological decay and daughter in-growth. RASCAL 2.1 is a DOS application that can be run under Windows 3.1 and 95. RASCAL has been the starting point for other accident consequence models, notably INTERRAS, an international version of RASCAL, and HASCAL, an expansion of RASCAL that will model radiological, biological, and chemical accidents

  15. Life Cycle Assessment and Cost Analysis of Water and ...

    Science.gov (United States)

    changes in drinking and wastewater infrastructure need to incorporate a holistic view of the water service sustainability tradeoffs and potential benefits when considering shifts towards new treatment technology, decentralized systems, energy recovery and reuse of treated wastewater. The main goal of this study is to determine the influence of scale on the energy and cost performance of different transitional membrane bioreactors (MBR) in decentralized wastewater treatment (WWT) systems by performing a life cycle assessment (LCA) and cost analysis. LCA is a tool used to quantify sustainability-related metrics from a systems perspective. The study calculates the environmental and cost profiles of both aerobic MBRs (AeMBR) and anaerobic MBRs (AnMBR), which not only recover energy from waste, but also produce recycled water that can displace potable water for uses such as irrigation and toilet flushing. MBRs represent an intriguing technology to provide decentralized WWT services while maximizing resource recovery. A number of scenarios for these WWT technologies are investigated for different scale systems serving various population density and land area combinations to explore the ideal application potentials. MBR systems are examined from 0.05 million gallons per day (MGD) to 10 MGD and serve land use types from high density urban (100,000 people per square mile) to semi-rural single family (2,000 people per square mile). The LCA and cost model was built with ex

  16. Neutron activation analysis method - international ring test for proficiency assessment

    International Nuclear Information System (INIS)

    Barbos, D.; Bucsa, A. F.

    2016-01-01

    The main objective of this test is to assess the quality control of analytical procedures for soils and plants which is of utmost importance to produce reliable and reproducible analytical data. For this purpose first, second, and third line quality control measures are taken in analytical laboratories. For first line control certified reference materials (CRM's) are preferred. However, the number and matrix variation in CRM's for environmental analytical research is still very limited. For second line control internal reference samples are often used, but again here the values for many element and parameter concentrations are questionable since almost no check versus CRM's is possible. For third line control participation in laboratory-evaluating exchange programs is recommended. This article contains the results achieved by our neutron activation analysis laboratory after irradiation experiment of soil and vegetation samples in TRIGA Reactor. All the samples were irradiated in the same location of the reactor in roughly similar conditions. (authors)

  17. Landslides geotechnical analysis. Qualitative assessment by valuation factors

    Science.gov (United States)

    Cuanalo Oscar, Sc D.; Oliva Aldo, Sc D.; Polanco Gabriel, M. E.

    2012-04-01

    In general, a landslide can cause a disaster when it is combined a number of factors such as an extreme event related to a geological phenomenon, vulnerable elements exposed in a specific geographic area, and the probability of loss and damage evaluated in terms of lives and economic assets, in a certain period of time. This paper presents the qualitative evaluation of slope stability through of Valuation Factors, obtained from the characterization of the determinants and triggers factors that influence the instability; for the first the morphology and topography, geology, soil mechanics, hydrogeology and vegetation to the second, the rain, earthquakes, erosion and scour, human activity, and ultimately dependent factors of the stability analysis, and its influence ranges which greatly facilitate the selection of construction processes best suited to improve the behavior of a slope or hillside. The Valuation Factors are a set of parameters for assessing the influence of conditioning and triggering factors that influence the stability of slopes and hillsides. The characteristics of each factor must be properly categorized to involve its effect on behavior; a way to do this is by assigning a weighted value range indicating its effect on the stability of a slope. It is proposed to use Valuation Factors with weighted values between 0 and 1 (arbitrarily selected but common sense and logic), the first corresponds to no or minimal effect on stability (no effect or very little influence) and the second, the greatest impact on it (has a significant influence). The meddle effects are evaluated with intermediate values.

  18. Direct Survival Analysis: a new stock assessment method

    Directory of Open Access Journals (Sweden)

    Eduardo Ferrandis

    2007-03-01

    Full Text Available In this work, a new stock assessment method, Direct Survival Analysis, is proposed and described. The parameter estimation of the Weibull survival model proposed by Ferrandis (2007 is obtained using trawl survey data. This estimation is used to establish a baseline survival function, which is in turn used to estimate the specific survival functions in the different cohorts considered through an adaptation of the separable model of the fishing mortality rates introduced by Pope and Shepherd (1982. It is thus possible to test hypotheses on the evolution of survival during the period studied and to identify trends in recruitment. A link is established between the preceding analysis of trawl survey data and the commercial catch-at-age data that are generally obtained to evaluate the population using analytical models. The estimated baseline survival, with the proposed versions of the stock and catch equations and the adaptation of the Separable Model, may be applied to commercial catch-at-age data. This makes it possible to estimate the survival corresponding to the landing data, the initial size of the cohort and finally, an effective age of first capture, in order to complete the parameter model estimation and consequently the estimation of the whole survival and mortality, along with the reference parameters that are useful for management purposes. Alternatively, this estimation of an effective age of first capture may be obtained by adapting the demographic structure of trawl survey data to that of the commercial fleet through suitable selectivity models of the commercial gears. The complete model provides the evaluation of the stock at any age. The coherence (and hence the mutual “calibration” between the two kinds of information may be analysed and compared with results obtained by other methods, such as virtual population analysis (VPA, in order to improve the diagnosis of the state of exploitation of the population. The model may be

  19. The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health.

    Science.gov (United States)

    Lyon, Aaron R; Whitaker, Kelly; Locke, Jill; Cook, Clayton R; King, Kevin M; Duong, Mylien; Davis, Chayna; Weist, Mark D; Ehrhart, Mark G; Aarons, Gregory A

    2018-02-07

    Integrated healthcare delivered by work groups in nontraditional service settings is increasingly common, yet contemporary implementation frameworks typically assume a single organization-or organizational unit-within which system-level processes influence service quality and implementation success. Recent implementation frameworks predict that inter-organizational alignment (i.e., similarity in values, characteristics, activities related to implementation across organizations) may facilitate the implementation of evidence-based practices (EBP), but few studies have evaluated this premise. This study's aims examine the impact of overlapping organizational contexts by evaluating the implementation contexts of externally employed mental health clinicians working in schools-the most common integrated service delivery setting for children and adolescents. Aim 1 is to estimate the effects of unique intra-organizational implementation contexts and combined inter-organizational alignment on implementation outcomes. Aim 2 is to examine the underlying mechanisms through which inter-organizational alignment facilitates or hinders EBP implementation. This study will conduct sequential, exploratory mixed-methods research to evaluate the intra- and inter-organizational implementation contexts of schools and the external community-based organizations that most often employ school-based mental health clinicians, as they relate to mental health EBP implementation. Aim 1 will involve quantitative surveys with school-based, externally-employed mental health clinicians, their supervisors, and proximal school-employed staff (total n = 120 participants) to estimate the effects of each organization's general and implementation-specific organizational factors (e.g., climate, leadership) on implementation outcomes (fidelity, acceptability, appropriateness) and assess the moderating role of the degree of clinician embeddedness in the school setting. Aim 2 will explore the mechanisms

  20. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  1. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  2. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang (Inspecta Technology AB, Stockholm (Sweden))

    2009-06-15

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  3. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    International Nuclear Information System (INIS)

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang

    2009-06-01

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  4. Image quality assessment based on multiscale geometric analysis.

    Science.gov (United States)

    Gao, Xinbo; Lu, Wen; Tao, Dacheng; Li, Xuelong

    2009-07-01

    Reduced-reference (RR) image quality assessment (IQA) has been recognized as an effective and efficient way to predict the visual quality of distorted images. The current standard is the wavelet-domain natural image statistics model (WNISM), which applies the Kullback-Leibler divergence between the marginal distributions of wavelet coefficients of the reference and distorted images to measure the image distortion. However, WNISM fails to consider the statistical correlations of wavelet coefficients in different subbands and the visual response characteristics of the mammalian cortical simple cells. In addition, wavelet transforms are optimal greedy approximations to extract singularity structures, so they fail to explicitly extract the image geometric information, e.g., lines and curves. Finally, wavelet coefficients are dense for smooth image edge contours. In this paper, to target the aforementioned problems in IQA, we develop a novel framework for IQA to mimic the human visual system (HVS) by incorporating the merits from multiscale geometric analysis (MGA), contrast sensitivity function (CSF), and the Weber's law of just noticeable difference (JND). In the proposed framework, MGA is utilized to decompose images and then extract features to mimic the multichannel structure of HVS. Additionally, MGA offers a series of transforms including wavelet, curvelet, bandelet, contourlet, wavelet-based contourlet transform (WBCT), and hybrid wavelets and directional filter banks (HWD), and different transforms capture different types of image geometric information. CSF is applied to weight coefficients obtained by MGA to simulate the appearance of images to observers by taking into account many of the nonlinearities inherent in HVS. JND is finally introduced to produce a noticeable variation in sensory experience. Thorough empirical studies are carried out upon the LIVE database against subjective mean opinion score (MOS) and demonstrate that 1) the proposed framework has

  5. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  6. Sampling and Analysis for Assessment of Body Burdens

    International Nuclear Information System (INIS)

    Harley, J.H.

    1964-01-01

    A review of sampling criteria and techniques and of sample processing methods for indirect assessment of body burdens is presented. The text is limited to the more recent developments in the field of bioassay and to the nuclides which cannot be readily determined in the body directly. A selected bibliography is included. The planning of a bioassay programme should emphasize the detection of high or unusual exposures and the concentrated study of these cases when detected. This procedure gives the maximum amount of data for the dosimetry of individuals at risk and also adds to our scientific background for an understanding of internal emitters. Only a minimum of effort should be spent on sampling individuals having had negligible exposure. The chemical separation procedures required for bioassay also fall into two categories. The first is the rapid method, possibly of low accuracy, used for detection. The second is the more accurate method required for study of the individual after detection of the exposure. Excretion, whether exponential or a power function, drops off rapidly. It is necessary to locate the exposure in time before any evaluation can be made, even before deciding if the exposure is significant. One approach is frequent sampling and analysis by a quick screening technique. More commonly, samples are collected at longer intervals and an arbitrary level of re-sampling is set to assist in the detection of real exposures. It is probable that too much bioassay effort has gone into measurements on individuals at low risk and not enough on those at higher risk. The development of bioassay procedures for overcoming this problem has begun, and this paper emphasizes this facet of sampling and sample processing. (author) [fr

  7. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  8. Risk Assessment of Healthcare Waste by Preliminary Hazard Analysis Method

    Directory of Open Access Journals (Sweden)

    Pouran Morovati

    2017-09-01

    Full Text Available Introduction and purpose: Improper management of healthcare waste (HCW can pose considerable risks to human health and the environment and cause serious problems in developing countries such as Iran. In this study, we sought to determine the hazards of HCW in the public hospitals affiliated to Abadan School of Medicine using the preliminary hazard analysis (PHA method. Methods: In this descriptive and analytic study, health risk assessment of HCW in government hospitals affiliated to Abadan School of Medicine (4 public hospitals was carried out by using PHA in the summer of  2016. Results: We noted the high risk of sharps and infectious wastes. Considering the dual risk of injury and disease transmission, sharps were classified in the very high-risk group, and pharmaceutical and chemical and radioactive wastes were classified in the medium-risk group. Sharps posed the highest risk, while pharmaceutical and chemical wastes had the lowest risk. Among the various stages of waste management, the waste treatment stage was the most hazardous in all the studied hospitals. Conclusion: To diminish the risks associated with healthcare waste management in the studied hospitals, adequate training of healthcare workers and care providers, provision of suitable personal protective and transportation equipment, and supervision of the environmental health manager of hospitals should be considered by the authorities.  

  9. Analysis of Management Behavior Assessments and Affect on Productivity

    National Research Council Canada - National Science Library

    Shipley, Jr, Steven E

    2005-01-01

    ... (Virtual Military Health Institute, 2003). The need exists for constructing a reliable behavior assessment instrument that captures data operationalized into correlational relationships between hospital management and employee beliefs...

  10. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Science.gov (United States)

    Khoshaim, Heba Bakr; Rashid, Saima

    2016-01-01

    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  11. Assessing the validity of discourse analysis: transdisciplinary convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  12. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    Science.gov (United States)

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  13. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  14. Watershed assessment-watershed analysis: What are the limits and what must be considered

    Science.gov (United States)

    Robert R. Ziemer

    2000-01-01

    Watershed assessment or watershed analysis describes processes and interactions that influence ecosystems and resources in a watershed. Objectives and methods differ because issues and opportunities differ.

  15. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  16. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  17. No-Reference Video Quality Assessment by HEVC Codec Analysis

    DEFF Research Database (Denmark)

    Huang, Xin; Søgaard, Jacob; Forchhammer, Søren

    2015-01-01

    This paper proposes a No-Reference (NR) Video Quality Assessment (VQA) method for videos subject to the distortion given by High Efficiency Video Coding (HEVC). The proposed assessment can be performed either as a BitstreamBased (BB) method or as a Pixel-Based (PB). It extracts or estimates...... the transform coefficients, estimates the distortion, and assesses the video quality. The proposed scheme generates VQA features based on Intra coded frames, and then maps features using an Elastic Net to predict subjective video quality. A set of HEVC coded 4K UHD sequences are tested. Results show...... that the quality scores computed by the proposed method are highly correlated with the subjective assessment....

  18. Use of quantitative uncertainty analysis for human health risk assessment

    International Nuclear Information System (INIS)

    Duncan, F.L.W.; Gordon, J.W.; Kelly, M.

    1994-01-01

    Current human health risk assessment method for environmental risks typically use point estimates of risk accompanied by qualitative discussions of uncertainty. Alternatively, Monte Carlo simulations may be used with distributions for input parameters to estimate the resulting risk distribution and descriptive risk percentiles. These two techniques are applied for the ingestion of 1,1=dichloroethene in ground water. The results indicate that Monte Carlo simulations provide significantly more information for risk assessment and risk management than do point estimates

  19. Assessment of Transport Projects: Risk Analysis and Decision Support

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2008-01-01

    functions. New research proved that specifically two impacts stood out in transport project assessment, namely, travel time savings and construction costs. The final concern of this study has been the fitting of distributions, e.g. by the use of data from major databases developed in which Optimism Bias...... choosing probability distributions and performing real term data fits. The perspective of this Ph.D. study presents newer and better understanding of assigning risks within assessment of transport projects....

  20. Chip based single cell analysis for nanotoxicity assessment.

    Science.gov (United States)

    Shah, Pratikkumar; Kaushik, Ajeet; Zhu, Xuena; Zhang, Chengxiao; Li, Chen-Zhong

    2014-05-07

    Nanomaterials, because of their tunable properties and performances, have been utilized extensively in everyday life related consumable products and technology. On exposure, beyond the physiological range, nanomaterials cause health risks via affecting the function of organisms, genomic systems, and even the central nervous system. Thus, new analytical approaches for nanotoxicity assessment to verify the feasibility of nanomaterials for future use are in demand. The conventional analytical techniques, such as spectrophotometric assay-based techniques, usually require a lengthy and time-consuming process and often produce false positives, and often cannot be implemented at a single cell level measurement for studying cell behavior without interference from its surrounding environment. Hence, there is a demand for a precise, accurate, sensitive assessment for toxicity using single cells. Recently, due to the advantages of automation of fluids and minimization of human errors, the integration of a cell-on-a-chip (CoC) with a microfluidic system is in practice for nanotoxicity assessments. This review explains nanotoxicity and its assessment approaches with advantages/limitations and new approaches to overcome the confines of traditional techniques. Recent advances in nanotoxicity assessment using a CoC integrated with a microfluidic system are also discussed in this review, which may be of use for nanotoxicity assessment and diagnostics.

  1. Radiological assessment. A textbook on environmental dose analysis

    International Nuclear Information System (INIS)

    Till, J.E.; Meyer, H.R.

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides

  2. Radiological assessment. A textbook on environmental dose analysis

    Energy Technology Data Exchange (ETDEWEB)

    Till, J.E.; Meyer, H.R. (eds.)

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.

  3. Assessing Anti-American Sentiment Through Social Media Analysis

    Science.gov (United States)

    2016-12-01

    165 Yuri Kageyama, “Twitter Takes Off in Japan – Social Network Changes Nation’s Internet Culture,” Journal - Gazette, July 4, 2010, http...military shows of force in and around Japan . 14. SUBJECT TERMS anti-Americanism, Twitter, social media analysis; drone strike; Pakistan, UAV...hypothesizes that social media analysis, specifically analysis of messages sent through the Twitter network , can be used to gauge the overall

  4. Environmental Impact Assessment for Socio-Economic Analysis of Chemicals

    DEFF Research Database (Denmark)

    Calow, Peter; Biddinger, G; Hennes, C

    This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH.......This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH....

  5. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  6. Analysis of truncation limit in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, Marko

    2005-01-01

    A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit

  7. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  8. What influences the choice of assessment methods in health technology assessments? Statistical analysis of international health technology assessments from 1989 to 2002.

    Science.gov (United States)

    Draborg, Eva; Andersen, Christian Kronborg

    2006-01-01

    Health technology assessment (HTA) has been used as input in decision making worldwide for more than 25 years. However, no uniform definition of HTA or agreement on assessment methods exists, leaving open the question of what influences the choice of assessment methods in HTAs. The objective of this study is to analyze statistically a possible relationship between methods of assessment used in practical HTAs, type of assessed technology, type of assessors, and year of publication. A sample of 433 HTAs published by eleven leading institutions or agencies in nine countries was reviewed and analyzed by multiple logistic regression. The study shows that outsourcing of HTA reports to external partners is associated with a higher likelihood of using assessment methods, such as meta-analysis, surveys, economic evaluations, and randomized controlled trials; and with a lower likelihood of using assessment methods, such as literature reviews and "other methods". The year of publication was statistically related to the inclusion of economic evaluations and shows a decreasing likelihood during the year span. The type of assessed technology was related to economic evaluations with a decreasing likelihood, to surveys, and to "other methods" with a decreasing likelihood when pharmaceuticals were the assessed type of technology. During the period from 1989 to 2002, no major developments in assessment methods used in practical HTAs were shown statistically in a sample of 433 HTAs worldwide. Outsourcing to external assessors has a statistically significant influence on choice of assessment methods.

  9. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-05-09

    ... Justice in Regulatory Analysis.'' The purpose of this guidance is to provide EPA analysts with technical...-566-2363. Mail: Technical Guidance for Assessing Environmental Justice in Regulatory Analysis... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OA-2013-0320; FRL-9810-5] Technical Guidance for Assessing...

  10. The role of sensitivity analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.; Knochenhauer, M.

    1987-01-01

    The paper describes several items suitable for close examination by means of application of sensitivity analysis, when performing a level 1 PSA. Sensitivity analyses are performed with respect to; (1) boundary conditions, (2) operator actions, and (3) treatment of common cause failures (CCFs). The items of main interest are identified continuously in the course of performing a PSA, as well as by scrutinising the final results. The practical aspects of sensitivity analysis are illustrated by several applications from a recent PSA study (ASEA-ATOM BWR 75). It is concluded that sensitivity analysis leads to insights important for analysts, reviewers and decision makers. (orig./HP)

  11. The predictive value of aptitude assessment in laparoscopic surgery : a meta-analysis

    NARCIS (Netherlands)

    Kramp, Kelvin H.; van Det, Marc J.; Hoff, Christiaan; Veeger, Nic J. G. M.; ten Cate Hoedemaker, Henk O.; Pierie, Jean-Pierre E. N.

    ContextCurrent methods of assessing candidates for medical specialties that involve laparoscopic skills suffer from a lack of instruments to assess the ability to work in a minimally invasive surgery environment. ObjectivesA meta-analysis was conducted to investigate whether aptitude assessment can

  12. Review of assessment methods discount rate in investment analysis

    Directory of Open Access Journals (Sweden)

    Yamaletdinova Guzel Hamidullovna

    2011-08-01

    Full Text Available The article examines the current methods of calculating discount rate in investment analysis and business valuation, as well as analyzes the key problems using various techniques in terms of the Russian economy.

  13. Sensitivity analysis of the RESRAD, a dose assessment code

    International Nuclear Information System (INIS)

    Yu, C.; Cheng, J.J.; Zielen, A.J.

    1991-01-01

    The RESRAD code is a pathway analysis code that is designed to calculate radiation doses and derive soil cleanup criteria for the US Department of Energy's environmental restoration and waste management program. the RESRAD code uses various pathway and consumption-rate parameters such as soil properties and food ingestion rates in performing such calculations and derivations. As with any predictive model, the accuracy of the predictions depends on the accuracy of the input parameters. This paper summarizes the results of a sensitivity analysis of RESRAD input parameters. Three methods were used to perform the sensitivity analysis: (1) Gradient Enhanced Software System (GRESS) sensitivity analysis software package developed at oak Ridge National Laboratory; (2) direct perturbation of input parameters; and (3) built-in graphic package that shows parameter sensitivities while the RESRAD code is operational

  14. FINANCIAL ANALYSIS FUNDAMENT FOR ASSESSMENT THE COMPANY'S VALUE

    Directory of Open Access Journals (Sweden)

    Goran Karanovic

    2010-06-01

    Full Text Available Lack of capital market development cause that calculating the value of companies in the small markets, such as the Croatian market, is carried out primarily from the analysis of financial statements. Lack of market development is evident from the unrealistic and unobjective corporate values, as result of too small volumeof securities trading in financial markets. The primary financial analysis is the basic method for estimating company value, and represents the foundation for an objective determination of cash flow components that will be discounted. Trought analysis investors are trying to answer the questions such as: status of the assets,liabilities and capital, the dynamics of business enterprises, the level of solvency and liquidity, utilization of fixed assets, contribution of fixed assets in total income, company profitability rates and investment in the company. Investors use financial analysis only as a basis and as a tool to predict the potential for creating new business value.

  15. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  16. Assessment and analysis of wind energy generation and power ...

    African Journals Online (AJOL)

    time, a statistical analysis of wind characteristics and the extrapolation of weibull parameters are presented. Otherwise, the .... The wind speed probability density function. (PDF) can ... be adjusted using following expression [28, 30,. 31]:. (11).

  17. The role of risk assessment and safety analysis in integrated safety assessments

    International Nuclear Information System (INIS)

    Niall, R.; Hunt, M.; Wierman, T.E.

    1990-01-01

    To ensure that the design and operation of both nuclear and non- nuclear hazardous facilities is acceptable, and meets all societal safety expectations, a rigorous deterministic and probabilistic assessment is necessary. An approach is introduced, founded on the concept of an ''Integrated Safety Assessment.'' It merges the commonly performed safety and risk analyses and uses them in concert to provide decision makers with the necessary depth of understanding to achieve ''adequacy.'' 3 refs., 1 fig

  18. Analysis of the most widely used Building Environmental Assessment methods

    International Nuclear Information System (INIS)

    Gu, Zhenhong; Wennersten, R.; Assefa, G.

    2006-01-01

    Building Environmental Assessment (BEA) is a term used for several methods for environmental assessment of the building environment. Generally, Life Cycle Assessment (LCA) is an important foundation and part of the BEA method, but current BEA methods form more comprehensive tools than LCA. Indicators and weight assignments are the two most important factors characterizing BEA. From the comparison of the three most widely used BEA methods, EcoHomes (BREEAM for residential buildings), LEED-NC and GBTool, it can be seen that BEA methods are shifting from ecological, indicator-based scientific systems to more integrated systems covering ecological, social and economic categories. Being relatively new methods, current BEA systems are far from perfect and are under continuous development. The further development of BEA methods will focus more on non-ecological indicators and how to promote implementation. Most BEA methods are developed based on regional regulations and LCA methods, but they do not attempt to replace these regulations. On the contrary, they try to extend implementation by incentive programmes. There are several ways to enhance BEA in the future: expand the studied scope from design levels to whole life-cycle levels of constructions, enhance international cooperation, accelerate legislation and standardize and develop user-oriented assessment systems

  19. Design and analysis for thematic map accuracy assessment: Fundamental principles

    Science.gov (United States)

    Stephen V. Stehman; Raymond L. Czaplewski

    1998-01-01

    Land-cover maps are used in numerous natural resource applications to describe the spatial distribution and pattern of land-cover, to estimate areal extent of various cover classes, or as input into habitat suitability models, land-cover change analyses, hydrological models, and risk analyses. Accuracy assessment quantifies data quality so that map users may evaluate...

  20. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  1. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  2. A big data analysis approach for rail failure risk assessment

    NARCIS (Netherlands)

    Jamshidi, A.; Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; Dollevoet, R.P.B.J.; Li, Z.; De Schutter, B.H.K.

    2017-01-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by

  3. Meta-Analysis of Psychological Assessment as a Therapeutic Intervention

    Science.gov (United States)

    Poston, John M.; Hanson, William E.

    2010-01-01

    This study entails the use of meta-analytic techniques to calculate and analyze 18 independent and 52 nonindependent effect sizes across 17 published studies of psychological assessment as a therapeutic intervention. In this sample of studies, which involves 1,496 participants, a significant overall Cohen's d effect size of 0.423 (95% CI [0.321,…

  4. Analysis of Online Quizzes as a Teaching and Assessment Tool

    Science.gov (United States)

    Salas-Morera, Lorenzo; Arauzo-Azofra, Antonio; García-Hernández, Laura

    2012-01-01

    This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an…

  5. Individuals' stress assessment using human-smartphone interaction analysis

    DEFF Research Database (Denmark)

    Ciman, Matteo; Wac, Katarzyna

    2018-01-01

    costs and reducing user acceptance, or they use some of privacy-related information. This paper presents an approach for stress assessment that leverages data extracted from smartphone sensors, and that is not invasive concerning privacy. Two different approaches are presented. One, based on smartphone...

  6. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  7. Comparative analysis of different modalities of assessment of lower ...

    African Journals Online (AJOL)

    LEAD was assessed in all the patients using: 1. History of intermittent claudication using the Edinburgh Claudication Questionnaire. 2. Palpation of pedal pulses for diminished or absent dorsalis pedis and/or posterior tibial artery pulsations. 3. Ankle Brachial Index <0.9 in either leg, using hand-held Doppler ultrasonography

  8. Application of Automated Facial Expression Analysis and Qualitative Analysis to Assess Consumer Perception and Acceptability of Beverages and Water

    OpenAIRE

    Crist, Courtney Alissa

    2016-01-01

    Sensory and consumer sciences aim to understand the influences of product acceptability and purchase decisions. The food industry measures product acceptability through hedonic testing but often does not assess implicit or qualitative response. Incorporation of qualitative research and automated facial expression analysis (AFEA) may supplement hedonic acceptability testing to provide product insights. The purpose of this research was to assess the application of AFEA and qualitative analysis ...

  9. Indoor air - assessment: Methods of analysis for environmental carcinogens

    International Nuclear Information System (INIS)

    Peterson, M.R.; Naugle, D.F.; Berry, M.A.

    1990-06-01

    The monograph describes, in a general way, published sampling procedures and analytical approaches for known and suspected carcinogens. The primary focus is upon carcinogens found in indoor air, although the methods described are applicable to other media or environments. In cases where there are no published methods for a particular pollutant in indoor air, methods developed for the workplace and for ambient air are included since they should be adaptable to indoor air. Known and suspected carcinogens have been grouped into six categories for the purposes of this and related work. The categories are radon, asbestos, organic compounds, inorganic species, particles, and non-ionizing radiation. Some methods of assessing exposure that are not specific to any particular pollutant category are covered in a separate section. The report is the fifth in a series of EPA/Environmental Criteria and Assessment Office Monographs

  10. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars

    2015-01-01

    Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system could...... intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure...... with the performance on the simulator (virtual-reality simulator score; p analysis system could discriminate between different levels of experience. Automatic feedback on correct movements during self-directed training on simulators might help new bronchoscopists learn how to handle...

  11. Technical quality assessment of an optoelectronic system for movement analysis

    International Nuclear Information System (INIS)

    Sapienza University of Rome (Italy))" data-affiliation=" (Department of Mechanical and Aerospace Engineering, Sapienza University of Rome (Italy))" >Di Marco, R; Sapienza University of Rome (Italy))" data-affiliation=" (Department of Mechanical and Aerospace Engineering, Sapienza University of Rome (Italy))" >Patanè, F; Sapienza University of Rome (Italy))" data-affiliation=" (Department of Mechanical and Aerospace Engineering, Sapienza University of Rome (Italy))" >Cappa, P; Rossi, S

    2015-01-01

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm

  12. Safety analysis and risk assessment of the National Ignition Facility

    International Nuclear Information System (INIS)

    Brereton, S.; McLouth, L.; Odell, B.

    1996-01-01

    The National Ignition Facility (NIF) is a proposed U.S. Department of Energy inertial confinement laser fusion facility. The candidate sites for locating the NIF are: Los Alamos National Laboratory, Sandia National Laboratory, the Nevada Test Site, and Lawrence Livermore National Laboratory (LLNL), the preferred site. The NIF will operate by focusing 192 laser beams onto a tiny deuterium-tritium target located at the center of a spherical target chamber. The NIF mission is to achieve inertial confinement fusion (ICF) ignition, access physical conditions in matter of interest to nuclear weapons physics, provide an above ground simulation capability for nuclear weapons effects testing, and contribute to the development of inertial fusion for electrical power production. The NIF has been classified as a radiological, low hazard facility on the basis of a preliminary hazards analysis and according to the DOE methodology for facility classification. This requires that a safety analysis be prepared under DOE Order 5481.1B, Safety Analysis and Review System. A draft Preliminary Safety Analysis Report (PSAR) has been written, and this will be finalized later in 1996. This paper summarizes the safety issues associated with the operation of the NIF and the methodology used to study them. It provides a summary of the methodology, an overview of the hazards, estimates maximum routine and accidental exposures for the preferred site of LLNL, and concludes that the risks from NIF operations are low

  13. Advanced multivariate analysis to assess remediation of hydrocarbons in soils.

    Science.gov (United States)

    Lin, Deborah S; Taylor, Peter; Tibbett, Mark

    2014-10-01

    Accurate monitoring of degradation levels in soils is essential in order to understand and achieve complete degradation of petroleum hydrocarbons in contaminated soils. We aimed to develop the use of multivariate methods for the monitoring of biodegradation of diesel in soils and to determine if diesel contaminated soils could be remediated to a chemical composition similar to that of an uncontaminated soil. An incubation experiment was set up with three contrasting soil types. Each soil was exposed to diesel at varying stages of degradation and then analysed for key hydrocarbons throughout 161 days of incubation. Hydrocarbon distributions were analysed by Principal Coordinate Analysis and similar samples grouped by cluster analysis. Variation and differences between samples were determined using permutational multivariate analysis of variance. It was found that all soils followed trajectories approaching the chemical composition of the unpolluted soil. Some contaminated soils were no longer significantly different to that of uncontaminated soil after 161 days of incubation. The use of cluster analysis allows the assignment of a percentage chemical similarity of a diesel contaminated soil to an uncontaminated soil sample. This will aid in the monitoring of hydrocarbon contaminated sites and the establishment of potential endpoints for successful remediation.

  14. Assessment of major nuclear technologies with decision and risk analysis

    International Nuclear Information System (INIS)

    Winterfeldt, D. von

    1995-01-01

    Selecting technologies for major nuclear programs involves several complexities, including multiple stakeholders, multiple conflicting objectives, uncertainties, and risk. In addition, the programmatic risks related to the schedule, cost, and performance of these technologies often become major issues in the selection process. This paper describes a decision analysis approach for addressing these complexities in a logical manner

  15. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  16. Assessing ground compaction via time lapse surface wave analysis

    Czech Academy of Sciences Publication Activity Database

    Dal Moro, Giancarlo; Al-Arifi, N.; Moustafa, S.S.R.

    2016-01-01

    Roč. 13, č. 3 (2016), s. 249-256 ISSN 1214-9705 Institutional support: RVO:67985891 Keywords : Full velocity spectrum (FVS) analysis * ground compaction * ground compaction * phase velocities * Rayleigh waves * seismic data inversion * surface wave dispersion * surface waves Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.699, year: 2016

  17. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  18. [Modular risk analysis for assessing multiple waste sites]: Proceedings

    International Nuclear Information System (INIS)

    Whelan, G.

    1994-01-01

    This document contains proceedings from the Integrated Planning Workshop from Strategic Planning to Baselining and Other Objectives. Topics discussed include: stakeholder involvement; regulations; future site use planning; site integration and baseline methods; risk analysis in decision making; land uses; and economics in decision making. Individual records have been processed separately for the database

  19. A Preliminary Analysis of a Behavioral Classrooms Needs Assessment

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McCray, Cynthia; Lamkins, Carol; Taubman, Mitchell; McEachin, John; Cihon, Joseph H.

    2016-01-01

    Today many special education classrooms implement procedures based upon the principles of Applied Behavior Analysis (ABA) to establish educationally relevant skills and decrease aberrant behaviors. However, it is difficult for school staff and consultants to evaluate the implementation of various components of ABA and general classroom set up. In…

  20. Assessing air quality in Aksaray with time series analysis

    Science.gov (United States)

    Kadilar, Gamze Özel; Kadilar, Cem

    2017-04-01

    Sulphur dioxide (SO2) is a major air pollutant caused by the dominant usage of diesel, petrol and fuels by vehicles and industries. One of the most air-polluted city in Turkey is Aksaray. Hence, in this study, the level of SO2 is analyzed in Aksaray based on the database monitored at air quality monitoring station of Turkey. Seasonal Autoregressive Integrated Moving Average (SARIMA) approach is used to forecast the level of SO2 air quality parameter. The results indicate that the seasonal ARIMA model provides reliable and satisfactory predictions for the air quality parameters and expected to be an alternative tool for practical assessment and justification.

  1. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  2. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  3. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  4. Analysis and Pollution Assessment of Heavy Metal in Soil, Perlis

    International Nuclear Information System (INIS)

    Siti Norbaya Mat Ripin; Siti Norbaya Mat Ripin; Sharizal Hasan; Mohd Lias Kamal; NorShahrizan Mohd Hashim

    2014-01-01

    Concentration of 5 heavy metals (Cu, Cr, Ni, Cd, Pb) were studied in the soils around Perlis, to assess heavy metals contamination distribution due to industrialization, urbanization and agricultural activities. Soil samples were collected at depth of 0-15 cm in eighteen station around Perlis. The soil samples (2 mm) were obtained duplicates and subjected to hot block digestion and the concentration of total metal was determined via ICP-MS. Overall concentrations of Cu, Cr, Ni, Cd and Pb in the soil samples ranged from 0.38-240.59, 0.642-3.921, 0.689-2.398, 0-0.63 and 0.39-27.47 mg/ kg respectively. The concentration of heavy metals in the soil display the following decreasing trend: Cu> Pb> Cr> Ni> Cd. From this result, found that level of heavy metal in soil near centralized Chuping industrial areas give maximum value compared with other location in Perlis. The Pollution index revealed that only 11 % of Cu and 6 % of Cd were classes as heavily contaminated. Meanwhile, Cu and Pb showed 6 % from all samples result a moderately contaminated and the others element give low contamination. Results of combined heavy metal concentration and heavy metal assessment indicate that industrial activities and traffic emission represent most important sources for Cu, Cd and Pb whereas Cr, Ni mainly from natural sources. Increasing anthropogenic influences on the environment, especially pollution loadings, have caused negative changes in natural ecosystems and decreased biodiversity. (author)

  5. An Analysis Report of 2014 CALA Self-Assessment Survey

    Directory of Open Access Journals (Sweden)

    Jian Anna Xiong

    2016-06-01

    Full Text Available On the occasion of CALA’s 40th anniversary in 2014, the 2013 Board of Directors appointed a Self-Assessment Task Force to conduct an assessment survey with special focuses on members’ awareness of CALA’s organizational structure and policies, its services to members, the extent of participation in events sponsored by CALA, and the level of satisfaction with CALA leadership. Although only one-fifth of the active members responded to the survey, the answers and feedback have identified areas for organizational improvement and have shown how active members view the current state of CALA. Some essential findings from the survey include: 1 the growth of overseas membership as a demographic trend, 2 a need to recruit student members, 3 a high percentage of CALA members not aware of CALA’s Mission/Vision/Goal, 4 conflicting data on CALA’s leadership, 5 discovery of low ratings (10-30% of respondents on eleven out of twelve rating questions, and 6 strong support for CALA as a representative organization of Chinese American librarians in North America. The findings of the survey will serve as a valuable reference for future strategic planning and for carrying out CALA’s long term goals.

  6. Influence analysis to assess sensitivity of the dropout process

    OpenAIRE

    Molenberghs, Geert; Verbeke, Geert; Thijs, Herbert; Lesaffre, Emmanuel; Kenward, Michael

    2001-01-01

    Diggle and Kenward (Appl. Statist. 43 (1994) 49) proposed a selection model for continuous longitudinal data subject to possible non-random dropout. It has provoked a large debate about the role for such models. The original enthusiasm was followed by skepticism about the strong but untestable assumption upon which this type of models invariably rests. Since then, the view has emerged that these models should ideally be made part of a sensitivity analysis. One of their examples is a set of da...

  7. Assessment of the SFC database for analysis and modeling

    Science.gov (United States)

    Centeno, Martha A.

    1994-01-01

    SFC is one of the four clusters that make up the Integrated Work Control System (IWCS), which will integrate the shuttle processing databases at Kennedy Space Center (KSC). The IWCS framework will enable communication among the four clusters and add new data collection protocols. The Shop Floor Control (SFC) module has been operational for two and a half years; however, at this stage, automatic links to the other 3 modules have not been implemented yet, except for a partial link to IOS (CASPR). SFC revolves around a DB/2 database with PFORMS acting as the database management system (DBMS). PFORMS is an off-the-shelf DB/2 application that provides a set of data entry screens and query forms. The main dynamic entity in the SFC and IOS database is a task; thus, the physical storage location and update privileges are driven by the status of the WAD. As we explored the SFC values, we realized that there was much to do before actually engaging in continuous analysis of the SFC data. Half way into this effort, it was realized that full scale analysis would have to be a future third phase of this effort. So, we concentrated on getting to know the contents of the database, and in establishing an initial set of tools to start the continuous analysis process. Specifically, we set out to: (1) provide specific procedures for statistical models, so as to enhance the TP-OAO office analysis and modeling capabilities; (2) design a data exchange interface; (3) prototype the interface to provide inputs to SCRAM; and (4) design a modeling database. These objectives were set with the expectation that, if met, they would provide former TP-OAO engineers with tools that would help them demonstrate the importance of process-based analyses. The latter, in return, will help them obtain the cooperation of various organizations in charting out their individual processes.

  8. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  9. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  10. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung (and others)

    2008-04-15

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out.

  11. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung

    2008-04-01

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out

  12. Fuel assembly assessment from CVD image analysis: A feasibility study

    International Nuclear Information System (INIS)

    Lindsay, C.S.; Lindblad, T.

    1997-05-01

    The Swedish Nuclear Inspectorate commissioned a feasibility study of automatic assessment of fuel assemblies from images obtained with the digital Cerenkov viewing device currently in development. The goal is to assist the IAEA inspectors in evaluating the fuel since they typically have only a few seconds to inspect an assembly. We report results here in two main areas: Investigation of basic image processing and recognition techniques needed to enhance the images and find the assembly in the image; Study of the properties of the distributions of light from the assemblies to determine whether they provide unique signatures for different burn-up and cooling times for real fuel or indicate presence of non-fuel. 8 refs, 27 figs

  13. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  14. Wind resource assessment and siting analysis in Italy

    International Nuclear Information System (INIS)

    Ricci, A.; Mizzoni, G.; Rossi, E.

    1992-01-01

    Recently, the wind power industry has matured; consequently, in many countries a lot of wind energy applications have been programmed. Many of them are already realized and running. As such, there is a direct necessity to identify a sizeable number of wind power plant sites. Choosing the right sites to match specific Wind Energy Conversion Systems (WECS) is also needed to harness this clean energy from the points of view of industrial viability and project financing. As a pre-requisite to install a wind turbine at a particular site, it is necessary to have knowledge of the theoretical available wind energy at the site, as well as, of the practicability of the design in matching the characteristics of the WECS. In this paper, ENEA (Italian National Agency for New Technology, Energy and Environment) wind siting and resource assessment activities, currently on-going in different regions in Italy, along with the present status and future prospects of the wind power industry

  15. Sensitivity, uncertainty, and importance analysis of a risk assessment

    International Nuclear Information System (INIS)

    Andsten, R.S.; Vaurio, J.K.

    1992-01-01

    In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements

  16. Promises and pitfalls in environmentally extended input–output analysis for China: A survey of the literature

    International Nuclear Information System (INIS)

    Hawkins, Jacob; Ma, Chunbo; Schilizzi, Steven; Zhang, Fan

    2015-01-01

    As the world's largest developing economy, China plays a key role in global climate change and other environmental impacts of international concern. Environmentally extended input–output analysis (EE-IOA) is an important and insightful tool seeing widespread use in studying large-scale environmental impacts in China: calculating and analyzing greenhouse gas emissions, carbon and water footprints, pollution, and embodied energy. This paper surveys the published articles regarding EE-IOA for China in peer-reviewed journals and provides a comprehensive and quantitative overview of the body of literature, examining the research impact, environmental issues addressed, and data utilized. The paper further includes a discussion of the shortcomings in official Chinese data and of the potential means to move beyond its inherent limitations. - Highlights: • Articles in 2012–2013 more than doubled that published between 1995 and 2011. • CO 2 and energy are the most common topics, frequently associated with trade. • Data from the National Bureau of Statistics is widely used but seen as flawed. • Climate change, water supply, and food security drive the future of the literature

  17. Quantitative assessment of early diabetic retinopathy using fractal analysis.

    Science.gov (United States)

    Cheung, Ning; Donaghue, Kim C; Liew, Gerald; Rogers, Sophie L; Wang, Jie Jin; Lim, Shueh-Wen; Jenkins, Alicia J; Hsu, Wynne; Li Lee, Mong; Wong, Tien Y

    2009-01-01

    Fractal analysis can quantify the geometric complexity of the retinal vascular branching pattern and may therefore offer a new method to quantify early diabetic microvascular damage. In this study, we examined the relationship between retinal fractal dimension and retinopathy in young individuals with type 1 diabetes. We conducted a cross-sectional study of 729 patients with type 1 diabetes (aged 12-20 years) who had seven-field stereoscopic retinal photographs taken of both eyes. From these photographs, retinopathy was graded according to the modified Airlie House classification, and fractal dimension was quantified using a computer-based program following a standardized protocol. In this study, 137 patients (18.8%) had diabetic retinopathy signs; of these, 105 had mild retinopathy. Median (interquartile range) retinal fractal dimension was 1.46214 (1.45023-1.47217). After adjustment for age, sex, diabetes duration, A1C, blood pressure, and total cholesterol, increasing retinal vascular fractal dimension was significantly associated with increasing odds of retinopathy (odds ratio 3.92 [95% CI 2.02-7.61] for fourth versus first quartile of fractal dimension). In multivariate analysis, each 0.01 increase in retinal vascular fractal dimension was associated with a nearly 40% increased odds of retinopathy (1.37 [1.21-1.56]). This association remained after additional adjustment for retinal vascular caliber. Greater retinal fractal dimension, representing increased geometric complexity of the retinal vasculature, is independently associated with early diabetic retinopathy signs in type 1 diabetes. Fractal analysis of fundus photographs may allow quantitative measurement of early diabetic microvascular damage.

  18. Assessment of the Prony's method for BWR stability analysis

    International Nuclear Information System (INIS)

    Ortiz-Villafuerte, Javier; Castillo-Duran, Rogelio; Palacios-Hernandez, Javier C.

    2011-01-01

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  19. Assessing Canadian Bank Branch Operating Efficiency Using Data Envelopment Analysis

    Science.gov (United States)

    Yang, Zijiang

    2009-10-01

    In today's economy and society, performance analyses in the services industries attract more and more attention. This paper presents an evaluation of 240 branches of one big Canadian bank in Greater Toronto Area using Data Envelopment Analysis (DEA). Special emphasis was placed on how to present the DEA results to management so as to provide more guidance to them on what to manage and how to accomplish the changes. Finally the potential management uses of the DEA results were presented. All the findings are discussed in the context of the Canadian banking market.

  20. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  1. Assessment of Prospective Physician Characteristics by SWOT Analysis.

    Science.gov (United States)

    Thira, Woratanarat; Patarawan, Woratanarat

    2012-01-01

    Thailand is one of the developing countries encountering medical workforce shortage. From the national registry in 2006, there were 33 166 physicians: 41.5% worked in the government sector, 21.6% worked in the private sector, and the remaining worked in non-medical fields. There is no current data to confirm the effectiveness of the national policy to increase physician production. We demonstrate our findings from the strength, weakness, opportunity, and threat (SWOT) analysis in medical students and the potential impact on national workforce planning. We introduced SWOT analysis to 568 medical students during the 2008-2010 academic years, with the objective of becoming "a good physician in the future". Pertinent issues were grouped into 4 categories: not wanting to be a doctor, having inadequate medical professional skills, not wanting to work in rural or community areas, and planning to pursue training in specialties with high salary/low workload/low risk for lawsuit. The percentages of medical students who described themselves as "do not want to be a doctor" and "do not want to work in rural or community areas" increased from 7.07% and 25.00% in 2008 to 12.56% and 29.65% in 2010, respectively. Further intervention should be considered in order to change the medical students attitudes on the profession and their impact on Thai health system.

  2. Efficiency analysis and assessment of interlocking PVC sheet piling walls

    International Nuclear Information System (INIS)

    Emam, A.A.

    2005-01-01

    The use of PVC sheet piling in marine environments offers a number of unique advantages that include weight saving, corrosion resistance and environmentally safe material. In this study, one of the widely used classical methods as well as a finite element analysis are used to analyze such sheet piling walls. The analysis focuses on the effect of some important parameters on the wall global behavior, bending moments, stresses and deflections. The parameters include wall cross-section, wall height, embedment depth, number and spacing of anchor rods, and type of soil and loading conditions. Furthermore, the effect of the shape of the wall cross-section and the location of the interlocking joints has been studied by using plane frame and arch-like models. Results indicate that the finite element modeling is an effective tool for numerical approximation of soil-structure interaction problems. The required theoretical embedment depth is nearly 30 % of the clear wall height. Also, the modulus of subgrade reaction has a minor effect on both cantilever wall and one anchor sheet-pile wall. Finally, lateral (horizontal) action shows that deep sections tend to behave like an arch under radial loading which might increase normal stresses at some critical sections

  3. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  4. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    International Nuclear Information System (INIS)

    Walsh, B.; Fisher, C.; Zigler, G.; Clark, R.A.

    1990-01-01

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG ampersand G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort

  5. The physical and mathematical model of dynamic economic analysis and assessment for NPP

    International Nuclear Information System (INIS)

    Xu Jiming

    1992-01-01

    A set physical and mathematical model of dynamic economic analysis referring to international general sub-item and account of investment and constant money levelized model and combining current economic analysis method in China for nuclear power plant was established. The model can be used in economic analysis not only for nuclear power plant but also for coal-fired power plant and can satisfy demand of doing economic analysis and assessment for nuclear power plant and conventional power plant

  6. Use Of Risk Analysis Fremeworks In Urban Flood Assessments

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    with better decision support tools. Some of the developments are risk frameworks that encompass economic and/or ethic evaluation of climate change adaptation options and improved risk management. This line of development is based on a societal-based evaluation of maximizing the outcome for society...... in extreme precipitation has been observed, corresponding to an increase of design levels of at least 30 %. Analysis of climate change model output has given clear evidence, that further increases in extreme precipitation must be expected in the future due to anthropogenic emissions of greenhouse gasses...... and planned urban drainage solutions are shared between very different stakeholders and that current practices are leading to personal bankruptcy by those bearing the highest costs. Therefore solutions must be developed that are understandable and can be communicated between different stakeholders...

  7. In-field analysis and assessment of nuclear material

    International Nuclear Information System (INIS)

    Morgado, R.E.; Myers, W.S.; Olivares, J.A.; Phillips, J.R.; York, R.L.

    1996-01-01

    Los Alamos National Laboratory has actively developed and implemented a number of instruments to monitor, detect, and analyze nuclear materials in the field. Many of these technologies, developed under existing US Department of Energy programs, can also be used to effectively interdict nuclear materials smuggled across or within national borders. In particular, two instruments are suitable for immediate implementation: the NAVI-2, a hand-held gamma-ray and neutron system for the detection and rapid identification of radioactive materials, and the portable mass spectrometer for the rapid analysis of minute quantities of radioactive materials. Both instruments provide not only critical information about the characteristics of the nuclear material for law-enforcement agencies and national authorities but also supply health and safety information for personnel handling the suspect materials

  8. Ebola Virus Training: A Needs Assessment and Gap Analysis.

    Science.gov (United States)

    Yeskey, Kevin; Hughes, Joseph; Galluzzo, Betsy; Jaitly, Nina; Remington, James; Weinstock, Deborah; Lee Pearson, Joy; Rosen, Jonathan D

    In response to the 2014 Ebola virus disease outbreak, the Worker Training Program embarked on an assessment of existing training for those at risk for exposure to the virus. Searches of the recent peer-reviewed literature were conducted for descriptions of relevant training. Federal guidance issued during 2015 was also reviewed. Four stakeholder meetings were conducted with representatives from health care, academia, private industry, and public health to discuss issues associated with ongoing training. Our results revealed few articles about training that provided sufficient detail to serve as models. Training programs struggled to adjust to frequently updated federal guidance. Stakeholders commented that most healthcare training focused solely on infection control, and there was an absence of employee health-related training for non-healthcare providers. Challenges to ongoing training included funding and organizational complacency. Best practices were noted where management and employees planned training cooperatively and where infection control, employee health, and hospital emergency managers worked together on the development of protective guidance. We conclude that sustainable training for infectious disease outbreaks requires annual funding, full support from organizational management, input from all stakeholders, and integration of infection control, emergency management, and employee health when implementing guidance and training.

  9. POISSON, Analysis Solution of Poisson Problems in Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1986-01-01

    1 - Description of program or function: Purpose of program: Analytic treatment of two-stage Poisson problem in Probabilistic Risk Assessment. Input: estimated a-priori mean failure rate and error factor of system considered (for calculation of stage-1 prior), number of failures and operating times for similar systems (for calculation of stage-2 prior). Output: a-posteriori probability distributions on linear and logarithmic time scale (on specified time grid) and expectation values of failure rate and error factors are calculated for: - stage-1 a-priori distribution, - stage-1 a-posteriori distribution, - stage-2 a-priori distribution, - stage-2 a-posteriori distribution. 2 - Method of solution: Bayesian approach with conjugate stage-1 prior, improved with experience from similar systems to yield stage-2 prior, and likelihood function from experience with system under study (documentation see below under 10.). 3 - Restrictions on the complexity of the problem: Up to 100 similar systems (including the system considered), arbitrary number of problems (failure types) with same grid

  10. Tiger Team Assessments seventeen through thirty-five: A summary and analysis

    International Nuclear Information System (INIS)

    1992-12-01

    This report provides a summary and analysis of the Department of Energy's (DOE'S) 19 Tiger Team Assessments that were conducted from October 1990 to July 1992. The sites are listed in the box below, along with their respective program offices and assessment completion dates. This analysis relied solely on the information contained in the Tiger Team Assessment Reports. The findings and concerns documented by the Tiger Teams provide a database of information about the then-current ES ampersand H programs and practice. Program Secretarial Officers (PSOS) and field managers may use this information, along with other sources (such as the Corrective Action Plans, Progress Assessments, and Self-Assessments), to address the ES ampersand H deficiencies found, prioritize and plan appropriate corrective actions, measure progress toward solving the problems, strengthen and transfer knowledge about areas where site performance exemplified the ES ampersand H mindset, and so forth. Further analyses may be suggested by the analysis presented in this report

  11. Preview of the Mission Assurance Analysis Protocol (MAAP): Assessing Risk and Opportunity in Complex Environments

    National Research Council Canada - National Science Library

    Alberts, Christopher; Dorofee, Audrey; Marino, Lisa

    2008-01-01

    .... A MAAP assessment provides a systematic, in-depth analysis of the potential for success in distributed, complex, and uncertain environments and can be applied across the life cycle and throughout the supply chain...

  12. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  13. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    Science.gov (United States)

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  14. Army Enlisted Personnel Competency Assessment Program Phase 1. Volume 1: Needs Analysis

    National Research Council Canada - National Science Library

    Knapp, Deirdre

    2004-01-01

    .... The PerformM21 program has two mutually supporting tracks. The first is a needs analysis that will result in design recommendations and identification of issues related to implementation of a competency assessment program...

  15. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    Science.gov (United States)

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  16. Evaluation of auto-assessment method for C-D analysis based on support vector machine

    International Nuclear Information System (INIS)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Kamihira, Hiroaki; Kishimoto, Tomonari; Goto, Hiroya

    2010-01-01

    Contrast-Detail (C-D) analysis is one of the visual quality assessment methods in medical imaging, and many auto-assessment methods for C-D analysis have been developed in recent years. However, for the auto-assessment method for C-D analysis, the effects of nonlinear image processing are not clear. So, we have made an auto-assessment method for C-D analysis using a support vector machine (SVM), and have evaluated its performance for the images processed with a noise reduction method. The feature indexes used in the SVM were the normalized cross correlation (NCC) coefficient on each signal between the noise-free and noised image, the contrast to noise ratio (CNR) on each signal, the radius of each signal, and the Student's t-test statistic for the mean difference between the signal and background pixel values. The results showed that the auto-assessment method for C-D analysis by using Student's t-test statistic agreed well with the visual assessment for the non-processed images, but disagreed for the images processed with the noise reduction method. Our results also showed that the auto-assessment method for C-D analysis by the SVM made of NCC and CNR agreed well with the visual assessment for the non-processed and noise-reduced images. Therefore, the auto-assessment method for C-D analysis by the SVM will be expected to have the robustness for the non-linear image processing. (author)

  17. Cost Analysis of Water Transport for Climate Change Impact Assessment

    Science.gov (United States)

    Szaleniec, V.; Buytaert, W.

    2012-04-01

    It is expected that climate change will have a strong impact on water resources worldwide. Many studies exist that couple the output of global climate models with hydrological models to assess the impact of climate change on physical water availability. However, the water resources topology of many regions and especially that of cities can be very complex. Changes in physical water availability do therefore not translate easily into impacts on water resources for cities. This is especially the case for cities with a complex water supply topology, for instance because of geographical barriers, strong gradients in precipitation patterns, or competing water uses. In this study we explore the use of cost maps to enable the inclusion of water supply topologies in climate change impact studies. We use the city of Lima as a case study. Lima is the second largest desert city in the world. Although Peru as a whole has no water shortage, extreme gradients exist. Most of the economic activities including the city of Lima are located in the coastal desert. This region is geographically disconnected from the wet Amazon basin because of the Andes mountain range. Hence, water supply is precarious, provided by a complex combination of high mountain ecosystems including wetlands and glaciers, as well as groundwater aquifers depending on recharge from the mountains. We investigate the feasibility and costs of different water abstraction scenarios and the impact of climate change using cost functions for different resources. The option of building inter basins tunnels across the Andes is compared to the costs of desalinating seawater from the Pacific Ocean under different climate change scenarios and population growth scenarios. This approach yields recommendations for the most cost-effective options for the future.

  18. Assessment of the TRINO reactor pressure vessel integrity: theoretical analysis and NDE

    Energy Technology Data Exchange (ETDEWEB)

    Milella, P P; Pini, A [ENEA, Rome (Italy)

    1988-12-31

    This document presents the method used for the capability assessment of the Trino reactor pressure vessel. The vessel integrity assessment is divided into the following parts: transients evaluation and selection, fluence estimate for the projected end of life of the vessel, characterization of unirradiated and irradiated materials, thermal and stress analysis, fracture mechanics analysis and eventually fracture input to Non Destructive Examination (NDE). For each part, results are provided. (TEC).

  19. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  20. Analysis of Hydrological Sensitivity for Flood Risk Assessment

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Sharma

    2018-02-01

    Full Text Available In order for the Indian government to maximize Integrated Water Resource Management (IWRM, the Brahmaputra River has played an important role in the undertaking of the Pilot Basin Study (PBS due to the Brahmaputra River’s annual regional flooding. The selected Kulsi River—a part of Brahmaputra sub-basin—experienced severe floods in 2007 and 2008. In this study, the Rainfall-Runoff-Inundation (RRI hydrological model was used to simulate the recent historical flood in order to understand and improve the integrated flood risk management plan. The ultimate objective was to evaluate the sensitivity of hydrologic simulation using different Digital Elevation Model (DEM resources, coupled with DEM smoothing techniques, with a particular focus on the comparison of river discharge and flood inundation extent. As a result, the sensitivity analysis showed that, among the input parameters, the RRI model is highly sensitive to Manning’s roughness coefficient values for flood plains, followed by the source of the DEM, and then soil depth. After optimizing its parameters, the simulated inundation extent showed that the smoothing filter was more influential than its simulated discharge at the outlet. Finally, the calibrated and validated RRI model simulations agreed well with the observed discharge and the Moderate Imaging Spectroradiometer (MODIS-detected flood extents.

  1. Decay assessment through thermographic analysis in architectural and archaeological heritage

    Science.gov (United States)

    Gomez-Heras, Miguel; Martinez-Perez, Laura; Fort, Rafael; Alvarez de Buergo, Monica

    2010-05-01

    Any exposed stone-built structure is subject to thermal variations due to daily, seasonal and secular environmental temperature changes. Surface temperature is a function of air temperature (due to convective heat transfer) and of infrared radiation received through insolation. While convective heat transfer homogenizes surface temperature, stone response to insolation is much more complex and the temporal and spatial temperature differences across structures are enhanced. Surface temperature in stone-built structures will be affected by orientation, sunlight inclination and the complex patterns of light and shadows generated by the often intricate morphology of historical artefacts and structures. Surface temperature will also be affected by different material properties, such as albedo, thermal conductivity, transparency and absorbance to infrared radiation of minerals and rocks. Moisture and the occurrence of salts will also be a factor affecting surface temperatures. Surface temperatures may as well be affected by physical disruptions of rocks due to differences in thermal inertia generated by cracks and other discontinuities. Thermography is a non-invasive, non-destructive technique that measures temperature variations on the surface of a material. With this technique, surface temperature rates of change and their spatial variations can be analysed. This analysis may be used not only to evaluate the incidence of thermal decay as a factor that generates or enhances stone decay, but also to detect and evaluate other factors that affect the state of conservation of architectural and archaeological heritage, as for example moisture, salts or mechanical disruptions.

  2. Assessment of homogeneity of regions for regional flood frequency analysis

    Science.gov (United States)

    Lee, Jeong Eun; Kim, Nam Won

    2016-04-01

    This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  3. Job analysis and student assessment tool: perfusion education clinical preceptor.

    Science.gov (United States)

    Riley, Jeffrey B

    2007-09-01

    The perfusion education system centers on the cardiac surgery operating room and the perfusionist teacher who serves as a preceptor for the perfusion student. One method to improve the quality of perfusion education is to create a valid method for perfusion students to give feedback to clinical teachers. The preceptor job analysis consisted of a literature review and interviews with preceptors to list their critical tasks, critical incidents, and cognitive and behavioral competencies. Behaviorally anchored rating traits associated with the preceptors' tasks were identified. Students voted to validate the instrument items. The perfusion instructor rating instrument with a 0-4, "very weak" to "very strong" Likert rating scale was used. The five preceptor traits for student evaluation of clinical instruction (SECI) are as follows: The clinical instructor (1) encourages self-learning, (2) encourages clinical reasoning, (3) meets student's learning needs, (4) gives continuous feedback, and (5) represents a good role model. Scores from 430 student-preceptor relationships for 28 students rotating at 24 affiliate institutions with 134 clinical instructors were evaluated. The mean overall good preceptor average (GPA) was 3.45 +/- 0.76 and was skewed to the left, ranging from 0.0 to 4.0 (median = 3.8). Only 21 of the SECI relationships earned a GPA SECI are methods to provide valid information to improve the quality of a perfusion education program.

  4. X-ray quality assessment by MTF analysis

    International Nuclear Information System (INIS)

    Gais, P.; Burger, G.; Drexler, G.; Rappl, M.; Bunde, E.; Lissner, J.; Schaetzl, M.

    1985-01-01

    In a previous study a lucite phantom with several physical elements embedded, such as lead gratings with varying line widths, etc., was exposed at 200 X-ray installations in Bavaria, Federal Republic of Germany, by a conventional diagnostic standard technique. One of the parameters investigated was the local resolution achieved, determined visually by examination of the grating images. The same radiographs have now been used in a retrospective comparative study, based upon a quantitative analysis of TV images of the films. The films were positioned on a commercial illuminator screen and looked at by a TV camera through a simple magnifying optical system. The video signals were digitised, resulting in a pixel distance of 50 μm. In a first approach the edges of the broad frames of the lead gratings were adjusted vertically and the normalised sum of all 256 TV scanning lines taken as the edge function f(x). This was differentiated and suitably Fourier-transformed, delivering the modulation transfer function (MTF). The MTF can be analysed in several ways to describe quantitatively the maximum local resolution achievable. Correlation of some frequency measurements with the visually determined line resolution is generally good. (author)

  5. Reporting of covariate selection and balance assessment in propensity score analysis is suboptimal: A systematic review

    NARCIS (Netherlands)

    Ali, M. Sanni|info:eu-repo/dai/nl/345709497; Groenwold, Rolf H.H.; Belitser, S.|info:eu-repo/dai/nl/304843865; Pestman, Wiebe R.; Hoes, Arno W.; Roes, Kit C.B.; Boer, Anthonius De|info:eu-repo/dai/nl/075097346; Klungel, Olaf H.|info:eu-repo/dai/nl/181447649

    2015-01-01

    Objectives To assess the current practice of propensity score (PS) analysis in the medical literature, particularly the assessment and reporting of balance on confounders. Study Design and Setting A PubMed search identified studies using PS methods from December 2011 through May 2012. For each

  6. Peer Assessment in the Digital Age: A Meta-Analysis Comparing Peer and Teacher Ratings

    Science.gov (United States)

    Li, Hongli; Xiong, Yao; Zang, Xiaojiao; Kornhaber, Mindy L.; Lyu, Youngsun; Chung, Kyung Sun; Suen, Hoi K.

    2016-01-01

    Given the wide use of peer assessment, especially in higher education, the relative accuracy of peer ratings compared to teacher ratings is a major concern for both educators and researchers. This concern has grown with the increase of peer assessment in digital platforms. In this meta-analysis, using a variance-known hierarchical linear modelling…

  7. Cost analysis of breast cancer diagnostic assessment programs.

    Science.gov (United States)

    Honein-AbouHaidar, G N; Hoch, J S; Dobrow, M J; Stuart-McEwan, T; McCready, D R; Gagliardi, A R

    2017-10-01

    Diagnostic assessment programs (daps) appear to improve the diagnosis of cancer, but evidence of their cost-effectiveness is lacking. Given that no earlier study used secondary financial data to estimate the cost of diagnostic tests in the province of Ontario, we explored how to use secondary financial data to retrieve the cost of key diagnostic test services in daps, and we tested the reliability of that cost-retrieving method with hospital-reported costs in preparation for future cost-effectiveness studies. We powered our sample at an alpha of 0.05, a power of 80%, and a margin of error of ±5%, and randomly selected a sample of eligible patients referred to a dap for suspected breast cancer during 1 January-31 December 2012. Confirmatory diagnostic tests received by each patient were identified in medical records. Canadian Classification of Health Intervention procedure codes were used to search the secondary financial data Web portal at the Ontario Case Costing Initiative for an estimate of the direct, indirect, and total costs of each test. The hospital-reported cost of each test received was obtained from the host-hospital's finance department. Descriptive statistics were used to calculate the cost of individual or group confirmatory diagnostic tests, and the Wilcoxon signed-rank test or the paired t-test was used to compare the Ontario Case Costing Initiative and hospital-reported costs. For the 191 identified patients with suspected breast cancer, the estimated total cost of $72,195.50 was not significantly different from the hospital-reported total cost of $72,035.52 ( p = 0.24). Costs differed significantly when multiple tests to confirm the diagnosis were completed during one patient visit and when confirmatory tests reported in hospital data and in medical records were discrepant. The additional estimated cost for non-salaried physicians delivering diagnostic services was $28,387.50. It was feasible to use secondary financial data to retrieve the cost

  8. A new tool for risk analysis and assessment in petrochemical plants

    Directory of Open Access Journals (Sweden)

    El-Arkam Mechhoud

    2016-09-01

    Full Text Available The aim of our work was the implementation of a new automated tool dedicated to risk analysis and assessment in petrochemical plants, based on a combination of two analysis methods: HAZOP (HAZard and OPerability and FMEA (Failure Mode and Effect Analysis. Assessment of accident scenarios is also considered. The principal advantage of the two analysis methods is to speed-up hazard identification and risk assessment and forecast the nature and impact of such accidents. Plant parameters are analyzed under a graphical interface to facilitate the exploitation of our developed approach. This automated analysis brings out the different deviations of the operating parameters of any system in the plant. Possible causes of these deviations, their consequences and preventive actions are identified. The result is risk minimization and dependability enhancement of the considered system.

  9. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  10. A critical analysis of the impact assessment of environmental tritium

    International Nuclear Information System (INIS)

    Jain, Narendra; Bhatia, Arvind

    2013-01-01

    Tritium, a radionuclide of hydrogen has longer life and gets more rapidly dispersed, but before becoming globally distributed, it represents a significant radiobiological risk to the local population exposed. It is produced naturally in the upper atmosphere by the interaction of cosmic rays with nitrogen and hydrogen. The tritons in the upper atmosphere are oxidized to tritiated water (HTO) and mix with the hydrosphere generally through the movement of air masses and precipitation. Terrestrially, tritium may be formed by the action of lithium on neutrons. There is apprehension that the recent controversy concerning the health and environmental impact of tritium may end up as worldwide contaminants in the final analysis. From many varied reports from different laboratories, it appears that projected levels for fusion reactors may also produce deleterious and detectable effects. The degree of concern over tritium problem is evidenced by a rapid increase in publications on the health implications of environmental tritium. The present issues of controversy will be intensified as the fusion reactor technology approaches the door step of public and the possible health detriment from its radioactive emissions arouse concern. The current project has been planned keeping some such points in view. It reviews the work on the behavior of tritium in its various forms in the environment with an emphasis on the release from various sources, its world inventories at present level sand its transfer into the various compartments of ecosystems. Besides this, its metabolism in biosystem and the possible implications of low doses of tritium in present and future generations have also been discussed. (author)

  11. Unmanned Aerial Vehicle (UAV) data analysis for fertilization dose assessment

    Science.gov (United States)

    Kavvadias, Antonis; Psomiadis, Emmanouil; Chanioti, Maroulio; Tsitouras, Alexandros; Toulios, Leonidas; Dercas, Nicholas

    2017-10-01

    The growth rate monitoring of crops throughout their biological cycle is very important as it contributes to the achievement of a uniformly optimum production, a proper harvest planning, and reliable yield estimation. Fertilizer application often dramatically increases crop yields, but it is necessary to find out which is the ideal amount that has to be applied in the field. Remote sensing collects spatially dense information that may contribute to, or provide feedback about, fertilization management decisions. There is a potential goal to accurately predict the amount of fertilizer needed so as to attain an ideal crop yield without excessive use of fertilizers cause financial loss and negative environmental impacts. The comparison of the reflectance values at different wavelengths, utilizing suitable vegetation indices, is commonly used to determine plant vigor and growth. Unmanned Aerial Vehicles (UAVs) have several advantages; because they can be deployed quickly and repeatedly, they are flexible regarding flying height and timing of missions, and they can obtain very high-resolution imagery. In an experimental crop field in Eleftherio Larissa, Greece, different dose of pre-plant and in-season fertilization was applied in 27 plots. A total of 102 aerial photos in two flights were taken using an Unmanned Aerial Vehicle based on the scheduled fertilization. Α correlation of experimental fertilization with the change of vegetation indices values and with the increase of the vegetation cover rate during those days was made. The results of the analysis provide useful information regarding the vigor and crop growth rate performance of various doses of fertilization.

  12. FEBEX II Project Post-mortem analysis EDZ assessment

    International Nuclear Information System (INIS)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-01-01

    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  13. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  14. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    Science.gov (United States)

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  15. Stockholm Safety Conference. Analysis of the sessions on radiological protection, licensing and risk assessment

    International Nuclear Information System (INIS)

    Gea, A.

    1981-01-01

    A summary of the sessions on radiological protection, licensing and risk assessment in the safety conference of Stockholm is presented. It is considered the new point of view of the nuclear safety, probabilistic analysis, components failures probability and accident analysis. They are included conclusions applicable in many cases to development countries. (author)

  16. Using Citation Analysis Methods to Assess the Influence of Science, Technology, Engineering, and Mathematics Education Evaluations

    Science.gov (United States)

    Greenseid, Lija O.; Lawrenz, Frances

    2011-01-01

    This study explores the use of citation analysis methods to assess the influence of program evaluations conducted within the area of science, technology, engineering, and mathematics (STEM) education. Citation analysis is widely used within scientific research communities to measure the relative influence of scientific research enterprises and/or…

  17. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  18. Assessing the Social Acceptability of the Functional Analysis of Problem Behavior

    Science.gov (United States)

    Langthorne, Paul; McGill, Peter

    2011-01-01

    Although the clinical utility of the functional analysis is well established, its social acceptability has received minimal attention. The current study assessed the social acceptability of functional analysis procedures among 10 parents and 3 teachers of children who had recently received functional analyses. Participants completed a 9-item…

  19. Seeking missing pieces in science concept assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch analysis

    Science.gov (United States)

    Ding, Lin

    2014-02-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses. Differing from typical concept inventories that focus only on one topic of a subject area, BEMA covers a broad range of topics in the electromagnetism domain. In spite of this fact, prior studies exclusively used a single aggregate score to represent individual students' overall understanding of E&M without explicating the construct of this assessment. Additionally, BEMA has been used to compare traditional physics courses with a reformed course entitled Matter and Interactions (M&I). While prior findings were in favor of M&I, no empirical evidence was sought to rule out possible differential functioning of BEMA that may have inadvertently advantaged M&I students. In this study, we used Rasch analysis to seek two missing pieces regarding the construct and differential functioning of BEMA. Results suggest that although BEMA items generally can function together to measure the same construct of application and analysis of E&M concepts, several items may need further revision. Additionally, items that demonstrate differential functioning for the two courses are detected. Issues such as item contextual features and student familiarity with question settings may underlie these findings. This study highlights often overlooked threats in science concept assessments and provides an exemplar for using evidence-based reasoning to make valid inferences and arguments.

  20. Seeking missing pieces in science concept assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch analysis

    Directory of Open Access Journals (Sweden)

    Lin Ding

    2014-02-01

    Full Text Available Discipline-based science concept assessments are powerful tools to measure learners’ disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA has been broadly used to gauge student conceptions of key electricity and magnetism (E&M topics in college-level introductory physics courses. Differing from typical concept inventories that focus only on one topic of a subject area, BEMA covers a broad range of topics in the electromagnetism domain. In spite of this fact, prior studies exclusively used a single aggregate score to represent individual students’ overall understanding of E&M without explicating the construct of this assessment. Additionally, BEMA has been used to compare traditional physics courses with a reformed course entitled Matter and Interactions (M&I. While prior findings were in favor of M&I, no empirical evidence was sought to rule out possible differential functioning of BEMA that may have inadvertently advantaged M&I students. In this study, we used Rasch analysis to seek two missing pieces regarding the construct and differential functioning of BEMA. Results suggest that although BEMA items generally can function together to measure the same construct of application and analysis of E&M concepts, several items may need further revision. Additionally, items that demonstrate differential functioning for the two courses are detected. Issues such as item contextual features and student familiarity with question settings may underlie these findings. This study highlights often overlooked threats in science concept assessments and provides an exemplar for using evidence-based reasoning to make valid inferences and arguments.

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Scenario Analysis for the Safety Assessment of Nuclear Waste Repositories: A Critical Review.

    Science.gov (United States)

    Tosoni, Edoardo; Salo, Ahti; Zio, Enrico

    2018-04-01

    A major challenge in scenario analysis for the safety assessment of nuclear waste repositories pertains to the comprehensiveness of the set of scenarios selected for assessing the safety of the repository. Motivated by this challenge, we discuss the aspects of scenario analysis relevant to comprehensiveness. Specifically, we note that (1) it is necessary to make it clear why scenarios usually focus on a restricted set of features, events, and processes; (2) there is not yet consensus on the interpretation of comprehensiveness for guiding the generation of scenarios; and (3) there is a need for sound approaches to the treatment of epistemic uncertainties. © 2017 Society for Risk Analysis.

  3. Risk assessment model for nuclear accident emergency protection countermeasure based on fuzzy matter-element analysis

    International Nuclear Information System (INIS)

    Xin Jing; Tang Huaqing; Zhang Yinghua; Zhang Limin

    2009-01-01

    A risk assessment model of nuclear accident emergency protection countermeasure based on fuzzy matter-element analysis and Euclid approach degree is proposed in the paper. The weight of assessed index is determined by information entropy and the scoring by experts, which could not only make full use of the inherent information of the indexes adequately, but reduce subjective assumption in the course of assessment effectively. The applied result shows that it is reasonable that the model is adopted to make risk assessment for nuclear accident emergency protective countermeasure,and it could be a kind of effective analytical method and decision making basis to choose the optimum protection countermeasure. (authors)

  4. AADL Fault Modeling and Analysis Within an ARP4761 Safety Assessment

    Science.gov (United States)

    2014-10-01

    Analysis Generator 27 3.2.3 Mapping to OpenFTA Format File 27 3.2.4 Mapping to Generic XML Format 28 3.2.5 AADL and FTA Mapping Rules 28 3.2.6 Issues...PSSA), System Safety Assessment (SSA), Common Cause Analysis (CCA), Fault Tree Analysis ( FTA ), Failure Modes and Effects Analysis (FMEA), Failure...Modes and Effects Summary, Mar - kov Analysis (MA), and Dependence Diagrams (DDs), also referred to as Reliability Block Dia- grams (RBDs). The

  5. DEFINING THE RELEVANT OUTCOME MEASURES IN MEDICAL DEVICE ASSESSMENTS: AN ANALYSIS OF THE DEFINITION PROCESS IN HEALTH TECHNOLOGY ASSESSMENT.

    Science.gov (United States)

    Jacobs, Esther; Antoine, Sunya-Lee; Prediger, Barbara; Neugebauer, Edmund; Eikermann, Michaela

    2017-01-01

    Defining relevant outcome measures for clinical trials on medical devices (MD) is complex, as there is a large variety of potentially relevant outcomes. The chosen outcomes vary widely across clinical trials making the assessment in evidence syntheses very challenging. The objective is to provide an overview on the current common procedures of health technology assessment (HTA) institutions in defining outcome measures in MD trials. In 2012-14, the Web pages of 126 institutions involved in HTA were searched for methodological manuals written in English or German that describe methods for the predefinition process of outcome measures. Additionally, the institutions were contacted by email. Relevant information was extracted. All process steps were performed independently by two reviewers. Twenty-four manuals and ten responses from the email request were included in the analysis. Overall, 88.5 percent of the institutions describe the type of outcomes that should be considered in detail and 84.6 percent agree that the main focus should be on patient relevant outcomes. Specifically related to MD, information could be obtained in 26 percent of the included manuals and email responses. Eleven percent of the institutions report a particular consideration of MD related outcomes. This detailed analysis on common procedures of HTA institutions in the context of defining relevant outcome measures for the assessment of MD shows that standardized procedures for MD from the perspective of HTA institutions are not widespread. This leads to the question if a homogenous approach should be implemented in the field of HTA on MD.

  6. Validity as a social imperative for assessment in health professions education: a concept analysis.

    Science.gov (United States)

    Marceau, Mélanie; Gallagher, Frances; Young, Meredith; St-Onge, Christina

    2018-06-01

    Assessment can have far-reaching consequences for future health care professionals and for society. Thus, it is essential to establish the quality of assessment. Few modern approaches to validity are well situated to ensure the quality of complex assessment approaches, such as authentic and programmatic assessments. Here, we explore and delineate the concept of validity as a social imperative in the context of assessment in health professions education (HPE) as a potential framework for examining the quality of complex and programmatic assessment approaches. We conducted a concept analysis using Rodgers' evolutionary method to describe the concept of validity as a social imperative in the context of assessment in HPE. Supported by an academic librarian, we developed and executed a search strategy across several databases for literature published between 1995 and 2016. From a total of 321 citations, we identified 67 articles that met our inclusion criteria. Two team members analysed the texts using a specified approach to qualitative data analysis. Consensus was achieved through full team discussions. Attributes that characterise the concept were: (i) demonstration of the use of evidence considered credible by society to document the quality of assessment; (ii) validation embedded through the assessment process and score interpretation; (iii) documented validity evidence supporting the interpretation of the combination of assessment findings, and (iv) demonstration of a justified use of a variety of evidence (quantitative and qualitative) to document the quality of assessment strategies. The emerging concept of validity as a social imperative highlights some areas of focus in traditional validation frameworks, whereas some characteristics appear unique to HPE and move beyond traditional frameworks. The study reflects the importance of embedding consideration for society and societal concerns throughout the assessment and validation process, and may represent a

  7. Computer-assisted liver graft steatosis assessment via learning-based texture analysis.

    Science.gov (United States)

    Moccia, Sara; Mattos, Leonardo S; Patrini, Ilaria; Ruperti, Michela; Poté, Nicolas; Dondero, Federica; Cauchy, François; Sepulveda, Ailton; Soubrane, Olivier; De Momi, Elena; Diaspro, Alberto; Cesaretti, Manuela

    2018-05-23

    Fast and accurate graft hepatic steatosis (HS) assessment is of primary importance for lowering liver dysfunction risks after transplantation. Histopathological analysis of biopsied liver is the gold standard for assessing HS, despite being invasive and time consuming. Due to the short time availability between liver procurement and transplantation, surgeons perform HS assessment through clinical evaluation (medical history, blood tests) and liver texture visual analysis. Despite visual analysis being recognized as challenging in the clinical literature, few efforts have been invested to develop computer-assisted solutions for HS assessment. The objective of this paper is to investigate the automatic analysis of liver texture with machine learning algorithms to automate the HS assessment process and offer support for the surgeon decision process. Forty RGB images of forty different donors were analyzed. The images were captured with an RGB smartphone camera in the operating room (OR). Twenty images refer to livers that were accepted and 20 to discarded livers. Fifteen randomly selected liver patches were extracted from each image. Patch size was [Formula: see text]. This way, a balanced dataset of 600 patches was obtained. Intensity-based features (INT), histogram of local binary pattern ([Formula: see text]), and gray-level co-occurrence matrix ([Formula: see text]) were investigated. Blood-sample features (Blo) were included in the analysis, too. Supervised and semisupervised learning approaches were investigated for feature classification. The leave-one-patient-out cross-validation was performed to estimate the classification performance. With the best-performing feature set ([Formula: see text]) and semisupervised learning, the achieved classification sensitivity, specificity, and accuracy were 95, 81, and 88%, respectively. This research represents the first attempt to use machine learning and automatic texture analysis of RGB images from ubiquitous smartphone

  8. Femoral anteversion assessment: Comparison of physical examination, gait analysis, and EOS biplanar radiography.

    Science.gov (United States)

    Westberry, David E; Wack, Linda I; Davis, Roy B; Hardin, James W

    2018-05-01

    Multiple measurement methods are available to assess transverse plane alignment of the lower extremity. This study was performed to determine the extent of correlation between femoral anteversion assessment using simultaneous biplanar radiographs and three-dimensional modeling (EOS imaging), clinical hip rotation by physical examination, and dynamic hip rotation assessed by gait analysis. Seventy-seven patients with cerebral palsy (GMFCS Level I and II) and 33 neurologically typical children with torsional abnormalities completed a comprehensive gait analysis with same day biplanar anterior-posterior and lateral radiographs and three-dimensional transverse plane assessment of femoral anteversion. Correlations were determined between physical exam of hip rotation, EOS imaging of femoral anteversion, and transverse plane hip kinematics for this retrospective review study. Linear regression analysis revealed a weak relationship between physical examination measures of hip rotation and biplanar radiographic assessment of femoral anteversion. Similarly, poor correlation was found between clinical evaluation of femoral anteversion and motion assessment of dynamic hip rotation. Correlations were better in neurologically typical children with torsional abnormalities compared to children with gait dysfunction secondary to cerebral palsy. Dynamic hip rotation cannot be predicted by physical examination measures of hip range of motion or from three-dimensional assessment of femoral anteversion derived from biplanar radiographs. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    Science.gov (United States)

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  10. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  11. Interconnectivity among Assessments from Rating Agencies: Using Cluster and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jaroslav Krejčíř

    2014-09-01

    Full Text Available The aim of this paper is to determine whether there is a dependency among leading rating agencies assessments. Rating agencies are important part of global economy. Great attention has been paid to activities of rating agencies since 2007, when there was a financial crisis. One of the main causes of this crisis was identified credit rating agencies. This paper is focused on an existence of mutual interconnectivity among assessments from three leading rating agencies. The method used for this determines is based on cluster analysis and subsequently correlation analysis and the test of independence. Credit rating assessments of Greece and Spain were chosen to the determination of this mutual interconnectivity due to the fact that these countries are most talked euro­area countries. The significant dependence of the assessment from different rating agencies has been demonstrated.

  12. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  13. A Flood Risk Assessment of Quang Nam, Vietnam Using Spatial Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chinh Luu

    2018-04-01

    Full Text Available Vietnam is highly vulnerable to flood and storm impacts. Holistic flood risk assessment maps that adequately consider flood risk factors of hazard, exposure, and vulnerability are not available. These are vital for flood risk preparedness and disaster mitigation measures at the local scale. Unfortunately, there is a lack of knowledge about spatial multicriteria decision analysis and flood risk analysis more broadly in Vietnam. In response to this need, we identify and quantify flood risk components in Quang Nam province through spatial multicriteria decision analysis. The study presents a new approach to local flood risk assessment mapping, which combines historical flood marks with exposure and vulnerability data. The flood risk map output could assist and empower decision-makers in undertaking flood risk management activities in the province. Our study demonstrates a methodology to build flood risk assessment maps using flood mark, exposure and vulnerability data, which could be applied in other provinces in Vietnam.

  14. Ecological risk assessment of hydropower dam construction based on ecological network analysis

    OpenAIRE

    Chen, Shaoqing; Fath, Brian D.; Chen, Bin

    2010-01-01

    Dam construction is regarded as one of the major factors contributing to significant modifications of the river ecosystems, and the ecological risk (ER) assessment of dam construction has received growing attention in recent years. In the present study, we explored the potential ecological risk caused by dam project based on the general principles of the ecological risk assessment. Ecological network analysis was proposed as the usable analytic method for the implement of ecological risk asse...

  15. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  16. System Analysis and Risk Assessment system (SARA) Version 4.0

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1992-01-01

    This NUREG is the tutorial for the System Analysis and Risk Assessment System (SARA) Version 4.0, a microcomputer-based system used to analyze the safety issues of a family [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons are provided that walk the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis

  17. Analysis of environmental impact assessment for large-scale X-ray medical equipments

    International Nuclear Information System (INIS)

    Fu Jin; Pei Chengkai

    2011-01-01

    Based on an Environmental Impact Assessment (EIA) project, this paper elaborates the basic analysis essentials of EIA for the sales project of large-scale X-ray medical equipment, and provides the analysis procedure of environmental impact and dose estimation method under normal and accident conditions. The key points of EIA for the sales project of large-scale X-ray medical equipment include the determination of pollution factor and management limit value according to the project's actual situation, the utilization of various methods of assessment and prediction such as analogy, actual measurement and calculation to analyze, monitor, calculate and predict the pollution during normal and accident condition. (authors)

  18. Cognitive human reliability analysis for an assessment of the safety significance of complex transients

    International Nuclear Information System (INIS)

    Amico, P.J.; Hsu, C.J.; Youngblood, R.W.; Fitzpatrick, R.G.

    1989-01-01

    This paper reports that as part of a probabilistic assessment of the safety significance of complex transients at certain PWR power plants, it was necessary to perform a cognitive human reliability analysis. To increase the confidence in the results, it was desirable to make use of actual observations of operator response which were available for the assessment. An approach was developed which incorporated these observations into the human cognitive reliability (HCR) modeling approach. The results obtained provided additional insights over what would have been found using other approaches. These insights were supported by the observations, and it is suggested that this approach be considered for use in future probabilistic safety assessments

  19. Endogenous allergens and compositional analysis in the allergenicity assessment of genetically modified plants.

    Science.gov (United States)

    Fernandez, A; Mills, E N C; Lovik, M; Spoek, A; Germini, A; Mikalsen, A; Wal, J M

    2013-12-01

    Allergenicity assessment of genetically modified (GM) plants is one of the key pillars in the safety assessment process of these products. As part of this evaluation, one of the concerns is to assess that unintended effects (e.g. over-expression of endogenous allergens) relevant for the food safety have not occurred due to the genetic modification. Novel technologies are now available and could be used as complementary and/or alternative methods to those based on human sera for the assessment of endogenous allergenicity. In view of these developments and as a step forward in the allergenicity assessment of GM plants, it is recommended that known endogenous allergens are included in the compositional analysis as additional parameters to be measured. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    and approaches which have been developed or proposed by large organizations or regulatory bodies for NM. These frameworks and approaches were evaluated and assessed based on a select number of criteria which have been previously proposed as important parameters for inclusion in successful risk assessment......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  1. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    2003-01-01

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  2. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.

    1996-01-01

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  3. Degradation Assessment and Fault Diagnosis for Roller Bearing Based on AR Model and Fuzzy Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Lingli Jiang

    2011-01-01

    Full Text Available This paper proposes a new approach combining autoregressive (AR model and fuzzy cluster analysis for bearing fault diagnosis and degradation assessment. AR model is an effective approach to extract the fault feature, and is generally applied to stationary signals. However, the fault vibration signals of a roller bearing are non-stationary and non-Gaussian. Aiming at this problem, the set of parameters of the AR model is estimated based on higher-order cumulants. Consequently, the AR parameters are taken as the feature vectors, and fuzzy cluster analysis is applied to perform classification and pattern recognition. Experiments analysis results show that the proposed method can be used to identify various types and severities of fault bearings. This study is significant for non-stationary and non-Gaussian signal analysis, fault diagnosis and degradation assessment.

  4. Scenario analysis in environmental impact assessment: Improving explorations of the future

    International Nuclear Information System (INIS)

    Duinker, Peter N.; Greig, Lorne A.

    2007-01-01

    Scenarios and scenario analysis have become popular approaches in organizational planning and participatory exercises in pursuit of sustainable development. However, they are little used, at least in any formal way, in environmental impact assessment (EIA). This is puzzling because EIA is a process specifically dedicated to exploring options for more-sustainable (i.e., less environmentally damaging) futures. In this paper, we review the state of the art associated with scenarios and scenario analysis, and describe two areas where scenario analysis could be particularly helpful in EIA: (a) in defining future developments for cumulative effects assessment; and (b) in considering the influence of contextual change - e.g. climate change - on impact forecasts for specific projects. We conclude by encouraging EIA practitioners to learn about the promise of scenario-based analysis and implement scenario-based methods so that EIA can become more effective in fostering sustainable development

  5. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  6. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  7. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fry, F.A.

    1983-12-01

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  8. Comprehensive assessment of firm financial performance using financial ratios and linguistic analysis of annual reports

    OpenAIRE

    Renáta Myšková; Petr Hájek

    2017-01-01

    Indicators of financial performance, especially financial ratio analysis, have become important financial decision-support information used by firm management and other stakeholders to assess financial stability and growth potential. However, additional information may be hidden in management communication. The article deals with the analysis of the annual reports of U.S. firms from both points of view, a financial one based on a set of financial ratios, and a linguistic one based on the anal...

  9. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  10. Using principal component analysis and annual seasonal trend analysis to assess karst rocky desertification in southwestern China.

    Science.gov (United States)

    Zhang, Zhiming; Ouyang, Zhiyun; Xiao, Yi; Xiao, Yang; Xu, Weihua

    2017-06-01

    Increasing exploitation of karst resources is causing severe environmental degradation because of the fragility and vulnerability of karst areas. By integrating principal component analysis (PCA) with annual seasonal trend analysis (ASTA), this study assessed karst rocky desertification (KRD) within a spatial context. We first produced fractional vegetation cover (FVC) data from a moderate-resolution imaging spectroradiometer normalized difference vegetation index using a dimidiate pixel model. Then, we generated three main components of the annual FVC data using PCA. Subsequently, we generated the slope image of the annual seasonal trends of FVC using median trend analysis. Finally, we combined the three PCA components and annual seasonal trends of FVC with the incidence of KRD for each type of carbonate rock to classify KRD into one of four categories based on K-means cluster analysis: high, moderate, low, and none. The results of accuracy assessments indicated that this combination approach produced greater accuracy and more reasonable KRD mapping than the average FVC based on the vegetation coverage standard. The KRD map for 2010 indicated that the total area of KRD was 78.76 × 10 3  km 2 , which constitutes about 4.06% of the eight southwest provinces of China. The largest KRD areas were found in Yunnan province. The combined PCA and ASTA approach was demonstrated to be an easily implemented, robust, and flexible method for the mapping and assessment of KRD, which can be used to enhance regional KRD management schemes or to address assessment of other environmental issues.

  11. A Qualitative Analysis of Narrative Preclerkship Assessment Data to Evaluate Teamwork Skills.

    Science.gov (United States)

    Dolan, Brigid M; O'Brien, Celia Laird; Cameron, Kenzie A; Green, Marianne M

    2018-04-16

    Construct: Students entering the health professions require competency in teamwork. Although many teamwork curricula and assessments exist, studies have not demonstrated robust longitudinal assessment of preclerkship students' teamwork skills and attitudes. Assessment portfolios may serve to fill this gap, but it is unknown how narrative comments within portfolios describe student teamwork behaviors. We performed a qualitative analysis of narrative data in 15 assessment portfolios. Student portfolios were randomly selected from 3 groups stratified by quantitative ratings of teamwork performance gathered from small-group and clinical preceptor assessment forms. Narrative data included peer and faculty feedback from these same forms. Data were coded for teamwork-related behaviors using a constant comparative approach combined with an identification of the valence of the coded statements as either "positive observation" or "suggestion for improvement." Eight codes related to teamwork emerged: attitude and demeanor, information facilitation, leadership, preparation and dependability, professionalism, team orientation, values team member contributions, and nonspecific teamwork comments. The frequency of codes and valence varied across the 3 performance groups, with students in the low-performing group receiving more suggestions for improvement across all teamwork codes. Narrative data from assessment portfolios included specific descriptions of teamwork behavior, with important contributions provided by both faculty and peers. A variety of teamwork domains were represented. Such feedback as collected in an assessment portfolio can be used for longitudinal assessment of preclerkship student teamwork skills and attitudes.

  12. Predicting child maltreatment: A meta-analysis of the predictive validity of risk assessment instruments.

    Science.gov (United States)

    van der Put, Claudia E; Assink, Mark; Boekhout van Solinge, Noëlle F

    2017-11-01

    Risk assessment is crucial in preventing child maltreatment since it can identify high-risk cases in need of child protection intervention. Despite widespread use of risk assessment instruments in child welfare, it is unknown how well these instruments predict maltreatment and what instrument characteristics are associated with higher levels of predictive validity. Therefore, a multilevel meta-analysis was conducted to examine the predictive accuracy of (characteristics of) risk assessment instruments. A literature search yielded 30 independent studies (N=87,329) examining the predictive validity of 27 different risk assessment instruments. From these studies, 67 effect sizes could be extracted. Overall, a medium significant effect was found (AUC=0.681), indicating a moderate predictive accuracy. Moderator analyses revealed that onset of maltreatment can be better predicted than recurrence of maltreatment, which is a promising finding for early detection and prevention of child maltreatment. In addition, actuarial instruments were found to outperform clinical instruments. To bring risk and needs assessment in child welfare to a higher level, actuarial instruments should be further developed and strengthened by distinguishing risk assessment from needs assessment and by integrating risk assessment with case management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Small group learning: effect on item analysis and accuracy of self-assessment of medical students.

    Science.gov (United States)

    Biswas, Shubho Subrata; Jain, Vaishali; Agrawal, Vandana; Bindra, Maninder

    2015-01-01

    Small group sessions are regarded as a more active and student-centered approach to learning. Item analysis provides objective evidence of whether such sessions improve comprehension and make the topic easier for students, in addition to assessing the relative benefit of the sessions to good versus poor performers. Self-assessment makes students aware of their deficiencies. Small group sessions can also help students develop the ability to self-assess. This study was carried out to assess the effect of small group sessions on item analysis and students' self-assessment. A total of 21 female and 29 male first year medical students participated in a small group session on topics covered by didactic lectures two weeks earlier. It was preceded and followed by two multiple choice question (MCQ) tests, in which students were asked to self-assess their likely score. The MCQs used were item analyzed in a previous group and were chosen of matching difficulty and discriminatory indices for the pre- and post-tests. The small group session improved the marks of both genders equally, but female performance was better. The session made the items easier; increasing the difficulty index significantly but there was no significant alteration in the discriminatory index. There was overestimation in the self-assessment of both genders, but male overestimation was greater. The session improved the self-assessment of students in terms of expected marks and expectation of passing. Small group session improved the ability of students to self-assess their knowledge and increased the difficulty index of items reflecting students' better performance.

  14. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.

    Science.gov (United States)

    2010-01-01

    ... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations. Pursuant to § 2.29(a)(2), the following delegations of authority are by the Chief Economist to the Director... reserved to the Chief Economist: Review all proposed decisions having substantial economic policy...

  15. A Comparison of Functional Behavioral Assessment and Functional Analysis Methodology among Students with Mild Disabilities

    Science.gov (United States)

    Lewis, Timothy J.; Mitchell, Barbara S.; Harvey, Kristin; Green, Ambra; McKenzie, Jennifer

    2015-01-01

    Functional behavioral assessment (FBA) and functional analyses (FA) are grounded in the applied behavior analysis principle that posits problem behavior is functionally related to the environment in which it occurs and is maintained by either providing access to reinforcing outcomes or allowing the individual to avoid or escape that which they…

  16. Testing-Context Analysis: Assessment Is Just Another Part of Language Curriculum Development

    Science.gov (United States)

    Brown, James Dean

    2008-01-01

    In keeping with the theme of the International Language Testing Association/Language Testing Research Colloquium Conference in 2008, "Focusing on the Core: Justifying the Use of Language Assessments to Stakeholders," I define "stakeholder-friendly tests," "defensible testing," and "testing-context analysis."…

  17. Safety assessment of research reactors and preparation of the safety analysis report

    International Nuclear Information System (INIS)

    1994-01-01

    This Safety Guide presents guidelines, approved by international consensus, for the preparation, review and assessment of safety documentation for research reactors such as the Safety Analysis Report. While the Guide is most applicable to research reactors in the design and construction stage, it is also recommended for use during relicensing or reassessment of existing reactors

  18. Speed of sound reflects Young's modulus as assessed by microstructural finite element analysis

    NARCIS (Netherlands)

    Bergh, van den J.P.W.; Lenthe, van G.H.; Hermus, A.R.M.M.; Corstens, F.H.M.; Smals, A.G.H.; Huiskes, H.W.J.

    2000-01-01

    We analyzed the ability of the quantitative ultrasound (QUS) parameter, speed of sound (SOS), and bone mineral density (BMD), as measured by dual-energy X-ray absorptiometry (DXA), to predict Young's modulus, as assessed by microstructural finite element analysis (muFEA) from microcomputed

  19. Artificial neural network analysis to assess hypernasality in patients treated for oral or oropharyngeal cancer

    NARCIS (Netherlands)

    de Bruijn, Marieke; ten Bosch, Louis; Kuik, Dirk J.; Langendijk, Johannes A.; Leemans, C. Rene; Verdonck-de Leeuw, Irma

    2011-01-01

    Objective. Investigation of applicability of neural network feature analysis of nasalance in speech to assess hypernasality in speech of patients treated for oral or oropharyngeal cancer. Patients and methods. Speech recordings of 51 patients and of 18 control speakers were evaluated regarding

  20. Computational Psycholinguistic Analysis and Its Application in Psychological Assessment of College Students

    Science.gov (United States)

    Kucera, Dalibor; Havigerová, Jana M.

    2015-01-01

    The paper deals with the issue of computational psycholinguistic analysis (CPA) and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text's factual (semantic)…

  1. Understory vegetation data quality assessment for the Interior West Forest and Inventory Analysis program

    Science.gov (United States)

    Paul L. Patterson; Renee A. O' Brien

    2011-01-01

    The Interior West Forest Inventory and Analysis (IW-FIA) program of the USDA Forest Service collects field data on understory vegetation structure that have broad applications. In IW-FIA one aspect of quality assurance is assessed based on the repeatability of field measurements. The understory vegetation protocol consists of two suites of measurements; (1) the...

  2. Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment

    Science.gov (United States)

    Cui, Ying; Mousavi, Amin

    2015-01-01

    The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…

  3. Assessment of Smolt Condition for Travel Time Analysis, 1993-1994 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M; Beeman, John W; VanderKooi, Scott P [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1999-02-01

    The assessment of smolt condition for travel time analysis (ASCTTA) project provided information on the level of smoltification in Columbia River hatchery and wild salmonid stocks to the Fish Passage Center (FPC), for the primary purpose of in-river management of flows.

  4. Global ejection fraction and phase analysis assessed by radionuclide angiography during exercise and after isoproterenol infusion

    International Nuclear Information System (INIS)

    Righetti, A.; Ratib, O.; Merier, G.; Widmann, T.; Donath, A.

    1983-01-01

    Radionuclide angiography obtained during and following Isoproterenol infusion is a new approach for detecting latent myocardial ischemia. It is very sensitive and could be considered as an alternative to conventional exercice radionuclide angiography. The data presented show that phase analysis assessment of regional systolic wall motion is a better indicator than global ejection fraction for quantifying left ventricular dysfunction

  5. Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.

    Science.gov (United States)

    Miller, James H.; Carr, Sonya C.

    1997-01-01

    Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…

  6. Intelligent trainee behavior assessment system for medical training employing video analysis

    NARCIS (Netherlands)

    Han, Jungong; With, de P.H.N.; Merién, A.E.R.; Oei, S.G.

    2012-01-01

    This paper addresses the problem of assessing a trainee’s performance during a simulated delivery training by employing automatic analysis of a video camera signal. We aim at providing objective statistics reflecting the trainee’s behavior, so that the instructor is able to give valuable suggestions

  7. Applying HAZOP analysis in assessing remote handling compatibility of ITER port plugs

    NARCIS (Netherlands)

    Duisings, L. P. M.; van Til, S.; Magielsen, A. J.; Ronden, D. M. S.; Elzendoorn, B. S. Q.; Heemskerk, C. J. M.

    2013-01-01

    This paper describes the application of a Hazard and Operability Analysis (HAZOP) methodology in assessing the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. As part of the ECHUL consortium, the remote handling team at the DIFFER Institute is

  8. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  9. Assessment of balance in propensity score analysis in the medical literature: A systematic review

    NARCIS (Netherlands)

    Ali, M. Sanni|info:eu-repo/dai/nl/345709497; Groenwold, Rolf H.H.; Belitser, Svetlana V.; Pestman, Wiebe R.; Hoes, Arno W.; Roes, Kit C.B.; Boer, Ade; Klungel, Olaf H.|info:eu-repo/dai/nl/181447649

    2013-01-01

    Background: Assessing balance on co-variate distributions between treatment groups with a given propensity score (PS) is a crucial step in PS analysis. Several methodological papers comparing different balance measures have been published in the last decade. However, the current practice on

  10. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...

  11. Local Assessment: Using Genre Analysis to Validate Directed Self-Placement

    Science.gov (United States)

    Gere, Anne Ruggles; Aull, Laura; Escudero, Moises Damian Perales; Lancaster, Zak; Lei, Elizabeth Vander

    2013-01-01

    Grounded in the principle that writing assessment should be locally developed and controlled, this article describes a study that contextualizes and validates the decisions that students make in the modified Directed Self-Placement (DSP) process used at the University of Michigan. The authors present results of a detailed text analysis of…

  12. Children, Mathematics, and Videotape: Using Multimodal Analysis to Bring Bodies into Early Childhood Assessment Interviews

    Science.gov (United States)

    Parks, Amy Noelle; Schmeichel, Mardi

    2014-01-01

    Despite the increased use of video for data collection, most research using assessment interviews in early childhood education relies solely upon the analysis of linguistic data, ignoring children's bodies. This trend is particularly troubling in studies of marginalized children because transcripts limited to language can make it difficult to…

  13. An introductory guide to uncertainty analysis in environmental and health risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  14. The Organisational Culture Assessment Inventory: A Metaphorical Analysis in Educational Settings.

    Science.gov (United States)

    Steinhoff, Carl R.; Owens, Robert G.

    1989-01-01

    A preliminary analysis of the Organizational Culture Assessment Inventory yielded four root metaphors descriptive of the type of culture likely to be found in public schools: the family, the machine, the circus, and the "little shop of horrors." (10 references) (SI)

  15. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OA-2013-0320; FRL-9830-1] Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... Environmental Protection Agency (EPA) issued for public comment a document entitled, ``Technical Guidance for...

  16. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    2013-01-01

    a thermal air flow simulation program - Into the energy systems analysis model. Descriptions of the energy systems in two geographical locations, i.e. Mexico and Denmark, are set up as inputs. Then, the assessment is done by calculating the energy impacts as well as environmental benefits in the energy...

  17. Assessment of bone formation capacity using in vivo transplantation assays: procedure and tissue analysis

    DEFF Research Database (Denmark)

    Abdallah, Basem; Ditzel, Nicholas; Kassem, Moustapha

    2008-01-01

    In vivo assessment of bone formation (osteogenesis) potential by isolated cells is an important method for analysis of cells and factors control ling bone formation. Currently, cell implantation mixed with hydroxyapa-tite/tricalcium phosphate in an open system (subcutaneous implantation) in immun...

  18. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  19. Designing student peer assessment in higher education: Analysis of written and oral peer feedback

    NARCIS (Netherlands)

    van den Berg, I.; Admiraal, W.; Pilot, A.

    2006-01-01

    Designing student peer assessment in higher education: analysis of written and oral peer feedback Relating it to design features, the present article describes the nature of written and oral peer feedback as it occurred in seven writing courses, each with a different PA design. Results indicate that

  20. Assessing model fit in latent class analysis when asymptotics do not hold

    NARCIS (Netherlands)

    van Kollenburg, Geert H.; Mulder, Joris; Vermunt, Jeroen K.

    2015-01-01

    The application of latent class (LC) analysis involves evaluating the LC model using goodness-of-fit statistics. To assess the misfit of a specified model, say with the Pearson chi-squared statistic, a p-value can be obtained using an asymptotic reference distribution. However, asymptotic p-values

  1. Taxometric Analysis of the Antisocial Features Scale of the Personality Assessment Inventory in Federal Prison Inmates

    Science.gov (United States)

    Walters, Glenn D.; Diamond, Pamela M.; Magaletta, Philip R.; Geyer, Matthew D.; Duncan, Scott A.

    2007-01-01

    The Antisocial Features (ANT) scale of the Personality Assessment Inventory (PAI) was subjected to taxometric analysis in a group of 2,135 federal prison inmates. Scores on the three ANT subscales--Antisocial Behaviors (ANT-A), Egocentricity (ANT-E), and Stimulus Seeking (ANT-S)--served as indicators in this study and were evaluated using the…

  2. Reliability of ^1^H NMR analysis for assessment of lipid oxidation at frying temperatures

    Science.gov (United States)

    The reliability of a method using ^1^H NMR analysis for assessment of oil oxidation at a frying temperature was examined. During heating and frying at 180 °C, changes of soybean oil signals in the ^1^H NMR spectrum including olefinic (5.16-5.30 ppm), bisallylic (2.70-2.88 ppm), and allylic (1.94-2.1...

  3. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  4. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2015-09-01

    Full Text Available The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the source of undesirable events. The developed methodology was shown in application case. It was found that the risk of water contamination in water-pipe network of the analyzed CWSS caused by water source incidental contamination is at controlled level.

  5. The assessment report of QA program through the analysis of quality trend in 1994

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author).

  6. The assessment report of QA program through the analysis of quality trend in 1994

    International Nuclear Information System (INIS)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author)

  7. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    International Nuclear Information System (INIS)

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J.

    2003-01-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  8. From text to codings: intercoder reliability assessment in qualitative content analysis.

    Science.gov (United States)

    Burla, Laila; Knierim, Birte; Barth, Jurgen; Liewald, Katharina; Duetz, Margreet; Abel, Thomas

    2008-01-01

    High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

  9. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  10. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2012-11-08

    This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. The usage of ratio analysis in the assessing of the operation of commercial entities

    Directory of Open Access Journals (Sweden)

    Ljušić Milan

    2017-01-01

    Full Text Available This paper covers both the financial analysis and the analysis of net working capital that are most commonly used in assessing the operations of economic entities. It is of sig- nificant importance to analyze the operations of each corporate entity through financial aspect (assessment of liquidity and credit worthiness, economic phenomena, reputation and inner strength. The analysis of the operational assessment of the economic entity is of significant importance for the management as it is taken as one of the necessary para- meters in decision – making. Thus gained results can greatly affect and guide managers towards future steps in the process of decision making. On the other hand, by the insight into the financial analysis, the attitude of both the investitors and creditors towards busi- ness entity is determined and therefore they can become interested for various forms of cooperation, including joint ventures. The aim of this paper is to determine, through financial analysis, the status, operation and also the financial situation of the company Sog Line L.T.D. Belgrade, as well as to highlight the important business segments and pos- sibly propose the measures for the improvement of the operation of the entity.

  12. Peak Pc Prediction in Conjunction Analysis: Conjunction Assessment Risk Analysis. Pc Behavior Prediction Models

    Science.gov (United States)

    Vallejo, J.J.; Hejduk, M.D.; Stamey, J. D.

    2015-01-01

    Satellite conjunction risk typically evaluated through the probability of collision (Pc). Considers both conjunction geometry and uncertainties in both state estimates. Conjunction events initially discovered through Joint Space Operations Center (JSpOC) screenings, usually seven days before Time of Closest Approach (TCA). However, JSpOC continues to track objects and issue conjunction updates. Changes in state estimate and reduced propagation time cause Pc to change as event develops. These changes a combination of potentially predictable development and unpredictable changes in state estimate covariance. Operationally useful datum: the peak Pc. If it can reasonably be inferred that the peak Pc value has passed, then risk assessment can be conducted against this peak value. If this value is below remediation level, then event intensity can be relaxed. Can the peak Pc location be reasonably predicted?

  13. Integrating Life-cycle Assessment into Transport Cost-benefit Analysis

    DEFF Research Database (Denmark)

    Manzo, Stefano; Salling, Kim Bang

    2016-01-01

    Traditional transport Cost-Benefit Analysis (CBA) commonly ignores the indirect environmental impacts of an infrastructure project deriving from the overall life-cycle of the different project components. Such indirect impacts are instead of key importance in order to assess the long......-term sustainability of a transport infrastructure project. In the present study we suggest to overcome this limit by combining a conventional life-cycle assessment approach with standard transport cost-benefit analysis. The suggested methodology is tested upon a case study project related to the construction of a new...... fixed link across the Roskilde fjord in Frederikssund (Denmark). The results are then compared with those from a standard CBA framework. The analysis shows that indirect environmental impacts represent a relevant share of the estimated costs of the project, clearly affecting the final project evaluation...

  14. Life cycle based dynamic assessment coupled with multiple criteria decision analysis

    DEFF Research Database (Denmark)

    Sohn, Joshua; Kalbar, Pradip; Birkved, Morten

    2017-01-01

    the service life of the building. This case study uses both the established and the coupled MCDA assessment methods to quantify and assess the balance of impacts between the production of mineral wool insulation versus the production of space heat. The use of TOPSIS method for calculating single scores......This work looks at coupling Life cycle assessment (LCA) with a dynamic inventory and multiple criteria decision analysis (MCDA) to improve the validity and reliability of single score results for complex systems. This is done using the case study of a representative Danish single family home over...... not matter which impact assessment is applied. However, for the scenarios where other impact categories vary inversely or independently from the climate change impact indicator, such as with renewable energy production, there is need for a more unconventional method, such as the TOPSIS method...

  15. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    International Nuclear Information System (INIS)

    Brignon, Jean-Marc

    2011-01-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial 'socially' performs in comparison with its alternatives. 'Industrial economics' methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a 'pragmatic regulatory impact analysis', that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is 'pragmatic' in the sense that it is driven by the purpose to assess 'what happens' with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  16. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  17. Evaluation of SOVAT: an OLAP-GIS decision support system for community health assessment data analysis.

    Science.gov (United States)

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2008-06-09

    Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture.On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (alpha = .01) from SPSS-GIS for satisfaction and time (p OLAP-GIS decision support systems as a valuable tool for CHA data analysis.

  18. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  19. Performance assessment of air quality monitoring networks using principal component analysis and cluster analysis

    International Nuclear Information System (INIS)

    Lu, Wei-Zhen; He, Hong-Di; Dong, Li-yun

    2011-01-01

    This study aims to evaluate the performance of two statistical methods, principal component analysis and cluster analysis, for the management of air quality monitoring network of Hong Kong and the reduction of associated expenses. The specific objectives include: (i) to identify city areas with similar air pollution behavior; and (ii) to locate emission sources. The statistical methods were applied to the mass concentrations of sulphur dioxide (SO 2 ), respirable suspended particulates (RSP) and nitrogen dioxide (NO 2 ), collected in monitoring network of Hong Kong from January 2001 to December 2007. The results demonstrate that, for each pollutant, the monitoring stations are grouped into different classes based on their air pollution behaviors. The monitoring stations located in nearby area are characterized by the same specific air pollution characteristics and suggested with an effective management of air quality monitoring system. The redundant equipments should be transferred to other monitoring stations for allowing further enlargement of the monitored area. Additionally, the existence of different air pollution behaviors in the monitoring network is explained by the variability of wind directions across the region. The results imply that the air quality problem in Hong Kong is not only a local problem mainly from street-level pollutions, but also a region problem from the Pearl River Delta region. (author)

  20. The advanced scenario analysis for performance assessment of geological disposal. Pt. 3. Main document

    International Nuclear Information System (INIS)

    Ohkubo, Hiroo

    2004-02-01

    In 'H12 Project to Establish Technical Basis for HLW Disposal in Japan' an approach that is based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the peer review. However it was also suggested that there are issues related to improving transparency and traceability of the procedure. Therefore, in the current financial year, in the first place a scenario development methodology was constructed taking into account the requirements identified last year. Furthermore a practical work-frame was developed to support the activities related to the scenario development. This work-frame was applied to an example scenario to check its applicability and identify issues for further research. Secondly, scenario analysis method with regard to perturbation scenario has been studied. First of all, a survey of perturbation scenario discussed in different countries has been carried out and its assessment has been examined. Especially, in Japan, technical information has been classified in order to assess three scenarios, which are seismic activity, faulting and igneous activity. Then, on the basis of assumed occurrence pattern and influence pattern for each perturbation scenario, variant type that should be considered in this analysis has been identified, and the concept of treatment, modeling data and requirements have been clarified. As a result of these researches, a future direction for advanced scenario analysis on performance assessment has been indicated, as well as associated issues to be discussed have been clarified. (author)

  1. Use of risk assessment methods for security design and analysis of nuclear and radioactive facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Andrade, Marcos C.; Jordao, Elizabete

    2011-01-01

    The objective of this work is to evaluate the applicability of risk assessment methods for analyzing the physical protection of nuclear and radioactive facilities. One of the important processes for physical protection in nuclear and radioactive facilities is the identifying of areas containing nuclear materials, structures, systems or components to be protected from sabotage, which could directly or indirectly lead to unacceptable radiological consequences. A survey of the international guidelines and recommendations about vital area identification, design basis threat (DBT), and the security of nuclear and radioactive facilities was carried out. The traditional methods used for quantitative risk assessment, like FMEA (Failure Mode and Effect Analysis), Event and Decision Trees, Fault and Success Trees, Vulnerability Assessment, Monte Carlo Simulation, Probabilistic Safety Assessment, Scenario Analysis, and Game Theory, among others, are highlighted. The applicability of such techniques to security issues, their pros and cons, the general resources needed to implement them, as data or support software, are analyzed. Finally, an approach to security design and analysis, beginning with a qualitative and preliminary examination to determine the range of possible scenarios, outcomes, and the systems to be included in the analyses, and proceeding to a progressively use of more quantitative techniques is presented. (author)

  2. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Science.gov (United States)

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  3. Urban water metabolism efficiency assessment: integrated analysis of available and virtual water.

    Science.gov (United States)

    Huang, Chu-Long; Vause, Jonathan; Ma, Hwong-Wen; Yu, Chang-Ping

    2013-05-01

    Resolving the complex environmental problems of water pollution and shortage which occur during urbanization requires the systematic assessment of urban water metabolism efficiency (WME). While previous research has tended to focus on either available or virtual water metabolism, here we argue that the systematic problems arising during urbanization require an integrated assessment of available and virtual WME, using an indicator system based on material flow analysis (MFA) results. Future research should focus on the following areas: 1) analysis of available and virtual water flow patterns and processes through urban districts in different urbanization phases in years with varying amounts of rainfall, and their environmental effects; 2) based on the optimization of social, economic and environmental benefits, establishment of an indicator system for urban WME assessment using MFA results; 3) integrated assessment of available and virtual WME in districts with different urbanization levels, to facilitate study of the interactions between the natural and social water cycles; 4) analysis of mechanisms driving differences in WME between districts with different urbanization levels, and the selection of dominant social and economic driving indicators, especially those impacting water resource consumption. Combinations of these driving indicators could then be used to design efficient water resource metabolism solutions, and integrated management policies for reduced water consumption. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Transmission risk assessment of invasive fluke Fascioloides magna using GIS-modelling and multicriteria analysis methods

    Directory of Open Access Journals (Sweden)

    Juhásová L.

    2017-06-01

    Full Text Available The combination of multicriteria analysis (MCA, particularly analytic hierarchy process (AHP and geographic information system (GIS were applied for transmission risk assessment of Fascioloides magna (Trematoda; Fasciolidae in south-western Slovakia. Based on the details on F. magna life cycle, the following risk factors (RF of parasite transmission were determined: intermediate (RFIH and final hosts (RFFH (biological factors, annual precipitation (RFAP, land use (RFLU, flooded area (RFFA, and annual mean air temperature (RFAT (environmental factors. Two types of risk analyses were modelled: (1 potential risk analysis was focused on the determination of the potential risk of parasite transmission into novel territories (data on F. magna occurrence were excluded; (2 actual risk analysis considered also the summary data on F. magna occurrence in the model region (risk factor parasite occurrence RFPO included in the analysis. The results of the potential risk analysis provided novel distribution pattern and revealed new geographical area as the potential risk zone of F. magna occurrence. Although the actual risk analysis revealed all four risk zones of F. magna transmission (acceptable, moderate, undesirable and unacceptable, its outputs were significantly affected by the data on parasite occurrence what reduced the informative value of the actual transmission risk assessment.

  5. Bibliometric analysis of global environmental assessment research in a 20-year period

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wei, E-mail: weili@bnu.edu.cn; Zhao, Yang

    2015-01-15

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field.

  6. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  7. Bibliometric analysis of global environmental assessment research in a 20-year period

    International Nuclear Information System (INIS)

    Li, Wei; Zhao, Yang

    2015-01-01

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field

  8. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Verbruggen, Aviel; Laes, Erik

    2015-01-01

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  9. A confirmative clinimetric analysis of the 36-item Family Assessment Device.

    Science.gov (United States)

    Timmerby, Nina; Cosci, Fiammetta; Watson, Maggie; Csillag, Claudio; Schmitt, Florence; Steck, Barbara; Bech, Per; Thastum, Mikael

    2018-02-07

    The Family Assessment Device (FAD) is a 60-item questionnaire widely used to evaluate self-reported family functioning. However, the factor structure as well as the number of items has been questioned. A shorter and more user-friendly version of the original FAD-scale, the 36-item FAD, has therefore previously been proposed, based on findings in a nonclinical population of adults. We aimed in this study to evaluate the brief 36-item version of the FAD in a clinical population. Data from a European multinational study, examining factors associated with levels of family functioning in adult cancer patients' families, were used. Both healthy and ill parents completed the 60-item version FAD. The psychometric analyses conducted were Principal Component Analysis and Mokken-analysis. A total of 564 participants were included. Based on the psychometric analysis we confirmed that the 36-item version of the FAD has robust psychometric properties and can be used in clinical populations. The present analysis confirmed that the 36-item version of the FAD (18 items assessing 'well-being' and 18 items assessing 'dysfunctional' family function) is a brief scale where the summed total score is a valid measure of the dimensions of family functioning. This shorter version of the FAD is, in accordance with the concept of 'measurement-based care', an easy to use scale that could be considered when the aim is to evaluate self-reported family functioning.

  10. Bioimpedance harmonic analysis as a tool to simultaneously assess circulation and nervous control.

    Science.gov (United States)

    Mudraya, I S; Revenko, S V; Nesterov, A V; Gavrilov, I Yu; Kirpatovsky, V I

    2011-07-01

    Multicycle harmonic (Fourier) analysis of bioimpedance was employed to simultaneously assess circulation and neural activity in visceral (rat urinary bladder) and somatic (human finger) organs. The informative value of the first cardiac harmonic of the bladder impedance as an index of bladder circulation is demonstrated. The individual reactions of normal and obstructive bladders in response to infusion cystometry were recorded. The potency of multicycle harmonic analysis of bioimpedance to assess sympathetic and parasympathetic neural control in urinary bladder is discussed. In the human finger, bioimpedance harmonic analysis revealed three periodic components at the rate of the heart beat, respiration and Mayer wave (0.1 Hz), which were observed under normal conditions and during blood flow arrest in the hand. The revealed spectrum peaks were explained by the changes in systemic blood pressure and in regional vascular tone resulting from neural vasomotor control. During normal respiration and circulation, two side cardiac peaks were revealed in a bioimpedance amplitude spectrum, whose amplitude reflected the depth of amplitude respiratory modulation of the cardiac output. During normal breathing, the peaks corresponding to the second and third cardiac harmonics were split, reflecting frequency respiratory modulation of the heart rate. Multicycle harmonic analysis of bioimpedance is a novel potent tool to examine the interaction between the respiratory and cardiovascular system and to simultaneously assess regional circulation and neural influences in visceral and somatic organs.

  11. TEM validation of immunohistochemical staining prior to assessment of tumour angiogenesis by computerised image analysis

    International Nuclear Information System (INIS)

    Killingsworth, M.C.

    2002-01-01

    Full text: Counts of microvessel density (MVD) within solid tumours have been shown to be an independent predictor of outcome with higher counts generally associated with a worse prognosis. These assessments are commonly performed on immunoperoxidase stained (IPX) sections with antibodies to CD34, CD31 and Factor VIII-related antigen routinely used as vascular markers. Tumour vascular density is thought to reflect the demand the growing neoplasm is placing on its feeding blood supply. Vascular density also appears to be associated with spread of invasive cells to distant sites. The present study of tumour angiogenesis in prostate cancer specimens aims to assess new vessel growth in addition to MVD counts. The hypothesis being that an assessment which takes into account vascular migration and proliferation as well as the number of patent vessels present may have improved predictive power over assessments based on MVD counts alone. We are employing anti-CD34 stained IPX sections which are digitally photographed and assessed by a computerised image analysis system. Our aim is to develop parameters whereby tumour angiogenesis may be assessed at the light microscopic level and then correlated with existing histological methods of tumour assessment such as Gleason grading. In order to use IPX stained sections for angiogenic assessment validation and understanding of the anti-CD34 immunostaining pattern was necessary. This involved the following steps: i) Morphological assessment of angiogenic changes present in tumour blood vessels. Morphological changes in endothelial cells and pericytes indicative of angiogenic activation are generally below the level of resolution available with light microscopy. TEM examination revealed endothelial cell budding, pericyte retraction, basement membrane duplication and endothelial sprout formation in capillaries and venules surrounding tumour glands. This information assisted with the development of parameters by which IPX sections

  12. Reliability and accuracy of a video analysis protocol to assess core ability.

    Science.gov (United States)

    McDonald, Dawn A; Delgadillo, James Q; Fredericson, Michael; McConnell, Jennifer; Hodgins, Melissa; Besier, Thor F

    2011-03-01

    To develop and test a method to measure core ability in healthy athletes with 2-dimensional video analysis software (SiliconCOACH). Specific objectives were to: (1) develop a standardized exercise battery with progressions of increasing difficulty to evaluate areas of core ability in elite athletes; (2) develop an objective and quantitative grading rubric with the use of video analysis software; (3) assess the test-retest reliability of the exercise battery; (4) assess the interrater and intrarater reliability of the video analysis system; and (5) assess the accuracy of the assessment. Test-retest repeatability and accuracy. Testing was conducted in the Stanford Human Performance Laboratory, Stanford University, Stanford, CA. Nine female gymnasts currently training with the Stanford Varsity Women's Gymnastics Team participated in testing. Participants completed a test battery composed of planks, side planks, and leg bridges of increasing difficulty. Subjects completed two 20-minute testing sessions within a 4- to 10-day period. Two-dimensional sagittal-plane video was captured simultaneously with 3-dimensional motion capture. The main outcome measures were pelvic displacement and time that elapsed until failure occurred, as measured with SiliconCOACH video analysis software. Test-retest and interrater and intrarater reliability of the video analysis measures was assessed. Accuracy as compared with 3-dimensional motion capture also was assessed. Levels reached during the side planks and leg bridges had an excellent test-retest correlation (r(2) = 0.84, r(2) = 0.95). Pelvis displacements measured by examiner 1 and examiner 2 had an excellent correlation (r(2) = 0.86, intraclass correlation coefficient = 0.92). Pelvis displacements measured by examiner 1 during independent grading sessions had an excellent correlation (r(2) = 0.92). Pelvis displacements from the plank and from a set of combined plank and side plank exercises both had an excellent correlation with 3

  13. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2015-03-01

    Full Text Available The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was determined for three printing orientations. The results suggest that the material used to fabricate the tensile specimens shows clear anisotropy of the properties in relation to the printing direction.

  14. RELAP5/MOD2 Overview and Developmental. Assessment Results from TMl-1 Plant Transient Analysis

    International Nuclear Information System (INIS)

    Lin, J. C.; Tsai, C. C.; Ransom, V. H.; Johnsen, G. W.

    2013-01-01

    RELAP5/MOD2 is a new version of the RELAP5 thermal-hydraulic computer code containing improved modeling features that provide a generic capability for pressurized water reactor transient simulation. The objective of this paper is to provide code users with an overview of the code and to report developmental assessment results obtained from a Three Mile Island Unit One plant transient analysis. The assessment shows that the injection of highly sub-cooled water into a high-pressure primary coolant system does not cause unphysical results or pose a problem for RELAP5/MOD2. (author)

  15. Contribution to Risk Analysis of a Standard Brewery: Application of a Hygiene Assessment System Survey

    OpenAIRE

    Raposo, António; Salazar, Jairo; Pérez, Esteban; Sanjuán, Esther; Carrascosa, Conrado; Saavedra, Pedro; Millán, Rafael

    2013-01-01

    "Beer is a food product with a high consumption in Gran Canaria and the brewery industry is also present in this island. In order to carry out this study, it was designed a survey to assist in the assessment of risks from the facilities and infrastructures of the brewery, the raw materials used in the beer production and the HACCP (Hazard Analysis and Critical Control Points) plan. An initial assessment of various aspects of the industry has been conducted at the beginning of hygienic-sani...

  16. Analysis and Assessment of Parameters Shaping Methane Hazard in Longwall Areas

    Directory of Open Access Journals (Sweden)

    Eugeniusz Krause

    2013-01-01

    Full Text Available Increasing coal production concentration and mining in coal seams of high methane content contribute to its growing emission to longwall areas. In this paper, analysis of survey data concerning the assessment of parameters that influence the level of methane hazard in mining areas is presented. The survey was conducted with experts on ventilation and methane hazard in coal mines. The parameters which influence methane hazard in longwall areas were assigned specific weights (numerical values. The summary will show which of the assessed parameters have a strong, or weak, influence on methane hazard in longwall areas close to coal seams of high methane content.

  17. Comparative Analysis of Fuzzy Set Defuzzification Methods in the Context of Ecological Risk Assessment

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2017-12-01

    Full Text Available Fuzzy inference systems are widely used in various areas of human activity. Their most widespread use lies in the field of fuzzy control of technical devices of different kind. Another direction of using fuzzy inference systems is modelling and assessment of different kind of risks under insufficient or missing objective initial data. Fuzzy inference is concluded by the procedure of defuzzification of the resulting fuzzy sets. A large number of techniques for implementing the defuzzification procedure are available nowadays. The paper presents a comparative analysis of some widespread methods of fuzzy set defuzzification, and proposes the most appropriate methods in the context of ecological risk assessment.

  18. The predictive value of aptitude assessment in laparoscopic surgery: a meta-analysis.

    Science.gov (United States)

    Kramp, Kelvin H; van Det, Marc J; Hoff, Christiaan; Veeger, Nic J G M; ten Cate Hoedemaker, Henk O; Pierie, Jean-Pierre E N

    2016-04-01

    Current methods of assessing candidates for medical specialties that involve laparoscopic skills suffer from a lack of instruments to assess the ability to work in a minimally invasive surgery environment. A meta-analysis was conducted to investigate whether aptitude assessment can be used to predict variability in the acquisition and performance of laparoscopic skills. PubMed, PsycINFO and Google Scholar were searched to November 2014 for published and unpublished studies reporting the measurement of a form of aptitude for laparoscopic skills. The quality of studies was assessed with QUADAS-2. Summary correlations were calculated using a random-effects model. Thirty-four studies were found to be eligible for inclusion; six of these studies used an operating room performance measurement. Laparoscopic skills correlated significantly with visual-spatial ability (r = 0.32, 95% confidence interval [CI] 0.25-0.39; p < 0.001), perceptual ability (r = 0.31, 95% CI 0.22-0.39; p < 0.001), psychomotor ability (r = 0.26, 95% CI 0.10-0.40; p = 0.003) and simulator-based assessment of aptitude (r = 0.64, 95% CI 0.52-0.73; p < 0.001). Three-dimensional dynamic visual-spatial ability showed a significantly higher correlation than intrinsic static visual-spatial ability (p = 0.024). In general, aptitude assessments are associated with laparoscopic skill level. Simulator-based assessment of aptitude appears to have the potential to represent a job sample and to enable the assessment of all forms of aptitude for laparoscopic surgery at once. A laparoscopy aptitude test can be a valuable additional tool in the assessment of candidates for medical specialties that require laparoscopic skills. © 2016 John Wiley & Sons Ltd.

  19. Dependence assessment in human reliability analysis based on D numbers and AHP

    International Nuclear Information System (INIS)

    Zhou, Xinyi; Deng, Xinyang; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    Highlights: • D numbers and AHP are combined to implement dependence assessment in HRA. • A new tool, called D numbers, is used to deal with the uncertainty in HRA. • The proposed method can well address the fuzziness and subjectivity in linguistic assessment. • The proposed method is well applicable in dependence assessment which inherently has a linguistic assessment process. - Abstract: Since human errors always cause heavy loss especially in nuclear engineering, human reliability analysis (HRA) has attracted more and more attention. Dependence assessment plays a vital role in HRA, measuring the dependence degree of human errors. Many researches have been done while still have improvement space. In this paper, a dependence assessment model based on D numbers and analytic hierarchy process (AHP) is proposed. Firstly, identify the factors used to measure the dependence level of two human operations. Besides, in terms of the suggested dependence level, determine and quantify the anchor points for each factor. Secondly, D numbers and AHP are adopted in model. Experts evaluate the dependence level of human operations for each factor. Then, the evaluation results are presented as D numbers and fused by D number’s combination rule that can obtain the dependence probability of human operations for each factor. The weights of factors can be determined by AHP. Thirdly, based on the dependence probability for each factor and its corresponding weight, the dependence probability of two human operations and its confidence can be obtained. The proposed method can well address the fuzziness and subjectivity in linguistic assessment. The proposed method is well applicable to assess the dependence degree of human errors in HRA which inherently has a linguistic assessment process.

  20. Dependence assessment in human reliability analysis based on D numbers and AHP

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xinyi; Deng, Xinyang [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Deng, Yong, E-mail: ydeng@swu.edu.cn [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu, Sichuan 610054 (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville, TN 37235 (United States)

    2017-03-15

    Highlights: • D numbers and AHP are combined to implement dependence assessment in HRA. • A new tool, called D numbers, is used to deal with the uncertainty in HRA. • The proposed method can well address the fuzziness and subjectivity in linguistic assessment. • The proposed method is well applicable in dependence assessment which inherently has a linguistic assessment process. - Abstract: Since human errors always cause heavy loss especially in nuclear engineering, human reliability analysis (HRA) has attracted more and more attention. Dependence assessment plays a vital role in HRA, measuring the dependence degree of human errors. Many researches have been done while still have improvement space. In this paper, a dependence assessment model based on D numbers and analytic hierarchy process (AHP) is proposed. Firstly, identify the factors used to measure the dependence level of two human operations. Besides, in terms of the suggested dependence level, determine and quantify the anchor points for each factor. Secondly, D numbers and AHP are adopted in model. Experts evaluate the dependence level of human operations for each factor. Then, the evaluation results are presented as D numbers and fused by D number’s combination rule that can obtain the dependence probability of human operations for each factor. The weights of factors can be determined by AHP. Thirdly, based on the dependence probability for each factor and its corresponding weight, the dependence probability of two human operations and its confidence can be obtained. The proposed method can well address the fuzziness and subjectivity in linguistic assessment. The proposed method is well applicable to assess the dependence degree of human errors in HRA which inherently has a linguistic assessment process.

  1. Cluster analysis and quality assessment of logged water at an irrigation project, eastern Saudi Arabia.

    Science.gov (United States)

    Hussain, Mahbub; Ahmed, Syed Munaf; Abderrahman, Walid

    2008-01-01

    A multivariate statistical technique, cluster analysis, was used to assess the logged surface water quality at an irrigation project at Al-Fadhley, Eastern Province, Saudi Arabia. The principal idea behind using the technique was to utilize all available hydrochemical variables in the quality assessment including trace elements and other ions which are not considered in conventional techniques for water quality assessments like Stiff and Piper diagrams. Furthermore, the area belongs to an irrigation project where water contamination associated with the use of fertilizers, insecticides and pesticides is expected. This quality assessment study was carried out on a total of 34 surface/logged water samples. To gain a greater insight in terms of the seasonal variation of water quality, 17 samples were collected from both summer and winter seasons. The collected samples were analyzed for a total of 23 water quality parameters including pH, TDS, conductivity, alkalinity, sulfate, chloride, bicarbonate, nitrate, phosphate, bromide, fluoride, calcium, magnesium, sodium, potassium, arsenic, boron, copper, cobalt, iron, lithium, manganese, molybdenum, nickel, selenium, mercury and zinc. Cluster analysis in both Q and R modes was used. Q-mode analysis resulted in three distinct water types for both the summer and winter seasons. Q-mode analysis also showed the spatial as well as temporal variation in water quality. R-mode cluster analysis led to the conclusion that there are two major sources of contamination for the surface/shallow groundwater in the area: fertilizers, micronutrients, pesticides, and insecticides used in agricultural activities, and non-point natural sources.

  2. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis

    Directory of Open Access Journals (Sweden)

    Ágatha Nogueira Previdelli

    2016-09-01

    Full Text Available The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents’ dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR. In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits, while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  3. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    DEFF Research Database (Denmark)

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design...... of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, to definedesign/retrofit targets for process improvements. Economic analysis is performed to evaluate the profitability of the process........ Also, simultaneously with sustainability analysis, the life cycle impact on environment associated with bioethanol production is performed. Finally, candidate alternative designs are generated and compared with the base case design in terms of LCA, economics, waste, energy usage and enviromental impact...

  4. Procedures for conducting common cause failure analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1992-05-01

    The principal objective of this report is to supplement the procedure developed in Mosleh et al. (1988, 1989) by providing more explicit guidance for a practical approach to common cause failures (CCF) analysis. The detailed CCF analysis following that procedure would be very labour intensive and time consuming. This document identifies a number of options for performing the more labour intensive parts of the analysis in an attempt to achieve a balance between the need for detail, the purpose of the analysis and the resources available. The document is intended to be compatible with the Agency's Procedures for Conducting Probabilistic Safety Assessments for Nuclear Power Plants (IAEA, 1992), but can be regarded as a stand-alone report to be used in conjunction with NUREG/CR-4780 (Mosleh et al., 1988, 1989) to provide additional detail, and discussion of key technical issues

  5. Radiation Safety Analysis In The NFEC For Assessing Possible Implementation Of The ICRP-60 Standard

    International Nuclear Information System (INIS)

    Yowono, I.

    1998-01-01

    Radiation safety analysis of the 3 facilities in the nuclear fuel element center (NFEC) for assessing possible implementation of the ICRP-60 standard has been done. The analysis has covered the radiation dose received by workers, dose rate in the working area, surface contamination level, air contamination level and the level of radioactive gas release to the environment. The analysis has been based on BATAN regulation and ICRP-60 standard. The result of the analysis has showed that the highest radiation dose received has been found to be only around 15% of the set value in the ICRP-60 standard and only 6% of the set value in the BATAN regulation. Thus the ICRP-60 as radiation safety standard could be implemented without changing the laboratory design

  6. Rad waste disposal safety analysis / Integrated safety assessment of a waste repository

    International Nuclear Information System (INIS)

    Jeong, Jongtae; Choi, Jongwon; Kang, Chulhyung

    2012-04-01

    We developed CYPRUS+and adopted PID and RES method for the development of scenario. Safety performance assessment program was developed using GoldSim for the safety assessment of disposal system for the disposal of spnet fuels and wastes resulting from the pyrpoprocessing. Biosphere model was developed and verified in cooperation with JAEA. The capability to evaluate post-closure performance and safety was added to the previously developed program. And, nuclide migration and release to the biosphere considering site characteristics was evaluated by using deterministic and probabilistic approach. Operational safety assessment for drop, fire, and earthquake was also statistically evaluated considering well-established input parameter distribution. Conservative assessment showed that dose rate is below the limit value of low- and intermediate-level repository. Gas generation mechanism within engineered barrier was defined and its influence on safety was evaluated. We made probabilistic safety assessment by obtaining the probability distribution functions of important input variables and also made a sensitivity analysis. The maximum annual dose rate was shown to be below the safety limit value of 10 mSv/yr. The structure and element of safety case was developed to increase reliability of safety assessment methodology for a deep geological repository. Finally, milestone for safety case development and implementation strategy for each safety case element was also proposed

  7. An approach to prospective consequential life cycle assessment and net energy analysis of distributed electricity generation

    International Nuclear Information System (INIS)

    Jones, Christopher; Gilbert, Paul; Raugei, Marco; Mander, Sarah; Leccisi, Enrica

    2017-01-01

    Increasing distributed renewable electricity generation is one of a number of technology pathways available to policy makers to meet environmental and other sustainability goals. Determining the efficacy of such a pathway for a national electricity system implies evaluating whole system change in future scenarios. Life cycle assessment (LCA) and net energy analysis (NEA) are two methodologies suitable for prospective and consequential analysis of energy performance and associated impacts. This paper discusses the benefits and limitations of prospective and consequential LCA and NEA analysis of distributed generation. It concludes that a combined LCA and NEA approach is a valuable tool for decision makers if a number of recommendations are addressed. Static and dynamic temporal allocation are both needed for a fair comparison of distributed renewables with thermal power stations to account for their different impact profiles over time. The trade-offs between comprehensiveness and uncertainty in consequential analysis should be acknowledged, with system boundary expansion and system simulation models limited to those clearly justified by the research goal. The results of this approach are explorative, rather than for accounting purposes; this interpretive remit, and the assumptions in scenarios and system models on which results are contingent, must be clear to end users. - Highlights: • A common LCA and NEA framework for prospective, consequential analysis is discussed. • Approach to combined LCA and NEA of distributed generation scenarios is proposed. • Static and dynamic temporal allocation needed to assess distributed generation uptake.

  8. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  9. The importance of accurate anatomic assessment for the volumetric analysis of the amygdala

    Directory of Open Access Journals (Sweden)

    L. Bonilha

    2005-03-01

    Full Text Available There is a wide range of values reported in volumetric studies of the amygdala. The use of single plane thick magnetic resonance imaging (MRI may prevent the correct visualization of anatomic landmarks and yield imprecise results. To assess whether there is a difference between volumetric analysis of the amygdala performed with single plane MRI 3-mm slices and with multiplanar analysis of MRI 1-mm slices, we studied healthy subjects and patients with temporal lobe epilepsy. We performed manual delineation of the amygdala on T1-weighted inversion recovery, 3-mm coronal slices and manual delineation of the amygdala on three-dimensional volumetric T1-weighted images with 1-mm slice thickness. The data were compared using a dependent t-test. There was a significant difference between the volumes obtained by the coronal plane-based measurements and the volumes obtained by three-dimensional analysis (P < 0.001. An incorrect estimate of the amygdala volume may preclude a correct analysis of the biological effects of alterations in amygdala volume. Three-dimensional analysis is preferred because it is based on more extensive anatomical assessment and the results are similar to those obtained in post-mortem studies.

  10. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2002-01-01

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  11. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  12. ASSESSMENT OF PLASTIC FLOWS AND STOCKS IN SERBIA USING MATERIAL FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Goran Vujić

    2010-01-01

    Full Text Available Material flow analysis (MFA was used to assess the amounts of plastic materials flows and stocks that are annually produced, consumed, imported, exported, collected, recycled, and disposed in the landfills in Serbia. The analysis revealed that approximatelly 269,000 tons of plastic materials are directly disposed in uncontrolled landfills in Serbia without any preatretment, and that siginificant amounts of these materials have already accumulated in the landfills. The substantial amounts of landfilled plastics represent not only a loss of valuable recourses, but also pose a seriuos treath to the environment and human health, and if the trend of direct plastic landfilling is continued, Serbia will face with grave consecequnces.

  13. A systems engineering cost analysis capability for use in assessing nuclear waste management system cost performance

    International Nuclear Information System (INIS)

    Shay, M.R.

    1990-04-01

    The System Engineering Cost Analysis (SECA) capability has been developed by the System Integration Branch of the US Department of Energy's Office of Civilian Radioactive Waste Management for use in assessing the cost performance of alternative waste management system configurations. The SECA capability is designed to provide rapid cost estimates of the waste management system for a given operational scenario and to permit aggregate or detailed cost comparisons for alternative waste system configurations. This capability may be used as an integral part of the System Integration Modeling System (SIMS) or, with appropriate input defining a scenario, as a separate cost analysis model

  14. Analysis of Parameters Assessment on Laminated Rubber-Metal Spring for Structural Vibration

    International Nuclear Information System (INIS)

    Salim, M.A.; Putra, A.; Mansor, M.R.; Musthafah, M.T.; Akop, M.Z.; Abdullah, M.A.

    2016-01-01

    This paper presents the analysis of parameter assessment on laminated rubber-metal spring (LR-MS) for vibrating structure. Three parameters were selected for the assessment which are mass, Young's modulus and radius. Natural rubber materials has been used to develop the LR-MS model. Three analyses were later conducted based on the selected parameters to the LR-MS performance which are natural frequency, location of the internal resonance frequency and transmissibility of internal resonance. Results of the analysis performed were plotted in frequency domain function graph. Transmissibility of laminated rubber-metal spring (LR-MS) is changed by changing the value of the parameter. This occurrence was referred to the theory from open literature then final conclusion has been make which are these parameters have a potential to give an effects and trends for LR-MS transmissibility. (paper)

  15. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  16. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...... are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph...

  17. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  18. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    International Nuclear Information System (INIS)

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis

  19. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  20. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  1. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  2. Approach to proliferation risk assessment based on multiple objective analysis framework

    International Nuclear Information System (INIS)

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  3. The role of damage analysis in the assessment of service-exposed components

    International Nuclear Information System (INIS)

    Bendick, W.; Muesch, H.; Weber, H.

    1987-01-01

    Components in power stations are subjected to service conditions under which creep processes take place limiting the component's lifetime by material exhaustion. To ensure a safe and economic plant operation it is necessary to get information about the exhaustion grade of single components as well as of the whole plant. A comprehensive lifetime assessment requests the complete knowledge of the service parameters, the component's deformtion behavior, and the change in material properties caused by longtime exposure to high service temperatures. A basis of evaluation is given by: 1) determination of material exhaustion by calculation, 2) investigation of the material properties, and 3) damage analysis. The purpose of this report is to show the role which damage analysis can play in the assessment of service-exposed components. As an example the test results of a damaged pipe bend will be discussed. (orig./MM)

  4. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    International Nuclear Information System (INIS)

    Harry, T; Manger, R; Cervino, L; Pawlicki, T

    2016-01-01

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  5. Sustainability Assessment of Electricity Generation Technologies in Egypt Using Multi-Criteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Mostafa Shaaban

    2018-05-01

    Full Text Available Future electricity planning necessitates a thorough multi-faceted analysis of the available technologies in order to secure the energy supply for coming generations. To cope with worldwide concerns over sustainable development and meet the growing demands of electricity we assess the future potential technologies in Egypt through covering their technical, economic, environmental and social aspects. In this study we fill the gap of a lacking sustainability assessment of energy systems in Egypt where most of the studies focus mainly on the economic and technical aspects of planning future installation of power plants in Egypt. Furthermore, we include the stakeholder preferences of the indicators in the energy sector into our assessment. Moreover, we perform a sensitivity analysis through single dimension assessment scenarios of the technologies as well as a sustainable scenario with equal preferences of all dimensions of the sustainability. We employ two multi-criteria decision analysis (MCDA methodologies: the analytical hierarchy process for weighing the assessment criteria, and the weighted sum method for generating a general integrated sustainability index for each technology. The study investigates seven technologies: coal, natural gas, wind, concentrated solar power, photovoltaics, biomass and nuclear. The results reveal a perfect matching between the ranking of the technologies by the stakeholders and the sustainable scenario showing the highest ranking for natural gas and the lowest for nuclear and coal. There is a strong potential for renewable energy technologies to invade the electricity market in Egypt where they achieve the second ranking after natural gas. The Monte-Carlo approach gives photovoltaics a higher ranking over concentrated solar power as compared to the sample data ranking. The study concludes the importance of a multi-dimensional evaluation of the technologies while considering the preferences of the stakeholders in

  6. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Harry, T [Oregon State University, Corvallis, OR (United States); University of California, San Diego, La Jolla, CA (United States); Manger, R; Cervino, L; Pawlicki, T [University of California, San Diego, La Jolla, CA (United States)

    2016-06-15

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this work was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.

  7. Application of data analysis techniques to nuclear reactor systems code to accuracy assessment

    International Nuclear Information System (INIS)

    Kunz, R.F.; Kasmala, G.F.; Murray, C.J.; Mahaffy, J.H.

    2000-01-01

    An automated code assessment program (ACAP) has been developed by the authors to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. This software was developed under subcontract to the United States Nuclear Regulatory Commission for use in its NRS code consolidation efforts. In this paper, background on the topic of NRS accuracy and uncertainty assessment is provided which motivates the development of and defines basic software requirements for ACAP. A survey of data analysis techniques was performed, focusing on the applicability of methods in the construction of NRS code-data comparison measures. The results of this review process, which further defined the scope, user interface and process for using ACAP are also summarized. A description of the software package and several sample applications to NRS data sets are provided. Its functionality and ability to provide objective accuracy assessment figures are demonstrated. (author)

  8. Tolerability of risk, safety assessment principles and their implications for probabilistic safety analysis

    International Nuclear Information System (INIS)

    Ewing, D.J.F.; Campbell, J.F.

    1994-01-01

    This paper gives a regulatory view of probabilistic safety assessment as seen by the Nuclear Installations Inspectorate (NII) and in the light of the general regulatory risk aims set out in the Health and Safety Executive's (HSE) The tolerability of risk from nuclear power stations (TOR) and in Safety assessment principles for nuclear plants (SAPs), prepared by NII on behalf of the HSE. Both of these publications were revised and republished in 1992. This paper describes the SAPs, together with the historical background, the motivation for review, the effects of the Sizewell and Hinkley Point C public inquiries, changes since the original versions, comparison with international standards and use in assessment. For new plant, probabilistic safety analysis (PSA) is seen as an essential tool in balancing the safety of the design and in demonstrating compliance with TOR and the SAPs. (Author)

  9. Psychometric Analysis of the Work/Life Balance Self-Assessment Scale.

    Science.gov (United States)

    Smeltzer, Suzanne C; Cantrell, Mary Ann; Sharts-Hopko, Nancy C; Heverly, Mary Ann; Jenkinson, Amanda; Nthenge, Serah

    2016-01-01

    This study investigated the psychometric properties of the Work/Life Balance Self-Assessment scale among nurse faculty involved in doctoral education. A national random sample of 554 respondents completed the Work/Life Balance Self-Assessment scale, which addresses 3 factors: work interference with personal life (WIPL), personal life interference with work (PLIW), and work/personal life enhancement (WPLE). A principal components analysis with varimax rotation revealed 3 internally consistent aspects of work-life balance, explaining 40.5% of the variance. The Cronbach's alpha coefficients for reliability of the scale were .88 for the total scale and for the subscales, .93 (WIPL), .85 (PLIW), and .69 (WPLE). The Work/Life Balance Self-Assessment scale appears to be a reliable and valid instrument to examine work-life balance among nurse faculty.

  10. Analysis of the possibilities of using EEG in assessing pilots’ psychophysical condition

    Directory of Open Access Journals (Sweden)

    Marta GALANT

    2017-06-01

    Full Text Available An excessive load on an operator’s cognitive system can cause deterioration in perceptual abilities, decreased reaction time and increased probability of making an incorrect decision, which in turn can lead to a dangerous situation. Researching the cognitive load of an operator can therefore contribute to safer transportation. While there are many methods used in the study of cognitive load, they can be classified as either subjective assessments or objective assessments. This paper presents an analysis of the possibilities of using electroencephalography in assessing the psychophysical condition of the pilot. The investigation was conducted in the Simulation Research Laboratory in the Institute of Combustion Engines and Transport at Poznan University of Technology.

  11. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    International Nuclear Information System (INIS)

    Lee, Hyeong Yeon

    2008-11-01

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008

  12. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  13. ELEMENTAL ANALYSIS OF RESPIRABLE TIRE PARTICLES AND ASSESSMENT OF CARDIO-PULMONARY TOXICITY IN RATS

    Science.gov (United States)

    Elemental Analysis of Respirable Tire Particles and Assessment of Cardio-pulmonary Toxicity in RatsR.R. Gottipolu, PhD1, E. Landa, PhD2, J.K. McGee, MS1, M.C. Schladweiler, BS1, J.G. Wallenborn, MS3, A.D. Ledbetter, BS1, J.E. Richards, MS1 and U.P. Kodavanti, PhD1. 1NHEER...

  14. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeong Yeon

    2008-11-15

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008.

  15. USE OF MULTIPARAMETER ANALYSIS OF LABORATORY BIOMARKERS TO ASSESS RHEUMATOID ARTHRITIS ACTIVITY

    Directory of Open Access Journals (Sweden)

    A. A. Novikov

    2015-01-01

    Full Text Available The key component in the management of patients with rheumatoid arthritis (RA is regular control of RA activity. The quantitative assessment of a patient’s status allows the development of standardized indications for anti-rheumatic therapy.Objective: to identify the laboratory biomarkers able to reflect RA activity.Subjects and methods. Fifty-eight patients with RA and 30 age- and sex-matched healthy donors were examined. The patients were divided into high/moderate and mild disease activity groups according to DAS28. The serum concentrations of 30 biomarkers were measured using immunonephelometric assay, enzyme immunoassay, and xMAP technology.Results and discussion. Multivariate analysis could identify the factors mostly related to high/moderate RA activity according to DAS28, such as fibroblast growth factor-2, monocyte chemoattractant protein-1, interleukins (IL 1α, 6, and 15, and tumor necrosis factor-α and could create a prognostic model for RA activity assessment. ROC analysis has shown that this model has excellent diagnostic efficiency in differentiating high/moderate versus low RA activity.Conclusion. To create a subjective assessment-independent immunological multiparameter index of greater diagnostic accuracy than the laboratory parameters routinely used in clinical practice may be a qualitatively new step in assessing and monitoring RA activity.

  16. EFL LEARNERS REPAIR SEQUENCE TYPES ANALYSIS AS PEER- ASSESSMENT IN ORAL PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Novia Trisanti

    2017-04-01

    Full Text Available There are certain concerns that EFL teacher needs to observe in assessing students oral performance, such as the amount of words which the learners utter, the grammatical errors that they make, the hesitation and certain expression that they produce. This paper attempts to give overview of research results using qualitative method which show the impacts of repair sequence types analysis on those elements needed to be observed as students peer and self-assessment to enhance their speaking ability. The subject was tertiary level learners of English Department, State University of Semarang, Indonesia in 2012. Concerning the repair types, there are four repair sequences as reviewed by Buckwalter (2001, they are Self-Initiated Self Repair (SISR, Self-Initiated Other Repair (SIOR, Other-Initiated Self Repair (OISR, and Other-Initiated Other Repair (OIOR. Having the repair sequences types anaysis, the students investigated the repair sequence of their peers while they performed in class conversation. The modified peer- assessment guideline as proposed by Brown (2004 was used in identifying, categorizing and classifying the types of repair sequences in their peers oral performance. While, the peer-assessment can be a valuable additional means to improve students speaking since it is one of the motives that drive peer- evaluation, along with peer- verification, also peer and self- enhancement. The analysis results were then interpreted to see whether there was significant finding related to the students’ oral performance enhancement.

  17. Realistic Noise Assessment and Strain Analysis of Iranian Permanent GPS Stations

    Science.gov (United States)

    Razeghi, S. M.; Amiri Simkooei, A. A.; Sharifi, M. A.

    2012-04-01

    To assess noise characteristics of Iranian Permanent GPS Stations (IPGS), northwestern part of this network namely Azerbaijan Continuous GPS Station (ACGS), was selected. For a realistic noise assessment it is required to model all deterministic signals of the GPS time series by means of least squares harmonic estimation (LS-HE) and derive all periodic behavior of the series. After taking all deterministic signals into account, the least squares variance component estimation (LS-VCE) is used to obtain a realistic noise model (white noise plus flicker noise) of the ACGS. For this purpose, one needs simultaneous GPS time series for which a multivariate noise assessment is applied. Having determined realistic noise model, a realistic strain analysis of the network is obtained for which one relies on the finite element methods. Finite element is now considered to be a new functional model and the new stochastic model is given based on the multivariate noise assessment using LS-VCE. The deformation rates of the components along with their full covariance matries are input to the strain analysis. Further, the results are also provided using a pure white noise model. The normalized strains for these two models show that the strain parameters derived from a realistic noise model are less significant than those derived from the white model. This could be either due to the short time span of the time series used or due to the intrinsic behavior of the strain parameters in the ACGS. Longer time series are required to further elaborate this issue.

  18. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  19. Scenario analysis for the postclosure assessment of the Canadian concept for nuclear fuel waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Goodwin, B W; Stephens, M E; Davison, C C; Johnson, L H; Zach, R

    1994-12-01

    AECL Research has developed and evaluated a concept for disposal of Canada`s nuclear fuel waste involving deep underground disposal of the waste in intrusive igneous rock of the Canadian Shield. The postclosure assessment of this concept focusses on the effects on human health and the environment due to potential contaminant releases into the biosphere after the disposal vault is closed. Both radiotoxic and chemically toxic contaminants are considered. One of the steps in the postclosure assessment process is scenario analysis. Scenario analysis identifies factors that could affect the performance of the disposal system and groups these factors into scenarios that require detailed quantitative evaluation. This report documents a systematic procedure for scenario analysis that was developed for the postclosure assessment and then applied to the study of a hypothetical disposal system. The application leads to a comprehensive list of factors and a set of scenarios that require further quantitative study. The application also identifies a number of other factors and potential scenarios that would not contribute significantly to environmental and safety impacts for the hypothetical disposal system. (author). 46 refs., 3 tabs., 3 figs., 2 appendices.

  20. Scenario analysis for the postclosure assessment of the Canadian concept for nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    Goodwin, B.W.; Stephens, M.E.; Davison, C.C.; Johnson, L.H.; Zach, R.

    1994-12-01

    AECL Research has developed and evaluated a concept for disposal of Canada's nuclear fuel waste involving deep underground disposal of the waste in intrusive igneous rock of the Canadian Shield. The postclosure assessment of this concept focusses on the effects on human health and the environment due to potential contaminant releases into the biosphere after the disposal vault is closed. Both radiotoxic and chemically toxic contaminants are considered. One of the steps in the postclosure assessment process is scenario analysis. Scenario analysis identifies factors that could affect the performance of the disposal system and groups these factors into scenarios that require detailed quantitative evaluation. This report documents a systematic procedure for scenario analysis that was developed for the postclosure assessment and then applied to the study of a hypothetical disposal system. The application leads to a comprehensive list of factors and a set of scenarios that require further quantitative study. The application also identifies a number of other factors and potential scenarios that would not contribute significantly to environmental and safety impacts for the hypothetical disposal system. (author). 46 refs., 3 tabs., 3 figs., 2 appendices

  1. Component fragility analysis methodology for seismic risk assessment projects. Proven PSA safety document processing and assessment procedures

    International Nuclear Information System (INIS)

    Kolar, Ladislav

    2013-03-01

    The seismic risk task assessment task should be structured as follows: (i) Define all reactor unit building structures, components and equipment involved in the creation of an initiating event (IE) induced by an seismic event or contributing to the reliability of reactor unit response to an IE; (ii) construct and estimate of the fragility curves for the building and component groups sub (i); (iii) determine the HCLPF for each group of buildings, components or equipment; (iv) determine the nuclear source's seismic resistance (SME) as the minimum HCLPF from the group of equipment in the risk-dominant scenarios; (v) define the risk-limiting group of components, equipment and building structures to the SME value; (vi) based on the fragility levels, identify component groups for which a more detailed fragility analysis is needed; and (vii) recommend groups of equipment or building structures that should be taken into account with respect to the seismic risk, i.e. such groups of equipment or building structures as exhibit a low seismic resistance (HCLPF) and, at the same time, are involved to a significant extent in the reactor unit's seismic risk (are present in the dominant risk scenarios). (P.A.)

  2. Assessing the influence of Environmental Impact Assessments on science and policy: an analysis of the Three Gorges Project.

    Science.gov (United States)

    Tullos, Desiree

    2009-07-01

    The need to understand and minimize negative environmental outcomes associated with large dams has both contributed to and benefited from the introduction and subsequent improvements in the Environmental Impact Assessment (EIA) process. However, several limitations in the EIA process remain, including those associated with the uncertainty and significance of impact projections. These limitations are directly related to the feedback between science and policy, with information gaps in scientific understanding discovered through the EIA process contributing valuable recommendations on critical focus areas for prioritizing and funding research within the fields of ecological conservation and river engineering. This paper presents an analysis of the EIA process for the Three Gorges Project (TGP) in China as a case study for evaluating this feedback between the EIA and science and policy. For one of the best-studied public development projects in the world, this paper presents an investigation into whether patterns exist between the scientific interest (via number of publications) in environmental impacts and (a) the identification of impacts as uncertain or priority by the EIA, (b) decisions or political events associated with the dam, and (c) impact type. This analysis includes the compilation of literature on TGP, characterization of ecosystem interactions and responses to TGP through a hierarchy of impacts, coding of EIA impacts as "uncertain" impacts that require additional study and "priority" impacts that have particularly high significance, mapping of an event chronology to relate policies, institutional changes, and decisions about TGP as "events" that could influence the focus and intensity of scientific investigation, and analysis of the number of publications by impact type and order within the impact hierarchy. From these analyses, it appears that the availability and consistency of scientific information limit the accuracy of environmental impact

  3. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Hongbin; Wang Chunyan; Qi Yao [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); University of Chinese Academy of Sciences, Beijing 100039 (China); Song Fengrui, E-mail: songfr@ciac.jl.cn [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); Liu Zhiqiang; Liu Shuying [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China)

    2012-11-08

    Highlights: Black-Right-Pointing-Pointer DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. Black-Right-Pointing-Pointer The DART MS behavior of six aconitine-type alkaloids was investigated. Black-Right-Pointing-Pointer Chemical markers were recognized between the qualified and unqualified samples. Black-Right-Pointing-Pointer DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in

  4. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    International Nuclear Information System (INIS)

    Zhu Hongbin; Wang Chunyan; Qi Yao; Song Fengrui; Liu Zhiqiang; Liu Shuying

    2012-01-01

    Highlights: ► DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. ► The DART MS behavior of six aconitine-type alkaloids was investigated. ► Chemical markers were recognized between the qualified and unqualified samples. ► DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality

  5. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    Science.gov (United States)

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  6. Seeking Missing Pieces in Science Concept Assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch Analysis

    Science.gov (United States)

    Ding, Lin

    2014-01-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…

  7. A Comparative Analysis on Assessment of Land Carrying Capacity with Ecological Footprint Analysis and Index System Method.

    Directory of Open Access Journals (Sweden)

    Yao Qian

    Full Text Available Land carrying capacity (LCC explains whether the local land resources are effectively used to support economic activities and/or human population. LCC can be evaluated commonly with two approaches, namely ecological footprint analysis (EFA and the index system method (ISM. EFA is helpful to investigate the effects of different land categories whereas ISM can be used to evaluate the contributions of social, environmental, and economic factors. Here we compared the two LCC-evaluation approaches with data collected from Xiamen City, a typical region where rapid economic growth and urbanization are found in China. The results show that LCC assessments with EFA and ISM not only complement each other but also are mutually supportive. Both assessments suggest that decreases in arable land and increasingly high energy consumption have major negative effects on LCC and threaten sustainable development for Xiamen City. It is important for the local policy makers, planners and designers to reduce ecological deficits by controlling fossil energy consumption, protecting arable land and forest land from converting into other land types, and slowing down the speed of urbanization, and to promote sustainability by controlling rural-to-urban immigration, increasing hazard-free treatment rate of household garbage, and raising energy consumption per unit industrial added value. Although EFA seems more appropriate for estimating LCC for a resource-output or self-sufficient region and ISM is more suitable for a resource-input region, both approaches should be employed when perform LCC assessment in any places around the world.

  8. An Analysis of the Most Adopted Rating Systems for Assessing the Environmental Impact of Buildings

    Directory of Open Access Journals (Sweden)

    Elena Bernardi

    2017-07-01

    Full Text Available Rating systems for assessing the environmental impact of buildings are technical instruments that aim to evaluate the environmental impact of buildings and construction projects. In some cases, these rating systems can also cover urban-scale projects, community projects, and infrastructures. These schemes are designed to assist project management in making the projects more sustainable by providing frameworks with precise criteria for assessing the various aspects of a building’s environmental impact. Given the growing interest in sustainable development worldwide, many rating systems for assessing the environmental impact of buildings have been established in recent years, each one with its peculiarities and fields of applicability. The present work is motivated by an interest in emphasizing such differences to better understand these rating systems and extract the main implications to building design. It also attempts to summarize in a user-friendly form the vast and fragmented assortment of information that is available today. The analysis focuses on the six main rating systems: the Building Research Establishment Environmental Assessment Methodology (BREEAM, the Comprehensive Assessment System for Built Environment Efficiency (CASBEE, the Deutsche Gesellschaft für Nachhaltiges Bauen (DGNB, the Haute Qualité Environnementale (HQETM, the Leadership in Energy and Environmental Design (LEED, and the Sustainable Building Tool (SBTool.

  9. Annual Performance Assessment and Composite Analysis Review for the ICDF Landfill FY 2008

    International Nuclear Information System (INIS)

    Koslow, Karen; Rood, Arthur

    2009-01-01

    This report addresses low-level waste disposal operations at the Idaho Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Disposal Facility (ICDF) landfill from the start of operations in Fiscal Year 2003 through Fiscal Year 2008. The ICDF was authorized in the Operable Unit 3-13 Record of Decision for disposal of waste from the Idaho National Laboratory Site CERCLA environmental restoration activities. The ICDF has been operating since 2003 in compliance with the CERCLA requirements and the waste acceptance criteria developed in the CERCLA process. In developing the Operable Unit 3-13 Record of Decision, U.S. Department of Energy Order (DOE) 435.1, 'Radioactive Waste Management', was identified as a 'to be considered' requirement for the ICDF. The annual review requirement under DOE Order 435.1 was determined to be an administrative requirement and, therefore, annual reviews were not prepared on an annual basis. However, the landfill has been operating for 5 years and, since the waste forms and inventories disposed of have changed from what was originally envisioned for the ICDF landfill, the ICDF project team has decided that this annual review is necessary to document the changes and provide a basis for any updates in analyses that may be necessary to continue to meet the substantive requirements of DOE Order 435.1. For facilities regulated under DOE Order 435.1-1, U.S. DOE Manual 435.1-1, 'Radioactive Waste Management', IV.P.(4)(c) stipulates that annual summaries of low-level waste disposal operations shall be prepared with respect to the conclusions and recommendations of the performance assessment and composite analysis. Important factors considered in this review include facility operations, waste receipts, and results from monitoring and research and development programs. There have been no significant changes in operations at the landfill in respect to the disposal geometry, the verification of waste characteristics, and the

  10. Annual Performance Assessment and Composite Analysis Review for the ICDF Landfill FY 2008

    Energy Technology Data Exchange (ETDEWEB)

    Karen Koslow

    2009-08-31

    This report addresses low-level waste disposal operations at the Idaho Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Disposal Facility (ICDF) landfill from the start of operations in Fiscal Year 2003 through Fiscal Year 2008. The ICDF was authorized in the Operable Unit 3-13 Record of Decision for disposal of waste from the Idaho National Laboratory Site CERCLA environmental restoration activities. The ICDF has been operating since 2003 in compliance with the CERCLA requirements and the waste acceptance criteria developed in the CERCLA process. In developing the Operable Unit 3-13 Record of Decision, U.S. Department of Energy Order (DOE) 435.1, 'Radioactive Waste Management', was identified as a 'to be considered' requirement for the ICDF. The annual review requirement under DOE Order 435.1 was determined to be an administrative requirement and, therefore, annual reviews were not prepared on an annual basis. However, the landfill has been operating for 5 years and, since the waste forms and inventories disposed of have changed from what was originally envisioned for the ICDF landfill, the ICDF project team has decided that this annual review is necessary to document the changes and provide a basis for any updates in analyses that may be necessary to continue to meet the substantive requirements of DOE Order 435.1. For facilities regulated under DOE Order 435.1-1, U.S. DOE Manual 435.1-1, 'Radioactive Waste Management', IV.P.(4)(c) stipulates that annual summaries of low-level waste disposal operations shall be prepared with respect to the conclusions and recommendations of the performance assessment and composite analysis. Important factors considered in this review include facility operations, waste receipts, and results from monitoring and research and development programs. There have been no significant changes in operations at the landfill in respect to the disposal geometry, the verification of

  11. Assessing the Goodness of Fit of Phylogenetic Comparative Methods: A Meta-Analysis and Simulation Study.

    Directory of Open Access Journals (Sweden)

    Dwueng-Chwuan Jhwueng

    Full Text Available Phylogenetic comparative methods (PCMs have been applied widely in analyzing data from related species but their fit to data is rarely assessed.Can one determine whether any particular comparative method is typically more appropriate than others by examining comparative data sets?I conducted a meta-analysis of 122 phylogenetic data sets found by searching all papers in JEB, Blackwell Synergy and JSTOR published in 2002-2005 for the purpose of assessing the fit of PCMs. The number of species in these data sets ranged from 9 to 117.I used the Akaike information criterion to compare PCMs, and then fit PCMs to bivariate data sets through REML analysis. Correlation estimates between two traits and bootstrapped confidence intervals of correlations from each model were also compared.For phylogenies of less than one hundred taxa, the Independent Contrast method and the independent, non-phylogenetic models provide the best fit.For bivariate analysis, correlations from different PCMs are qualitatively similar so that actual correlations from real data seem to be robust to the PCM chosen for the analysis. Therefore, researchers might apply the PCM they believe best describes the evolutionary mechanisms underlying their data.

  12. Development of a computer tool to support scenario analysis for safety assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Kawamura, Makoto; Wakasugi, Keiichiro; Okubo, Hiroo; Takase, Hiroyasu

    2007-02-01

    In 'H12 Project to Establishing Technical Basis for HLW Disposal in Japan' a systematic approach that was based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the domestic and international peer review. However it was also suggested that there were issues related to improving transparency and traceability of the procedure. To achieve this, improvement of scenario analysis method has been studied. In this study, based on an improvement method for treatment of FEP interaction a computer tool to support scenario analysis by specialists of performance assessment has been developed. Anticipated effects of this tool are to improve efficiency of complex and time consuming scenario analysis work and to reduce possibility of human errors in this work. This tool also enables to describe interactions among a vast number of FEPs and the related information as interaction matrix, and analysis those interactions from a variety of perspectives. (author)

  13. Development of residual stress analysis procedure for fitness-for-service assessment of welded structure

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Jin, Tae Eun; Dong, P.; Prager, M.

    2003-01-01

    In this study, a state of art review of existing residual stress analysis techniques and representative solutions is presented in order to develop the residual stress analysis procedure for Fitness-For-Service(FFS) assessment of welded structure. Critical issues associated with existing residual stress solutions and their treatments in performing FFS are discussed. It should be recognized that detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermomechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress analysis procedure for FFS purposes belongs to the latter category. With this in mind, both residual stress analysis techniques and their adequacy for FFS are assessed based on both literature data and analyses performed in this investigation

  14. Diagnostic value of voice acoustic analysis in assessment of occupational voice pathologies in teachers.

    Science.gov (United States)

    Niebudek-Bogusz, Ewa; Fiszer, Marta; Kotylo, Piotr; Sliwinska-Kowalska, Mariola

    2006-01-01

    It has been shown that teachers are at risk of developing occupational dysphonia, which accounts for over 25% of all occupational diseases diagnosed in Poland. The most frequently used method of diagnosing voice diseases is videostroboscopy. However, to facilitate objective evaluation of voice efficiency as well as medical certification of occupational voice disorders, it is crucial to implement quantitative methods of voice assessment, particularly voice acoustic analysis. The aim of the study was to assess the results of acoustic analysis in 66 female teachers (aged 40-64 years), including 35 subjects with occupational voice pathologies (e.g., vocal nodules) and 31 subjects with functional dysphonia. The acoustic analysis was performed using the IRIS software, before and after a 30-minute vocal loading test. All participants were subjected also to laryngological and videostroboscopic examinations. After the vocal effort, the acoustic parameters displayed statistically significant abnormalities, mostly lowered fundamental frequency (Fo) and incorrect values of shimmer and noise to harmonic ratio. To conclude, quantitative voice acoustic analysis using the IRIS software seems to be an effective complement to voice examinations, which is particularly helpful in diagnosing occupational dysphonia.

  15. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  16. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z.; Xiao, T.; Li, D.

    2016-07-01

    Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. (Author)

  17. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  18. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  19. The development of integrated diabetes care in the Netherlands: a multiplayer self-assessment analysis.

    Science.gov (United States)

    Zonneveld, Nick; Vat, Lidewij E; Vlek, Hans; Minkman, Mirella M N

    2017-03-21

    Since recent years Dutch diabetes care has increasingly focused on improving the quality of care by introducing the concept of care groups (in Dutch: 'zorggroepen'), care pathways and improving cooperation with involved care professionals and patients. This study examined how participating actors in care groups assess the development of their diabetes services and the differences and similarities between different stakeholder groups. A self-evaluation study was performed within 36 diabetes care groups in the Netherlands. A web-based self-assessment instrument, based on the Development Model for Integrated Care (DMIC), was used to collect data among stakeholders of each care group. The DMIC defines nine clusters of integrated care and four phases of development. Statistical analysis was used to analyze the data. Respondents indicated that the diabetes care groups work together in well-organized multidisciplinary teams and there is clarity about one another's expertise, roles and tasks. The care groups can still develop on elements related to the management and monitoring of performance, quality of care and patient-centeredness. The results show differences (p < 0.01) between three stakeholders groups in how they assess their integrated care services; (1) core players, (2) managers/directors/coordinators and (3) players at a distance. Managers, directors and coordinators assessed more implemented integrated care activities than the other two stakeholder groups. This stakeholder group also placed their care groups in a further phase of development. Players at a distance assessed significantly less present elements and assessed their care group as less developed. The results show a significant difference between stakeholder groups in the assessment of diabetes care practices. This reflects that the professional disciplines and the roles of stakeholders influence the way they asses the development of their integrated care setting, or that certain stakeholder groups

  20. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  1. Nonlinear analysis techniques for use in the assessment of high-level waste tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-01-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks

  2. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  3. Whole-Lesion Histogram Analysis of Apparent Diffusion Coefficient for the Assessment of Cervical Cancer.

    Science.gov (United States)

    Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-01-01

    The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P histogram analysis of ADC maps is useful in the assessment of cervical cancer.

  4. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  5. An introduction to Item Response Theory and Rasch Analysis of the Eating Assessment Tool (EAT-10).

    Science.gov (United States)

    Kean, Jacob; Brodke, Darrel S; Biber, Joshua; Gross, Paul

    2018-03-01

    Item response theory has its origins in educational measurement and is now commonly applied in health-related measurement of latent traits, such as function and symptoms. This application is due in large part to gains in the precision of measurement attributable to item response theory and corresponding decreases in response burden, study costs, and study duration. The purpose of this paper is twofold: introduce basic concepts of item response theory and demonstrate this analytic approach in a worked example, a Rasch model (1PL) analysis of the Eating Assessment Tool (EAT-10), a commonly used measure for oropharyngeal dysphagia. The results of the analysis were largely concordant with previous studies of the EAT-10 and illustrate for brain impairment clinicians and researchers how IRT analysis can yield greater precision of measurement.

  6. Nonliner analysis techniques for use in the assessment of high-level waste storage tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-09-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks. 10 refs., 13 figs., 1 tab

  7. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    Science.gov (United States)

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    International Nuclear Information System (INIS)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-01

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation

  9. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    Energy Technology Data Exchange (ETDEWEB)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-15

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation

  10. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    Science.gov (United States)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  11. Human performance analysis in the frame of probabilistic safety assessment of research reactors

    International Nuclear Information System (INIS)

    Farcasiu, Mita; Nitoi, Mirela; Apostol, Minodora; Turcu, I.; Florescu, Gh.

    2005-01-01

    Full text: The analysis of operating experience has identified the importance of human performance in reliability and safety of research reactors. In Probabilistic Safety Assessment (PSA) of nuclear facilities, human performance analysis (HPA) is used in order to estimate human error contribution to the failure of system components or functions. HPA is a qualitative and quantitative analysis of human actions identified for error-likely situations or accident-prone situations. Qualitative analysis is used to identify all man-machine interfaces that can lead to an accident, types of human interactions which may mitigate or exacerbate the accident, types of human errors and performance shaping factors. Quantitative analysis is used to develop estimates of human error probability as effects of human performance in reliability and safety. The goal of this paper is to accomplish a HPA in the PSA frame for research reactors. Human error probabilities estimated as results of human actions analysis could be included in system event tree and/or system fault tree. The achieved sensitivity analyses determine human performance sensibility at systematically variations both for dependencies level between human actions and for operator stress level. The necessary information was obtained from operating experience of research reactor TRIGA from INR Pitesti. The required data were obtained from generic data bases. (authors)

  12. Taylor Dispersion Analysis as a promising tool for assessment of peptide-peptide interactions.

    Science.gov (United States)

    Høgstedt, Ulrich B; Schwach, Grégoire; van de Weert, Marco; Østergaard, Jesper

    2016-10-10

    Protein-protein and peptide-peptide (self-)interactions are of key importance in understanding the physiochemical behavior of proteins and peptides in solution. However, due to the small size of peptide molecules, characterization of these interactions is more challenging than for proteins. In this work, we show that protein-protein and peptide-peptide interactions can advantageously be investigated by measurement of the diffusion coefficient using Taylor Dispersion Analysis. Through comparison to Dynamic Light Scattering it was shown that Taylor Dispersion Analysis is well suited for the characterization of protein-protein interactions of solutions of α-lactalbumin and human serum albumin. The peptide-peptide interactions of three selected peptides were then investigated in a concentration range spanning from 0.5mg/ml up to 80mg/ml using Taylor Dispersion Analysis. The peptide-peptide interactions determination indicated that multibody interactions significantly affect the PPIs at concentration levels above 25mg/ml for the two charged peptides. Relative viscosity measurements, performed using the capillary based setup applied for Taylor Dispersion Analysis, showed that the viscosity of the peptide solutions increased with concentration. Our results indicate that a viscosity difference between run buffer and sample in Taylor Dispersion Analysis may result in overestimation of the measured diffusion coefficient. Thus, Taylor Dispersion Analysis provides a practical, but as yet primarily qualitative, approach to assessment of the colloidal stability of both peptide and protein formulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. A New Solution Assessment Approach and Its Application to Space Geodesy Data Analysis

    Science.gov (United States)

    Hu, X.; Huang, C.; Liao, X.

    2001-12-01

    The statistics of the residuals are used in this paper to perform a quality assessment of the solutions from space geodesy data analysis. With the stochastic estimation and the relatively arbitrary empirical parameters being employed to absorb unmodelled errors, it has long been noticed that different estimate combinations or analysis strategies may achieve the same level of fitting yet result in significantly different solutions. Based on the postulate that no conceivable signals should remain in the residuals, solutions of the same level of root mean square error (RMS) and variance-covariance may be differentiated in the sense that for reasonable solutions, the residuals are virtually identical with noise. While it is possible to develop complex noise models, the Gaussian white noise model simplifies the solution interpretation and implies the unmodelled errors have been smoothed out. Statistical moments of the residuals as well as the Pearson chi-square are computed in this paper to measure the discrepancies between the residuals and Gaussian white noise. Applying to both satellite laser ranging (SLR) and global positioning system (GPS) data analysis, we evaluate different parameter estimate combinations and/or different strategies that would be hardly discriminated by the level of fitting. Unlike most solution assessment methods broadly termed as external comparison, no information independent of the data analyzed is required. This makes the immediate solution assessment possible and easy to carry out. While the external comparison is the best and most convincing quality assessment of the solution, the statistics of the residuals provide important information on the solutions and, in some cases as discussed in this paper, can be supported with external comparison.

  14. Using generalizability analysis to estimate parameters for anatomy assessments: A multi-institutional study.

    Science.gov (United States)

    Byram, Jessica N; Seifert, Mark F; Brooks, William S; Fraser-Cotlin, Laura; Thorp, Laura E; Williams, James M; Wilson, Adam B

    2017-03-01

    With integrated curricula and multidisciplinary assessments becoming more prevalent in medical education, there is a continued need for educational research to explore the advantages, consequences, and challenges of integration practices. This retrospective analysis investigated the number of items needed to reliably assess anatomical knowledge in the context of gross anatomy and histology. A generalizability analysis was conducted on gross anatomy and histology written and practical examination items that were administered in a discipline-based format at Indiana University School of Medicine and in an integrated fashion at the University of Alabama School of Medicine and Rush University Medical College. Examination items were analyzed using a partially nested design s×(i:o) in which items were nested within occasions (i:o) and crossed with students (s). A reliability standard of 0.80 was used to determine the minimum number of items needed across examinations (occasions) to make reliable and informed decisions about students' competence in anatomical knowledge. Decision study plots are presented to demonstrate how the number of items per examination influences the reliability of each administered assessment. Using the example of a curriculum that assesses gross anatomy knowledge over five summative written and practical examinations, the results of the decision study estimated that 30 and 25 items would be needed on each written and practical examination to reach a reliability of 0.80, respectively. This study is particularly relevant to educators who may question whether the amount of anatomy content assessed in multidisciplinary evaluations is sufficient for making judgments about the anatomical aptitude of students. Anat Sci Educ 10: 109-119. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  15. Analysis on evaluation ability of nonlinear safety assessment model of coal mines based on artificial neural network

    Institute of Scientific and Technical Information of China (English)

    SHI Shi-liang; LIU Hai-bo; LIU Ai-hua

    2004-01-01

    Based on the integration analysis of goods and shortcomings of various methods used in safety assessment of coal mines, combining nonlinear feature of mine safety sub-system, this paper establishes the neural network assessment model of mine safety, analyzes the ability of artificial neural network to evaluate mine safety state, and lays the theoretical foundation of artificial neural network using in the systematic optimization of mine safety assessment and getting reasonable accurate safety assessment result.

  16. Contextualized analysis of a needs assessment using the Theoretical Domains Framework: a case example in endocrinology.

    Science.gov (United States)

    Lazure, Patrice; Bartel, Robert C; Biller, Beverly M K; Molitch, Mark E; Rosenthal, Stephen M; Ross, Judith L; Bernsten, Brock D; Hayes, Sean M

    2014-07-24

    The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain. The TDF categorization of

  17. Contextualized analysis of a needs assessment using the Theoretical Domains Framework: a case example in endocrinology

    Science.gov (United States)

    2014-01-01

    Background The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. Methods This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. Results The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain

  18. A general framework for the regression analysis of pooled biomarker assessments.

    Science.gov (United States)

    Liu, Yan; McMahan, Christopher; Gallagher, Colin

    2017-07-10

    As a cost-efficient data collection mechanism, the process of assaying pooled biospecimens is becoming increasingly common in epidemiological research; for example, pooling has been proposed for the purpose of evaluating the diagnostic efficacy of biological markers (biomarkers). To this end, several authors have proposed techniques that allow for the analysis of continuous pooled biomarker assessments. Regretfully, most of these techniques proceed under restrictive assumptions, are unable to account for the effects of measurement error, and fail to control for confounding variables. These limitations are understandably attributable to the complex structure that is inherent to measurements taken on pooled specimens. Consequently, in order to provide practitioners with the tools necessary to accurately and efficiently analyze pooled biomarker assessments, herein, a general Monte Carlo maximum likelihood-based procedure is presented. The proposed approach allows for the regression analysis of pooled data under practically all parametric models and can be used to directly account for the effects of measurement error. Through simulation, it is shown that the proposed approach can accurately and efficiently estimate all unknown parameters and is more computational efficient than existing techniques. This new methodology is further illustrated using monocyte chemotactic protein-1 data collected by the Collaborative Perinatal Project in an effort to assess the relationship between this chemokine and the risk of miscarriage. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    Science.gov (United States)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  20. Analysis and assessment of environmental impacts of small hydro power plant in Slovakia

    Science.gov (United States)

    Zeleňáková, M.; Fijko, R.; Remeňáková, I.

    2017-10-01

    Environmental impact assessment (EIA) is an important process that, prior to approval of the investment plan, can provide a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to apply a specific methodology for the analysis and evaluation of the environmental impacts of selected constructions, namely, small hydro power plant, using matrix of impacts. This analysis method is intended not only to increase the clarity and precision of the evaluation process, but also to align it with the requirements of the environmental impact assessment system. This modification should improve the reliability of the environmental impact assessment, and could moreover also be applied to other infrastructure projects. Comparison of alternatives and designation of the optimal variant are implemented based on selected criteria that objectively describe the characteristic lines of the planned alternatives of activity and their impact on the environment. The use of proper EIA procedures can help the decision-makers to formulate proper activities based on qualified decisions. The designed project in Spišské Bystré, Slovakia is used as a case study to clarify and exemplify the methodology and techniques.

  1. Is It Working? Distractor Analysis Results from the Test Of Astronomy STandards (TOAST) Assessment Instrument

    Science.gov (United States)

    Slater, Stephanie

    2009-05-01

    The Test Of Astronomy STandards (TOAST) assessment instrument is a multiple-choice survey tightly aligned to the consensus learning goals stated by the American Astronomical Society - Chair's Conference on ASTRO 101, the American Association of the Advancement of Science's Project 2061 Benchmarks, and the National Research Council's National Science Education Standards. Researchers from the Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team at the University of Wyoming's Science and Math Teaching Center (UWYO SMTC) have been conducting a question-by-question distractor analysis procedure to determine the sensitivity and effectiveness of each item. In brief, the frequency each possible answer choice, known as a foil or distractor on a multiple-choice test, is determined and compared to the existing literature on the teaching and learning of astronomy. In addition to having statistical difficulty and discrimination values, a well functioning assessment item will show students selecting distractors in the relative proportions to how we expect them to respond based on known misconceptions and reasoning difficulties. In all cases, our distractor analysis suggests that all items are functioning as expected. These results add weight to the validity of the Test Of Astronomy STandards (TOAST) assessment instrument, which is designed to help instructors and researchers measure the impact of course-length duration instructional strategies for undergraduate science survey courses with learning goals tightly aligned to the consensus goals of the astronomy education community.

  2. Discrete wavelet transform analysis of surface electromyography for the fatigue assessment of neck and shoulder muscles.

    Science.gov (United States)

    Chowdhury, Suman Kanti; Nimbarte, Ashish D; Jaridi, Majid; Creese, Robert C

    2013-10-01

    Assessment of neuromuscular fatigue is essential for early detection and prevention of risks associated with work-related musculoskeletal disorders. In recent years, discrete wavelet transform (DWT) of surface electromyography (SEMG) has been used to evaluate muscle fatigue, especially during dynamic contractions when the SEMG signal is non-stationary. However, its application to the assessment of work-related neck and shoulder muscle fatigue is not well established. Therefore, the purpose of this study was to establish DWT analysis as a suitable method to conduct quantitative assessment of neck and shoulder muscle fatigue under dynamic repetitive conditions. Ten human participants performed 40min of fatiguing repetitive arm and neck exertions while SEMG data from the upper trapezius and sternocleidomastoid muscles were recorded. The ten of the most commonly used wavelet functions were used to conduct the DWT analysis. Spectral changes estimated using power of wavelet coefficients in the 12-23Hz frequency band showed the highest sensitivity to fatigue induced by the dynamic repetitive exertions. Although most of the wavelet functions tested in this study reasonably demonstrated the expected power trend with fatigue development and recovery, the overall performance of the "Rbio3.1" wavelet in terms of power estimation and statistical significance was better than the remaining nine wavelets. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Laser cutting of various materials: Kerf width size analysis and life cycle assessment of cutting process

    Science.gov (United States)

    Yilbas, Bekir Sami; Shaukat, Mian Mobeen; Ashraf, Farhan

    2017-08-01

    Laser cutting of various materials including Ti-6Al-4V alloy, steel 304, Inconel 625, and alumina is carried out to assess the kerf width size variation along the cut section. The life cycle assessment is carried out to determine the environmental impact of the laser cutting in terms of the material waste during the cutting process. The kerf width size is formulated and predicted using the lump parameter analysis and it is measured from the experiments. The influence of laser output power and laser cutting speed on the kerf width size variation is analyzed using the analytical tools including scanning electron and optical microscopes. In the experiments, high pressure nitrogen assisting gas is used to prevent oxidation reactions in the cutting section. It is found that the kerf width size predicted from the lump parameter analysis agrees well with the experimental data. The kerf width size variation increases with increasing laser output power. However, this behavior reverses with increasing laser cutting speed. The life cycle assessment reveals that material selection for laser cutting is critical for the environmental protection point of view. Inconel 625 contributes the most to the environmental damages; however, recycling of the waste of the laser cutting reduces this contribution.

  4. Patient assessment within the context of healthcare delivery packages: A comparative analysis.

    Science.gov (United States)

    Rossen, Camilla Blach; Buus, Niels; Stenager, Egon; Stenager, Elsebeth

    2016-01-01

    Due to an increased focus on productivity and cost-effectiveness, many countries across the world have implemented a variety of tools for standardizing diagnostics and treatment. In Denmark, healthcare delivery packages are increasingly used for assessment of patients. A package is a tool for creating coordination, continuity and efficient pathways; each step is pre-booked, and the package has a well-defined content within a predefined category of diseases. The aim of this study was to investigate how assessment processes took place within the context of healthcare delivery packages. The study used a constructivist Grounded Theory approach. Ethnographic fieldwork was carried out in three specialized units: a mental health unit and two multiple sclerosis clinics in Southern Denmark, which all used assessment packages. Several types of data were sampled through theoretical sampling. Participant observation was conducted for a total of 126h. Formal and informal interviews were conducted with 12 healthcare professionals and 13 patients. Furthermore, audio recordings were made of 9 final consultations between physicians and patients; 193min of recorded consultations all in all. Lastly, the medical records of 13 patients and written information about packages were collected. The comparative, abductive analysis focused on the process of assessment and the work made by all the actors involved. In this paper, we emphasized the work of healthcare professionals. We constructed five interrelated categories: 1. "Standardized assessing", 2. "Flexibility", which has two sub-categories, 2.1. "Diagnostic options" and 2.2. "Time and organization", and, finally, 3. "Resisting the frames". The process of assessment required all participants to perform the predefined work in the specified way at the specified time. Multidisciplinary teamwork was essential for the success of the process. The local organization of the packages influenced the assessment process, most notably the pre

  5. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    Science.gov (United States)

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  6. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-01-01

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  7. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu [Louisiana State University, Baton Rouge, LA (United States); Sattler, Meredith, E-mail: msattler@lsu.edu [School of Architecture, Louisiana State University, Baton Rouge, LA (United States); Friedland, Carol J., E-mail: friedland@lsu.edu [Bert S. Turner Department of Construction Management, Louisiana State University, Baton Rouge, LA (United States)

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  8. Applying HAZOP analysis in assessing remote handling compatibility of ITER port plugs

    International Nuclear Information System (INIS)

    Duisings, L.P.M.; Til, S. van; Magielsen, A.J.; Ronden, D.M.S.; Elzendoorn, B.S.Q.; Heemskerk, C.J.M.

    2013-01-01

    Highlights: ► We applied HAZOP analysis to assess the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. ► We identified several weak points in the general upper port plug maintenance concept. ► We made clear recommendations on redesign in port plug design, operational sequence and Hot Cell equipment. ► The use of a HAZOP approach for the ECH UL port can also be applied to ITER port plugs in general. -- Abstract: This paper describes the application of a Hazard and Operability Analysis (HAZOP) methodology in assessing the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. As part of the ECHUL consortium, the remote handling team at the DIFFER Institute is developing maintenance tools and procedures for critical components of the ECH Upper launcher (UL). Based on NRG's experience with nuclear risk analysis and Hot Cell procedures, early versions of these tool concepts and maintenance procedures were subjected to a HAZOP analysis. The analysis identified several weak points in the general upper port plug maintenance concept and led to clear recommendations on redesigns in port plug design, the operational sequence and ITER Hot Cell equipment. The paper describes the HAZOP methodology and illustrates its application with specific procedures: the Steering Mirror Assembly (SMA) replacement and the exchange of the Mid Shield Optics (MSO) in the ECH UPL. A selection of recommended changes to the launcher design associated with the accessibility, maintainability and manageability of replaceable components are presented

  9. Assessing eco-efficiency: A metafrontier directional distance function approach using life cycle analysis

    International Nuclear Information System (INIS)

    Beltrán-Esteve, Mercedes; Reig-Martínez, Ernest; Estruch-Guitart, Vicent

    2017-01-01

    Sustainability analysis requires a joint assessment of environmental, social and economic aspects of production processes. Here we propose the use of Life Cycle Analysis (LCA), a metafrontier (MF) directional distance function (DDF) approach, and Data Envelopment Analysis (DEA), to assess technological and managerial differences in eco-efficiency between production systems. We use LCA to compute six environmental and health impacts associated with the production processes of nearly 200 Spanish citrus farms belonging to organic and conventional farming systems. DEA is then employed to obtain joint economic-environmental farm's scores that we refer to as eco-efficiency. DDF allows us to determine farms' global eco-efficiency scores, as well as eco-efficiency scores with respect to specific environmental impacts. Furthermore, the use of an MF helps us to disentangle technological and managerial eco-inefficiencies by comparing the eco-efficiency of both farming systems with regards to a common benchmark. Our core results suggest that the shift from conventional to organic farming technology would allow a potential reduction in environmental impacts of 80% without resulting in any decline in economic performance. In contrast, as regards farmers' managerial capacities, both systems display quite similar mean scores.

  10. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  11. Assessing eco-efficiency: A metafrontier directional distance function approach using life cycle analysis

    Energy Technology Data Exchange (ETDEWEB)

    Beltrán-Esteve, Mercedes, E-mail: mercedes.beltran@uv.es [Department of Applied Economics II, University of Valencia (Spain); Reig-Martínez, Ernest [Department of Applied Economics II, University of Valencia, Ivie (Spain); Estruch-Guitart, Vicent [Department of Economy and Social Sciences, Polytechnic University of Valencia (Spain)

    2017-03-15

    Sustainability analysis requires a joint assessment of environmental, social and economic aspects of production processes. Here we propose the use of Life Cycle Analysis (LCA), a metafrontier (MF) directional distance function (DDF) approach, and Data Envelopment Analysis (DEA), to assess technological and managerial differences in eco-efficiency between production systems. We use LCA to compute six environmental and health impacts associated with the production processes of nearly 200 Spanish citrus farms belonging to organic and conventional farming systems. DEA is then employed to obtain joint economic-environmental farm's scores that we refer to as eco-efficiency. DDF allows us to determine farms' global eco-efficiency scores, as well as eco-efficiency scores with respect to specific environmental impacts. Furthermore, the use of an MF helps us to disentangle technological and managerial eco-inefficiencies by comparing the eco-efficiency of both farming systems with regards to a common benchmark. Our core results suggest that the shift from conventional to organic farming technology would allow a potential reduction in environmental impacts of 80% without resulting in any decline in economic performance. In contrast, as regards farmers' managerial capacities, both systems display quite similar mean scores.

  12. Texture analysis for the assessment of structural changes in parotid glands induced by radiotherapy

    International Nuclear Information System (INIS)

    Scalco, Elisa; Fiorino, Claudio; Cattaneo, Giovanni Mauro; Sanguineti, Giuseppe; Rizzo, Giovanna

    2013-01-01

    Background and purpose: During radiotherapy (RT) for head-and-neck cancer, parotid glands undergo significant anatomic, functional and structural changes which could characterize pre-clinical signs of an increased risk of xerostomia. Texture analysis is proposed to assess structural changes of parotids induced by RT, and to investigate whether early variations of textural parameters (such as mean intensity and fractal dimension) can predict parotid shrinkage at the end of treatment. Material and methods: Textural parameters and volumes of 42 parotids from 21 patients treated with intensity-modulated RT for nasopharyngeal cancer were extracted from CT images. To individuate which parameters changed during RT, a Wilcoxon signed-rank test between textural indices (first and second RT week; first and last RT week) was performed. Discriminant analysis was applied to variations of these parameters in the first two weeks of RT to assess their power in predicting parotid shrinkage at the end of RT. Results: A significant decrease in mean intensity (1.7 HU and 3.8 HU after the second and last weeks, respectively) and fractal dimension (0.016 and 0.021) was found. Discriminant analysis, based on volume and fractal dimension, was able to predict the final parotid shrinkage (accuracy of 71.4%). Conclusion: Textural features could be used in combination with volume to characterize structural modifications on parotid glands and to predict parotid shrinkage at the end of RT

  13. Dependability Assessment by Static Analysis of Software Important to Nuclear Power Plant Safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab, Chatou (France)

    2014-08-15

    We describe a practical experimentation of safety assessment of safety-critical software used in Nuclear Power Plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricite de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Today, new industrial tools, based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software is very significantly improved. In a first part, we present the analysis principles of the tools used in our experimentation. In a second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools. In the last part, we present an overview of the results and the limitation of the tools.

  14. A hierarchical clustering scheme approach to assessment of IP-network traffic using detrended fluctuation analysis

    Science.gov (United States)

    Takuma, Takehisa; Masugi, Masao

    2009-03-01

    This paper presents an approach to the assessment of IP-network traffic in terms of the time variation of self-similarity. To get a comprehensive view in analyzing the degree of long-range dependence (LRD) of IP-network traffic, we use a hierarchical clustering scheme, which provides a way to classify high-dimensional data with a tree-like structure. Also, in the LRD-based analysis, we employ detrended fluctuation analysis (DFA), which is applicable to the analysis of long-range power-law correlations or LRD in non-stationary time-series signals. Based on sequential measurements of IP-network traffic at two locations, this paper derives corresponding values for the LRD-related parameter α that reflects the degree of LRD of measured data. In performing the hierarchical clustering scheme, we use three parameters: the α value, average throughput, and the proportion of network traffic that exceeds 80% of network bandwidth for each measured data set. We visually confirm that the traffic data can be classified in accordance with the network traffic properties, resulting in that the combined depiction of the LRD and other factors can give us an effective assessment of network conditions at different times.

  15. Current activities and future trends in reliability analysis and probabilistic safety assessment in Hungary

    International Nuclear Information System (INIS)

    Hollo, E.; Toth, J.

    1986-01-01

    In Hungary reliability analysis (RA) and probabilistic safety assessment (PSA) of nuclear power plants was initiated 3 years ago. First, computer codes for automatic fault tree analysis (CAT, PREP) and numerical evaluation (REMO, KITT1,2) were adapted. Two main case studies - detailed availability/reliability calculation of diesel sets and analysis of safety systems influencing event sequences induced by large LOCA - were performed. Input failure data were taken from publications, a need for failure and reliability data bank was revealed. Current and future activities involves: setup of national data bank for WWER-440 units; full-scope level-I PSA of PAKS NPP in Hungary; operational safety assessment of particular problems at PAKS NPP. In the present article the state of RA and PSA activities in Hungary, as well as the main objectives of ongoing work are described. A need for international cooperation (for unified data collection of WWER-440 units) and for IAEA support (within Interregional Program INT/9/063) is emphasized. (author)

  16. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter sen...... scavenging, water-sediment interaction, biological uptake, ice transport and fish migration. Two independent evaluations of the release of radioactivity from dumped nuclear waste in the Kara Sea have been used as source terms for the dose calculations.......A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter...... sensitivity analysis has identified components of the model that are potentially important contributors to the predictive accuracy of doses to individuals of critical groups as well as to the world population. The components investigated include features associated with water transport and mixing, particle...

  17. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    Directory of Open Access Journals (Sweden)

    Claudia R. Binder

    2013-03-01

    Full Text Available Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  18. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Science.gov (United States)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  19. Negotiating NORM cleanup and land use limits: Practical use of dose assessment and cost benefit analysis

    International Nuclear Information System (INIS)

    Blanchard, A.D.H.

    1997-01-01

    Oil companies are presently faced with complex and costly environmental decisions, especially concerning NORM cleanup and disposal. Strict cleanup limits and disposal restrictions are established, in theory, to protect public health and environment. While public health is directly measured in terms of dose (mrem/yr), most NORM regulations adopt soil concentration limits to ensure future public health is maintained. These derived soil limits create the potential for unnecessary burden to operators without additional health benefit to society. Operators may use a dose assessment to show direct compliance with dose limits, negotiating less restrictive cleanup levels and land use limits. This paper discusses why a dose assessment is useful to Oilfield operators, NORM exposure scenarios and pathways, assessment advantages, variables and recommendations and one recent dose assessment application. Finally, a cost benefit analysis tool for regulatory negotiations will be presented allowing comparison of Oilfield NORM health benefit costs to that of other industries. One use for this tool--resulting in the savings of approximately $100,000--will be discussed

  20. Risk assessment of environmentally influenced airway diseases based on time-series analysis.

    Science.gov (United States)

    Herbarth, O

    1995-09-01

    Threshold values are of prime importance in providing a sound basis for public health decisions. A key issue is determining threshold or maximum exposure values for pollutants and assessing their potential health risks. Environmental epidemiology could be instrumental in assessing these levels, especially since the assessment of ambient exposures involves relatively low concentrations of pollutants. This paper presents a statistical method that allows the determination of threshold values as well as the assessment of the associated risk using a retrospective, longitudinal study design with a prospective follow-up. Morbidity data were analyzed using the Fourier method, a time-series analysis that is based on the assumption of a high temporal resolution of the data. This method eliminates time-dependent responses like temporal inhomogeneity and pseudocorrelation. The frequency of calls for respiratory distress conditions to the regional Mobile Medical Emergency Service (MMES) in the city of Leipzig were investigated. The entire population of Leipzig served as a pool for data collection. In addition to the collection of morbidity data, air pollution measurements were taken every 30 min for the entire study period using sulfur dioxide as the regional indicator variable. This approach allowed the calculation of a dose-response curve for respiratory diseases and air pollution indices in children and adults. Significantly higher morbidities were observed above a 24-hr mean value of 0.6 mg SO2/m3 air for children and 0.8 mg SO2/m3 for adults.(ABSTRACT TRUNCATED AT 250 WORDS)