WorldWideScience

Sample records for mortem analysis framework

  1. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  2. [Post-mortem microbiology analysis].

    Science.gov (United States)

    Fernández-Rodríguez, Amparo; Alberola, Juan; Cohen, Marta Cecilia

    2013-12-01

    Post-mortem microbiology is useful in both clinical and forensic autopsies, and allows a suspected infection to be confirmed. Indeed, it is routinely applied to donor studies in the clinical setting, as well as in sudden and unexpected death in the forensic field. Implementation of specific sampling techniques in autopsy can minimize the possibility of contamination, making interpretation of the results easier. Specific interpretation criteria for post-mortem cultures, the use of molecular diagnosis, and its fusion with molecular biology and histopathology have led to post-mortem microbiology playing a major role in autopsy. Multidisciplinary work involving microbiologists, pathologists, and forensic physicians will help to improve the achievements of post-mortem microbiology, prevent infectious diseases, and contribute to a healthier population. Crown Copyright © 2012. Published by Elsevier Espana. All rights reserved.

  3. Blast furnace hearth lining: post mortem analysis

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Bruno Vidal de; Vernilli Junior, Fernando, E-mail: bva@usp.br [Universidade de Sao Paulo (USP), Lorena, SP (Brazil). Escola de Engenharia; Neves; Elton Silva; Silva, Sidiney Nascimento [Companhia Siderugica Nacional (CSN), Rio de Janeiro, RJ (Brazil)

    2017-05-15

    The main refractory lining of blast furnace hearth is composed by carbon blocks that operates in continuous contact with hot gases, liquid slag and hot metal, in temperatures above 1550 deg C for 24 hours a day. To fully understand the wear mechanism that acts in this refractory layer system it was performed a Post Mortem study during the last partial repair of this furnace. The samples were collected from different parts of the hearth lining and characterized using the following techniques: Bulk Density and Apparent Porosity, X-Ray Fluorescence, X-ray Diffraction, Scanning Electron Microscopy with Energy-dispersive X-Ray Spectroscopy. The results showed that the carbon blocks located at the opposite side of the blast furnace tap hole kept its main physicochemical characteristics preserved even after the production of 20x10{sup 6} ton of hot metal. However, the carbon blocks around the Tap Hole showed infiltration by hot metal and slag and it presents a severe deposition of zinc and sulfur over its carbon flakes. The presence of these elements is undesired because it reduces the physic-chemical stability of this refractory system. This deposition found in the carbon refractory is associated with impurities present in the both coke and the sinter feed used in this blast furnace in the last few years. (author)

  4. Blast furnace hearth lining: post mortem analysis

    International Nuclear Information System (INIS)

    Almeida, Bruno Vidal de; Vernilli Junior, Fernando

    2017-01-01

    The main refractory lining of blast furnace hearth is composed by carbon blocks that operates in continuous contact with hot gases, liquid slag and hot metal, in temperatures above 1550 deg C for 24 hours a day. To fully understand the wear mechanism that acts in this refractory layer system it was performed a Post Mortem study during the last partial repair of this furnace. The samples were collected from different parts of the hearth lining and characterized using the following techniques: Bulk Density and Apparent Porosity, X-Ray Fluorescence, X-ray Diffraction, Scanning Electron Microscopy with Energy-dispersive X-Ray Spectroscopy. The results showed that the carbon blocks located at the opposite side of the blast furnace tap hole kept its main physicochemical characteristics preserved even after the production of 20x10"6 ton of hot metal. However, the carbon blocks around the Tap Hole showed infiltration by hot metal and slag and it presents a severe deposition of zinc and sulfur over its carbon flakes. The presence of these elements is undesired because it reduces the physic-chemical stability of this refractory system. This deposition found in the carbon refractory is associated with impurities present in the both coke and the sinter feed used in this blast furnace in the last few years. (author)

  5. Deuterium inventory in Tore Supra: reconciling particle balance and post-mortem analysis

    International Nuclear Information System (INIS)

    Tsitrone, E.; Brosset, C.; Pegourie, B.; Gauthier, E.; Bouvet, J.; Bucalossi, J.; Carpentier, S.; Corre, Y.; Delchambre, E.; Dittmar, T.; Douai, D.; Ekedahl, A.; Ghendrih, Ph.; Grisolia, C.; Grosman, A.; Gunn, J.; Hong, S.H.; Desgranges, L.; Escarguel, A.; Jacob, W.

    2009-01-01

    Fuel retention, a crucial issue for next step devices, is assessed in present-day tokamaks using two methods: particle balance performed during shots and post-mortem analysis carried out during shutdowns between experimental campaigns. Post-mortem analysis generally gives lower estimates of fuel retention than integrated particle balance. In order to understand the discrepancy between these two methods, a dedicated experimental campaign has been performed in Tore Supra to load the vessel walls with deuterium (D) and monitor the trapped D inventory through particle balance. The campaign was followed by an extensive post-mortem analysis phase of the Tore Supra limiter. This paper presents the status of the analysis phase, including the assessment of the D content in the castellated tile structure of the limiter. Indeed, using combined surface analysis techniques, it was possible to derive the relative contributions of different zones of interest on the limiter (erosion, thick deposits, thin deposits), showing that the post-mortem inventory is mainly due to codeposition (90% of the total), in particular due to gap deposits. However, deuterium was also evidenced deep into the material in erosion zones (10% of the total). At the present stage of the analysis, 50% of the inventory deduced from particle balance has been found through post-mortem analysis, a significant progress with respect to previous studies (factor 8-10 discrepancy). This shows that post-mortem analysis can be consistent with particle balance provided specific procedures are implemented (dedicated campaign followed by extensive post-mortem analysis). Both techniques are needed for a reliable assessment of fuel retention in tokamaks, giving complementary information on how much and where fuel is retained in the vessel walls.

  6. Histopathological features of post-mortem pituitaries: A retrospective analysis

    Directory of Open Access Journals (Sweden)

    Francisco José Tortosa Vallecillos

    Full Text Available SUMMARY Objective: As a result of the use of neuroimaging techniques, silent pituitary lesions are diagnosed more and more frequently; however, there are few published post-mortem studies about this gland. Incidence data of pituitary lesions are rare and in Portugal they are outdated or even non-existent. The aim of this study is to determine the prevalence of normal patterns and incidental post-mortem pituitary pathology at Centro Hospitalar Lisboa Norte, analyzing the associations with clinical data and assessing the clinical relevance of the findings. Method: We reviewed retrospectively and histologically 167 pituitaries of a consecutive series of autopsies from the Department of Pathology of this centre. They were done between 2012 and 2014, and in all cases medical records were reviewed. The morphological patterns observed, were classified into three major groups: 1 Normal histological patterns and variants; 2 Infectious-inflammatory pathology, metabolic and vascular disorders; 3 Incidental primary proliferation and secondary to systemic diseases. Results: The subjects included in this study were of all age groups (from 1 day to 91 years old, 71 were female and 96 male. Fifty-seven of these glands didn’t show any alteration; 51 showed colloid cysts arising from Rathke cleft; 44 presented hyperplasia in adenohypophysis and we identified 20 adenomas in 19 glands (immunohistochemically, eight PRL-producing and five ACTH-producing tumors, ten of which associated with obesity, 11 to hypertension and six to diabetes mellitus. There were two cases with metastasis. Conclusion: Subclinical pathology in our country is similar to that seen in other parts of the world, but at older ages.

  7. FEBEX II Project Post-mortem analysis EDZ assessment

    International Nuclear Information System (INIS)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-01-01

    Within the framework of the FEBEX II project a multidisciplinary team studied the mechanisms of creation of the potential damaged zone around the test drift. The research program includes laboratory and in situ investigations as well as the numerical modelling of the observed phenomena. Where laboratory investigations are concerned, the 14C-PMMA technique was applied to study the spatial distribution of porosity in the samples taken from the test drift wall. In addition complementary microscopy and scanning electron microscopy (SEM) studies were performed to make qualitative investigations on the pore apertures and minerals in porous regions. The results obtained with the PMMA method have not shown any clear increased porosity zone adjacent to the tunnel wall. The total porosity of the samples varied between 0.6-1.2%. The samples of unplugged region did not differ from the samples of plugged region. A clear increase in porosity to depths of 10-15 mm from the tunnel wall was detected in lamprophyre samples. According to the SEM/EDX analyses the excavation-disturbed zone in the granite matrix extended to depths of 1-3 mm from the wall surface. A few quartz grains were crushed and some micro fractures were found. Gas permeability tests were carried out on two hollow cylinder samples of about 1m long each taken on the granite wall perpendicular to the drift axis. The first sample was cored in the service area far from the heated zone and the second one at the level of the heater. The tests were performed at constant gas pressure by setting a steady state radial flow through a section of 1cm wide isolated by means of four mini-packers. The profile of the gas permeability according to the core length has been established. The results obtained for both considered samples have shown permeability ranging between 3.5 10-18 and 8.4 10-19m2, pointing out the absence of a marked damage. Acoustic investigations have been carried out with the objective of quantifying the

  8. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    Within the framework of the FEBEX II project a multidisciplinary team studied the mechanisms of creation of the potential damaged zone around the test drift. The research program includes laboratory and in situ investigations as well as the numerical modelling of the observed phenomena. Where laboratory investigations are concerned, the 14C-PMMA technique was applied to study the spatial distribution of porosity in the samples taken from the test drift wall. In addition complementary microscopy and scanning electron microscopy (SEM) studies were performed to make qualitative investigations on the pore apertures and minerals in porous regions. The results obtained with the PMMA method have not shown any clear increased porosity zone adjacent to the tunnel wall. The total porosity of the samples varied between 0.6-1.2%. The samples of unplugged region did not differ from the samples of plugged region. A clear increase in porosity to depths of 10-15 mm from the tunnel wall was detected in lamprophyre samples. According to the SEM/EDX analyses the excavation-disturbed zone in the granite matrix extended to depths of 1-3 mm from the wall surface. A few quartz grains were crushed and some micro fractures were found. Gas permeability tests were carried out on two hollow cylinder samples of about 1m long each taken on the granite wall perpendicular to the drift axis. The first sample was cored in the service area far from the heated zone and the second one at the level of the heater. The tests were performed at constant gas pressure by setting a steady state radial flow through a section of 1cm wide isolated by means of four mini-packers. The profile of the gas permeability according to the core length has been established. The results obtained for both considered samples have shown permeability ranging between 3.5 10-18 and 8.4 10-19m2, pointing out the absence of a marked damage. Acoustic investigations have been carried out with the objective of quantifying the

  9. FEBEX Project Post-mortem Analysis: Corrosion Study

    International Nuclear Information System (INIS)

    Madina, V.; Azkarate, I.

    2004-01-01

    The partial dismantling of the FEBEX in situ test was carried out during de summer of 2002, following 5 years of continuous heating. The operation included the demolition of the concrete plug and the removal of the section of the test corresponding to the first heater. A large number of samples from all types of materials have been taken during the dismantling for subsequent analysis. Part of the samples collected were devoted to the analysis of the corrosion processes occurred during the first operational phase of the test. These samples comprised corrosion coupons from different metals installed for that purpose, sensors retrieved during the dismantling that were found severely corroded and bentonite in contact with those sensors. In addition, a corrosion study was performed on the heater extracted and on one section of liner surrounding it. All the analyses were carried out by the Fundacion INASMET (Spain). This report describes, in detail the studies carried out the different materials and the obtained results, as well as the drawn conclusions. (Author)

  10. FEBEX Project Post-mortem Analysis: Corrosion Study

    Energy Technology Data Exchange (ETDEWEB)

    Madina, V.; Azkarate, I.

    2004-07-01

    The partial dismantling of the FEBEX in situ test was carried out during de summer of 2002, following 5 years of continuous heating. The operation included the demolition of the concrete plug and the removal of the section of the test corresponding to the first heater. A large number of samples from all types of materials have been taken during the dismantling for subsequent analysis. Part of the samples collected were devoted to the analysis of the corrosion processes occurred during the first operational phase of the test. These samples comprised corrosion coupons from different metals installed for that purpose, sensors retrieved during the dismantling that were found severely corroded and bentonite in contact with those sensors. In addition, a corrosion study was performed on the heater extracted and on one section of liner surrounding it. All the analyses were carried out by the Fundacion INASMET (Spain). This report describes, in detail the studies carried out the different materials and the obtained results, as well as the drawn conclusions. (Author)

  11. Microstructural analysis of geopolymer developed from wood fly ash, post-mortem doloma refractory and metakaolin

    International Nuclear Information System (INIS)

    Moura, Jailes de Santana; Mafra, Marcio Paulo de Araujo; Rabelo, Adriano Alves; Fagury, Renata Lilian Ribeiro Portugal; Fagury Neto, Elias

    2016-01-01

    Geopolymers are one of the widely discussed topics of materials science in recent times due to its vast potential as an alternative binder material to cement. This work aimed to evaluate the microstructure of geopolymers developed from wood fly ash, post-mortem doloma refractory and metakaolin. A preliminary study has been completed and achieved significant results compressive strength: the best formulation of geopolymer paste obtained approximately 25 MPa. Microstructural analysis by scanning electron microscopy, the geopolymer paste, allowed to verify the homogeneity, distribution of components, and providing evidence of raw materials that do not respond if there was crystalline phase, porosity and density of the structure. (author)

  12. Fuel retention in JET ITER-Like Wall from post-mortem analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heinola, K., E-mail: kalle.heinola@ccfe.ac.uk [Association EURATOM-TEKES, University of Helsinki, PO Box 64, 00560 Helsinki (Finland); EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Widdowson, A. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Likonen, J. [Association EURATOM-TEKES, VTT, PO Box 1000, 02044 VTT, Espoo (Finland); Alves, E. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Baron-Wiechec, A. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Barradas, N. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Brezinsek, S. [Forschungszentrum Julich GmbH, EURATOM Association, D-52425 Julich (Germany); Catarino, N. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Coad, P. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Koivuranta, S. [Association EURATOM-TEKES, VTT, PO Box 1000, 02044 VTT, Espoo (Finland); Matthews, G.F. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Mayer, M. [Max-Planck Institut fur Plasmaphysik, EURATOM Association, D-85748 Garching (Germany); Petersson, P. [Royal Institute of Technology, Association EURATOM-VR, SE-10044 Stockholm (Sweden)

    2015-08-15

    Selected Ion Beam Analysis techniques applicable for detecting deuterium and heavier impurities have been used in the post-mortem analyses of tiles removed after the first JET ITER-Like Wall (JET-ILW) campaign. Over half of the retained fuel was measured in the divertor region. The highest figures for fuel retention were obtained from regions with the thickest deposited layers, i.e. in the inner divertor on top of tile 1 and on the High Field Gap Closure tile, which resides deep in the plasma scrape-off layer. Least retention was found in the main chamber high erosion regions, i.e. in the mid-plane of Inner Wall Guard Limiter. The fuel retention values found typically varied with deposition layer thicknesses. The reported retention values support the observed decrease in fuel retention obtained with gas balance experiments of JET-ILW.

  13. Cochlear neuropathy in human presbycusis: Confocal analysis of hidden hearing loss in post-mortem tissue.

    Science.gov (United States)

    Viana, Lucas M; O'Malley, Jennifer T; Burgess, Barbara J; Jones, Dianne D; Oliveira, Carlos A C P; Santos, Felipe; Merchant, Saumil N; Liberman, Leslie D; Liberman, M Charles

    2015-09-01

    Recent animal work has suggested that cochlear synapses are more vulnerable than hair cells in both noise-induced and age-related hearing loss. This synaptopathy is invisible in conventional histopathological analysis, because cochlear nerve cell bodies in the spiral ganglion survive for years, and synaptic analysis requires special immunostaining or serial-section electron microscopy. Here, we show that the same quadruple-immunostaining protocols that allow synaptic counts, hair cell counts, neuronal counts and differentiation of afferent and efferent fibers in mouse can be applied to human temporal bones, when harvested within 9 h post-mortem and prepared as dissected whole mounts of the sensory epithelium and osseous spiral lamina. Quantitative analysis of five "normal" ears, aged 54-89 yrs, without any history of otologic disease, suggests that cochlear synaptopathy and the degeneration of cochlear nerve peripheral axons, despite a near-normal hair cell population, may be an important component of human presbycusis. Although primary cochlear nerve degeneration is not expected to affect audiometric thresholds, it may be key to problems with hearing in noise that are characteristic of declining hearing abilities in the aging ear. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Joint analysis of three-dimensional anatomical and functional data considering the cerebral post mortem imaging in rodents

    International Nuclear Information System (INIS)

    Dubois, Albertine

    2008-01-01

    The recent development of dedicated small animal anatomical (MRI) and functional (micro-PET) scanners has opened up the possibility of performing repeated functional in vivo studies in the same animal as the longitudinal follow-up of cerebral glucose metabolism. However, these systems still suffer technical limitations including a limited sensitivity and a reduced spatial resolution. Hence, autoradiography and histological studies remain the reference and widely used techniques for biological studies in small animals. The major disadvantage of these post mortem imaging techniques is that they require brain tissue sectioning, entailing the production of large numbers (up to several hundreds) of serial sections and the inherent loss of three-dimensional (3D) spatial consistency. The first step towards improving the analysis of this post mortem information was the development of reliable, automated procedures for the 3D reconstruction of the whole brain sections. We first developed an optimized data acquisition from large numbers of post mortem data (2D sections and block-face photographs). Then, we proposed different strategies of 3D reconstruction of the corresponding volumes. We also addressed the histological to autoradiographic sections and to block-face photographs co-registration problem (the photographic volume is intrinsically spatially consistent). These developments were essential for the 3D reconstruction but also enabled the evaluation of different methods of functional data analysis, from the most straightforward (manual delineation of regions of interest) to the most automated (Statistical Parametric Mapping-like approaches for group analysis). Two biological applications were carried out: visual stimulation in rats and cerebral metabolism in a transgenic mouse model of Alzheimer's disease. One perspective of this work is to match reconstructed post mortem data with in vivo images of the same animal. (author) [fr

  15. Microstructural analysis of geopolymer developed from wood fly ash, post-mortem doloma refractory and metakaolin; Analise microestrutural de geopolimero desenvolvido a partir de cinza de olaria, tijolo refratario dolomitico post-mortem e metacaulim

    Energy Technology Data Exchange (ETDEWEB)

    Moura, Jailes de Santana; Mafra, Marcio Paulo de Araujo; Rabelo, Adriano Alves; Fagury, Renata Lilian Ribeiro Portugal; Fagury Neto, Elias, E-mail: jailesmoura@hotmail.com, E-mail: fagury@unifesspa.edu.br [Universidade Federal do Sul e Sudeste do Para (UNIFESSPA), PA (Brazil). Faculdade de Engenharia de Materiais

    2016-07-01

    Geopolymers are one of the widely discussed topics of materials science in recent times due to its vast potential as an alternative binder material to cement. This work aimed to evaluate the microstructure of geopolymers developed from wood fly ash, post-mortem doloma refractory and metakaolin. A preliminary study has been completed and achieved significant results compressive strength: the best formulation of geopolymer paste obtained approximately 25 MPa. Microstructural analysis by scanning electron microscopy, the geopolymer paste, allowed to verify the homogeneity, distribution of components, and providing evidence of raw materials that do not respond if there was crystalline phase, porosity and density of the structure. (author)

  16. Partitioning the proteome: phase separation for targeted analysis of membrane proteins in human post-mortem brain.

    Directory of Open Access Journals (Sweden)

    Jane A English

    Full Text Available Neuroproteomics is a powerful platform for targeted and hypothesis driven research, providing comprehensive insights into cellular and sub-cellular disease states, Gene × Environmental effects, and cellular response to medication effects in human, animal, and cell culture models. Analysis of sub-proteomes is becoming increasingly important in clinical proteomics, enriching for otherwise undetectable proteins that are possible markers for disease. Membrane proteins are one such sub-proteome class that merit in-depth targeted analysis, particularly in psychiatric disorders. As membrane proteins are notoriously difficult to analyse using traditional proteomics methods, we evaluate a paradigm to enrich for and study membrane proteins from human post-mortem brain tissue. This is the first study to extensively characterise the integral trans-membrane spanning proteins present in human brain. Using Triton X-114 phase separation and LC-MS/MS analysis, we enriched for and identified 494 membrane proteins, with 194 trans-membrane helices present, ranging from 1 to 21 helices per protein. Isolated proteins included glutamate receptors, G proteins, voltage gated and calcium channels, synaptic proteins, and myelin proteins, all of which warrant quantitative proteomic investigation in psychiatric and neurological disorders. Overall, our sub-proteome analysis reduced sample complexity and enriched for integral membrane proteins by 2.3 fold, thus allowing for more manageable, reproducible, and targeted proteomics in case vs. control biomarker studies. This study provides a valuable reference for future neuroproteomic investigations of membrane proteins, and validates the use Triton X-114 detergent phase extraction on human post mortem brain.

  17. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  18. Analysis framework for GLORIA

    Science.gov (United States)

    Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian

    2012-05-01

    GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  19. Deuterium Inventory in Tore Supra (DITS): 2nd post-mortem analysis campaign and fuel retention in the gaps

    International Nuclear Information System (INIS)

    Dittmar, T.; Tsitrone, E.; Pegourie, B.; Cadez, I.; Pelicon, P.; Gauthier, E.; Languille, P.; Likonen, J.; Litnovsky, A.; Markelj, S.; Martin, C.; Mayer, M.; Pascal, J.-Y.; Pardanaud, C.; Philipps, V.; Roth, J.; Roubin, P.; Vavpetic, P.

    2011-01-01

    A dedicated study on fuel retention has been launched in Tore Supra, which includes a D wall-loading campaign and the dismantling of the main limiter (Deuterium Inventory in Tore Supra, DITS project). This paper presents new results from a second post-mortem analysis campaign on 40 tiles with special emphasis on the D retention in the gaps. SIMS analysis reveals that only 1/3 of the thickness of deposits in the plasma shadowed zones are due to the DITS wall-loading campaign. As pre-DITS deposits contain less D than DITS deposits, the contribution of DITS to the D inventory is about 30-50%. The new estimate for the total amount of D retained in the Tore Supra limiter is 1.7 x 10 24 atoms, close to the previous estimate, with the gap surfaces contributing about 33%. NRA measurements show a stepped decrease of D along the gap with strong asymmetries between different gap orientations.

  20. Video tracking and post-mortem analysis of dust particles from all tungsten ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Endstrasser, N., E-mail: Nikolaus.Endstrasser@ipp.mpg.de [Max-Planck-Insitut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Brochard, F. [Institut Jean Lamour, Nancy-Universite, Bvd. des Aiguillettes, F-54506 Vandoeuvre (France); Rohde, V., E-mail: Volker.Rohde@ipp.mpg.de [Max-Planck-Insitut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Balden, M. [Max-Planck-Insitut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Lunt, T.; Bardin, S.; Briancon, J.-L. [Institut Jean Lamour, Nancy-Universite, Bvd. des Aiguillettes, F-54506 Vandoeuvre (France); Neu, R. [Max-Planck-Insitut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2011-08-01

    2D dust particle trajectories are extracted from fast framing camera videos of ASDEX Upgrade (AUG) by a new time- and resource-efficient code and classified into stationary hot spots, single-frame events and real dust particle fly-bys. Using hybrid global and local intensity thresholding and linear trajectory extrapolation individual particles could be tracked up to 80 ms. Even under challenging conditions such as high particle density and strong vacuum vessel illumination all particles detected for more than 50 frames are tracked correctly. During campaign 2009 dust has been trapped on five silicon wafer dust collectors strategically positioned within the vacuum vessel of the full tungsten AUG. Characterisation of the outer morphology and determination of the elemental composition of 5 x 10{sup 4} particles were performed via automated SEM-EDX analysis. A dust classification scheme based on these parameters was defined with the goal to link the particles to their most probable production sites.

  1. Potentially Treatable Disorder Diagnosed Post Mortem by Exome Analysis in a Boy with Respiratory Distress

    Directory of Open Access Journals (Sweden)

    Valentina Imperatore

    2016-02-01

    Full Text Available We highlight the importance of exome sequencing in solving a clinical case of a child who died at 14 months after a series of respiratory crises. He was the half-brother of a girl diagnosed at 7 years with the early-onset seizure variant of Rett syndrome due to CDKL5 mutation. We performed a test for CDKL5 in the boy, which came back negative. Driven by the mother’s compelling need for a diagnosis, we moved forward performing whole exome sequencing analysis. Surprisingly, two missense mutations in compound heterozygosity were identified in the RAPSN gene encoding a receptor-associated protein with a key role in clustering and anchoring nicotinic acetylcholine receptors at synaptic sites. This gene is responsible for a congenital form of myasthenic syndrome, a disease potentially treatable with cholinesterase inhibitors. Therefore, an earlier diagnosis in this boy would have led to a better clinical management and prognosis. Our study supports the key role of exome sequencing in achieving a definite diagnosis in severe perinatal diseases, an essential step especially when a specific therapy is available.

  2. An audit of the contribution to post-mortem examination diagnosis of individual analyte results obtained from biochemical analysis of the vitreous.

    Science.gov (United States)

    Mitchell, Rebecca; Charlwood, Cheryl; Thomas, Sunethra Devika; Bellis, Maria; Langlois, Neil E I

    2013-12-01

    Biochemical analysis of the vitreous humor from the eye is an accepted accessory test for post-mortem investigation of cause of death. Modern biochemical analyzers allow testing of a range of analytes from a sample. However, it is not clear which analytes should be requested in order to prevent unnecessary testing (and expense). The means and standard deviation of the values obtained from analysis of the vitreous humor for sodium, potassium, chloride, osmolality, glucose, ketones (β-hydroxybutyrate), creatinine, urea, calcium, lactate, and ammonia were calculated from which the contribution of each analyte was reviewed in the context of post-mortem findings and final cause of death. For sodium 32 cases were regarded as high (more than one standard deviation above the mean), from which 9 contributed to post-mortem diagnosis [drowning (4), heat related death (2), diabetic hyperglycemia (2), and dehydration (1)], but 25 low values (greater than one standard deviation below the mean) made no contribution. For chloride 29 high values contributed to 4 cases--3 drowning and 1 heat-related, but these were all previously identified by a high sodium level. There were 29 high and 35 low potassium values, none of which contributed to determining the final cause of death. Of 22 high values of creatinine, 12 contributed to a diagnosis of renal failure. From 32 high values of urea, 18 contributed to 16 cases of renal failure (2 associated with diabetic hyperglycemia), 1 heat-related death, and one case with dehydration. Osmolarity contributed to 12 cases (5 heat-related, 4 diabetes, 2 renal failure, and 1 dehydration) from 36 high values. There was no contribution from 32 high values and 19 low values of calcium and there was no contribution from 4 high and 2 low values of ammonia. There were 11 high values of glucose, which contributed to the diagnosis of 6 cases of diabetic hyperglycemia and 21 high ketone levels contributed to 8 cases: 4 diabetic ketosis, 3 hypothermia, 3

  3. Reflexiones Acerca del Papel de la Mujer en la Reproducción Artificial Post Mortem (Analysis of the Role of Women in the Posthumous Reproduction

    Directory of Open Access Journals (Sweden)

    Alma Marìa Rodrìguez Guitián

    2017-03-01

    Full Text Available This contribution focuses on the analysis of the role of women in the posthumous reproduction. In the first place, it studies whether the woman's right to have children should be subject to limits and, if so, to which ones. Second, it explores if the posthumous reproduction extends to lesbian couples, married or not, and finally it focuses on the relevance of the mother's will to decide the deceased to be registered as the parent of the child. Este trabajo tiene por objeto el análisis del papel de la mujer en la hipótesis de la reproducción artificial post mortem. En primer lugar, aborda si el derecho a procrear de la mujer está sujeto a límites y, si es así, cuáles son. En segundo lugar, si la reproducción artificial post mortem se extiende desde el punto de vista subjetivo al matrimonio y parejas de mujeres y, por último, cuál es la relevancia de la voluntad de la madre gestante a la hora de decidir la determinación de la paternidad o maternidad de la persona fallecida. DOWNLOAD THIS PAPER FROM SSRN: https://ssrn.com/abstract=2921870

  4. Post-Mortem Analysis after High-Power Operation of the TD24_R05 Tested in Xbox_1

    CERN Document Server

    Degiovanni, Alberto; Mouriz Irazabal, Nerea; Aicheler, Markus

    2016-01-01

    The CLIC prototype structure TD24_R05 has been high power tested in Xbox_1 in 2013. This report summarizes all examinations conducted after the high power test including bead-pull measurements, structure cutting, metrology and SEM observations. A synthesis of the various results is then made. The structure developed a hot cell progressively during operation and detuning was observed after the test was complete. The post mortem examination clearly showed a developed standing wave pattern which was explained by the physical deformation of one of the coupler iris. An elevated breakdown count through SEM imaging in the suspected hot cell however could not be confirmed. Neither any particular feature offering an explanation for the observed longitudinal breakdown distribution could be detected.

  5. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  6. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  7. Clinical and post mortem analysis of combat neck injury used to inform a novel coverage of armour tool.

    Science.gov (United States)

    Breeze, J; Fryer, R; Hare, J; Delaney, R; Hunt, N C; Lewis, E A; Clasper, J C

    2015-04-01

    There is a requirement in the Ministry of Defence for an objective method of comparing the area of coverage of different body armour designs for future applications. Existing comparisons derived from surface wound mapping are limited in that they can only demonstrate the skin entry wound location. The Coverage of Armour Tool (COAT) is a novel three-dimensional model capable of comparing the coverage provided by body armour designs, but limited information exists as to which anatomical structures require inclusion. The aim of this study was to assess the utility of COAT, in the assessment of neck protection, using clinically relevant injury data. Hospital notes and post mortem records of all UK soldiers injured by an explosive fragment to the neck between 01 Jan 2006 and 31 December 2012 from Iraq and Afghanistan were analysed to determine which anatomical structures were responsible for death or functional disability at one year post injury. Using COAT a comparison of three ballistic neck collar designs was undertaken with reference to the percentage of these anatomical structures left exposed. 13/81 (16%) survivors demonstrated complications at one year, most commonly upper limb weakness from brachial plexus injury or a weak voice from laryngeal trauma. In 14/94 (15%) soldiers the neck wound was believed to have been the sole cause of death, primarily from carotid artery damage, spinal cord transection or rupture of the larynx. COAT objectively demonstrated that despite the larger OSPREY collar having almost double the surface area than the two-piece prototype collar, the percentage area of vulnerable cervical structures left exposed only reduced from 16.3% to 14.4%. COAT demonstrated its ability to objectively quantify the potential effectiveness of different body armour designs in providing coverage of vulnerable anatomical structures from different shot line orientations. To improve its utility, it is recommended that COAT be further developed to enable weapon

  8. Influence of operational condition on lithium plating for commercial lithium-ion batteries – Electrochemical experiments and post-mortem-analysis

    International Nuclear Information System (INIS)

    Ecker, Madeleine; Shafiei Sabet, Pouyan; Sauer, Dirk Uwe

    2017-01-01

    Highlights: •Investigation of lithium plating to support reliable system integration. •Influence of operational conditions at low temperature on lithium plating. •Comparison of different lithium-ion battery technologies. •Large differences in low-temperature behaviour for different technologies. •Post-mortem analysis reveals inhomogeneous deposition of metallic lithium. -- Abstract: The lifetime and safety of lithium-ion batteries are key requirements for successful market introduction of electro mobility. Especially charging at low temperature and fast charging, known to provoke lithium plating, is an important issue for automotive engineers. Lithium plating, leading both to ageing as well as safety risks, is known to play a crucial role in system design of the application. To gain knowledge of different influence factors on lithium plating, low-temperature ageing tests are performed in this work. Commercial lithium-ion batteries of various types are tested under various operational conditions such as temperature, current, state of charge, charging strategy as well as state of health. To analyse the ageing behaviour, capacity fade and resistance increase are tracked over lifetime. The results of this large experimental survey on lithium plating provide support for the design of operation strategies for the implementation in battery management systems. To further investigate the underlying degradation mechanisms, differential voltage curves and impedance spectra are analysed and a post-mortem analysis of anode degradation is performed for a selected technology. The results confirm the deposition of metallic lithium or lithium compounds in the porous structure and suggest a strongly inhomogeneous deposition over the electrode thickness with a dense deposition layer close to the separator for the considered cell. It is shown that this inhomogeneous deposition can even lead to loss of active material. The plurality of the investigated technologies

  9. Comparative analysis of bones, mites, soil chemistry, nematodes and soil micro-eukaryotes from a suspected homicide to estimate the post-mortem interval.

    Science.gov (United States)

    Szelecz, Ildikó; Lösch, Sandra; Seppey, Christophe V W; Lara, Enrique; Singer, David; Sorge, Franziska; Tschui, Joelle; Perotti, M Alejandra; Mitchell, Edward A D

    2018-01-08

    Criminal investigations of suspected murder cases require estimating the post-mortem interval (PMI, or time after death) which is challenging for long PMIs. Here we present the case of human remains found in a Swiss forest. We have used a multidisciplinary approach involving the analysis of bones and soil samples collected beneath the remains of the head, upper and lower body and "control" samples taken a few meters away. We analysed soil chemical characteristics, mites and nematodes (by microscopy) and micro-eukaryotes (by Illumina high throughput sequencing). The PMI estimate on hair 14 C-data via bomb peak radiocarbon dating gave a time range of 1 to 3 years before the discovery of the remains. Cluster analyses for soil chemical constituents, nematodes, mites and micro-eukaryotes revealed two clusters 1) head and upper body and 2) lower body and controls. From mite evidence, we conclude that the body was probably brought to the site after death. However, chemical analyses, nematode community analyses and the analyses of micro-eukaryotes indicate that decomposition took place at least partly on site. This study illustrates the usefulness of combining several lines of evidence for the study of homicide cases to better calibrate PMI inference tools.

  10. Framework for SEM contour analysis

    Science.gov (United States)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  11. Analysis of death in major trauma: value of prompt post mortem computed tomography (pmCT) in comparison to office hour autopsy.

    Science.gov (United States)

    Schmitt-Sody, Markus; Kurz, Stefanie; Reiser, Maximilian; Kanz, Karl Georg; Kirchhoff, Chlodwig; Peschel, Oliver; Kirchhoff, Sonja

    2016-03-29

    To analyze diagnostic accuracy of prompt post mortem Computed Tomography (pmCT) in determining causes of death in patients who died during trauma room management and to compare the results to gold standard autopsy during office hours. Multiple injured patients who died during trauma room care were enrolled. PmCT was performed immediately followed by autopsy during office hours. PmCT and autopsy were analyzed primarily regarding pmCT ability to find causes of death and secondarily to define exact causes of death including accurate anatomic localizations. For the secondary analysis data was divided in group-I with equal results of pmCT and autopsy, group-II with autopsy providing superior results and group-III with pmCT providing superior information contributing to but not majorly causing death. Seventeen multiple trauma patients were enrolled. Since multiple trauma patients were enrolled more injuries than patients are provided. Eight patients sustained deadly head injuries (47.1%), 11 chest (64.7%), 4 skeletal system (23.5%) injuries and one patient drowned (5.8%). Primary analysis revealed in 16/17 patients (94.1%) causes of death in accordance with autopsy. Secondary analysis revealed in 9/17 cases (group-I) good agreement of autopsy and pmCT. In seven cases autopsy provided superior results (group-II) whereas in 1 case pmCT found more information (group-III). The presented work studied the diagnostic value of pmCT in defining causes of death in comparison to standard autopsy. Primary analysis revealed that in 94.1% of cases pmCT was able to define causes of death even if only indirect signs were present. Secondary analysis showed that pmCT and autopsy showed equal results regarding causes of death in 52.9%. PmCT is useful in traumatic death allowing for an immediate identification of causes of death and providing detailed information on bony lesions, brain injuries and gas formations. It is advisable to conduct pmCT especially in cases without consent to

  12. Design and Analysis of Web Application Frameworks

    DEFF Research Database (Denmark)

    Schwarz, Mathias Romme

    -state manipulation vulnerabilities. The hypothesis of this dissertation is that we can design frameworks and static analyses that aid the programmer to avoid such errors. First, we present the JWIG web application framework for writing secure and maintainable web applications. We discuss how this framework solves...... some of the common errors through an API that is designed to be safe by default. Second, we present a novel technique for checking HTML validity for output that is generated by web applications. Through string analysis, we approximate the output of web applications as context-free grammars. We model......Numerous web application frameworks have been developed in recent years. These frameworks enable programmers to reuse common components and to avoid typical pitfalls in web application development. Although such frameworks help the programmer to avoid many common errors, we nd...

  13. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  14. Post-mortem analysis of suicide victims shows ABCB1 haplotype 1236T-2677T-3435T as a candidate predisposing factor behind adverse drug reactions in females.

    Science.gov (United States)

    Rahikainen, Anna-Liina; Palo, Jukka U; Haukka, Jari; Sajantila, Antti

    2018-04-01

    Genetic variation in efflux transporter, permeability glycoprotein (P-gp), has recently been associated with completed violent suicides and also violent suicide attempts. As depression is known to be a risk factor for suicide and many antidepressants are P-gp substrates, it has been speculated that inadequate antidepressant treatment response or adverse side effects could be involved. The aim of this study was to investigate whether there is an association between the P-gp coding ABCB1 gene and completed suicides in citalopram users. Also, the effect of sex and suicide method used (violent vs. non-violent) was evaluated. All cases included in the study population, 349 completed suicide victims and 284 controls, were shown to be positive for antidepressant citalopram in a post-mortem toxicological drug screen. ABCB1 1236C>T, 2677G>T/A and 3435C>T polymorphisms were determined by TaqMan genotyping assays. Haplotypes were constructed from genotype data using the PHASE software. The association between the manner of death and the ABCB1 haplotype was tested with logistic regression analysis. No statistically significant differences were observed in the ABCB1 allele or genotype frequencies between the suicide and control groups. However, the ABCB1 1236T-2677T-3435T haplotype was associated with completed suicides of female citalopram users (odds ratio: 2.23; 95% confidence interval: 1.22-4.07; P=0.009). After stratification by the method used for suicide, the association emerged in fatal intoxications (odds ratio: 2.51; 95% confidence interval: 1.29-4.87; P=0.007). In other groups, no statistically significant associations were observed. Our results suggest that female citalopram users with ABCB1 1236T-2677T-3435T are more vulnerable to adverse effects of the drugs as this haplotype was enriched in non-violent suicides of female citalopram users. Even though the biological mechanism behind this observation is unknown, the results provide another example of the importance

  15. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  16. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  17. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  18. Multimodal imaging and in vivo/post mortem co-registration in rodents and non human primates

    International Nuclear Information System (INIS)

    Delzescaux, T.

    2006-01-01

    Within the framework of neuro-degenerative disease studies, animal models still remain essential for the improvement of our understanding of underlying pathological mechanisms and for the discovery and development of potential novel therapeutic approaches. The pre-clinical research especially requires the use of non-human primates models because of the similarities between their brain and Human's, whereas fundamental investigations in many areas of biology and medicine more widely involve the use of rodent models.The recent developments of in vivo imaging systems dedicated to small animals (μ-CT, μ-MRI and μ-PET) have made possible the study of brain anatomic alterations as well as the longitudinal follow-up of metabolism and neurotransmission impairments, which can be involved in neuro-degenerative diseases. In particular, μ-PET is becoming increasingly relevant to assess the efficiency of a potential candidate in the field of drug discovery and development and disease diagnosis. However, until today a few laboratories are equipped with them. Moreover, their limited spatial resolution and the lack of specific biological markers are still major limitations. As a consequence, the scientific community still needs comparative anatomical and/or functional analyses, in particular for studies concerning rodent brain. Hence, post mortem biological imaging remains the powerful, reference and predominantly technology used for small animal imaging and for the validation of in vivo imaging systems. Generally, anatomical and complementary functional information are, respectively, provided by histological staining and autoradiography of corresponding brain sections. The large variety of histological dyes (cresyl violet for Nissl bodies Congo red for amyloid plaques) and radioactive compounds ([ 14 C]Deoxyglucose for cerebral glucose metabolism, [ 14 C]leucine for cerebral protein synthesis [ 14 C]iodoantipyrine for cerebral blood flow), as well as the microscopic range of

  19. Multimodal imaging and in vivo/post mortem co-registration in rodents and non human primates

    Energy Technology Data Exchange (ETDEWEB)

    Delzescaux, T. [Service Hospitalier Frederic Joliot, Isotopic Imaging, 91 - Orsay (France)

    2006-07-01

    Within the framework of neuro-degenerative disease studies, animal models still remain essential for the improvement of our understanding of underlying pathological mechanisms and for the discovery and development of potential novel therapeutic approaches. The pre-clinical research especially requires the use of non-human primates models because of the similarities between their brain and Human's, whereas fundamental investigations in many areas of biology and medicine more widely involve the use of rodent models.The recent developments of in vivo imaging systems dedicated to small animals ({mu}-CT, {mu}-MRI and {mu}-PET) have made possible the study of brain anatomic alterations as well as the longitudinal follow-up of metabolism and neurotransmission impairments, which can be involved in neuro-degenerative diseases. In particular, {mu}-PET is becoming increasingly relevant to assess the efficiency of a potential candidate in the field of drug discovery and development and disease diagnosis. However, until today a few laboratories are equipped with them. Moreover, their limited spatial resolution and the lack of specific biological markers are still major limitations. As a consequence, the scientific community still needs comparative anatomical and/or functional analyses, in particular for studies concerning rodent brain. Hence, post mortem biological imaging remains the powerful, reference and predominantly technology used for small animal imaging and for the validation of in vivo imaging systems. Generally, anatomical and complementary functional information are, respectively, provided by histological staining and autoradiography of corresponding brain sections. The large variety of histological dyes (cresyl violet for Nissl bodies Congo red for amyloid plaques) and radioactive compounds ([{sup 14}C]Deoxyglucose for cerebral glucose metabolism, [{sup 14}C]leucine for cerebral protein synthesis [{sup 14}C]iodoantipyrine for cerebral blood flow), as well as

  20. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  1. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  2. Post-mortem analysis on LiFePO4|Graphite cells describing the evolution & composition of covering layer on anode and their impact on cell performance

    Science.gov (United States)

    Lewerenz, Meinert; Warnecke, Alexander; Sauer, Dirk Uwe

    2017-11-01

    During cyclic aging of lithium-ion batteries the formation of a μm-thick covering layer on top of the anode facing the separator is found on top of the anode. In this work several post-mortem analyses of cyclic aged cylindrical LFP|Graphite cells are evaluated to give a detailed characterization of the covering layer and to find possible causes for the evolution of such a layer. The analyses of the layer with different methods return that it consists to high percentage of plated active lithium, deposited Fe and products of a solid electrolyte interphase (SEI). The deposition is located mainly in the center of the cell symmetrical to the coating direction. The origin of these depositions is assumed in locally overcharged particles, Fe deposition or inhomogeneous distribution of capacity density. As a secondary effect the deposition on one side increases the thickness locally; thereafter a pressure-induced overcharging due to charge agglomeration of the back side of the anode occurs. Finally a compact and dense covering layer in a late state of aging leads to deactivation of the covered parts of the anode and cathode due to suppressed lithium-ion conductivity. This leads to increasing slope of capacity fade and increase of internal resistance.

  3. The Measurand Framework: Scaling Exploratory Data Analysis

    Science.gov (United States)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  4. Digital Trade Infrastructures: A Framework for Analysis

    Directory of Open Access Journals (Sweden)

    Boriana Boriana

    2018-04-01

    Full Text Available In global supply chains, information about transactions resides in fragmented pockets within business and government systems. The lack of reliable, accurate and complete information makes it hard to detect risks (such as safety, security, compliance and commercial risks and at the same time makes international trade inefficient. The introduction of digital infrastructures that transcend organizational and system domains is driven by the prospect of reducing the fragmentation of information, thereby enabling improved security and efficiency in the trading process. This article develops a digital trade infrastructure framework through an empirically grounded analysis of four digital infrastructures in the trade domain, using the conceptual lens of digital infrastructure.

  5. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  6. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  7. Evaluation and Policy Analysis: A Communicative Framework

    Directory of Open Access Journals (Sweden)

    Cynthia Wallat

    1997-07-01

    Full Text Available A major challenge for the next generation of students of human development is to help shape the paradigms by which we analyze and evaluate public policies for children and families. Advocates of building research and policy connections point to health care and stress experiences across home, school, and community as critical policy issues that expand the scope of contexts and outcomes studied. At a minimum, development researchers and practitioners will need to be well versed in available methods of inquiry; they will need to be "methodologically multilingual" when conducting evaluation and policy analysis, producing reports, and reporting their interpretations to consumer and policy audiences. This article suggests how traditional approaches to policy inquiry can be reconsidered in light of these research inquiry and communicative skills needed by all policy researchers. A fifteen year review of both policy and discourse processes research is presented to suggest ways to conduct policy studies within a communicative framework.

  8. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  9. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  10. Using Framework Analysis in nursing research: a worked example.

    Science.gov (United States)

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  11. Post-mortem CT-coronary angiography

    DEFF Research Database (Denmark)

    Pøhlsgaard, Camilla; Leth, Peter Mygind

    2007-01-01

    post-mortem coronary angiography and computerized tomography.  We describe how to prepare and inject the contrast medium, and how to establish a CT-protocol that optimizes spatial resolution, low contrast resolution and noise level. Testing of the method on 6 hearts, showed that the lumen...

  12. CLARA: CLAS12 Reconstruction and Analysis Framework

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matta, Sebastian Mancilla [Santa Maria U., Valparaiso, Chile; Oyarzun, Ricardo [Santa Maria U., Valparaiso, Chile

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  13. Compressive rib fracture: peri-mortem and post-mortem trauma patterns in a pig model.

    Science.gov (United States)

    Kieser, Jules A; Weller, Sarah; Swain, Michael V; Neil Waddell, J; Das, Raj

    2013-07-01

    Despite numerous studies on high impact fractures of ribs, little is known about compressive rib injuries. We studied rib fractures from a biomechanical and morphological perspective using 15, 5th ribs of domestic pigs Sus scrofa, divided into two groups, desiccated (representing post-mortem trauma) and fresh ribs with intact periosteum (representing peri-mortem trauma). Ribs were axially compressed and subjected to four-point bending in an Instron 3339 fitted with custom jigs. Morphoscopic analysis of resultant fractures consisted of standard optical methods, micro-CT (μCT) and scanning electron microscopy (SEM). During axial compression, fresh ribs had slightly higher strength because of energy absorption capabilities of their soft and fluidic components. In flexure tests, dry ribs showed typical elastic-brittle behaviour with long linear load-extension curves, followed by relatively short non-linear elastic (hyperelastic) behaviour and brittle fracture. Fresh ribs showed initial linear-elastic behaviour, followed by strain softening, visco-plastic responses. During the course of loading, dry bone showed minimal observable damage prior to the onset of unstable fracture. In contrast, fresh bone showed buckling-like damage features on the compressive surface and cracking parallel to the axis of the bone. Morphologically, all dry ribs fractured precipitously, whereas all but one of the fresh ribs showed incomplete fracture. The mode of fracture, however, was remarkably similar for both groups, with butterfly fractures predominating (7/15, 46.6% dry and wet). Our study highlights the fact that under controlled loading, despite seemingly similar butterfly fracture morphology, fresh ribs (representing perimortem trauma) show a non-catastrophic response. While extensive strain softening observed for the fresh bone does show some additional micro-cracking damage, it appears that the periosteum may play a key role in imparting the observed pseudo-ductility to the ribs

  14. Reviviendo la consulta post-mortem.

    OpenAIRE

    Armando Cortés

    2009-01-01

    Por estos días se inaugura el “Centro de consulta post-mortem del Hospital Universitario del Valle”, una denominación más apropiada para la autopsia «ver por sí mismo» o cualquiera de sus sinónimos necropsia, examen post-mortem, necroscopia, o tanatopsia; todos ellos no aceptados y condicionados por factores culturales, sociales o religiosos. Estos términos han alcanzado una connotación claramente negativa en el ambiente médico y en el público general. Quizás, el mejor término sea «consulta p...

  15. Analysis of legal narratives: a conceptual framework

    NARCIS (Netherlands)

    Sileno, G.; Boer, A.; van Engers, T.; Schäfer, B.

    2012-01-01

    This article presents a conceptual framework intended to describe and to abstract cases or scenarios of compliance and non-compliance. These scenarios are collected in order to be animated in an agent-based platform for purposes of design and validation of both new regulations and new

  16. Linux Incident Response Volatile Data Analysis Framework

    Science.gov (United States)

    McFadden, Matthew

    2013-01-01

    Cyber incident response is an emphasized subject area in cybersecurity in information technology with increased need for the protection of data. Due to ongoing threats, cybersecurity imposes many challenges and requires new investigative response techniques. In this study a Linux Incident Response Framework is designed for collecting volatile data…

  17. Fatty kidney diagnosed by mortem computed tomography

    DEFF Research Database (Denmark)

    Leth, P. M.

    2016-01-01

    Subnuclear vacuolization of the renal tubular epithelium is indicative of diabetic and alcoholic ketoacidosis and has also been proposed as a postmortem marker for hypothermia. We present for the first time a fatal case of ketoacidosis in combination with exposure where a suspicion of these diagn...... of these diagnoses was raised by a marked radiolucency of the kidneys at post-mortem computed tomography (PMCT). © 2015 Elsevier Ltd....

  18. A multilevel evolutionary framework for sustainability analysis

    Directory of Open Access Journals (Sweden)

    Timothy M. Waring

    2015-06-01

    Full Text Available Sustainability theory can help achieve desirable social-ecological states by generalizing lessons across contexts and improving the design of sustainability interventions. To accomplish these goals, we argue that theory in sustainability science must (1 explain the emergence and persistence of social-ecological states, (2 account for endogenous cultural change, (3 incorporate cooperation dynamics, and (4 address the complexities of multilevel social-ecological interactions. We suggest that cultural evolutionary theory broadly, and cultural multilevel selection in particular, can improve on these fronts. We outline a multilevel evolutionary framework for describing social-ecological change and detail how multilevel cooperative dynamics can determine outcomes in environmental dilemmas. We show how this framework complements existing sustainability frameworks with a description of the emergence and persistence of sustainable institutions and behavior, a means to generalize causal patterns across social-ecological contexts, and a heuristic for designing and evaluating effective sustainability interventions. We support these assertions with case examples from developed and developing countries in which we track cooperative change at multiple levels of social organization as they impact social-ecological outcomes. Finally, we make suggestions for further theoretical development, empirical testing, and application.

  19. Fatal Chromobacterium violaceum septicaemia in northern Laos, a modified oxidase test and post-mortem forensic family G6PD analysis

    Directory of Open Access Journals (Sweden)

    Mayxay Mayfong

    2009-07-01

    Full Text Available Abstract Background Chromobacterium violaceum is a Gram negative facultative anaerobic bacillus, found in soil and stagnant water, that usually has a violet pigmented appearance on agar culture. It is rarely described as a human pathogen, mostly from tropical and subtropical areas. Case presentation A 53 year-old farmer died with Chromobacterium violaceum septicemia in Laos. A modified oxidase method was used to demonstrate that this violacious organism was oxidase positive. Forensic analysis of the glucose-6-phosphate dehydrogenase genotypes of his family suggest that the deceased patient did not have this possible predisposing condition. Conclusion C. violaceum infection should be included in the differential diagnosis in patients presenting with community-acquired septicaemia in tropical and subtropical areas. The apparently neglected but simple modified oxidase test may be useful in the oxidase assessment of other violet-pigmented organisms or of those growing on violet coloured agar.

  20. Fatal Chromobacterium violaceum septicaemia in northern Laos, a modified oxidase test and post-mortem forensic family G6PD analysis.

    Science.gov (United States)

    Slesak, Günther; Douangdala, Phouvieng; Inthalad, Saythong; Silisouk, Joy; Vongsouvath, Manivanh; Sengduangphachanh, Amphonesavanh; Moore, Catrin E; Mayxay, Mayfong; Matsuoka, Hiroyuki; Newton, Paul N

    2009-07-29

    Chromobacterium violaceum is a Gram negative facultative anaerobic bacillus, found in soil and stagnant water, that usually has a violet pigmented appearance on agar culture. It is rarely described as a human pathogen, mostly from tropical and subtropical areas. A 53 year-old farmer died with Chromobacterium violaceum septicemia in Laos. A modified oxidase method was used to demonstrate that this violacious organism was oxidase positive. Forensic analysis of the glucose-6-phosphate dehydrogenase genotypes of his family suggest that the deceased patient did not have this possible predisposing condition. C. violaceum infection should be included in the differential diagnosis in patients presenting with community-acquired septicaemia in tropical and subtropical areas. The apparently neglected but simple modified oxidase test may be useful in the oxidase assessment of other violet-pigmented organisms or of those growing on violet coloured agar.

  1. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  2. A Framework for Analysis of Case Studies of Reading Lessons

    Science.gov (United States)

    Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita

    2013-01-01

    This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…

  3. The influence of cycling temperature and cycling rate on the phase specific degradation of a positive electrode in lithium ion batteries: A post mortem analysis

    Science.gov (United States)

    Darma, Mariyam Susana Dewi; Lang, Michael; Kleiner, Karin; Mereacre, Liuda; Liebau, Verena; Fauth, Francois; Bergfeldt, Thomas; Ehrenberg, Helmut

    2016-09-01

    The influence of cycling temperatures and cycling rates on the cycling stability of the positive electrode (cathode) of commercial batteries are investigated. The cathode is a mixture of LiMn2O4 (LMO), LiNi0.5Co0.2Mn0.3O2 (NCM) and LiNi0.8Co0.15Al0.05O2 (NCA). It is found that increasing the cycling temperature from 25 °C to 40 °C is detrimental to the long term cycling stability of the cathode. Contrastingly, the improved cycling stability is observed for the cathodes cycled at higher charge/discharge rate (2C/3C instead of 1C/2C). The microstructure analysis by X-ray powder diffraction reveals that a significant capacity fading and an increased overvoltage is observed for NCM and NCA in all the fatigued cathodes. After high number of cycling (above 1500 cycles), NCM becomes partially inactive. In contrast to NCM and NCA, LMO shows a good cycling stability at 25 °C. A pronounced degradation of LMO is only observed for the fatigued cathodes cycled at 40 °C. The huge capacity losses of NCM and NCA are most likely because the blended cathodes were cycled up to 4.12 V vs. the graphite anode during the cycle-life test (corresponds to 4.16 V vs. Li+/Li); which is beyond the stability limit of the layered oxides below 4.05 V vs. Li+/Li.

  4. Isolation of primary microglia from the human post-mortem brain: effects of ante- and post-mortem variables.

    Science.gov (United States)

    Mizee, Mark R; Miedema, Suzanne S M; van der Poel, Marlijn; Adelia; Schuurman, Karianne G; van Strien, Miriam E; Melief, Jeroen; Smolders, Joost; Hendrickx, Debbie A; Heutinck, Kirstin M; Hamann, Jörg; Huitinga, Inge

    2017-02-17

    Microglia are key players in the central nervous system in health and disease. Much pioneering research on microglia function has been carried out in vivo with the use of genetic animal models. However, to fully understand the role of microglia in neurological and psychiatric disorders, it is crucial to study primary human microglia from brain donors. We have developed a rapid procedure for the isolation of pure human microglia from autopsy tissue using density gradient centrifugation followed by CD11b-specific cell selection. The protocol can be completed in 4 h, with an average yield of 450,000 and 145,000 viable cells per gram of white and grey matter tissue respectively. This method allows for the immediate phenotyping of microglia in relation to brain donor clinical variables, and shows the microglia population to be distinguishable from autologous choroid plexus macrophages. This protocol has been applied to samples from over 100 brain donors from the Netherlands Brain Bank, providing a robust dataset to analyze the effects of age, post-mortem delay, brain acidity, and neurological diagnosis on microglia yield and phenotype. Our data show that cerebrospinal fluid pH is positively correlated to microglial cell yield, but donor age and post-mortem delay do not negatively affect viable microglia yield. Analysis of CD45 and CD11b expression showed that changes in microglia phenotype can be attributed to a neurological diagnosis, and are not influenced by variation in ante- and post-mortem parameters. Cryogenic storage of primary microglia was shown to be possible, albeit with variable levels of recovery and effects on phenotype and RNA quality. Microglial gene expression substantially changed due to culture, including the loss of the microglia-specific markers, showing the importance of immediate microglia phenotyping. We conclude that primary microglia can be isolated effectively and rapidly from human post-mortem brain tissue, allowing for the study of the

  5. Risk and train control : a framework for analysis

    Science.gov (United States)

    2001-01-01

    This report develops and demonstrates a framework for examining the effects of various train control strategies on some of the major risks of railroad operations. Analysis of hypothetical 1200-mile corridor identified the main factors that increase r...

  6. An analysis of a national strategic framework to promote tourism ...

    African Journals Online (AJOL)

    An analysis of a national strategic framework to promote tourism, leisure, sport and ... is to highlight the extent to which selected macro policy components namely, ... tourism growth, tourism safety and security, environmental management and ...

  7. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  8. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  9. Investigations into the analysis of the rate of decay of the compound action potentials recorded from the rat sciatic nerve after death: significance for the prediction of the post-mortem period.

    Science.gov (United States)

    Nokes, L D; Daniel, D; Flint, T; Barasi, S

    1991-01-01

    There have been a number of papers that have reported the investigations of electrical stimulation of muscle groups in order to determine the post-mortem period. To the authors knowledge, no techniques have been described that analyse the compound action potentials (CAP) of various nerve fibre groups after death. This paper reports the monitoring of both the amplitude and latency changes of the CAP recorded from a stimulated rat sciatic nerve after death. Initial results suggest that the method my be useful in determining the early post-mortem period within 1 or 2 h after death. It may also be of use in measuring nerve conduction delay in various pathological conditions that can affect the neural network; for example diabetes.

  10. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  11. SPATIAL ANALYSIS FRAMEWORK FOR MANGROVE FORESTS RESTORATION

    Directory of Open Access Journals (Sweden)

    Arimatéa de Carvalho Ximenes

    2016-09-01

    Full Text Available Mangroves are coastal ecosystems in transition between sea and land, localized worldwide on the tropical and subtropical regions. However, anthropogenic pressure in coastal areas has led to the conversion of many mangrove areas to other uses. Due to the increased awareness of the importance of mangroves worldwide, restoration methods are being studied. Our aim is to develop a framework for selecting suitable sites for red mangrove planting using Geographic Information Systems (GIS. For this reason, the methodology is based on abiotic factors that have an influence on the zonation (distribution and growing of the Rhizophora mangle. A total suitable area of 6,12 hectares was found, where 15.300 propagules could be planted.

  12. Post mortem examination report concerning Nadim Nuwwara

    DEFF Research Database (Denmark)

    Leth, Peter Mygind

    2014-01-01

    Post mortem examination report concerning Nadim Nuwawara, 17-years old, who was killed may 15 2014 in Beitunia near Rahmallah, Palestine. The examination was performed by an international team consisting of dr. Saber Al-Aloul, director of the Medico Legal Institute at Quds University, dr. Marc A....... Krouse, Deputy Chief Medical Examiner, Office of Chief Medical Examiner, Fort Worth, Texas, USA, dr. Chen Kugel, Chief Forensic Pathologist, Abu Kabir Institute of Forensic Medicine, Tel Aviv, dr. Ricardo Pablo Nachman, forensic expert at Abu Kabir Institute of Forensic Medicine, Tel Aviv and dr. Peter...

  13. Utility of Post-Mortem Genetic Testing in Cases of Sudden Arrhythmic Death Syndrome

    NARCIS (Netherlands)

    Lahrouchi, Najim; Raju, Hariharan; Lodder, Elisabeth M.; Papatheodorou, Efstathios; Ware, James S.; Papadakis, Michael; Tadros, Rafik; Cole, Della; Skinner, Jonathan R.; Crawford, Jackie; Love, Donald R.; Pua, Chee J.; Soh, Bee Y.; Bhalshankar, Jaydutt D.; Govind, Risha; Tfelt-Hansen, Jacob; Winkel, Bo G.; van der Werf, Christian; Wijeyeratne, Yanushi D.; Mellor, Greg; Till, Jan; Cohen, Marta C.; Tome-Esteban, Maria; Sharma, Sanjay; Wilde, Arthur A. M.; Cook, Stuart A.; Bezzina, Connie R.; Sheppard, Mary N.; Behr, Elijah R.

    2017-01-01

    Sudden arrhythmic death syndrome (SADS) describes a sudden death with negative autopsy and toxicological analysis. Cardiac genetic disease is a likely etiology. This study investigated the clinical utility and combined yield of post-mortem genetic testing (molecular autopsy) in cases of SADS and

  14. Utility of Post-Mortem Genetic Testing in Cases of Sudden Arrhythmic Death Syndrome

    DEFF Research Database (Denmark)

    Lahrouchi, Najim; Raju, Hariharan; Lodder, Elisabeth M

    2017-01-01

    BACKGROUND: Sudden arrhythmic death syndrome (SADS) describes a sudden death with negative autopsy and toxicological analysis. Cardiac genetic disease is a likely etiology. OBJECTIVES: This study investigated the clinical utility and combined yield of post-mortem genetic testing (molecular autopsy...

  15. Framework for the analysis of crystallization operations

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; Abdul Samad, Noor Asma Fazli Bin; Gernaey, Krist

    Crystallization is often applied in the production of salts and/oractive pharmaceutical ingredients (API), and the crystallization step is an essential part of the manufacturing process for many chemicals-based products.In recent years the monitoring and analysis of crystallization operations has...

  16. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  17. Establishing a framework for comparative analysis of genome sequences

    Energy Technology Data Exchange (ETDEWEB)

    Bansal, A.K.

    1995-06-01

    This paper describes a framework and a high-level language toolkit for comparative analysis of genome sequence alignment The framework integrates the information derived from multiple sequence alignment and phylogenetic tree (hypothetical tree of evolution) to derive new properties about sequences. Multiple sequence alignments are treated as an abstract data type. Abstract operations have been described to manipulate a multiple sequence alignment and to derive mutation related information from a phylogenetic tree by superimposing parsimonious analysis. The framework has been applied on protein alignments to derive constrained columns (in a multiple sequence alignment) that exhibit evolutionary pressure to preserve a common property in a column despite mutation. A Prolog toolkit based on the framework has been implemented and demonstrated on alignments containing 3000 sequences and 3904 columns.

  18. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    OpenAIRE

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  19. Strategy analysis frameworks for strategy orientation and focus

    OpenAIRE

    Isoherranen, V. (Ville)

    2012-01-01

    Abstract The primary research target of this dissertation is to develop new strategy analysis frameworks, focusing on analysing changes in strategic position as a function of variations in life cycle s-curve/time/typology/market share/orientation. Research is constructive and qualitative by nature, with case study methodology being the adopted approach. The research work is carried out as a compilation dissertation containing four (4) journal articles. The theoretical framework of thi...

  20. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  1. Overview of the NRC/EPRI common cause analysis framework

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Worledge, D.H.; Mosleh, A.; Fleming, K.; Parry, G.W.; Paula, H.

    1988-01-01

    This paper presents an overview of a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures whose causes are not explicitly included in the logic model as basic events. The emphasis here is on providing guidelines for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) Logic Model Development, (2) Identification of Common Cause Component Groups, (3) Common Cause Modeling and Data Analysis, and (4) Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 25 references

  2. Delayed Post Mortem Predation in Lightning Strike Carcasses ...

    African Journals Online (AJOL)

    Campbell Murn

    An adult giraffe was struck dead by lightning on a game farm outside. Phalaborwa, South Africa in March 2014. Interestingly, delayed post-mortem predation occurred on the carcass, which according to the farm owners was an atypical phenomenon for the region. Delayed post-mortem scavenging on lightning strike ...

  3. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  4. In vitro studies of ante-mortem proliferation kinetics

    International Nuclear Information System (INIS)

    McBride, W.H.; Withers, H.R.

    1986-01-01

    Using K562 human erythroblastoid cells, it was concluded that dose fractionation has no discrepant effect on the ante-mortem proliferation kinetics of doomed cells as opposed to clonogenic cell survival and that effects on ante-mortem proliferation kinetics cannot be solely responsible for the differences in fractionation response between early and late responding tissues. (UK)

  5. Post-mortem CT evaluation of atlanto-occipital dissociation.

    Science.gov (United States)

    Madadin, Mohammed; Samaranayake, Ravindra Priyalal; O'Donnell, Chris; Cordner, Stephen

    2017-02-01

    Atlanto-occipital dissociation injury is an important injury in forensic pathology practice. Radiological diagnosis of atlanto-occipital dissociation clinically is assessed by direct measurement of occipito-vertebral skeletal relationships. Different measurements may be used to diagnose atlanto-occipital dissociation, including the basion-dens interval (BDI) and basion-axial interval (BAI). It is not known whether the normal ante-mortem measurements of BDI and BAI described in the literature are applicable to post-mortem CT images of the occipito-cervical junction (OCJ) or whether these measurements could be affected by early post-mortem changes. This study aims to compare post-mortem BDI and BAI measurements with ante-mortem values. Post-mortem CT scans of the cervical spines of 100 deceased adults were reviewed, and the BDI and BAI were measured. Different parameters were recorded in each case. The results from this study suggest that there are no effects of post-mortem changes on the measurement of BAI as relied upon clinically. There appear to be some effects of fully established rigor mortis on BDI measurement, shortening it. This may have consequences for the post mortem diagnosis of atlanto-occipital dissociation. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  6. ANALYSIS FRAMEWORKS OF THE COLLABORATIVE INNOVATION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2014-12-01

    Full Text Available Time management is one of the resources by which we can achieve improved performance innovation. This perspective of resource management and process efficiency by reducing the timing of incubation of ideas, selecting profitable innovations and turning them into added value relates to that absolute time, a time specific to human existence. In this article I will try to prove that the main way to obtain high performance through inter-organizational innovation can be achieved by manipulating the context and manipulating knowledge outside the arbitrary concept for “time”. This article presents the results of the research suggesting a sequential analysis and evaluation model of the performance through a rational and refined process of selection of the performance indicators, aiming at providing the shortest and most relevant list of criteria.

  7. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  8. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture.

    Science.gov (United States)

    Winklhofer, Sebastian; Stoeck, Christian T; Berger, Nicole; Thali, Michael; Manka, Robert; Kozerke, Sebastian; Alkadhi, Hatem; Stolzmann, Paul

    2014-11-01

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p Analysis of HA distribution demonstrated remodelling of myofibre architecture, with significant differences between healthy segments and segments with chronic (p  0.05). Post-mortem cardiac DTI enables differentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. • DTI enables post-mortem detection of myocardial infarction with good accuracy. • A decrease in right-handed helical fibre indicates myofibre remodelling following chronic myocardial infarction. • DTI allows for ruling out myocardial infarction by means of FA. • Post-mortem DTI may represent a valuable screening tool in forensic investigations.

  9. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  10. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  11. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  12. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    opportunities, a generic modelling framework is proposed to handle this task. This framework outlines a set of building blocks which are necessary for carrying out the economic analysis of various BS applications. Further, special focus is given on describing how to use the rainflow cycle counting algorithm...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so......Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...

  13. [Legal aspects of post-mortem radiology in the Netherlands].

    Science.gov (United States)

    Venderink, W; Dute, J C J

    2016-01-01

    In the Netherlands, the application of post-mortem radiology (virtual autopsy) is on the rise. Contrary to conventional autopsy, with post-mortem radiology the body remains intact. There is uncertainty concerning the legal admissibility of post-mortem radiology, since the Dutch Corpse Disposal Act does not contain any specific regulations for this technique. Autopsy and post-mortem radiology differ significantly from a technical aspect, but these differences do not have far-reaching legal consequences from a legal perspective. Even though the body remains intact during post-mortem radiology, the bodily integrity of a deceased person is breached if it would be applied without previously obtained consent. This permission can only be obtained after the relatives are fully informed about the proposed activity. In this respect, it is not relevant which technique is used, be it post-mortem radiology or autopsy. Therefore, the other legal conditions for post-mortem radiology are essentially identical to those for autopsy.

  14. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  15. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...

  16. A comparative analysis of protected area planning and management frameworks

    Science.gov (United States)

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  17. Agricultural Value Chains in Developing Countries; a Framework for Analysis

    NARCIS (Netherlands)

    Trienekens, J.H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component

  18. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  19. Generic Formal Framework for Compositional Analysis of Hierarchical Scheduling Systems

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; Hyun Kim, Jin; Thi Xuan Phan, Linh

    We present a compositional framework for the specification and analysis of hierarchical scheduling systems (HSS). Firstly we provide a generic formal model, which can be used to describe any type of scheduling system. The concept of Job automata is introduced in order to model job instantiation...

  20. Transactional Analysis: Conceptualizing a Framework for Illuminating Human Experience

    Directory of Open Access Journals (Sweden)

    Trevor Thomas Stewart PhD

    2011-09-01

    Full Text Available Myriad methods exist for analyzing qualitative data. It is, however, imperative for qualitative researchers to employ data analysis tools that are congruent with the theoretical frameworks underpinning their inquiries. In this paper, I have constructed a framework for analyzing data that could be useful for researchers interested in focusing on the transactional nature of language as they engage in Social Science research. Transactional Analysis (TA is an inductive approach to data analysis that transcends constant comparative methods of exploring data. Drawing on elements of narrative and thematic analysis, TA uses the theories of Bakhtin and Rosenblatt to attend to the dynamic processes researchers identify as they generate themes in their data and seek to understand how their participants' worldviews are being shaped. This paper highlights the processes researchers can utilize to study the mutual shaping that occurs as participants read and enter into dialogue with the world around them.

  1. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  2. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  3. Combinatorial-topological framework for the analysis of global dynamics

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  4. Combinatorial-topological framework for the analysis of global dynamics.

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  5. Framework for Interactive Parallel Dataset Analysis on the Grid

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  6. Analysis of Worldwide Regulatory Framework for On-Line Maintenance

    International Nuclear Information System (INIS)

    Ahn, Sang Kyu; Oh, Kyu Myung; Lee, Chang Ju

    2010-01-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of online maintenance (OLM). OLM means a kind of planned maintenance of nuclear reactor facilities, including structure, systems, and components (SSCs), during power operation. In Korea, a similar situation is made up, so it needs to establish a regulatory framework for OLM. A few years ago, foreign countries' practices related to OLM were surveyed by the Working Group on Inspection Practices (WGIP) of OECD/NEA/CNRA. The survey results and additional new information of countries' status will be helpful to establish our own regulatory framework for OLM, which are analyzed in this paper. From the analysis, some considerable points to be addressed for establishing a regulatory framework for OLM are suggested

  7. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  8. Second generation CO2 FEP analysis: Cassifcarbon sequestration scenario identification framework

    NARCIS (Netherlands)

    Yavuz, F.T.; Tilburg, T. van; Pagnier, H.

    2008-01-01

    A novel scenario analysis framework has been created, called Carbon Sequestration Scenario Identification Framework (CASSIF). This framework addresses containment performance defined by the three major categories: well, fault and seal integrity. The relevant factors that influence the integrity are

  9. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    to occupational settings with minor environmental considerations, and most have not been thoroughly tested on a wide range of NM. Care should also be taken when selecting the most appropriate risk analysis strategy for a given risk context. Given this, we recommend a multi-faceted approach to assess...... the environmental risks of NM as well as increased applications and testing of the proposed frameworks for different NM....

  10. Uses and social meanings of post-mortem photography in Colombia

    OpenAIRE

    Ana María Henao Albarracín

    2013-01-01

    This research is intended to understand the social uses and meanings of post-mortem or funeral photography between the late nineteenth and mid-twentieth century in Colombia. The article seeks to contribute to the analysis of the relationship between photography and society, and more particularly, between photography and a social representation of death, identifying the conventions and rules of this photographic practice that determine aesthetic behaviors around death.

  11. Uses and social meanings of post-mortem photography in Colombia

    Directory of Open Access Journals (Sweden)

    Ana María Henao Albarracín

    2013-06-01

    Full Text Available This research is intended to understand the social uses and meanings of post-mortem or funeral photography between the late nineteenth and mid-twentieth century in Colombia. The article seeks to contribute to the analysis of the relationship between photography and society, and more particularly, between photography and a social representation of death, identifying the conventions and rules of this photographic practice that determine aesthetic behaviors around death.

  12. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  13. A new kernel discriminant analysis framework for electronic nose recognition

    International Nuclear Information System (INIS)

    Zhang, Lei; Tian, Feng-Chun

    2014-01-01

    Graphical abstract: - Highlights: • This paper proposes a new discriminant analysis framework for feature extraction and recognition. • The principle of the proposed NDA is derived mathematically. • The NDA framework is coupled with kernel PCA for classification. • The proposed KNDA is compared with state of the art e-Nose recognition methods. • The proposed KNDA shows the best performance in e-Nose experiments. - Abstract: Electronic nose (e-Nose) technology based on metal oxide semiconductor gas sensor array is widely studied for detection of gas components. This paper proposes a new discriminant analysis framework (NDA) for dimension reduction and e-Nose recognition. In a NDA, the between-class and the within-class Laplacian scatter matrix are designed from sample to sample, respectively, to characterize the between-class separability and the within-class compactness by seeking for discriminant matrix to simultaneously maximize the between-class Laplacian scatter and minimize the within-class Laplacian scatter. In terms of the linear separability in high dimensional kernel mapping space and the dimension reduction of principal component analysis (PCA), an effective kernel PCA plus NDA method (KNDA) is proposed for rapid detection of gas mixture components by an e-Nose. The NDA framework is derived in this paper as well as the specific implementations of the proposed KNDA method in training and recognition process. The KNDA is examined on the e-Nose datasets of six kinds of gas components, and compared with state of the art e-Nose classification methods. Experimental results demonstrate that the proposed KNDA method shows the best performance with average recognition rate and total recognition rate as 94.14% and 95.06% which leads to a promising feature extraction and multi-class recognition in e-Nose

  14. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  15. Flexible Human Behavior Analysis Framework for Video Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Weilun Lao

    2010-01-01

    Full Text Available We study a flexible framework for semantic analysis of human motion from surveillance video. Successful trajectory estimation and human-body modeling facilitate the semantic analysis of human activities in video sequences. Although human motion is widely investigated, we have extended such research in three aspects. By adding a second camera, not only more reliable behavior analysis is possible, but it also enables to map the ongoing scene events onto a 3D setting to facilitate further semantic analysis. The second contribution is the introduction of a 3D reconstruction scheme for scene understanding. Thirdly, we perform a fast scheme to detect different body parts and generate a fitting skeleton model, without using the explicit assumption of upright body posture. The extension of multiple-view fusion improves the event-based semantic analysis by 15%–30%. Our proposed framework proves its effectiveness as it achieves a near real-time performance (13–15 frames/second and 6–8 frames/second for monocular and two-view video sequences.

  16. Diagnosis of drowning using post-mortem computed tomography - state of the art.

    Science.gov (United States)

    Raux, C; Saval, F; Rouge, D; Telmon, N; Dedouit, F

    Recent studies using post-mortem computed tomography (PMCT) have suggested this imaging modality is of value in the positive diagnosis of drowning. We summarize the data from the literature regarding the diagnostic value of CT in cases of drowning. We performed an all-language search of literature published from 1999 to 2013 with the key words "post-mortem CT scan", "drowning and CT scan", "near-drowning diagnosis", and "drowning diagnosis". Only 11 articles, whose data enabled complementary statistical analysis, were included. The presence of fluid and sediment in paranasal sinuses appear to be the determinants of the diagnosis of drowning. The presence of fluid in the sinuses had a sensitivity of 100%, and of 90% in the trachea and main bronchi. The results were completed by the high specificity of the presence of sediment in the paranasal sinuses, upper airways and stomach, which was 100% for all three. Haemodilution was present in cases of drowning (p drowning.

  17. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  18. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  19. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  20. An ovine in vivo framework for tracheobronchial stent analysis.

    Science.gov (United States)

    McGrath, Donnacha J; Thiebes, Anja Lena; Cornelissen, Christian G; O'Shea, Mary B; O'Brien, Barry; Jockenhoevel, Stefan; Bruzzi, Mark; McHugh, Peter E

    2017-10-01

    Tracheobronchial stents are most commonly used to restore patency to airways stenosed by tumour growth. Currently all tracheobronchial stents are associated with complications such as stent migration, granulation tissue formation, mucous plugging and stent strut fracture. The present work develops a computational framework to evaluate tracheobronchial stent designs in vivo. Pressurised computed tomography is used to create a biomechanical lung model which takes into account the in vivo stress state, global lung deformation and local loading from pressure variation. Stent interaction with the airway is then evaluated for a number of loading conditions including normal breathing, coughing and ventilation. Results of the analysis indicate that three of the major complications associated with tracheobronchial stents can potentially be analysed with this framework, which can be readily applied to the human case. Airway deformation caused by lung motion is shown to have a significant effect on stent mechanical performance, including implications for stent migration, granulation formation and stent fracture.

  1. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  2. Quantification of ante-mortem hypoxic ischemic brain injury by post-mortem cerebral magnetic resonance imaging in neonatal encephalopathy.

    Science.gov (United States)

    Montaldo, Paolo; Chaban, Badr; Lally, Peter J; Sebire, Neil J; Taylor, Andrew M; Thayyil, Sudhin

    2015-11-01

    Post-mortem (PM) magnetic resonance imaging (MRI) is increasingly used as an alternative to conventional autopsy in babies dying from neonatal encephalopathy. However, the confounding effect of post-mortem changes on the detection of ante-mortem ischemic injury is unclear. We examined whether quantitative MR measurements can accurately distinguish ante-mortem ischemic brain injury from artifacts using post-mortem MRI. We compared PM brain MRI (1.5 T Siemens, Avanto) in 7 infants who died with neonatal encephalopathy (NE) of presumed hypoxic-ischemic origin with 7 newborn infants who had sudden unexplained neonatal death (SUND controls) without evidence of hypoxic-ischemic brain injury at autopsy. We measured apparent diffusion coefficients (ADCs), T1-weighted signal intensity ratios (SIRs) compared to vitreous humor and T2 relaxation times from 19 predefined brain areas typically involved in neonatal encephalopathy. There were no differences in mean ADC values, SIRs on T1-weighted images or T2 relaxation times in any of the 19 predefined brain areas between NE and SUND infants. All MRI images showed loss of cortical gray/white matter differentiation, loss of the normal high signal intensity (SI) in the posterior limb of the internal capsule on T1-weighted images, and high white matter SI on T2-weighted images. Normal post-mortem changes may be easily mistaken for ante-mortem ischemic injury, and current PM MRI quantitative assessment cannot reliably distinguish these. These findings may have important implications for appropriate interpretation of PM imaging findings, especially in medico-legal practice. Copyright © 2015 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  3. Exploring intellectual capital through social network analysis: a conceptual framework

    Directory of Open Access Journals (Sweden)

    Ivana Tichá

    2011-01-01

    Full Text Available The purpose of this paper is to develop a framework to assess intellectual capital. Intellectual capital is a key element in an organization’s future earning potential. Theoretical and empirical studies show that it is the unique combination of the different elements of intellectual capital and tangible investments that determines an enterprise´s competitive advantage. Intellectual capital has been defined as the combination of an organization´s human, organizational and relational resources and activities. It includes the knowledge, skills, experience and abilities of the employees, its R&D activities, organizational, routines, procedures, systems, databases and its Intellectual Property Rights, as well as all the resources linked to its external relationships, such as with its customers, suppliers, R&D partners, etc. This paper focuses on the relational capital and attempts to suggest a conceptual framework to assess this part of intellectual capital applying social network analysis approach. The SNA approach allows for mapping and measuring of relationships and flows between, people, groups, organizations, computers, URLs, and other connected information/knowledge entities. The conceptual framework is developed for the assessment of collaborative networks in the Czech higher education sector as the representation of its relational capital. It also builds on the previous work aiming at proposal of methodology guiding efforts to report intellectual capital at the Czech public universities.

  4. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis......We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  5. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    Science.gov (United States)

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  6. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  7. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  8. DNA methylation results depend on DNA integrity – role of post mortem interval

    Directory of Open Access Journals (Sweden)

    Mathias eRhein

    2015-05-01

    Full Text Available Major questions of neurological and psychiatric mechanisms involve the brain functions on a molecular level and cannot be easily addressed due to limitations in access to tissue samples. Post mortem studies are able to partly bridge the gap between brain tissue research retrieved from animal trials and the information derived from peripheral analysis (e.g. measurements in blood cells in patients. Here, we wanted to know how fast DNA degradation is progressing under controlled conditions in order to define thresholds for tissue quality to be used in respective trials. Our focus was on the applicability of partly degraded samples for bisulfite sequencing and the determination of simple means to define cut-off values.After opening the brain cavity, we kept two consecutive pig skulls at ambient temperature (19-21°C and removed cortex tissue up to a post mortem interval (PMI of 120h. We calculated the percentage of degradation on DNA gel electrophoresis of brain DNA to estimate quality and relate this estimation spectrum to the quality of human post-mortem control samples. Functional DNA quality was investigated by bisulfite sequencing of two functionally relevant genes for either the serotonin receptor 5 (SLC6A4 or aldehyde dehydrogenase 2 (ALDH2.Testing our approach in a heterogeneous collective of human blood and brain samples, we demonstrate integrity of measurement quality below the threshold of 72h PMI.While sequencing technically worked for all timepoints irrespective of conceivable DNA degradation, there is a good correlation between variance of methylation to degradation levels documented in the gel (R2=0.4311, p=0.0392 for advancing post mortem intervals (PMI. This otherwise elusive phenomenon is an important prerequisite for the interpretation and evaluation of samples prior to in-depth processing via an affordable and easy assay to estimate identical sample quality and thereby comparable methylation measurements.

  9. A Case Study in Support of Multiple Post Mortem Assessments (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jill Pable

    2015-02-01

    Full Text Available Creative projects in various fields are often subjected to afterthe- fact 'post-mortem' assessments to better understand their successes and failures. Names for these include project retrospectives or post occupancy evaluations (POEs depending on their field of origin. This case study from the architecture field will show the utility of engaging in multiple rounds of post-mortem activities in order to assess the solution from multiple stakeholder perspectives and in doing so, more fully recognize its strengths and weaknesses. The design of a homeless shelter bedroom was subjected to two POE analyses: a 'demand side' focused study that analyzed user accommodation, and a 'supply side' study that addressed issues including budget and funding. The two POEs yielded both corroborative and contrasting findings that sometimes worked at cross purposes. Three evaluation tactics emerged that could be extended to other fields' post mortem assessment activities: 1 conduct two or more POEs; 2 vary the POE criteria so that one is deep and focused 'demand side' user analysis and the other is 'supply side' operational and installation issues; and 3 conduct the POEs over a broad time period.

  10. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture

    Energy Technology Data Exchange (ETDEWEB)

    Winklhofer, Sebastian; Berger, Nicole; Stolzmann, Paul [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); Stoeck, Christian T.; Kozerke, Sebastian [Institute for Biomedical Engineering University and ETH Zurich, Zurich (Switzerland); Thali, Michael [University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); Manka, Robert [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); Institute for Biomedical Engineering University and ETH Zurich, Zurich (Switzerland); University Hospital Zurich, Clinic for Cardiology, Zurich (Switzerland); Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland)

    2014-11-15

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p < 0.01) and lower MD (p < 0.001) compared to segments with MI. Multivariate logistic regression demonstrated that FA (p < 0.10) and MD (p = 0.01) with the covariate post-mortem time (p < 0.01) predicted MI with an accuracy of 0.73. Analysis of HA distribution demonstrated remodelling of myofibre architecture, with significant differences between healthy segments and segments with chronic (p < 0.001) but not with acute MI (p > 0.05). Post-mortem cardiac DTI enablesdifferentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. (orig.)

  11. Quality of coroner's post-mortems in a UK hospital.

    Science.gov (United States)

    Al Mahdy, Husayn

    2014-01-01

    The aim of this paper was, principally, to look at the coroner's post-mortem report quality regarding adult medical patients admitted to an English hospital; and to compare results with Royal College of Pathologists guidelines. Hospital clinical notes of adult medical patients dying in 2011 and who were referred to the coroner's office to determine the cause of death were scrutinised. Their clinical care was also reviewed. There needs to be a comprehensive approach to coroner's post-mortems such as routinely taking histological and microbiological specimens. Acute adult medical patient care needs to improve. Steps should be taken to ensure that comprehensive coroner's post-mortems are performed throughout the UK, including with routine histological and microbiological specimens examination. Additionally, closer collaboration between clinicians and pathologists needs to occur to improve emergency adult medical patient clinical care. The study highlights inadequacies in coroner's pathology services.

  12. Axial osteitis of the proximal sesamoid bones and desmitis of the intersesamoidean ligament in the hindlimb of Friesian horses: review of 12 cases (2002-2012) and post-mortem analysis of the bone-ligament interface.

    Science.gov (United States)

    Brommer, Harold; Voermans, Margreet; Veraa, Stefanie; van den Belt, Antoon J M; van der Toorn, Annette; Ploeg, Margreet; Gröne, Andrea; Back, Willem

    2014-11-19

    Axial osteitis of the proximal sesamoid bones and desmitis of the intersesamoidean ligament has been described in Friesian horses as well as in other breeds. The objectives of this study were to review the outcome of clinical cases of this disease in Friesian horses and analyse the pathology of the bone-ligament interface. Case records of Friesian horses diagnosed with axial osteitis of the proximal sesamoid bones and desmitis of the intersesamoidean ligament in the period 2002-2012 were retrospectively evaluated. Post-mortem examination was performed on horses that were euthanized (n = 3) and included macroscopic necropsy (n = 3), high-field (9.4 Tesla) magnetic resonance imaging (n = 1) and histopathology (n = 2). Twelve horses were included, aged 6.8 ± 2.7 years. The hindlimb was involved in all cases. Lameness was acute in onset and severe, with a mean duration of 1.9 ± 1.0 months. Three horses were euthanized after diagnosis; 9 horses underwent treatment. Two horses (22%) became sound for light riding purposes, 2 horses (22%) became pasture sound (comfortable at pasture, but not suitable for riding), 5 horses (56%) remained lame. In addition to bone resorption at the proximo-axial margin of the proximal sesamoid bones, magnetic resonance imaging and histopathology showed osteoporosis of the peripheral compact bone and spongious bone of the proximal sesamoid bones and chronic inflammation of the intersesamoidean ligament. Axial osteitis of the proximal sesamoid bones and desmitis of the intersesamoidean ligament in the hindlimb of Friesian horses carries a poor prognosis. Pathological characterization (inflammation, proximo-axial bone resorption and remodelling of the peripheral compact bone and spongious bone of the proximal sesamoid bones) may help in unravelling the aetiology of this disease.

  13. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  14. FIND--a unified framework for neural data analysis.

    Science.gov (United States)

    Meier, Ralph; Egert, Ulrich; Aertsen, Ad; Nawrot, Martin P

    2008-10-01

    The complexity of neurophysiology data has increased tremendously over the last years, especially due to the widespread availability of multi-channel recording techniques. With adequate computing power the current limit for computational neuroscience is the effort and time it takes for scientists to translate their ideas into working code. Advanced analysis methods are complex and often lack reproducibility on the basis of published descriptions. To overcome this limitation we develop FIND (Finding Information in Neural Data) as a platform-independent, open source framework for the analysis of neuronal activity data based on Matlab (Mathworks). Here, we outline the structure of the FIND framework and describe its functionality, our measures of quality control, and the policies for developers and users. Within FIND we have developed a unified data import from various proprietary formats, simplifying standardized interfacing with tools for analysis and simulation. The toolbox FIND covers a steadily increasing number of tools. These analysis tools address various types of neural activity data, including discrete series of spike events, continuous time series and imaging data. Additionally, the toolbox provides solutions for the simulation of parallel stochastic point processes to model multi-channel spiking activity. We illustrate two examples of complex analyses with FIND tools: First, we present a time-resolved characterization of the spiking irregularity in an in vivo extracellular recording from a mushroom-body extrinsic neuron in the honeybee during odor stimulation. Second, we describe layer specific input dynamics in the rat primary visual cortex in vivo in response to visual flash stimulation on the basis of multi-channel spiking activity.

  15. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Science.gov (United States)

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  16. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Directory of Open Access Journals (Sweden)

    Matteo Convertino

    Full Text Available Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the

  17. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    Science.gov (United States)

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  18. Post-mortem cytogenomic investigations in patients with congenital malformations.

    Science.gov (United States)

    Dias, Alexandre Torchio; Zanardo, Évelin Aline; Dutra, Roberta Lelis; Piazzon, Flavia Balbo; Novo-Filho, Gil Monteiro; Montenegro, Marilia Moreira; Nascimento, Amom Mendes; Rocha, Mariana; Madia, Fabricia Andreia Rosa; Costa, Thais Virgínia Moura Machado; Milani, Cintia; Schultz, Regina; Gonçalves, Fernanda Toledo; Fridman, Cintia; Yamamoto, Guilherme Lopes; Bertola, Débora Romeo; Kim, Chong Ae; Kulikowski, Leslie Domenici

    2016-08-01

    Congenital anomalies are the second highest cause of infant deaths, and, in most cases, diagnosis is a challenge. In this study, we characterize patterns of DNA copy number aberrations in different samples of post-mortem tissues from patients with congenital malformations. Twenty-eight patients undergoing autopsy were cytogenomically evaluated using several methods, specifically, Multiplex Ligation-dependent Probe Amplification (MLPA), microsatellite marker analysis with a MiniFiler kit, FISH, a cytogenomic array technique and bidirectional Sanger sequencing, which were performed on samples of different tissues (brain, heart, liver, skin and diaphragm) preserved in RNAlater, in formaldehyde or by paraffin-embedding. The results identified 13 patients with pathogenic copy number variations (CNVs). Of these, eight presented aneuploidies involving chromosomes 13, 18, 21, X and Y (two presented inter- and intra-tissue mosaicism). In addition, other abnormalities were found, including duplication of the TYMS gene (18p11.32); deletion of the CHL1 gene (3p26.3); deletion of the HIC1 gene (17p13.3); and deletion of the TOM1L2 gene (17p11.2). One patient had a pathogenic missense mutation of g.8535C>G (c.746C>G) in exon 7 of the FGFR3 gene consistent with Thanatophoric Dysplasia type I. Cytogenomic techniques were reliable for the analysis of autopsy material and allowed the identification of inter- and intra-tissue mosaicism and a better understanding of the pathogenesis of congenital malformations. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  20. 9 CFR 381.71 - Condemnation on ante mortem inspection.

    Science.gov (United States)

    2010-01-01

    ... dressed, nor shall they be conveyed into any department of the official establishment where poultry... AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS Ante Mortem Inspection § 381.71...

  1. Drowning - post-mortem imaging findings by computed tomography

    International Nuclear Information System (INIS)

    Christe, Andreas; Aghayev, Emin; Jackowski, Christian; Thali, Michael J.; Vock, Peter

    2008-01-01

    The aim of this study was to identify the classic autopsy signs of drowning in post-mortem multislice computed tomography (MSCT). Therefore, the post-mortem pre-autopsy MSCT- findings of ten drowning cases were correlated with autopsy and statistically compared with the post-mortem MSCT of 20 non-drowning cases. Fluid in the airways was present in all drowning cases. Central aspiration in either the trachea or the main bronchi was usually observed. Consecutive bronchospasm caused emphysema aquosum. Sixty percent of drowning cases showed a mosaic pattern of the lung parenchyma due to regions of hypo- and hyperperfused lung areas of aspiration. The resorption of fresh water in the lung resulted in hypodensity of the blood representing haemodilution and possible heart failure. Swallowed water distended the stomach and duodenum; and inflow of water filled the paranasal sinuses (100%). All the typical findings of drowning, except Paltau's spots, were detected using post-mortem MSCT, and a good correlation of MSCT and autopsy was found. The advantage of MSCT was the direct detection of bronchospasm, haemodilution and water in the paranasal sinus, which is rather complicated or impossible at the classical autopsy. (orig.)

  2. Post-mortem examination and sampling of African flamingos ...

    African Journals Online (AJOL)

    Recent largely unexplained deaths in African flamingos have prompted the need for standard, reproducible methods for the post-mortem examination of these birds, for the taking of samples and for the recording of findings. Here we describe suitable techniques and present three distinct protocols for field-based ...

  3. Academic Libraries and Quality: An Analysis and Evaluation Framework

    Science.gov (United States)

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  4. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  5. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  6. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  7. Post-mortem radiology-a new sub-speciality?

    International Nuclear Information System (INIS)

    O'Donnell, C.; Woodford, N.

    2008-01-01

    Computed tomography (CT) and magnetic resonance imaging (MRI) examinations of deceased individuals are increasingly being utilized in the field of forensic pathology. However, there are differences in the interpretation of post-mortem and clinical imaging. Radiologists with only occasional experience in post-mortem imaging are at risk of misinterpreting the findings if they rely solely on clinical experience. Radiological specialists working in a co-operative environment with pathologists are pivotal in the understanding of post-mortem CT and MRI, and its appropriate integration into the autopsy. This has spawned a novel subspecialty called post-mortem radiology or necro-radiology (radiology of the deceased). In the future it is likely that whole-body CT will be incorporated into the routine forensic autopsy due its ability to accurately detect and localise abnormalities commonly seen in forensic practice, such as haematoma, abnormal gas collections, fractures, and metallic foreign bodies. In the next 5-10 years most forensic institutes will seek regular access to such CT facilities or install machines into their own mortuaries. MRI is technically more problematic in the deceased but the improved tissue contrast over CT means that it is also very useful for investigation of pathology in the cranial, thoracic, and abdominal cavities, as well as the detection of haematoma in soft tissue. In order for radiologists to be an integral part of this important development in forensic investigation, radiological organizations must recognize the subspecialty of post-mortem radiology and provide a forum for radiologists to advance scientific knowledge in the field

  8. Evolutionary squeaky wheel optimization: a new framework for analysis.

    Science.gov (United States)

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  9. Short Run Profit Maximization in a Convex Analysis Framework

    Directory of Open Access Journals (Sweden)

    Ilko Vrankic

    2017-03-01

    Full Text Available In this article we analyse the short run profit maximization problem in a convex analysis framework. The goal is to apply the results of convex analysis due to unique structure of microeconomic phenomena on the known short run profit maximization problem where the results from convex analysis are deductively applied. In the primal optimization model the technology in the short run is represented by the short run production function and the normalized profit function, which expresses profit in the output units, is derived. In this approach the choice variable is the labour quantity. Alternatively, technology is represented by the real variable cost function, where costs are expressed in the labour units, and the normalized profit function is derived, this time expressing profit in the labour units. The choice variable in this approach is the quantity of production. The emphasis in these two perspectives of the primal approach is given to the first order necessary conditions of both models which are the consequence of enveloping the closed convex set describing technology with its tangents. The dual model includes starting from the normalized profit function and recovering the production function, and alternatively the real variable cost function. In the first perspective of the dual approach the choice variable is the real wage, and in the second it is the real product price expressed in the labour units. It is shown that the change of variables into parameters and parameters into variables leads to both optimization models which give the same system of labour demand and product supply functions and their inverses. By deductively applying the results of convex analysis the comparative statics results are derived describing the firm's behaviour in the short run.

  10. SIDEKICK: Genomic data driven analysis and decision-making framework

    Directory of Open Access Journals (Sweden)

    Yoon Kihoon

    2010-12-01

    Full Text Available Abstract Background Scientists striving to unlock mysteries within complex biological systems face myriad barriers in effectively integrating available information to enhance their understanding. While experimental techniques and available data sources are rapidly evolving, useful information is dispersed across a variety of sources, and sources of the same information often do not use the same format or nomenclature. To harness these expanding resources, scientists need tools that bridge nomenclature differences and allow them to integrate, organize, and evaluate the quality of information without extensive computation. Results Sidekick, a genomic data driven analysis and decision making framework, is a web-based tool that provides a user-friendly intuitive solution to the problem of information inaccessibility. Sidekick enables scientists without training in computation and data management to pursue answers to research questions like "What are the mechanisms for disease X" or "Does the set of genes associated with disease X also influence other diseases." Sidekick enables the process of combining heterogeneous data, finding and maintaining the most up-to-date data, evaluating data sources, quantifying confidence in results based on evidence, and managing the multi-step research tasks needed to answer these questions. We demonstrate Sidekick's effectiveness by showing how to accomplish a complex published analysis in a fraction of the original time with no computational effort using Sidekick. Conclusions Sidekick is an easy-to-use web-based tool that organizes and facilitates complex genomic research, allowing scientists to explore genomic relationships and formulate hypotheses without computational effort. Possible analysis steps include gene list discovery, gene-pair list discovery, various enrichments for both types of lists, and convenient list manipulation. Further, Sidekick's ability to characterize pairs of genes offers new ways to

  11. Economic impacts of climate change in Australia: framework and analysis

    International Nuclear Information System (INIS)

    Ford, Melanie

    2007-01-01

    Full text: There is growing interest in understanding the potential impacts of climate change in Australia, and especially the economic impacts of 'inaction'. In this study, a preliminary analysis of the possible economic impacts of future climate change in Australia is undertaken using ABARE's general equilibrium model of the global economy, GTEM. In order to understand the potential economy-wide economic impacts, the broad climatic trends that Australia is likely to experience over the next several decades are canvassed and the potential economic and non-economic impacts on key risk areas, such as water resources, agriculture and forests, health, industry and human settlements and the ecosystems, are identified. A more detailed analysis of the economic impacts of climate change are undertaken by developing two case studies. In the first case study, the economic impact of climate change and reduced water availability on the agricultural sector is assessed in the Murray-Darling Basin. In the second case study, the sectoral economic impacts on the Australian resources sector of a projected decline in global economic activity due to climate change is analysed. The key areas of required development to more fully understand the economy-wide and sectoral impacts of climate change are also discussed including issues associated with estimating both non-market and market impacts. Finally, an analytical framework for undertaking integrated assessment of climate change impacts domestically and globally is developed

  12. A Framework for Analysis of Music Similarity Measures

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Christensen, Mads G.; Jensen, Søren Holdt

    2007-01-01

    To analyze specific properties of music similarity measures that the commonly used genre classification evaluation procedure does not reveal, we introduce a MIDI based test framework for music similarity measures. We introduce the framework by example and thus outline an experiment to analyze the...

  13. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  14. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  15. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  16. Towards an intelligent framework for multimodal affective data analysis.

    Science.gov (United States)

    Poria, Soujanya; Cambria, Erik; Hussain, Amir; Huang, Guang-Bin

    2015-03-01

    An increasingly large amount of multimodal content is posted on social media websites such as YouTube and Facebook everyday. In order to cope with the growth of such so much multimodal data, there is an urgent need to develop an intelligent multi-modal analysis framework that can effectively extract information from multiple modalities. In this paper, we propose a novel multimodal information extraction agent, which infers and aggregates the semantic and affective information associated with user-generated multimodal data in contexts such as e-learning, e-health, automatic video content tagging and human-computer interaction. In particular, the developed intelligent agent adopts an ensemble feature extraction approach by exploiting the joint use of tri-modal (text, audio and video) features to enhance the multimodal information extraction process. In preliminary experiments using the eNTERFACE dataset, our proposed multi-modal system is shown to achieve an accuracy of 87.95%, outperforming the best state-of-the-art system by more than 10%, or in relative terms, a 56% reduction in error rate. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Layers of protection analysis in the framework of possibility theory.

    Science.gov (United States)

    Ouazraoui, N; Nait-Said, R; Bourareche, M; Sellami, I

    2013-11-15

    An important issue faced by risk analysts is how to deal with uncertainties associated with accident scenarios. In industry, one often uses single values derived from historical data or literature to estimate events probability or their frequency. However, both dynamic environments of systems and the need to consider rare component failures may make unrealistic this kind of data. In this paper, uncertainty encountered in Layers Of Protection Analysis (LOPA) is considered in the framework of possibility theory. Data provided by reliability databases and/or experts judgments are represented by fuzzy quantities (possibilities). The fuzzy outcome frequency is calculated by extended multiplication using α-cuts method. The fuzzy outcome is compared to a scenario risk tolerance criteria and the required reduction is obtained by resolving a possibilistic decision-making problem under necessity constraint. In order to validate the proposed model, a case study concerning the protection layers of an operational heater is carried out. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Politics of energy and the NEP: a framework and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Toner, G B

    1984-01-01

    This dissertation examines the nature and evolution of Canadian energy politics, with the focus on the 1973-1983 period and on the oil and gas aspects of energy. The conceptual basis for undertaking the analysis is development and application of an integrated framework for the study of energy politics in Canada. The introduction of the National Energy Program (NEP) by the federal Liberal government in October, 1980, marked a significant conjuncture in the development of Canadian energy politics. The NEP was intended to be a signal of a revitalized central government as well as bargaining stance in the ongoing price and revenue sharing negotiations. Thus, the NEP must be understood as first and foremost a political act. This research suggests that energy politics must be understood as the outcome of conflict and consensus within the government industry and intergovernmental relationships of power, over the ability to influence and control energy developments. To attempt to explain energy politics as essentially the outcome of interaction between government and industry with intergovernmental relations simply reflecting intra-industry competition, or conversely, to explain energy politics as merely the toing and froing of competing governments, is to present a fundamentally flawed portrayed of Canadian energy politics. That is, the dynamic force driving energy politics in Canada is a three-sided set of competitive relations between governments and the industry.

  19. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  20. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  1. The toxicological significance of post-mortem drug concentrations in bile.

    Science.gov (United States)

    Ferner, Robin E; Aronson, Jeffrey K

    2018-01-01

    Some authors have proposed that post-mortem drug concentrations in bile are useful in estimating concentrations in blood. Both The International Association of Forensic Toxicologists (TIAFT) and the US Federal Aviation Administration recommend that samples of bile should be obtained in some circumstances. Furthermore, standard toxicological texts compare blood and bile concentrations, implying that concentrations in bile are of forensic value. To review the evidence on simultaneous measurements of blood and bile drug concentrations reported in the medical literature. We made a systematic search of EMBASE 1980-2016 using the search terms ("bile/" OR "exp drug bile level/concentration/") AND "drug blood level/concentration/", PubMed 1975-2017 for ("bile[tw]" OR "biliary[tw]") AND ("concentration[tw]" OR "concentrations[tw]" OR "level[tw]" OR "levels[tw]") AND "post-mortem[tw]" and also MEDLINE 1990-2016 for information on drugs whose biliary concentrations were mentioned in standard textbooks. The search was limited to human studies without language restrictions. We also examined recent reviews, indexes of relevant journals and citations in Web of Science and Google Scholar. We calculated the bile:blood concentration ratio. The searches together yielded 1031 titles with abstracts. We scanned titles and abstracts for relevance and retrieved 230, of which 161 were considered further. We excluded 49 papers because: the paper reported only one case (30 references); the data referred only to a metabolite (1); the work was published before 1980 (3); the information concerned only samples taken during life (10); or the paper referred to a toxin or unusual recreational drug (5). The remaining 112 papers provided data for analysis, with at least two observations for each of 58 drugs. Bile:blood concentration ratios: Median bile:blood concentration ratios varied from 0.18 (range 0.058-0.32) for dextromoramide to 520 (range 0.62-43,000) for buprenorphine. Median bile

  2. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  3. A Conceptual Framework over Contextual Analysis of Concept Learning within Human-Machine Interplays

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    This research provides a contextual description concerning existential and structural analysis of ‘Relations’ between human beings and machines. Subsequently, it will focus on conceptual and epistemological analysis of (i) my own semantics-based framework [for human meaning construction] and of (ii......) a well-structured machine concept learning framework. Accordingly, I will, semantically and epistemologically, focus on linking those two frameworks for logical analysis of concept learning in the context of human-machine interrelationships. It will be demonstrated that the proposed framework provides...

  4. Profiling of RNA degradation for estimation of post mortem [corrected] interval.

    Directory of Open Access Journals (Sweden)

    Fernanda Sampaio-Silva

    Full Text Available An estimation of the post mortem interval (PMI is frequently touted as the Holy Grail of forensic pathology. During the first hours after death, PMI estimation is dependent on the rate of physical observable modifications including algor, rigor and livor mortis. However, these assessment methods are still largely unreliable and inaccurate. Alternatively, RNA has been put forward as a valuable tool in forensic pathology, namely to identify body fluids, estimate the age of biological stains and to study the mechanism of death. Nevertheless, the attempts to find correlation between RNA degradation and PMI have been unsuccessful. The aim of this study was to characterize the RNA degradation in different post mortem tissues in order to develop a mathematical model that can be used as coadjuvant method for a more accurate PMI determination. For this purpose, we performed an eleven-hour kinetic analysis of total extracted RNA from murine's visceral and muscle tissues. The degradation profile of total RNA and the expression levels of several reference genes were analyzed by quantitative real-time PCR. A quantitative analysis of normalized transcript levels on the former tissues allowed the identification of four quadriceps muscle genes (Actb, Gapdh, Ppia and Srp72 that were found to significantly correlate with PMI. These results allowed us to develop a mathematical model with predictive value for estimation of the PMI (confidence interval of ±51 minutes at 95% that can become an important complementary tool for traditional methods.

  5. Diagnostic accuracy of post-mortem CT with targeted coronary angiography versus autopsy for coroner-requested post-mortem investigations: a prospective, masked, comparison study.

    Science.gov (United States)

    Rutty, Guy N; Morgan, Bruno; Robinson, Claire; Raj, Vimal; Pakkal, Mini; Amoroso, Jasmin; Visser, Theresa; Saunders, Sarah; Biggs, Mike; Hollingbury, Frances; McGregor, Angus; West, Kevin; Richards, Cathy; Brown, Laurence; Harrison, Rebecca; Hew, Roger

    2017-07-08

    England and Wales have one of the highest frequencies of autopsy in the world. Implementation of post-mortem CT (PMCT), enhanced with targeted coronary angiography (PMCTA), in adults to avoid invasive autopsy would have cultural, religious, and potential economic benefits. We aimed to assess the diagnostic accuracy of PMCTA as a first-line technique in post-mortem investigations. In this single-centre (Leicester, UK), prospective, controlled study, we selected cases of natural and non-suspicious unnatural death referred to Her Majesty's (HM) Coroners. We excluded cases younger than 18 years, known to have had a transmittable disease, or who weighed more than 125 kg. Each case was assessed by PMCTA, followed by autopsy. Pathologists were masked to the PMCTA findings, unless a potential risk was shown. The primary endpoint was the accuracy of the cause of death diagnosis from PMCTA against a gold standard of autopsy findings, modified by PMCTA findings only if additional substantially incontrovertible findings were identified. Between Jan 20, 2010, and Sept 13, 2012, we selected 241 cases, for which PMCTA was successful in 204 (85%). Seven cases were excluded from the analysis because of procedural unmasking or no autopsy data, as were 24 cases with a clear diagnosis of traumatic death before investigation; 210 cases were included. In 40 (19%) cases, predictable toxicology or histology testing accessible by PMCT informed the result. PMCTA provided a cause of death in 193 (92%) cases. A major discrepancy with the gold standard was noted in 12 (6%) cases identified by PMCTA, and in nine (5%) cases identified by autopsy (because of specific findings on PMCTA). The frequency of autopsy and PMCTA discrepancies were not significantly different (p=0·65 for major discrepancies and p=0·21 for minor discrepancies). Cause of death given by PMCTA did not overlook clinically significant trauma, occupational lung disease, or reportable disease, and did not significantly affect

  6. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  7. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  8. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  9. Diagnosis of drowning using post-mortem computed tomography – state of the art

    Directory of Open Access Journals (Sweden)

    Catherine Raux

    2014-12-01

    Full Text Available Aim of the study: Recent studies using post-mortem computed tomography (PMCT have suggested this imaging modality is of value in the positive diagnosis of drowning. We summarize the data from the literature regarding the diagnostic value of CT in cases of drowning. Material and methods: We performed an all-language search of literature published from 1999 to 2013 with the key words “post-mortem CT scan”, “drowning and CT scan”, “near-drowning diagnosis”, and “drowning diagnosis”. Results : Only 11 articles, whose data enabled complementary statistical analysis, were included. The presence of fluid and sediment in paranasal sinuses appear to be the determinants of the diagnosis of drowning. The presence of fluid in the sinuses had a sensitivity of 100%, and of 90% in the trachea and main bronchi. The results were completed by the high specificity of the presence of sediment in the paranasal sinuses, upper airways and stomach, which was 100% for all three. Haemodilution was present in cases of drowning (p < 0.001. The values made it possible to formulate a decision algorithm for the diagnosis of drowning.

  10. Burned bodies: post-mortem computed tomography, an essential tool for modern forensic medicine.

    Science.gov (United States)

    Coty, J-B; Nedelcu, C; Yahya, S; Dupont, V; Rougé-Maillart, C; Verschoore, M; Ridereau Zins, C; Aubé, C

    2018-06-07

    Currently, post-mortem computed tomography (PMCT) has become an accessible and contemporary tool for forensic investigations. In the case of burn victims, it provides specific semiologies requiring a prudent understanding to differentiate between the normal post-mortem changes from heat-related changes. The aim of this pictorial essay is to provide to the radiologist the keys to establish complete and focused reports in cases of PMCT of burn victims. Thus, the radiologist must discern all the contextual divergences with the forensic history, and must be able to report all the relevant elements to answer to the forensic pathologist the following questions: Are there tomographic features that could help to identify the victim? Is there evidence of remains of biological fluids in liquid form available for toxicological analysis and DNA sampling? Is there another obvious cause of death than heat-related lesions, especially metallic foreign bodies of ballistic origin? Finally, what are the characteristic burn-related injuries seen on the corpse that should be sought during the autopsy? • CT is highly useful to find features permitting the identification of a severely burned body. • PMCT is a major asset in gunshot injuries to depict ballistic foreign bodies in the burned cadavers. • CT is able to recognise accessible blood for tests versus heat clot (air-crescent sign). • Heat-related fractures are easily differentiated from traumatic fractures. • Epidural collections with a subdural appearance are typical heat-related head lesions.

  11. Hierarchical Scheduling Framework Based on Compositional Analysis Using Uppaal

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    This paper introduces a reconfigurable compositional scheduling framework, in which the hierarchical structure, the scheduling policies, the concrete task behavior and the shared resources can all be reconfigured. The behavior of each periodic preemptive task is given as a list of timed actions, ...

  12. Toward Solving the Problem of Problem Solving: An Analysis Framework

    Science.gov (United States)

    Roesler, Rebecca A.

    2016-01-01

    Teaching is replete with problem solving. Problem solving as a skill, however, is seldom addressed directly within music teacher education curricula, and research in music education has not examined problem solving systematically. A framework detailing problem-solving component skills would provide a needed foundation. I observed problem solving…

  13. Mediation Analysis in a Latent Growth Curve Modeling Framework

    Science.gov (United States)

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  14. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  15. Comparative Analysis of Language Minorities: A Sociopolitical Framework.

    Science.gov (United States)

    Anderson, A. B.

    1990-01-01

    Synthesizes theoretical typologies in the fields of ethnic relations, ethnonationalism, and sociolinguistics into a sociopolitical framework for analyzing various types of ethnolinguistic minority situations. Particular reference is made to minority situations in Europe, North America, and developing countries. (35 references) (Author/CB)

  16. Design and Analysis of a Service Migration Framework

    DEFF Research Database (Denmark)

    Saeed, Aamir; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2013-01-01

    on another device. For such a need, an architecture is proposed to design and develop applications that migrate from one device to another and resume its operation. A simple application was constructed based on the proposed framework. Experiments were carried out to demonstrate its applicability...

  17. Differential Nuclear and Mitochondrial DNA Preservation in Post-Mortem Teeth with Implications for Forensic and Ancient DNA Studies

    Science.gov (United States)

    Higgins, Denice; Rohrlach, Adam B.; Kaidonis, John; Townsend, Grant; Austin, Jeremy J.

    2015-01-01

    Major advances in genetic analysis of skeletal remains have been made over the last decade, primarily due to improvements in post-DNA-extraction techniques. Despite this, a key challenge for DNA analysis of skeletal remains is the limited yield of DNA recovered from these poorly preserved samples. Enhanced DNA recovery by improved sampling and extraction techniques would allow further advancements. However, little is known about the post-mortem kinetics of DNA degradation and whether the rate of degradation varies between nuclear and mitochondrial DNA or across different skeletal tissues. This knowledge, along with information regarding ante-mortem DNA distribution within skeletal elements, would inform sampling protocols facilitating development of improved extraction processes. Here we present a combined genetic and histological examination of DNA content and rates of DNA degradation in the different tooth tissues of 150 human molars over short-medium post-mortem intervals. DNA was extracted from coronal dentine, root dentine, cementum and pulp of 114 teeth via a silica column method and the remaining 36 teeth were examined histologically. Real time quantification assays based on two nuclear DNA fragments (67 bp and 156 bp) and one mitochondrial DNA fragment (77 bp) showed nuclear and mitochondrial DNA degraded exponentially, but at different rates, depending on post-mortem interval and soil temperature. In contrast to previous studies, we identified differential survival of nuclear and mtDNA in different tooth tissues. Futhermore histological examination showed pulp and dentine were rapidly affected by loss of structural integrity, and pulp was completely destroyed in a relatively short time period. Conversely, cementum showed little structural change over the same time period. Finally, we confirm that targeted sampling of cementum from teeth buried for up to 16 months can provide a reliable source of nuclear DNA for STR-based genotyping using standard

  18. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  19. Post-mortem MRI of the foetal spine and spinal cord

    International Nuclear Information System (INIS)

    Widjaja, E.; Whitby, E.H.; Cohen, M.; Paley, M.N.J.; Griffiths, P.D.

    2006-01-01

    Aims: To compare the findings of post-mortem magnetic resonance imaging (MRI) of the foetal spine with autopsy with a view to using post-mortem MRI as an alternative or adjunct to autopsy, particularly in foetal and neonatal cases. Materials and Methods: The brains and spines of 41 foetuses, with a gestational age range of 14-41 weeks, underwent post-mortem MRI before autopsy. Post-mortem MRI of the brain consisted of T2-weighted sequences in three orthogonal planes and MRI of the spine consisted of T2-weighted sequence in the sagittal and axial planes in all cases and coronal planes in selected cases. Results: Thirty of 41 (78%) foetal spines were found to be normal at autopsy and on post-mortem MRI. Eleven of 41 (22%) foetal spines were abnormal: eight foetuses had myelomeningocoeles and Chiari 2 deformities, one foetus had limited dorsal myeloschisis, one foetus had caudal regression syndrome, and one had diastematomyelia. The post-mortem MRI findings concurred with the autopsy findings in 10/11 of the abnormal cases, the disagreement being the case of diastematomyelia that was shown on post-mortem MRI but was not diagnosed at autopsy. Conclusions: In this series, post-mortem MRI findings agreed with the autopsy findings in 40/41(98%) cases and in one case the post-mortem MRI demonstrated an abnormality not demonstrated at autopsy

  20. Preliminary study of post mortem identification using lip prints.

    Science.gov (United States)

    Utsuno, Hajime; Kanoh, Takashi; Tadokoro, Osamu; Inoue, Katsuhiro

    2005-05-10

    Identification using lip prints was first performed in the 1950s and was the subject of much research in the 1960s and 70s, leading to the acceptance of this technique as evidence in the criminal justice system. Previous research has focused on identifying lip print types or on methods of obtaining hidden lip prints left at the crime scene. The present study aimed to clarify characteristics of lip prints from cadavers with various causes of death (including drowning and hanging) and to determine the effects of fixation on post mortem changes in lip impressions.

  1. Making post-mortem implantable cardioverter defibrillator explantation safe

    DEFF Research Database (Denmark)

    Räder, Sune B E W; Zeijlemaker, Volkert; Pehrson, Steen

    2009-01-01

    that the resting voltage over the operating person would not exceed 50 V. CONCLUSION: The use of intact medical gloves made of latex, neoprene, or plastic eliminates the potential electrical risk during explantation of an ICD. Two gloves on each hand offer sufficient protection. We will recommend the use......AIMS: The aim of this study is to investigate whether protection with rubber or plastic gloves during post-mortem explantation of an implantable cardioverter defibrillator (ICD) offers enough protection for the explanting operator during a worst-case scenario (i.e. ICD shock). METHODS AND RESULTS...

  2. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  3. A Statistical Framework for the Functional Analysis of Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  4. Towards an oral healthcare framework and policy analysis for Swaziland

    OpenAIRE

    Mndzebele, Samuel

    2010-01-01

    Background and Rationale: A synopsis by the researcher suggested that caries was becoming a public health problem among the youth, hence there was a need for deeper investigations which would lead to possible oral health interventions. Purpose: The purpose of the study was to assess dental care practices and experiences among teenagers in the Northern region of Swaziland. Based on the outcomes and views from health professionals; develop a framework for oral healthcare delivery and ...

  5. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  6. Globalization and health: a framework for analysis and action.

    Science.gov (United States)

    Woodward, D.; Drager, N.; Beaglehole, R.; Lipson, D.

    2001-01-01

    Globalization is a key challenge to public health, especially in developing countries, but the linkages between globalization and health are complex. Although a growing amount of literature has appeared on the subject, it is piecemeal, and suffers from a lack of an agreed framework for assessing the direct and indirect health effects of different aspects of globalization. This paper presents a conceptual framework for the linkages between economic globalization and health, with the intention that it will serve as a basis for synthesizing existing relevant literature, identifying gaps in knowledge, and ultimately developing national and international policies more favourable to health. The framework encompasses both the indirect effects on health, operating through the national economy, household economies and health-related sectors such as water, sanitation and education, as well as more direct effects on population-level and individual risk factors for health and on the health care system. Proposed also is a set of broad objectives for a programme of action to optimize the health effects of economic globalization. The paper concludes by identifying priorities for research corresponding with the five linkages identified as critical to the effects of globalization on health. PMID:11584737

  7. Post mortem analysis of fatigue mechanisms in LiNi0.8Co0.15Al0.05O2 - LiNi0.5Co0.2Mn0.3O2 - LiMn2O4/graphite lithium ion batteries

    Science.gov (United States)

    Lang, Michael; Darma, Mariyam Susana Dewi; Kleiner, Karin; Riekehr, Lars; Mereacre, Liuda; Ávila Pérez, Marta; Liebau, Verena; Ehrenberg, Helmut

    2016-09-01

    The fatigue of commercial lithium ion batteries after long-term cycling at two different temperatures and cycling rates is investigated. The cells are opened after cycling and post-mortem analysis are conducted. Two main contributions to the capacity loss of the batteries are revealed. The loss of active lithium leads to a relative shift between anodes and cathodes potentials. A growth of the solid electrolyte interface (SEI) on the anode is determined as well as the formation of lithium fluoride species as an electrolyte decomposition product. Those effects are reinforced by increasing cycling rates from 1C/2C (charge/discharge) to 2C/3C as well as by increasing cycling temperatures from 25 °C to 40 °C. The other contribution to the capacity loss originates from a fatigue of the blended cathodes consisting of LiNi0.5Co0.2Mn0.3O2 (NCM), LiNi0.8Co0.15Al0.05O2 (NCA) and LiMn2O4 (LMO). Phase-specific capacity losses and fatigue mechanisms are identified. The layered oxides tend to form microcracks and reveal changes of the surface structure leading to a worsening of the lithium kinetics. The cathode exhibits a loss of manganese at 40 °C cycling temperature. Cycling at 40 °C instead of 25 °C has the major impact on cathodes capacity loss, while cycling at 2C/3C rates barely influences it.

  8. Influence of Post-Mortem Sperm Recovery Method and Extender on Unstored and Refrigerated Rooster Sperm Variables.

    Science.gov (United States)

    Villaverde-Morcillo, S; Esteso, M C; Castaño, C; Santiago-Moreno, J

    2016-02-01

    Many post-mortem sperm collection techniques have been described for mammalian species, but their use in birds is scarce. This paper compares the efficacy of two post-mortem sperm retrieval techniques - the flushing and float-out methods - in the collection of rooster sperm, in conjunction with the use of two extenders, i.e., L&R-84 medium and Lake 7.1 medium. To determine whether the protective effects of these extenders against refrigeration are different for post-mortem and ejaculated sperm, pooled ejaculated samples (procured via the massage technique) were also diluted in the above extenders. Post-mortem and ejaculated sperm variables were assessed immediately at room temperature (0 h), and after refrigeration at 5°C for 24 and 48 h. The flushing method retrieved more sperm than the float-out method (596.5 ± 75.4 million sperm vs 341.0 ± 87.6 million sperm; p < 0.05); indeed, the number retrieved by the former method was similar to that obtained by massage-induced ejaculation (630.3 ± 78.2 million sperm). For sperm collected by all methods, the L&R-84 medium provided an advantage in terms of sperm motility variables at 0 h. In the refrigerated sperm samples, however, the Lake 7.1 medium was associated with higher percentages of viable sperm, and had a greater protective effect (p < 0.05) with respect to most motility variables. In conclusion, the flushing method is recommended for collecting sperm from dead birds. If this sperm needs to be refrigerated at 5°C until analysis, Lake 7.1 medium is recommended as an extender. © 2015 Blackwell Verlag GmbH.

  9. Changes of microbial spoilage, lipid-protein oxidation and physicochemical properties during post mortem refrigerated storage of goat meat.

    Science.gov (United States)

    Sabow, Azad Behnan; Sazili, Awis Qurni; Aghwan, Zeiad Amjad; Zulkifli, Idrus; Goh, Yong Meng; Ab Kadir, Mohd Zainal Abidin; Nakyinsige, Khadijah; Kaka, Ubedullah; Adeyemi, Kazeem Dauda

    2016-06-01

    Examined was the effect of post mortem refrigerated storage on microbial spoilage, lipid-protein oxidation and physicochemical traits of goat meat. Seven Boer bucks were slaughtered, eviscerated and aged for 24 h. The Longissimus lumborum (LL) and Semitendinosus (ST) muscles were excised and subjected to 13 days post mortem refrigerated storage. The pH, lipid and protein oxidation, tenderness, color and drip loss were determined in LL while microbiological analysis was performed on ST. Bacterial counts generally increased with increasing aging time and the limit for fresh meat was reached at day 14 post mortem. Significant differences were observed in malondialdehyde (MDA) content at day 7 of storage. The thiol concentration significantly reduced as aging time increased. The band intensities of myosin heavy chain (MHC) and troponin-T significantly decreased as storage progressed, while actin remained relatively stable. After 14 days of aging, tenderness showed significant improvement while muscle pH and drip loss reduced with increase in storage time. Samples aged for 14 days had higher lightness (P goat meat. © 2016 Japanese Society of Animal Science.

  10. Usefulness of post mortem computed tomography versus conventional forensic autopsy of road accident victims (drivers and passengers).

    Science.gov (United States)

    Moskała, Artur; Woźniak, Krzysztof; Kluza, Piotr; Romaszko, Karol; Lopatin, Oleksiy

    2017-01-01

    Aim of the study: Deaths of in-vehicle victims (drivers and passengers) of road accidents represent a significant group of issues addressed by forensic medicine. Expressing opinions in this regard involves first of all the determination of the cause of death and the forensic pathologist's participation in the process of road accident reconstruction through defining the mechanism of bodily harm. The scope of the opinion as well as its accuracy and degree of detail largely depend on the scope of forensic autopsy. In this context, techniques that broaden the capabilities of standard autopsy are of particular importance. This paper compares the results of post mortem computed tomography (PMCT) of road accident victims (drivers and passengers) against the results of standard examination in order to determine the scope to which PMCT significantly enhances autopsy capabilities. Material and methods: The analysis covers 118 in-vehicle victims (drivers and passengers) examined from 2012 to 2014. In each case, post-mortem examination was preceded by PMCT examination using Somatom Emotion 16 (Siemens AG, Germany). Results: The results are presented in a tabular form. Conclusions: In most road accident victims (drivers and passengers), post mortem computed tomography significantly increases the results' degree of detail, particularly with regard to injuries of bones and gas collections.

  11. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  12. Integrating Poverty and Environmental Concerns into Value-Chain Analysis: A Strategic Framework and Practical Guide

    DEFF Research Database (Denmark)

    Riisgaard, Lone; Bolwig, Simon; Ponte, Stefano

    2010-01-01

    This article aims to guide the design and implementation of action-research projects in value-chain analysis by presenting a strategic framework focused on small producers and trading and processing firms in developing countries. Its stepwise approach – building on the conceptual framework set ou...... purpose of increasing the rewards and/or reducing the risks....

  13. Ethical considerations in forensic genetics research on tissue samples collected post-mortem in Cape Town, South Africa.

    Science.gov (United States)

    Heathfield, Laura J; Maistry, Sairita; Martin, Lorna J; Ramesar, Raj; de Vries, Jantina

    2017-11-29

    The use of tissue collected at a forensic post-mortem for forensic genetics research purposes remains of ethical concern as the process involves obtaining informed consent from grieving family members. Two forensic genetics research studies using tissue collected from a forensic post-mortem were recently initiated at our institution and were the first of their kind to be conducted in Cape Town, South Africa. This article discusses some of the ethical challenges that were encountered in these research projects. Among these challenges was the adaptation of research workflows to fit in with an exceptionally busy service delivery that is operating with limited resources. Whilst seeking guidance from the literature regarding research on deceased populations, it was noted that next of kin of decedents are not formally recognised as a vulnerable group in the existing ethical and legal frameworks in South Africa. The authors recommend that research in the forensic mortuary setting is approached using guidance for vulnerable groups, and the benefit to risk standard needs to be strongly justified. Lastly, when planning forensic genetics research, consideration must be given to the potential of uncovering incidental findings, funding to validate these findings and the feedback of results to family members; the latter of which is recommended to occur through a genetic counsellor. It is hoped that these experiences will contribute towards a formal framework for conducting forensic genetic research in medico-legal mortuaries in South Africa.

  14. Method for modeling post-mortem biometric 3D fingerprints

    Science.gov (United States)

    Rajeev, Srijith; Shreyas, Kamath K. M.; Agaian, Sos S.

    2016-05-01

    Despite the advancements of fingerprint recognition in 2-D and 3-D domain, authenticating deformed/post-mortem fingerprints continue to be an important challenge. Prior cleansing and reconditioning of the deceased finger is required before acquisition of the fingerprint. The victim's finger needs to be precisely and carefully operated by a medium to record the fingerprint impression. This process may damage the structure of the finger, which subsequently leads to higher false rejection rates. This paper proposes a non-invasive method to perform 3-D deformed/post-mortem finger modeling, which produces a 2-D rolled equivalent fingerprint for automated verification. The presented novel modeling method involves masking, filtering, and unrolling. Computer simulations were conducted on finger models with different depth variations obtained from Flashscan3D LLC. Results illustrate that the modeling scheme provides a viable 2-D fingerprint of deformed models for automated verification. The quality and adaptability of the obtained unrolled 2-D fingerprints were analyzed using NIST fingerprint software. Eventually, the presented method could be extended to other biometric traits such as palm, foot, tongue etc. for security and administrative applications.

  15. Barriers to renewable energy penetration. A framework for analysis

    DEFF Research Database (Denmark)

    Painuly, Jyoti P.

    2001-01-01

    Renewable energy has the potential to play an important role in providing energy with sustainability to the vast populations in developing countries who as yet have no access to clean energy. Although economically viable fur several applications, renewable energy has not been able to realise its...... potential due to several barriers to its penetration. A framework has been developed in this paper to identify the barriers to renewable energy penetration acid to suggest measures to overcome them. (C) 2001 Elsevier Science Ltd. All rights reserved....

  16. iterClust: a statistical framework for iterative clustering analysis.

    Science.gov (United States)

    Ding, Hongxu; Wang, Wanxin; Califano, Andrea

    2018-03-22

    In a scenario where populations A, B1 and B2 (subpopulations of B) exist, pronounced differences between A and B may mask subtle differences between B1 and B2. Here we present iterClust, an iterative clustering framework, which can separate more pronounced differences (e.g. A and B) in starting iterations, followed by relatively subtle differences (e.g. B1 and B2), providing a comprehensive clustering trajectory. iterClust is implemented as a Bioconductor R package. andrea.califano@columbia.edu, hd2326@columbia.edu. Supplementary information is available at Bioinformatics online.

  17. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  18. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  19. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  20. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    Dimitrijevic, V.B.

    1997-01-01

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  1. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  2. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  3. Benchmarking of Modern Data Analysis Tools for a 2nd generation Transient Data Analysis Framework

    CERN Document Server

    Goncalves, Nuno

    2016-01-01

    During the past year of operating the Large Hadron Collider (LHC), the amount of transient accelerator data to be persisted and analysed has been steadily growing. Since the startup of the LHC in 2006, the amount of weekly data storage requirements exceeded what the systems was initially designed to accommodate in a full year of operation. Moreover, it is predicted that the data acquisition rates will continue to increase in the future, due to foreseen improvements in the infrastructure within the scope of the High Luminosity LHC project. Despite the efforts for improving and optimizing the current data storage infrastructures (CERN Accelerator Logging Service and Post Mortem database), some limitations still persist and require a different approach to scale up efficiently to provide efficient services for future machine upgrades. This project aims to explore one of the possibilities among novel solutions proposed to solve the problem of working with large datasets. The configuration is composed of Spark for ...

  4. Histological transformations of the dental pulp as possible indicator of post mortem interval: a pilot study.

    Science.gov (United States)

    Carrasco, Patricio A; Brizuela, Claudia I; Rodriguez, Ismael A; Muñoz, Samuel; Godoy, Marianela E; Inostroza, Carolina

    2017-10-01

    The correct estimation of the post mortem interval (PMI) can be crucial on the success of a forensic investigation. Diverse methods have been used to estimate PMI, considering physical changes that occur after death, such as mortis algor, livor mortis, among others. Degradation after death of dental pulp is a complex process that has not yet been studied thoroughly. It has been described that pulp RNA degradation could be an indicator of PMI, however this study is limited to 6 days. The tooth is the hardest organ of the human body, and within is confined dental pulp. The pulp morphology is defined as a lax conjunctive tissue with great sensory innervation, abundant microcirculation and great presence of groups of cell types. The aim of this study is to describe the potential use of pulp post mortem alterations to estimate PMI, using a new methodology that will allow obtainment of pulp tissue to be used for histomorphological analysis. The current study will identify potential histological indicators in dental pulp tissue to estimate PMI in time intervals of 24h, 1 month, 3 months and 6 months. This study used 26 teeth from individuals with known PMI of 24h, 1 month, 3 months or 6 months. All samples were manipulated with the new methodology (Carrasco, P. and Inostroza C. inventors; Universidad de los Andes, assignee. Forensic identification, post mortem interval estimation and cause of death determination by recovery of dental tissue. United State patent US 61/826,558 23.05.2013) to extract pulp tissue without the destruction of the tooth. The dental pulp tissues obtained were fixed in formalin for the subsequent generation of histological sections, stained with Hematoxylin Eosin and Masson's Trichrome. All sections were observed under an optical microscope using magnifications of 10× and 40×. The microscopic analysis of the samples showed a progressive transformation of the cellular components and fibers of dental pulp along PMI. These results allowed creating a

  5. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  6. Usefulness of post-mortem ophthalmological endoscopy during forensic autopsy: a case report.

    Science.gov (United States)

    Tsujinaka, Masatake; Akaza, Kayoko; Nagai, Atsushi; Nakamura, Isao; Bunai, Yasuo

    2005-01-01

    Post-mortem intraocular findings in two autopsy cases with traumatic intracranial haemorrhage were obtained using an ophthalmological endoscope. The endoscopy results clearly revealed the presence of intraocular haemorrhages and papilledema caused by intracranial haemorrhage. Post-mortem ophthalmological endoscopy offers several benefits. First, post-mortem intraocular findings can be directly observed in corpses with post-mortem clouding of the cornea. Secondly, the endoscopy only requires a 0.9 mm incision in the sclera and does not require the removal of the eye from the corpse, a procedure that should be avoided for ethical and cosmetic reasons. Thus, post-mortem opthalmological endoscopy is a useful method for obtaining intraocular findings in autopsies.

  7. An in-depth analysis of theoretical frameworks for the study of care coordination

    Directory of Open Access Journals (Sweden)

    Sabine Van Houdt

    2013-06-01

    Full Text Available Introduction: Complex chronic conditions often require long-term care from various healthcare professionals. Thus, maintaining quality care requires care coordination. Concepts for the study of care coordination require clarification to develop, study and evaluate coordination strategies. In 2007, the Agency for Healthcare Research and Quality defined care coordination and proposed five theoretical frameworks for exploring care coordination. This study aimed to update current theoretical frameworks and clarify key concepts related to care coordination. Methods: We performed a literature review to update existing theoretical frameworks. An in-depth analysis of these theoretical frameworks was conducted to formulate key concepts related to care coordination.Results: Our literature review found seven previously unidentified theoretical frameworks for studying care coordination. The in-depth analysis identified fourteen key concepts that the theoretical frameworks addressed. These were ‘external factors’, ‘structure’, ‘tasks characteristics’, ‘cultural factors’, ‘knowledge and technology’, ‘need for coordination’, ‘administrative operational processes’, ‘exchange of information’, ‘goals’, ‘roles’, ‘quality of relationship’, ‘patient outcome’, ‘team outcome’, and ‘(interorganizational outcome’.Conclusion: These 14 interrelated key concepts provide a base to develop or choose a framework for studying care coordination. The relational coordination theory and the multi-level framework are interesting as these are the most comprehensive.

  8. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  9. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    Science.gov (United States)

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well

  10. Post mortem analysis of a JET quartz microbalance system

    Energy Technology Data Exchange (ETDEWEB)

    Esser, H.G. [Association EURATOM-Forschungszentrum Juelich, IPP, D-52425, Juelich (Germany)]. E-mail: h.g.esser@fz-juelich.de; Philipps, V. [Association EURATOM-Forschungszentrum Juelich, IPP, D-52425, Juelich (Germany); Wienhold, P. [Association EURATOM-Forschungszentrum Juelich, IPP, D-52425, Juelich (Germany); Sugiyama, K. [Department of Nuclear Engineering, Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Kreter, A. [Association EURATOM-Forschungszentrum Juelich, IPP, D-52425, Juelich (Germany); Coad, J.P. [UKAEA/EURATOM Fusion Association, Culham Science Centre, Abingdon (United Kingdom); Tanabe, T. [Department of Nuclear Engineering, Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan)

    2007-06-15

    In the year 2001, a quartz microbalance system (QMB) was installed in the remote area of the inner JET divertor to investigate in situ material erosion and redeposition processes. When removed in 2004, the system was found to be coated all over with carbon deposits. The deposit on the quartz oscillator and the outer and inner housing was analysed by various methods, as SIMS (secondary ion mass spectroscopy), stylus depth profilometry, EPMA (electron probe microanalysis), TIPT (Tritium imaging plate technique) and colorimetry and compared to the frequency change of the quartz. The layer thickness was determined to 1.85 {+-} 0.1 {mu}m in average on an area of 0.95 cm{sup 2} which has to be related to the equivalent of 1.77 x 10{sup -4} g measured from the frequency change of 23 640 Hz. This corresponds to a carbon areal density of 9.3 x 10{sup 18} C atoms/cm{sup 2}. Significant deposition was found also on the surfaces inside the QMB housing which can only be understood if reflection and low sticking is assumed for a high fraction of particles.

  11. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  12. The Macroeconomic Framework of Support Analysis for Sustainable Businesses Development

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2015-08-01

    Full Text Available The state of satisfaction of an economy results from the quality of the economic products it produces and consumes, in agreement with assuring environment protection, as a source of producing present and future economic goods, and with intensive utilising of human capital, as a source of innovation growth. Knowledge transfer happens in a sustainable economy, whose principles are rational use of resources, limiting of waste, protection, for enabling future generations to have also access to resources. The present research is based on a multifactorial liniar regression model which outlines the direct correlation between the dependent variable welfare and the independent variable of concentration measured by the Gini coefficient of wealth concentration, on the one hand, and by the GDP level, on the other hand, at the level of year 2012. The aim of this research is to identify the correlation between the indicator of quality of life satisfaction or of the welfare function at the level of EU 2012, and the assurance of a macroeconomic framework for sustainable business development.

  13. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  14. Framework for Financial Ratio Analysis of Audited Federal Financial Reports

    Science.gov (United States)

    1999-12-01

    this period were conducted on the statistical validity of the ratio method in financial analysis. McDonald and Morris conducted a study on the... franchising operations, allowing them to lower costs and share administrative support services with other agencies. [Ref. 60:sec. 402-403] The GMRA also...Press, Washington, D.C., 1955). 21. McDonald , Bill and Morris, Michael H., "The Statistical Validity of the Ratio Method in Financial Analysis: An

  15. Note sulla concezione del post mortem presso gli Ittiti

    DEFF Research Database (Denmark)

    Vigo, Matteo; Bellucci, Benedetta

    significato escatologico della morte; 2) la definizione dello status ultraterreno dei defunti (i.e. il diverso trattamento del defunto, ad esempio dal punto di vista cultuale, sulla base della sua condizione sociale); 3) il delineamento dei caratteri e delle funzioni delle divinità ultra-terrene; 4) la......Oggetto d’analisi di questo paper è la concezione della sfera del post-mortem presso la civiltà degli Ittiti. Lo studio delle tematiche che riguardano la concezione dell’“aldilà” nel mondo ittita, sia in senso fisico che metafisico, riveste una larga importanza nella letteratura ittitologica ed è...... stato, pertanto, diffusamente trattato. Nel presente contributo si è scelto quindi di approfondire solo alcuni aspetti della concezione dell’oltretomba, inteso come dimensione metafisica in cui il corpo del defunto giace, staziona o transita dopo la morte. I temi approfonditi saranno quindi: 1) il...

  16. Post-mortem virtual estimation of free abdominal blood volume

    International Nuclear Information System (INIS)

    Ampanozi, Garyfalia; Hatch, Gary M.; Ruder, Thomas D.; Flach, Patricia M.; Germerott, Tanja; Thali, Michael J.; Ebert, Lars C.

    2012-01-01

    Purpose: The purpose of this retrospective study was to examine the reliability of virtually estimated abdominal blood volume using segmentation from postmortem computed tomography (PMCT) data. Materials and methods: Twenty-one cases with free abdominal blood were investigated by PMCT and autopsy. The volume of the blood was estimated using a manual segmentation technique (Amira, Visage Imaging, Germany) and the results were compared to autopsy data. Six of 21 cases had undergone additional post-mortem computed tomographic angiography (PMCTA). Results: The virtually estimated abdominal blood volumes did not differ significantly from those measured at autopsy. Additional PMCTA did not bias data significantly. Conclusion: Virtual estimation of abdominal blood volume is a reliable technique. The virtual blood volume estimation is a useful tool to deliver additional information in cases where autopsy is not performed or in cases where a postmortem angiography is performed

  17. The Elusive Universal Post-Mortem Interval Formula

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad Alexander [ORNL

    2011-01-01

    The following manuscript details our initial attempt at developing universal post-mortem interval formulas describing human decomposition. These formulas are empirically derived from data collected over the last 20 years from the University of Tennessee's Anthropology Research Facility, in Knoxville, Tennessee, USA. Two formulas were developed (surface decomposition and burial decomposition) based on temperature, moisture, and the partial pressure of oxygen, as being three of the four primary drivers for human decomposition. It is hoped that worldwide application of these formulas to environments and situations not readily studied in Tennessee will result in interdisciplinary cooperation between scientists and law enforcement personnel that will allow for future refinements of these models leading to increased accuracy.

  18. Role of forensic odontologist in post mortem person identification

    Directory of Open Access Journals (Sweden)

    Jahagirdar B Pramod

    2012-01-01

    Full Text Available The natural teeth are the most durable organs in the bodies of vertebrates, and humankind′s understanding of their own past and evolution relies heavily upon remnant dental evidence found as fossils. The use of features unique to the human dentition as an aid to personal identification is widely accepted within the forensic field. Comparative dental identifications play a major role in identifying the victims of violence, disaster or other mass tragedies. The comparison of ante-mortem and postmortem dental records to determine human identity has long been established. Indeed, it is still a major identification method in criminal investigations, mass disasters, grossly decomposed or traumatized bodies, and in other situations where visual identification is neither possible nor desirable. This article has comprehensively described some of the methods, and additional factors aiding in postmortem person identification.

  19. Post-mortem toxicology in young sudden cardiac death victims

    DEFF Research Database (Denmark)

    Bjune, Thea; Risgaard, Bjarke; Kruckow, Line

    2017-01-01

    Aims: Several drugs increase the risk of ventricular fibrillation and sudden cardiac death (SCD). We aimed to investigate in detail the toxicological findings of all young SCD throughout Denmark. Methods and results: Deaths in persons aged 1-49 years were included over a 10-year period. Death...... certificates and autopsy reports were retrieved and read to identify cases of sudden death and establish cause of death. All medico-legal autopsied SCD were included and toxicological reports collected. Positive toxicology was defined as the presence of any substance (licit and/or illicit). All toxicological...... findings had previously been evaluated not to have caused the death (i.e. lethal concentrations were excluded). We identified 620 medico-legal autopsied cases of SCD, of which 77% (n = 477) were toxicologically investigated post-mortem, and 57% (n = 270) had a positive toxicology profile. Sudden cardiac...

  20. Post-mortem whole-body magnetic resonance imaging of human fetuses: a comparison of 3-T vs. 1.5-T MR imaging with classical autopsy

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Xin; Bevilacqua, Elisa; Cos Sanchez, Teresa; Jani, Jacques C. [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Obstetrics and Gynecology, Fetal Medicine Unit, Brussels (Belgium); Cannie, Mieke M. [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Radiology, Brussels (Belgium); Vrije Universiteit Brussel, Department of Radiology, UZ Brussel, Brussels (Belgium); Arthurs, Owen J.; Sebire, Neil J. [Great Ormond Street Hospital for Children NHS Foundation Trust, London (United Kingdom); UCL Institute of Child Health, London (United Kingdom); Segers, Valerie; Fourneau, Catherine [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Fetopathology, Brussels (Belgium)

    2017-08-15

    To prospectively compare diagnostic accuracy of fetal post-mortem whole-body MRI at 3-T vs. 1.5-T. Between 2012 and 2015, post-mortem MRI at 1.5-T and 3-T was performed in fetuses after miscarriage/stillbirth or termination. Clinical MRI diagnoses were assessed using a confidence diagnostic score and compared with classical autopsy to derive a diagnostic error score. The relation of diagnostic error for each organ group with gestational age was calculated and 1.5-T with 3-T was compared with accuracy analysis. 135 fetuses at 12-41 weeks underwent post-mortem MRI (followed by conventional autopsy in 92 fetuses). For all organ groups except the brain, and for both modalities, the diagnostic error decreased with gestation (P < 0.0001). 3-T MRI diagnostic error was significantly lower than that of 1.5-T for all anatomic structures and organ groups, except the orbits and brain. This difference was maintained for fetuses <20 weeks gestation. Moreover, 3-T was associated with fewer non-diagnostic scans and greater concordance with classical autopsy than 1.5-T MRI, especially for the thorax, heart and abdomen in fetuses <20 weeks. Post-mortem fetal 3-T MRI improves confidence scores and overall accuracy compared with 1.5-T, mainly for the thorax, heart and abdomen of fetuses <20 weeks of gestation. (orig.)

  1. Quantification of maceration changes using post mortem MRI in fetuses

    International Nuclear Information System (INIS)

    Montaldo, P.; Addison, S.; Oliveira, V.; Lally, P. J.; Taylor, A. M.; Sebire, N. J.; Thayyil, S.; Arthurs, O. J.

    2016-01-01

    Post mortem imaging is playing an increasingly important role in perinatal autopsy, and correct interpretation of imaging changes is paramount. This is particularly important following intra-uterine fetal death, where there may be fetal maceration. The aim of this study was to investigate whether any changes seen on a whole body fetal post mortem magnetic resonance imaging (PMMR) correspond to maceration at conventional autopsy. We performed pre-autopsy PMMR in 75 fetuses using a 1.5 Tesla Siemens Avanto MR scanner (Erlangen, Germany). PMMR images were reported blinded to the clinical history and autopsy data using a numerical severity scale (0 = no maceration changes to 2 = severe maceration changes) for 6 different visceral organs (total 12). The degree of maceration at autopsy was categorized according to severity on a numerical scale (1 = no maceration to 4 = severe maceration). We also generated quantitative maps to measure the liver and lung T 2 . The mean PMMR maceration score correlated well with the autopsy maceration score (R 2 = 0.93). A PMMR score of ≥4.5 had a sensitivity of 91 %, specificity of 64 %, for detecting moderate or severe maceration at autopsy. Liver and lung T 2 were increased in fetuses with maceration scores of 3–4 in comparison to those with 1–2 (liver p = 0.03, lung p = 0.02). There was a good correlation between PMMR maceration score and the extent of maceration seen at conventional autopsy. This score may be useful in interpretation of fetal PMMR

  2. Essentials of forensic post-mortem MR imaging in adults

    Science.gov (United States)

    Ruder, T D; Thali, M J; Hatch, G M

    2014-01-01

    Post-mortem MR (PMMR) imaging is a powerful diagnostic tool with a wide scope in forensic radiology. In the past 20 years, PMMR has been used as both an adjunct and an alternative to autopsy. The role of PMMR in forensic death investigations largely depends on the rules and habits of local jurisdictions, availability of experts, financial resources, and individual case circumstances. PMMR images are affected by post-mortem changes, including position-dependent sedimentation, variable body temperature and decomposition. Investigators must be familiar with the appearance of normal findings on PMMR to distinguish them from disease or injury. Coronal whole-body images provide a comprehensive overview. Notably, short tau inversion–recovery (STIR) images enable investigators to screen for pathological fluid accumulation, to which we refer as “forensic sentinel sign”. If scan time is short, subsequent PMMR imaging may be focussed on regions with a positive forensic sentinel sign. PMMR offers excellent anatomical detail and is especially useful to visualize pathologies of the brain, heart, subcutaneous fat tissue and abdominal organs. PMMR may also be used to document skeletal injury. Cardiovascular imaging is a core area of PMMR imaging and growing evidence indicates that PMMR is able to detect ischaemic injury at an earlier stage than traditional autopsy and routine histology. The aim of this review is to present an overview of normal findings on forensic PMMR, provide general advice on the application of PMMR and summarise the current literature on PMMR imaging of the head and neck, cardiovascular system, abdomen and musculoskeletal system. PMID:24191122

  3. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    Science.gov (United States)

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  4. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  5. Conceptual Framework for Gentrification Analysis of Iskandar Malaysia

    Directory of Open Access Journals (Sweden)

    Rabiyatul Adawiyah Abd Khalil

    2015-05-01

    Full Text Available Gentrification is generally defined as the transformation of a working class living in the central city into middle-upper class society. It has both positive and negative consequences. Gentrification caused loses of affordable home, however, it is also beneficial because it rejuvenates the tax base as well stimulates mixed income. Question arises whether the characteristics of gentrification in developing countries will appear to be the same or varies to those in developed countries. Because of this research growth, a review of the body of literature related to the mutation of gentrification, i.e. type of gentrification and its characteristics is believed necessary. This will serve as a basis for a conceptual framework to analyze what is happening in Iskandar Malaysia (IM. As globalized urbanization area, IM offers a particularly interesting case as there are already signs of gentrification due to its rapid urbanization. In the residential market, house price in IM shows a rapid and continuous increment. Many foreigners are attracted to the new residential area in IM being promoted as exclusive while promising a quality lifestyle. The locals meanwhile face difficulties in owning a home because of the upward spiraling of house price.  In certain area, the local low income people are displaced by middle and upper income group. The identification of such characteristics and the associated attributes which is the second phase of the study will determine to what extent IM is in the process of gentrification. The paper finally concluded that the sign of gentrification in IM is similar to the other developing countries.

  6. Architecture of collaborating frameworks simulation, visualisation, user interface and analysis

    CERN Document Server

    Pfeier, A; Ferrero-Merlino, B; Giannitrapani, R; Longo, F; Nieminen, P; Pia, M G; Santin, G

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  7. Strategic Port Graph Rewriting: An Interactive Modelling and Analysis Framework

    Directory of Open Access Journals (Sweden)

    Maribel Fernández

    2014-07-01

    Full Text Available We present strategic portgraph rewriting as a basis for the implementation of visual modelling and analysis tools. The goal is to facilitate the specification, analysis and simulation of complex systems, using port graphs. A system is represented by an initial graph and a collection of graph rewriting rules, together with a user-defined strategy to control the application of rules. The strategy language includes constructs to deal with graph traversal and management of rewriting positions in the graph. We give a small-step operational semantics for the language, and describe its implementation in the graph transformation and visualisation tool PORGY.

  8. The social impacts of dams: A new framework for scholarly analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk

    2016-09-15

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  9. The social impacts of dams: A new framework for scholarly analysis

    International Nuclear Information System (INIS)

    Kirchherr, Julian; Charles, Katrina J.

    2016-01-01

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  10. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  11. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    Science.gov (United States)

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  12. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  13. Value Chain Analysis: A Framework for Management of Distance Education.

    Science.gov (United States)

    Woudstra, Andrew; Powell, Richard

    1989-01-01

    Discussion of the benefits of value chain analysis in the management of distance education organizations focuses on an example at Athabasca University. The effects of policies and decisions on the organization and its value system are considered, cost drivers for activities are described, and a future-oriented perspective is emphasized. (14…

  14. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Khaw, Kim Siang [Washington U., Seattle

    2017-10-21

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  15. Social Entrepreneurship: Framework for feasibility analysis of social business concepts

    OpenAIRE

    Groth, Ida Eikvåg; Magnussen, Line

    2011-01-01

    ABSTRACTPURPOSEWith the increased interest in social entrepreneurship demonstrated within business schools and academic environments, the adaption of existing academic entrepreneurial constructs for social entrepreneurship applications becomes relevant. The purpose of this thesis is to develop additional tools to the traditional feasibility analysis. The tools will be specifically directed at technology-based concepts, due to the increased employment of technology-based products to solve soci...

  16. Network analysis: An innovative framework for understanding eating disorder psychopathology.

    Science.gov (United States)

    Smith, Kathryn E; Crosby, Ross D; Wonderlich, Stephen A; Forbush, Kelsie T; Mason, Tyler B; Moessner, Markus

    2018-03-01

    Network theory and analysis is an emerging approach in psychopathology research that has received increasing attention across fields of study. In contrast to medical models or latent variable approaches, network theory suggests that psychiatric syndromes result from systems of causal and reciprocal symptom relationships. Despite the promise of this approach to elucidate key mechanisms contributing to the development and maintenance of eating disorders (EDs), thus far, few applications of network analysis have been tested in ED samples. We first present an overview of network theory, review the existing findings in the ED literature, and discuss the limitations of this literature to date. In particular, the reliance on cross-sectional designs, use of single-item self-reports of symptoms, and instability of results have raised concern about the inferences that can be made from network analyses. We outline several areas to address in future ED network analytic research, which include the use of prospective designs and adoption of multimodal assessment methods. Doing so will provide a clearer understanding of whether network analysis can enhance our current understanding of ED psychopathology and inform clinical interventions. © 2018 Wiley Periodicals, Inc.

  17. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D.; Hadwiger, Markus

    2014-01-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  18. Ovis: A Framework for Visual Analysis of Ocean Forecast Ensembles.

    Science.gov (United States)

    Höllt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D; Hadwiger, Markus

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea.

  19. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  20. VisRseq: R-based visual framework for analysis of sequencing data

    OpenAIRE

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven JM

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for ...

  1. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  2. Accuracy of an efficient framework for structural analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert D.; Fedorov, Vladimir

    2016-01-01

    -section analysis tool is able to capture the effects stemming from material anisotropy and inhomogeneity for sections of arbitrary geometry. The proposed framework is very efficient and therefore ideally suited for integration within wind turbine aeroelastic design and analysis tools. A number of benchmark......This paper presents a novel framework for the structural design and analysis of wind turbine blades and establishes its accuracy. The framework is based on a beam model composed of two parts—a 2D finite element-based cross-section analysis tool and a 3D beam finite element model. The cross...... examples are presented comparing the results from the proposed beam model to 3D shell and solid finite element models. The examples considered include a square prismatic beam, an entire wind turbine rotor blade and a detailed wind turbine blade cross section. Phenomena at both the blade length scale...

  3. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  4. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  5. A threat analysis framework as applied to critical infrastructures in the Energy Sector.

    Energy Technology Data Exchange (ETDEWEB)

    Michalski, John T.; Duggan, David Patrick

    2007-09-01

    The need to protect national critical infrastructure has led to the development of a threat analysis framework. The threat analysis framework can be used to identify the elements required to quantify threats against critical infrastructure assets and provide a means of distributing actionable threat information to critical infrastructure entities for the protection of infrastructure assets. This document identifies and describes five key elements needed to perform a comprehensive analysis of threat: the identification of an adversary, the development of generic threat profiles, the identification of generic attack paths, the discovery of adversary intent, and the identification of mitigation strategies.

  6. Application of contrast media in post-mortem imaging (CT and MRI).

    Science.gov (United States)

    Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice

    2015-09-01

    The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.

  7. The accident analysis in the framework of emergency provisions

    International Nuclear Information System (INIS)

    Tietze, A.

    1981-03-01

    The first part of the report describes the demands on and bases of a reactor emergency plan and outlines the technical characteristics of a nuclear power plant with light-water moderated pressurized-water reactor with special regard to reactor safety. In the second part the failure and risk potentials of a pressurized-water plant are described and discussed. The third part is dedicated to a representation of the analytical method in a stricter sense, according to the current state of technology. Finally the current degree of effectiveness of the reactor accident analysis method is critically discussed and perspectives of future development are pointed out. (orig.) [de

  8. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  9. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  10. A Comparative Analysis of PISA Scientific Literacy Framework in Finnish and Thai Science Curricula

    Science.gov (United States)

    Sothayapetch, Pavinee; Lavonen, Jari; Juuti, Kalle

    2013-01-01

    A curriculum is a master plan that regulates teaching and learning. This paper compares Finnish and Thai primary school level science curricula to the PISA 2006 Scientific Literacy Framework. Curriculum comparison was made following the procedure of deductive content analysis. In the analysis, there were four main categories adopted from PISA…

  11. A Framework for RFID Survivability Requirement Analysis and Specification

    Science.gov (United States)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  12. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  13. A Generalized Framework for Non-Stationary Extreme Value Analysis

    Science.gov (United States)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA

  14. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  15. A statistical framework for differential network analysis from microarray data

    Directory of Open Access Journals (Sweden)

    Datta Somnath

    2010-02-01

    Full Text Available Abstract Background It has been long well known that genes do not act alone; rather groups of genes act in consort during a biological process. Consequently, the expression levels of genes are dependent on each other. Experimental techniques to detect such interacting pairs of genes have been in place for quite some time. With the advent of microarray technology, newer computational techniques to detect such interaction or association between gene expressions are being proposed which lead to an association network. While most microarray analyses look for genes that are differentially expressed, it is of potentially greater significance to identify how entire association network structures change between two or more biological settings, say normal versus diseased cell types. Results We provide a recipe for conducting a differential analysis of networks constructed from microarray data under two experimental settings. At the core of our approach lies a connectivity score that represents the strength of genetic association or interaction between two genes. We use this score to propose formal statistical tests for each of following queries: (i whether the overall modular structures of the two networks are different, (ii whether the connectivity of a particular set of "interesting genes" has changed between the two networks, and (iii whether the connectivity of a given single gene has changed between the two networks. A number of examples of this score is provided. We carried out our method on two types of simulated data: Gaussian networks and networks based on differential equations. We show that, for appropriate choices of the connectivity scores and tuning parameters, our method works well on simulated data. We also analyze a real data set involving normal versus heavy mice and identify an interesting set of genes that may play key roles in obesity. Conclusions Examining changes in network structure can provide valuable information about the

  16. An intersectionality-based policy analysis framework: critical reflections on a methodology for advancing equity.

    Science.gov (United States)

    Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie

    2014-12-10

    In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.

  17. SAFE: A Sentiment Analysis Framework for E-Learning

    Directory of Open Access Journals (Sweden)

    Francesco Colace

    2014-12-01

    Full Text Available The spread of social networks allows sharing opinions on different aspects of life and daily millions of messages appear on the web. This textual information can be a rich source of data for opinion mining and sentiment analysis: the computational study of opinions, sentiments and emotions expressed in a text. Its main aim is the identification of the agreement or disagreement statements that deal with positive or negative feelings in comments or reviews. In this paper, we investigate the adoption, in the field of the e-learning, of a probabilistic approach based on the Latent Dirichlet Allocation (LDA as Sentiment grabber. By this approach, for a set of documents belonging to a same knowledge domain, a graph, the Mixed Graph of Terms, can be automatically extracted. The paper shows how this graph contains a set of weighted word pairs, which are discriminative for sentiment classification. In this way, the system can detect the feeling of students on some topics and teacher can better tune his/her teaching approach. In fact, the proposed method has been tested on datasets coming from e-learning platforms. A preliminary experimental campaign shows how the proposed approach is effective and satisfactory.

  18. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  20. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  1. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  2. Framework for analysis of solar energy systems in the built environment from an exergy perspective

    OpenAIRE

    Torio, H.; Schmidt, D.

    2010-01-01

    Exergy analysis is a more powerful tool than mere energy analysis for showing the improvement potential of energy systems. Direct use of solar radiation instead of degrading other high quality energy resources found in nature is advantageous. Yet, due to physical inconsistencies present in the exergy analysis framework for assessing direct-solar systems commonly found in literature, high exergy losses arise in the conversion process of solar radiation in direct-solar systems. However, these l...

  3. Dutch guideline for clinical foetal-neonatal and paediatric post-mortem radiology, including a review of literature.

    Science.gov (United States)

    Sonnemans, L J P; Vester, M E M; Kolsteren, E E M; Erwich, J J H M; Nikkels, P G J; Kint, P A M; van Rijn, R R; Klein, W M

    2018-06-01

    Clinical post-mortem radiology is a relatively new field of expertise and not common practice in most hospitals yet. With the declining numbers of autopsies and increasing demand for quality control of clinical care, post-mortem radiology can offer a solution, or at least be complementary. A working group consisting of radiologists, pathologists and other clinical medical specialists reviewed and evaluated the literature on the diagnostic value of post-mortem conventional radiography (CR), ultrasonography, computed tomography (PMCT), magnetic resonance imaging (PMMRI), and minimally invasive autopsy (MIA). Evidence tables were built and subsequently a Dutch national evidence-based guideline for post-mortem radiology was developed. We present this evaluation of the radiological modalities in a clinical post-mortem setting, including MIA, as well as the recently published Dutch guidelines for post-mortem radiology in foetuses, neonates, and children. In general, for post-mortem radiology modalities, PMMRI is the modality of choice in foetuses, neonates, and infants, whereas PMCT is advised in older children. There is a limited role for post-mortem CR and ultrasonography. In most cases, conventional autopsy will remain the diagnostic method of choice. Based on a literature review and clinical expertise, an evidence-based guideline was developed for post-mortem radiology of foetal, neonatal, and paediatric patients. What is Known: • Post-mortem investigations serve as a quality check for the provided health care and are important for reliable epidemiological registration. • Post-mortem radiology, sometimes combined with minimally invasive techniques, is considered as an adjunct or alternative to autopsy. What is New: • We present the Dutch guidelines for post-mortem radiology in foetuses, neonates and children. • Autopsy remains the reference standard, however minimal invasive autopsy with a skeletal survey, post-mortem computed tomography, or post-mortem

  4. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  5. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  6. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    Science.gov (United States)

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom

  7. 9 CFR 354.122 - Condemnation on ante-mortem inspection.

    Science.gov (United States)

    2010-01-01

    ... AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY..., on ante-mortem inspection, are condemned shall not be dressed, nor shall they be conveyed into any...

  8. Clarke's Isolation and identification of drugs in pharmaceuticals, body fluids, and post-mortem material

    National Research Council Canada - National Science Library

    Clarke, E. G. C; Moffat, A. C; Jackson, J. V

    1986-01-01

    This book is intended for scientists faced with the difficult problem of identifying an unknown drug in a pharmaceutical product, in a sample of tissue or body fluid from a living patient, or in post-mortem material...

  9. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  10. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    Science.gov (United States)

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  11. A Framework for Collaborative Networked Learning in Higher Education: Design & Analysis

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2014-06-01

    Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.

  12. Post-Mortem Projections: Medieval Mystical Resurrection and the Return of Tupac Shakur

    OpenAIRE

    Spencer-Hall, Alicia

    2012-01-01

    Medieval hagiographies abound with tales of post-mortem visits and miracles by saints. The saint was a powerful religious individual both in life and in death, a conduit of divine grace and lightning rod for Christian fervour. With her post-mortem presence, the presumptive boundary between living and dead, spirit and flesh, is rent apart: showing the reality of the hereafter and shattering the fantasies of the mortal world. The phenomenon of a glorified individual returning to a worshipful co...

  13. Post-Mortem Projections: Medieval Mystical Resurrection and the Return of Tupac Shakur

    OpenAIRE

    Spencer-Hall, A.

    2012-01-01

    Medieval hagiographies abound with tales of post-mortem visits and miracles by saints. The saint was a powerful religious individual both in life and in death, a conduit of divine grace and lightning rod for Christian fervour. With her post-mortem presence, the presumptive boundary between living and dead, spirit and flesh, is rent apart: showing the reality of the hereafter and shattering the fantasies of the mortal world. The phenomenon of a glorified individual returning to ...

  14. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  15. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    Science.gov (United States)

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  16. Sustainability principles in strategic environmental assessment: A framework for analysis and examples from Italian urban planning

    Energy Technology Data Exchange (ETDEWEB)

    Lamorgese, Lydia, E-mail: lydial@tin.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it

    2013-09-15

    This paper presents a framework for analysing the degree of consideration of sustainability principles in Strategic environmental assessment (SEA), and demonstrates its application to a sample of SEA of Italian urban plans. The framework is based on Gibson's (2006) sustainability principles, which are linked to a number of guidance criteria and eventually to review questions, resulting from an extensive literature review. A total of 71 questions are included in the framework, which gives particular emphasis to key concepts, such as intragenerational and intergenerational equity. The framework was applied to review the Environmental Report of the urban plans of 15 major Italian cities. The results of this review show that, even if sustainability is commonly considered as a pivotal concept, there is still work to be done in order to effectively integrate sustainability principles into SEA. In particular, most of the attention is given to mitigation and compensation measures, rather than to actual attempts to propose more sustainable planning decisions in the first place. Concerning the proposed framework of analysis, further research is required to clarify equity concerns and particularly to identify suitable indicators for operationalizing the concepts of intra/inter-generational equity in decision-making. -- Highlights: ► A framework was developed in order to evaluate planning against sustainability criteria. ► The framework was applied to analyse how sustainable principles are addressed in 15 Italian SEA reports. ► Over 85% of the reports addressed, to some extent, at least 40% of the framework questions. ► Criteria explicitly linked to intra and inter-generational equity are rarely addressed.

  17. The Soldier-Cyborg Transformation: A Framework for Analysis of Social and Ethical Issues of Future Warfare

    Science.gov (United States)

    1998-05-26

    government agency. STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...UNCLASSIFIED USAWC STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...P) Donald A. Gagliano, M.D. TITLE: THE SOLDIER CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE WARFARE

  18. Post-mortem imaging compared with autopsy in trauma victims--A systematic review.

    Science.gov (United States)

    Jalalzadeh, Hamid; Giannakopoulos, Georgios F; Berger, Ferco H; Fronczek, Judith; van de Goot, Frank R W; Reijnders, Udo J; Zuidema, Wietse P

    2015-12-01

    Post-mortem imaging or virtual autopsy is a rapidly advancing field of post-mortem investigations of trauma victims. In this review we evaluate the feasibility of complementation or replacement of conventional autopsy by post-mortem imaging in trauma victims. A systematic review was performed in compliance with the PRISMA guidelines. MEDLINE, Embase and Cochrane databases were systematically searched for studies published between January 2008 and January 2014, in which post-mortem imaging was compared to conventional autopsy in trauma victims. Studies were included when two or more trauma victims were investigated. Twenty-six studies were included, with a total number of 563 trauma victims. Post-mortem computer tomography (PMCT) was performed in 22 studies, post-mortem magnetic resonance imaging (PMMRI) in five studies and conventional radiography in two studies. PMCT and PMMRI both demonstrate moderate to high-grade injuries and cause of death accurately. PMCT is more sensitive than conventional autopsy or PMMRI in detecting skeletal injuries. For detecting minor organ and soft tissue injuries, autopsy remains superior to imaging. Aortic injuries are missed frequently by PMCT and PMMRI and form their main limitation. PMCT should be considered as an essential supplement to conventional autopsy in trauma victims since it detects many additional injuries. Despite some major limitations, PMCT could be used as an alternative for conventional autopsy in situations where conventional autopsy is rejected or unavailable. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Forensic radiology: The role of cross-sectional imaging in virtual post-mortem examinations

    International Nuclear Information System (INIS)

    Higginbotham-Jones, Joshua; Ward, Anthony

    2014-01-01

    Aim: The aim of this review is to assess the benefits and limitations of using Multi Slice Computed Tomography and Magnetic Resonance as non-invasive post-mortem imaging methods. Method: The author utilised SciVerse (Science Direct), Scopus, PubMed and Discover to search for relevant articles. The following search terms were used: virtopsy, minimally invasive post-mortem imaging, autopsy, Multi Slice Computed Tomography, Magnetic Resonance. Articles which discussed the use of non-invasive imaging techniques for post-mortem examinations were included in the review. Any articles published before 2003 were excluded with a few exceptions. Findings: The decline in use of the conventional post-mortem method has led to the need for an alternative method of investigation which increases both sensitivity and specificity, and also is more acceptable to the family of the deceased. Discussion/conclusion: There are numerous factors affecting the usability of these non-invasive post-mortem options including cost and availability. With the price of non-invasive post-mortem examinations often rising above £1000, it is considered to be less economically viable than the conventional method. Therefore, further research into this method and its implementation in hospitals has been delayed

  20. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  1. Understanding Universities in Ontario, Canada: An Industry Analysis Using Porter's Five Forces Framework

    Science.gov (United States)

    Pringle, James; Huisman, Jeroen

    2011-01-01

    In analyses of higher education systems, many models and frameworks are based on governance, steering, or coordination models. Although much can be gained by such analyses, we argue that the language used in the present-day policy documents (knowledge economy, competitive position, etc.) calls for an analysis of higher education as an industry. In…

  2. Using a Strategic Planning Tool as a Framework for Case Analysis

    Science.gov (United States)

    Lai, Christine A.; Rivera, Julio C., Jr.

    2006-01-01

    In this article, the authors describe how they use a strategic planning tool known as SWOT as a framework for case analysis, using it to analyze the strengths, weaknesses, opportunities, and threats of a public works project intended to enhance regional economic development in Tempe, Arizona. Students consider the project in light of a variety of…

  3. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe; Dalcin, Lisandro; Collier, Nathan; Calo, Victor M.

    2014-01-01

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation

  4. The SAFE FOODS Risk Analysis Framework suitable for GMOs? A case study

    NARCIS (Netherlands)

    Kuiper, H.A.; Davies, H.V.

    2010-01-01

    This paper describes the current EU regulatory framework for risk analysis of genetically modified (GM) crop cultivation and market introduction of derived food/feed. Furthermore the risk assessment strategies for GM crops and derived food/feed as designed by the European Food Safety Authority

  5. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  6. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  7. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    Science.gov (United States)

    2011-03-01

    Maintenance ABACAS offers a level of flexibility in software development that would be very useful later in the software engineering life cycle. New... Blackjacking : security threats to blackberry devices, PDAs and cell phones in the enterprise. Indianapolis, Indiana, U.S.A.: Wiley Publishing, 2007...AUTOMATED ANALYSIS OF ARM BINARIES USING THE LOW- LEVEL VIRTUAL MACHINE COMPILER FRAMEWORK THESIS Jeffrey B. Scott

  8. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  9. Is survival improved by the use of NIV and PEG in amyotrophic lateral sclerosis (ALS)? A post-mortem study of 80 ALS patients

    OpenAIRE

    Burkhardt, Christian; Neuwirth, Christoph; Sommacal, Andreas; Andersen, Peter M.; Weber, Markus

    2017-01-01

    Background: Non-invasive ventilation (NIV) and percutaneous gastrostomy (PEG) are guideline-recommended interventions for symptom management in amyotrophic lateral sclerosis (ALS). Their effect on survival is controversial and the impact on causes of death is unknown. Objective: To investigate the effect of NIV and PEG on survival and causes of death in ALS patients. Methods: Eighty deceased ALS patients underwent a complete post mortem analysis for causes of death between 2003 and 2015. Fort...

  10. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  11. A framework for the economic analysis of data collection methods for vital statistics.

    Science.gov (United States)

    Jimenez-Soto, Eliana; Hodge, Andrew; Nguyen, Kim-Huong; Dettrick, Zoe; Lopez, Alan D

    2014-01-01

    Over recent years there has been a strong movement towards the improvement of vital statistics and other types of health data that inform evidence-based policies. Collecting such data is not cost free. To date there is no systematic framework to guide investment decisions on methods of data collection for vital statistics or health information in general. We developed a framework to systematically assess the comparative costs and outcomes/benefits of the various data methods for collecting vital statistics. The proposed framework is four-pronged and utilises two major economic approaches to systematically assess the available data collection methods: cost-effectiveness analysis and efficiency analysis. We built a stylised example of a hypothetical low-income country to perform a simulation exercise in order to illustrate an application of the framework. Using simulated data, the results from the stylised example show that the rankings of the data collection methods are not affected by the use of either cost-effectiveness or efficiency analysis. However, the rankings are affected by how quantities are measured. There have been several calls for global improvements in collecting useable data, including vital statistics, from health information systems to inform public health policies. Ours is the first study that proposes a systematic framework to assist countries undertake an economic evaluation of DCMs. Despite numerous challenges, we demonstrate that a systematic assessment of outputs and costs of DCMs is not only necessary, but also feasible. The proposed framework is general enough to be easily extended to other areas of health information.

  12. Feather retention force in broilers ante-, peri-, and post-mortem as influenced by electrical and carbon dioxide stunning.

    Science.gov (United States)

    Buhr, R J; Cason, J A; Rowland, G N

    1997-11-01

    Stunning and slaughter trials were conducted to evaluate the influence of stunning method (electrical 50 V alternating current, CO2 gas: 0 to 40% for 90 s or 40 to 60% for 30 s) on feather retention force (FRF) in commercial broilers. Feathers from the pectoral, sternal, and femoral feather tracts were sampled with a force gauge before stunning (ante-mortem) and contralaterally either after stunning (peri-mortem from 0.5 to 4 min) or after stunning and bleeding (post-mortem from 2 to 6 min). Prior to stunning, ante-mortem FRF values varied among assigned stunning methods only for the pectoral (7%) feather tract. After stunning, peri-mortem FRF values were higher only for the sternal tract (11% for 40 to 60% CO2 for 30 s); whereas after stunning and bleeding, post-mortem FRF values were lower than ante- or peri-mortem only for the sternal tract (10% lower for 40 to 60% CO2 for 30 s). Peri- and post-mortem FRF values did not differ among stunning methods for the pectoral and femoral feather tracts. Small changes in FRF values occurred from ante-mortem to peri-mortem (-1 to +12%), and from ante-mortem to post-mortem (-2 to +8%) across stunning methods. A significant increase was determined for only the pectoral tract (7%) from ante- to peri-mortem across stunning methods. Electrically stunned broilers that were not bled gained weight in excess of the 36 feathers removed (0.16%), apparently due to body surface water pickup during the brine-stunning process, whereas CO2-stunned broilers lost weight due to excretion of cloacal contents (-0.31 to -0.98%). The change in body weight among stunning methods was significant (P defeathering efficiency may not differ after scalding.

  13. Breast density quantification with cone-beam CT: a post-mortem study

    International Nuclear Information System (INIS)

    Johnson, Travis; Ding, Huanjun; Le, Huy Q; Ducote, Justin L; Molloi, Sabee

    2013-01-01

    Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The per cent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson's r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. (paper)

  14. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  15. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  16. A Stochastic Hybrid Systems framework for analysis of Markov reward models

    International Nuclear Information System (INIS)

    Dhople, S.V.; DeVille, L.; Domínguez-García, A.D.

    2014-01-01

    In this paper, we propose a framework to analyze Markov reward models, which are commonly used in system performability analysis. The framework builds on a set of analytical tools developed for a class of stochastic processes referred to as Stochastic Hybrid Systems (SHS). The state space of an SHS is comprised of: (i) a discrete state that describes the possible configurations/modes that a system can adopt, which includes the nominal (non-faulty) operational mode, but also those operational modes that arise due to component faults, and (ii) a continuous state that describes the reward. Discrete state transitions are stochastic, and governed by transition rates that are (in general) a function of time and the value of the continuous state. The evolution of the continuous state is described by a stochastic differential equation and reward measures are defined as functions of the continuous state. Additionally, each transition is associated with a reset map that defines the mapping between the pre- and post-transition values of the discrete and continuous states; these mappings enable the definition of impulses and losses in the reward. The proposed SHS-based framework unifies the analysis of a variety of previously studied reward models. We illustrate the application of the framework to performability analysis via analytical and numerical examples

  17. Stomach: ultrasonography evaluation and post mortem inspection in adult horses

    Directory of Open Access Journals (Sweden)

    Cristiano Chaves Pessoa da Veiga

    2014-06-01

    Full Text Available ABSTRACT. Veiga C.C.P., Cascon C.M., Souza B.G., Braga L.S.M., Souza V.C., Ferreira A.M.R. & Leite J.S. [Stomach: ultrasonography evaluation and post mortem inspection in adult horses.] Avaliação ultrassonográfica e anatomopatológica macroscópica do estômago de equinos destinados ao abate comercial. Revista Brasileira de Medicina Veterinária, 36(2:125-130, 2014. Instituto de Veterinária, Universidade Federal Rural do Rio de Janeiro, BR 465, km 7, Seropédica, 23890-000, RJ, Brasil. E-mail: radiovet@ufrrj.br The equine gastric ulcer syndrome (EGUS includes all symptomatic or asymptomatic cases of erosions, ulcers, gastritis, gastric emptying disorders, duodenitis, duodenal ulcers and complications of these disorders. Occupies a prominent place in the equine clinic where you can go for the death of the animal. Ultrasonography of the stomach is indicated when the animals showed clinical signs of gastric disease. The aim of this study was to describe the sonographic evaluation and macroscopic pathological findings of the stomach of adult horses intended for commercial slaughter. To this 39 intended for commercial slaughter horses were evaluated. Sonographic evaluation before slaughter transabdominal via the left side of abdomen for evaluation of the stomach was performed. After the slaughter of these animals their stomachs were collected, evaluated and photographed. The study concluded that ultrasonography identified the stomach in all animals evaluated, but did not allow a careful evaluation of the entire length of the viscera, especially the aglandular region and pleated border. All animals evaluated had injury to the gastric mucosa in different degrees. In animals evaluated, the stomach region was most affected by injuries glandular region, although the most severe lesions have been found in the ruffled border adjacent to aglandular region.

  18. O líquido cefalorraqueano no post-mortem

    Directory of Open Access Journals (Sweden)

    A. Spina-França

    1969-12-01

    Full Text Available Foi estudado o LCR de 45 cadáveres, sendo os resultados considerados em função do tempo decorrido entre o momento da morte e a colheita do LCR (TOC. Obedecendo a esse critério os casos foram assim grupados: 1 aqueles com TOC até 4 horas; 2 aqueles com TOC de 4 a 8 horas; 3 aqueles com TOC de 8 horas ou mais. Com o aumento do TOC a presença de hemácias no LCR de cadáveres se torna mais freqüente e mais intensa. A mistura de sangue ao LCR prejudica a avaliação das modificações cadavéricas de outros componentes do LCR, conforme foi demonstrado para as concentrações de cloretos glicose e proteínas totais, para o perfil protêico e para a atividade de transaminases. Assim sendo, para avaliar as modificações da composição do LCR próprias ao post-mortem devem ser considerados apenas os casos com menos de 1000 hemácias/mm³. O número normal de leucócitos foi proporcionalmente mais comum nas amostras provenientes de cadáveres cujo TOC era igual ou superior a 8 horas. A pleocitose foi observada com mais freqüência que o número normal de leucócitos, sendo mais comumente ligeira ou discreta. Quantidades superiores a 50 leucócitos/mm³ foram observadas geralmente em casos relativos a pacientes que faleceram na vigência de processos infecciosos agudos. As concentrações de cloretos e de glicose no LCR tendem a cair no postmortem e as diminuições mostraram-se, em média, tanto mais intensas quanto maior o TOC. A hipoglicorraquia foi, em média mais acentuada nos casos com pleoeitose mais intensa. A concentração de uréia tende a elevar-se de modo precoce, não tendo sido encontradas médias significativamente diversas em função do TOC. A atividade de TGO tende a elevar-se no post-mortem sendo esta elevação, em média, mais nítida a partir do grupo de casos com TOC de 4 até 8 horas. Ocorre também tendência a aumento da atividade de TGP; esta se mostrou menos intensa que a de TGO e, em média, foi mais nítida a

  19. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  20. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  1. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    and approaches which have been developed or proposed by large organizations or regulatory bodies for NM. These frameworks and approaches were evaluated and assessed based on a select number of criteria which have been previously proposed as important parameters for inclusion in successful risk assessment......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  2. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    Science.gov (United States)

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  3. A unified framework for risk and vulnerability analysis covering both safety and security

    International Nuclear Information System (INIS)

    Aven, Terje

    2007-01-01

    Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke

  4. Vital analysis: field validation of a framework for annotating biological signals of first responders in action.

    Science.gov (United States)

    Gomes, P; Lopes, B; Coimbra, M

    2012-01-01

    First responders are professionals that are exposed to extreme stress and fatigue during extended periods of time. That is why it is necessary to research and develop technological solutions based on wearable sensors that can continuously monitor the health of these professionals in action, namely their stress and fatigue levels. In this paper we present the Vital Analysis smartphone-based framework, integrated into the broader Vital Responder project, that allows the annotation and contextualization of the signals collected during real action. After a contextual study we have implemented and deployed this framework in a firefighter team with 5 elements, from where we have collected over 3300 hours of annotations during 174 days, covering 382 different events. Results are analysed and discussed, validating the framework as a useful and usable tool for annotating biological signals of first responders in action.

  5. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  6. Post-mortem radiography of the lungs: Experiments to compare various methods of examination and descriptions of their usefulness in actual practice

    International Nuclear Information System (INIS)

    Pankow, W.

    1986-01-01

    Described is the post-mortem examination of the isolated lung using radiologic-morphologic and histologic methods. Comparisons are made ragarding practical value of the conservation techniques chosen, which were the method of Markaria and Heitzmann that is based on fixing in alcohol and air drying and the nitrogen freezing method developed by Rau and colleagues. Both methods ensure adequate visualition of the pulmonary ultrastructure by X-rays, even though this observation should be qualified by the fact that pulmonary tissue fixed in alcohol tends to shrink and that intraalveolar edema are thus artificially reduced. Either of the methods under investigation permits angiographic and bronchographic examinations to be carried out without difficulty. In macroscopic evaluations better results are obtained for lungs fixed in alcohol. Freeze-dried samples offer advantages in the histological assessment of the pulmonary ultrastructure. Post-mortem radiography of the lungs is particularly valuable in the analysis of pathological changes in the pulmonary structure. (MBC) [de

  7. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  8. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  9. Late stillbirth post mortem examination in New Zealand: Maternal decision-making.

    Science.gov (United States)

    Cronin, Robin S; Li, Minglan; Wise, Michelle; Bradford, Billie; Culling, Vicki; Zuccollo, Jane; Thompson, John M D; Mitchell, Edwin A; McCowan, Lesley M E

    2018-03-05

    For parents who experience stillbirth, knowing the cause of their baby's death is important. A post mortem examination is the gold standard investigation, but little is known about what may influence parents' decisions to accept or decline. We aimed to identify factors influencing maternal decision-making about post mortem examination after late stillbirth. In the New Zealand Multicentre Stillbirth Study, 169 women with singleton pregnancies, no known abnormality at recruitment, and late stillbirth (≥28weeks gestation), from seven health regions were interviewed within six weeks of birth. The purpose of this paper was to explore factors related to post mortem examination decision-making and the reasons for declining. We asked women if they would make the same decision again. Maternal decision to decline a post mortem (70/169, 41.4%) was more common among women of Māori (adjusted odds ratio (aOR) 4.99 95% confidence interval (CI) 1.70-14.64) and Pacific (aOR 3.94 95% CI 1.47-10.54) ethnicity compared to European, and parity two or more (aOR 2.95 95% CI 1.14-7.62) compared to primiparous. The main reason for declining was that women 'did not want baby to be cut'. Ten percent (7/70) who declined said they would not make this decision again. No woman who consented regretted her decision. Ethnic differences observed in women's post mortem decision-making should be further explored in future studies. Providing information of the effect of post mortem on the baby's body and the possible emotional benefits of a post mortem may assist women faced with this decision in the future. © 2018 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  10. Rigor mortis at the myocardium investigated by post-mortem magnetic resonance imaging.

    Science.gov (United States)

    Bonzon, Jérôme; Schön, Corinna A; Schwendener, Nicole; Zech, Wolf-Dieter; Kara, Levent; Persson, Anders; Jackowski, Christian

    2015-12-01

    Post-mortem cardiac MR exams present with different contraction appearances of the left ventricle in cardiac short axis images. It was hypothesized that the grade of post-mortem contraction may be related to the post-mortem interval (PMI) or cause of death and a phenomenon caused by internal rigor mortis that may give further insights in the circumstances of death. The cardiac contraction grade was investigated in 71 post-mortem cardiac MR exams (mean age at death 52 y, range 12-89 y; 48 males, 23 females). In cardiac short axis images the left ventricular lumen volume as well as the left ventricular myocardial volume were assessed by manual segmentation. The quotient of both (LVQ) represents the grade of myocardial contraction. LVQ was correlated to the PMI, sex, age, cardiac weight, body mass and height, cause of death and pericardial tamponade when present. In cardiac causes of death a separate correlation was investigated for acute myocardial infarction cases and arrhythmic deaths. LVQ values ranged from 1.99 (maximum dilatation) to 42.91 (maximum contraction) with a mean of 15.13. LVQ decreased slightly with increasing PMI, however without significant correlation. Pericardial tamponade positively correlated with higher LVQ values. Variables such as sex, age, body mass and height, cardiac weight and cause of death did not correlate with LVQ values. There was no difference in LVQ values for myocardial infarction without tamponade and arrhythmic deaths. Based on the observation in our investigated cases, the phenomenon of post-mortem myocardial contraction cannot be explained by the influence of the investigated variables, except for pericardial tamponade cases. Further research addressing post-mortem myocardial contraction has to focus on other, less obvious factors, which may influence the early post-mortem phase too. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Prevalence and concordance between the clinical and the post-mortem diagnosis of dementia in a psychogeriatric clinic.

    Science.gov (United States)

    Grandal Leiros, B; Pérez Méndez, L I; Zelaya Huerta, M V; Moreno Eguinoa, L; García-Bragado, F; Tuñón Álvarez, T; Roldán Larreta, J J

    The aim of our study is to describe the types of dementia found in a series of patients and to estimate the level of agreement between the clinical diagnosis and post-mortem diagnosis. We conducted a descriptive analysis of the prevalence of the types of dementia found in our series and we established the level of concordance between the clinical and the post-mortem diagnoses. The diagnosis was made based on current diagnostic criteria. 114 cases were included. The most common clinical diagnoses both at a clinical and autopsy level were Alzheimer disease and mixed dementia but the prevalence was quite different. While at a clinical level, prevalence was 39% for Alzheimer disease and 18% for mixed dementia, in the autopsy level, prevalence was 22% and 34%, respectively. The agreement between the clinical and the autopsy diagnoses was 62% (95% CI 53-72%). Almost a third of our patients were not correctly diagnosed in vivo. The most common mistake was the underdiagnosis of cerebrovascular pathology. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. The importance of post-mortem computed tomography (PMCT) in confrontation with conventional forensic autopsy of victims of motorcycle accidents.

    Science.gov (United States)

    Moskała, Artur; Woźniak, Krzysztof; Kluza, Piotr; Romaszko, Karol; Lopatin, Oleksij

    2016-01-01

    Since traffic accidents are an important problem in forensic medicine, there is a constant search for new solutions to help with an investigation process in such cases. In recent years there was a rapid development of post-mortem imaging techniques, especially post-mortem computed tomography (PMCT). In our work we concentrated on a potential advantage of PMCT in cases of motorcycle accident fatalities. The results of forensic autopsy were compared with combined results of the autopsy and PMCT to check in which areas use of these two techniques gives statistically important increase in number of findings. The hypothesis was confirmed in case of pneumothorax and fractures of skull, spine, clavicle, scapula, lower leg bones. As for majority of other bone fractures locations and brain injures there were single cases with pathologies visible only in PMCT, but too few to reach expected level of p-value. In case of injuries of solid organs and soft tissues statistical analysis did not confirmed any advantage of unenhanced PMCT use. On the whole it has been shown that PMCT used as an adjunct to forensic autopsy can cause an increase in information about vitally important regions in case of motorcycle accident fatalities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Post-mortem hemoparasite detection in free-living Brazilian brown brocket deer (Mazama gouazoubira, Fischer 1814).

    Science.gov (United States)

    Silveira, Júlia Angélica Gonçalves da; Rabelo, Elida Mara Leite; Lima, Paula Cristina Senra; Chaves, Bárbara Neves; Ribeiro, Múcio Flávio Barbosa

    2014-01-01

    Tick-borne infections can result in serious health problems for wild ruminants, and some of these infectious agents can be considered zoonosis. The aim of the present study was the post-mortem detection of hemoparasites in free-living Mazama gouazoubira from Minas Gerais state, Brazil. The deer samples consisted of free-living M. gouazoubira (n = 9) individuals that died after capture. Necropsy examinations of the carcasses were performed to search for macroscopic alterations. Organ samples were collected for subsequent imprint slides, and nested PCR assays were performed to detect hemoparasite species. Imprint slide assays from four deer showed erythrocytes infected with Piroplasmida small trophozoites, and A. marginale corpuscles were observed in erythrocytes from two animals. A. marginale and trophozoite co-infections occurred in two deer. A nested PCR analysis of the organs showed that six of the nine samples were positive for Theileria sp., five were positive for A. phagocytophilum and three were positive for A. marginale, with co-infection occurring in four deer. The results of the present study demonstrate that post-mortem diagnostics using imprint slides and molecular assays are an effective method for detecting hemoparasites in organs.

  14. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  15. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  16. Practical experience in post-mortem tissue donation in consideration of the European tissue law.

    Science.gov (United States)

    Karbe, Thomas; Braun, Christian; Wulff, Birgit; Schröder, Ann Sophie; Püschel, Klaus; Bratzke, Hansjürgen; Parzeller, Markus

    2010-03-01

    In consequence of the European guidelines of safety and quality standards for the donation, retrieval, storing and distribution of human tissues and cells the purpose of tissue transplantation was implemented into German legislation in May 2007. The law came into effect on August 1st 2007 considering of the European rules. The Institutes for Legal Medicine of the University of Frankfurt/Main and the University Medical Center Hamburg-Eppendorf developed a model for tissue retrieval. The Institute of Legal Medicine (I.f.R.) at the University Medical Center Hamburg cooperates with the German Institute of Cell and Tissue Replacement (Deutsches Institut für Zell--und Gewebeersatz DIZG). Potential post-mortem tissue donors (PMTD) among the deceased are selected by standardized sets of defined criteria. The procedure is guided by the intended exclusion criteria of the tissue regulation draft (German Transplant Law TPG GewV) in accordance with the European Guideline (2006/17/EC). Following the identification of the donor and subsequent removal of tissue, the retrieved samples were sent to the DIZG, a non-profit tissue bank according to the tissue regulation. Here the final processing into transplantable tissue grafts takes place, which then results in the allocation of tissue to hospitals in Germany and other European countries. The Center of Legal Medicine at the Johann Wolfgang Goethe-University Medical Center Frankfurt/Main cooperates since 2000 with Tutogen, a pharmaceutical company. Harvesting of musculoskeletal tissues follows corresponding regulations. To verify the outcome of PMTD at the I.f.R. Hamburg, two-statistic analysis over 12 and 4 months have been implemented. Our results have shown an increasing number of potential appropriate PMTD within the second inquiry interval but a relatively small and unvaryingly rate of successful post-mortem tissue retrievals similar to the first examination period. Thus, the aim of the model developed by the I.f.R. is to

  17. O líquido cefalorraqueano no post-mortem The cerebrospinal fluid in the post-mortem

    Directory of Open Access Journals (Sweden)

    A. Spina-França

    1969-12-01

    Full Text Available Foi estudado o LCR de 45 cadáveres, sendo os resultados considerados em função do tempo decorrido entre o momento da morte e a colheita do LCR (TOC. Obedecendo a esse critério os casos foram assim grupados: 1 aqueles com TOC até 4 horas; 2 aqueles com TOC de 4 a 8 horas; 3 aqueles com TOC de 8 horas ou mais. Com o aumento do TOC a presença de hemácias no LCR de cadáveres se torna mais freqüente e mais intensa. A mistura de sangue ao LCR prejudica a avaliação das modificações cadavéricas de outros componentes do LCR, conforme foi demonstrado para as concentrações de cloretos glicose e proteínas totais, para o perfil protêico e para a atividade de transaminases. Assim sendo, para avaliar as modificações da composição do LCR próprias ao post-mortem devem ser considerados apenas os casos com menos de 1000 hemácias/mm³. O número normal de leucócitos foi proporcionalmente mais comum nas amostras provenientes de cadáveres cujo TOC era igual ou superior a 8 horas. A pleocitose foi observada com mais freqüência que o número normal de leucócitos, sendo mais comumente ligeira ou discreta. Quantidades superiores a 50 leucócitos/mm³ foram observadas geralmente em casos relativos a pacientes que faleceram na vigência de processos infecciosos agudos. As concentrações de cloretos e de glicose no LCR tendem a cair no postmortem e as diminuições mostraram-se, em média, tanto mais intensas quanto maior o TOC. A hipoglicorraquia foi, em média mais acentuada nos casos com pleoeitose mais intensa. A concentração de uréia tende a elevar-se de modo precoce, não tendo sido encontradas médias significativamente diversas em função do TOC. A atividade de TGO tende a elevar-se no post-mortem sendo esta elevação, em média, mais nítida a partir do grupo de casos com TOC de 4 até 8 horas. Ocorre também tendência a aumento da atividade de TGP; esta se mostrou menos intensa que a de TGO e, em média, foi mais nítida a

  18. Recent advances in metal-organic frameworks and covalent organic frameworks for sample preparation and chromatographic analysis.

    Science.gov (United States)

    Wang, Xuan; Ye, Nengsheng

    2017-12-01

    In the field of analytical chemistry, sample preparation and chromatographic separation are two core procedures. The means by which to improve the sensitivity, selectivity and detection limit of a method have become a topic of great interest. Recently, porous organic frameworks, such as metal-organic frameworks (MOFs) and covalent organic frameworks (COFs), have been widely used in this research area because of their special features, and different methods have been developed. This review summarizes the applications of MOFs and COFs in sample preparation and chromatographic stationary phases. The MOF- or COF-based solid-phase extraction (SPE), solid-phase microextraction (SPME), gas chromatography (GC), high-performance liquid chromatography (HPLC) and capillary electrochromatography (CEC) methods are described. The excellent properties of MOFs and COFs have resulted in intense interest in exploring their performance and mechanisms for sample preparation and chromatographic separation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. UNC-Utah NA-MIC framework for DTI fiber tract analysis.

    Science.gov (United States)

    Verde, Audrey R; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts.

  20. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  1. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  2. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  3. Value of systematic post mortem radiographic examinations of fetuses - 400 cases

    Energy Technology Data Exchange (ETDEWEB)

    Kalifa, G.; Sellier, N.; Barbet, J.P.; Labbe, F.; Houette, A.

    1989-01-01

    A retrospective study of 400 cases of fetal deaths has been carried out to assess the value of systematic post mortem radiological examination. Apart from general diagnosis purpose, special attention was given to the assessment of bone age and mineralization. The results were correlated with the clinical, U.S., chromosomal and pathological data. Computerized analysis of our information show the following results: (1) The radiological examination was valuable for the final diagnosis in 13.5% of cases. (2) It brings additional information in 34.5% of cases. (3) It had no diagnostic value in 52%. Furthermore several points deserve attention such as apparition of teeth (21 weeks), calcaneum (24 weeks). Major osteoporosis was always associated with a constitutional bone disease or an infectious process. An excessive length of the upper limbs (12) was seen in 11 cases of anencephaly. We suggest that a radiological examination should not be routinely performed, when the diagnosis is otherwise obvious, but should be considered in the presence of dwarfism, or other limb abnormalities and when the gestational age is uncertain. The films provide essential information especially for further genetic counselling.

  4. Value of systematic post mortem radiographic examinations of fetuses - 400 cases

    International Nuclear Information System (INIS)

    Kalifa, G.; Sellier, N.; Barbet, J.P.; Labbe, F.; Houette, A.

    1989-01-01

    A retrospective study of 400 cases of fetal deaths has been carried out to assess the value of systematic post mortem radiological examination. Apart from general diagnosis purpose, special attention was given to the assessment of bone age and mineralization. The results were correlated with the clinical, U.S., chromosomal and pathological data. Computerized analysis of our information show the following results: (1) The radiological examination was valuable for the final diagnosis in 13.5% of cases. (2) It brings additional information in 34.5% of cases. (3) It had no diagnostic value in 52%. Furthermore several points deserve attention such as apparition of teeth (21 weeks), calcaneum (24 weeks). Major osteoporosis was always associated with a constitutional bone disease or an infectious process. An excessive length of the upper limbs (12) was seen in 11 cases of anencephaly. We suggest that a radiological examination should not be routinely performed, when the diagnosis is otherwise obvious, but should be considered in the presence of dwarfism, or other limb abnormalities and when the gestational age is uncertain. The films provide essential information especially for further genetic counselling. (orig./MG)

  5. A generic framework for the description and analysis of energy security in an energy system

    International Nuclear Information System (INIS)

    Hughes, Larry

    2012-01-01

    While many energy security indicators and models have been developed for specific jurisdictions or types of energy, few can be considered sufficiently generic to be applicable to any energy system. This paper presents a framework that attempts to meet this objective by combining the International Energy Agency's definition of energy security with structured systems analysis techniques to create three energy security indicators and a process-flow energy systems model. The framework is applicable to those energy systems which can be described in terms of processes converting or transporting flows of energy to meet the energy–demand flows from downstream processes. Each process affects the environment and is subject to jurisdictional policies. The framework can be employed to capture the evolution of energy security in an energy system by analyzing the results of indicator-specific metrics applied to the energy, demand, and environment flows associated with the system's constituent processes. Energy security policies are treated as flows to processes and classified into one of three actions affecting the process's energy demand or the process or its energy input, or both; the outcome is determined by monitoring changes to the indicators. The paper includes a detailed example of an application of the framework. - Highlights: ► The IEA's definition of energy security is parsed into three energy security indicators: availability, affordability, and acceptability. ► Data flow diagrams and other systems analysis tools can represent an energy system and its processes, flows, and chains. ► Indicator-specific metrics applied to a process's flow determine the state of energy security in an energy system, an energy chain, or process. ► Energy policy is considered as a flow and policy outcomes are obtained by measuring flows with indicator-specific metrics. ► The framework is applicable to most jurisdictions and energy types.

  6. Solving nonlinear, High-order partial differential equations using a high-performance isogeometric analysis framework

    KAUST Repository

    Cortes, Adriano Mauricio; Vignal, Philippe; Sarmiento, Adel; Garcí a, Daniel O.; Collier, Nathan; Dalcin, Lisandro; Calo, Victor M.

    2014-01-01

    In this paper we present PetIGA, a high-performance implementation of Isogeometric Analysis built on top of PETSc. We show its use in solving nonlinear and time-dependent problems, such as phase-field models, by taking advantage of the high-continuity of the basis functions granted by the isogeometric framework. In this work, we focus on the Cahn-Hilliard equation and the phase-field crystal equation.

  7. A framework for adaptive e-learning for continuum mechanics and structural analysis

    OpenAIRE

    Mosquera Feijoo, Juan Carlos; Plaza Beltrán, Luis Francisco; González Rodrigo, Beatriz

    2015-01-01

    This paper presents a project for providing the students of Structural Engineering with the flexibility to learn outside classroom schedules. The goal is a framework for adaptive E-learning based on a repository of open educational courseware with a set of basic Structural Engineering concepts and fundamentals. These are paramount for students to expand their technical knowledge and skills in structural analysis and design of tall buildings, arch-type structures as well as bridges. Thus, conc...

  8. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe

    2014-06-06

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation, and the phase-field crystal equation as test cases. These two models allow us to highlight some of the main advantages that we have access to while using PetIGA for scientific computing.

  9. TomoPy: a framework for the analysis of synchrotron tomographic data

    International Nuclear Information System (INIS)

    Gürsoy, Doǧa; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris

    2014-01-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing

  10. TomoPy: a framework for the analysis of synchrotron tomographic data

    Energy Technology Data Exchange (ETDEWEB)

    Gürsoy, Doǧa, E-mail: dgursoy@aps.anl.gov; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris [Advanced Photon Source, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4837 (United States)

    2014-08-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing.

  11. A framework of analysis for field experiments with alternative materials in road construction.

    Science.gov (United States)

    François, D; Jullien, A

    2009-01-01

    In France, a wide variety of alternative materials is produced or exists in the form of stockpiles built up over time. Such materials are distributed over various regions of the territory depending on local industrial development and urbanisation trends. The use of alternative materials at a national scale implies sharing local knowledge and experience. Building a national database on alternative materials for road construction is useful in gathering and sharing information. An analysis of feedback from onsite experiences (back analysis) is essential to improve knowledge on alternative material use in road construction. Back analysis of field studies has to be conducted in accordance with a single common framework. This could enable drawing comparisons between alternative materials and between road applications. A framework for the identification and classification of data used in back analyses is proposed. Since the road structure is an open system, this framework has been based on a stress-response approach at both the material and structural levels and includes a description of external factors applying during the road service life. The proposal has been shaped from a review of the essential characteristics of road materials and structures, as well as from the state of knowledge specific to alternative material characterisation.

  12. UNC-Utah NA-MIC Framework for DTI Fiber Tract Analysis

    Directory of Open Access Journals (Sweden)

    Audrey Rose Verde

    2014-01-01

    Full Text Available Diffusion tensor imaging has become an important modality in the field ofneuroimaging to capture changes in micro-organization and to assess white matterintegrity or development. While there exists a number of tractography toolsets,these usually lack tools for preprocessing or to analyze diffusion properties alongthe fiber tracts. Currently, the field is in critical need of a coherent end-to-endtoolset for performing an along-fiber tract analysis, accessible to non-technicalneuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents acoherent, open source, end-to-end toolset for atlas fiber tract based DTI analysisencompassing DICOM data conversion, quality control, atlas building, fibertractography, fiber parameterization, and statistical analysis of diffusionproperties. Most steps utilize graphical user interfaces (GUI to simplifyinteraction and provide an extensive DTI analysis framework for non-technicalresearchers/investigators. We illustrate the use of our framework on a smallsample, cross sectional neuroimaging study of 8 healthy 1-year-old children fromthe Infant Brain Imaging Study (IBIS Network. In this limited test study, weillustrate the power of our method by quantifying the diffusion properties at 1year of age on the genu and splenium fiber tracts.

  13. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. A Novel Framework for Interactive Visualization and Analysis of Hyperspectral Image Data

    Directory of Open Access Journals (Sweden)

    Johannes Jordan

    2016-01-01

    Full Text Available Multispectral and hyperspectral images are well established in various fields of application like remote sensing, astronomy, and microscopic spectroscopy. In recent years, the availability of new sensor designs, more powerful processors, and high-capacity storage further opened this imaging modality to a wider array of applications like medical diagnosis, agriculture, and cultural heritage. This necessitates new tools that allow general analysis of the image data and are intuitive to users who are new to hyperspectral imaging. We introduce a novel framework that bundles new interactive visualization techniques with powerful algorithms and is accessible through an efficient and intuitive graphical user interface. We visualize the spectral distribution of an image via parallel coordinates with a strong link to traditional visualization techniques, enabling new paradigms in hyperspectral image analysis that focus on interactive raw data exploration. We combine novel methods for supervised segmentation, global clustering, and nonlinear false-color coding to assist in the visual inspection. Our framework coined Gerbil is open source and highly modular, building on established methods and being easily extensible for application-specific needs. It satisfies the need for a general, consistent software framework that tightly integrates analysis algorithms with an intuitive, modern interface to the raw image data and algorithmic results. Gerbil finds its worldwide use in academia and industry alike with several thousand downloads originating from 45 countries.

  15. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  16. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  17. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  18. Post-mortem whole-body magnetic resonance imaging of human fetuses: a comparison of 3-T vs. 1.5-T MR imaging with classical autopsy.

    Science.gov (United States)

    Kang, Xin; Cannie, Mieke M; Arthurs, Owen J; Segers, Valerie; Fourneau, Catherine; Bevilacqua, Elisa; Cos Sanchez, Teresa; Sebire, Neil J; Jani, Jacques C

    2017-08-01

    To prospectively compare diagnostic accuracy of fetal post-mortem whole-body MRI at 3-T vs. 1.5-T. Between 2012 and 2015, post-mortem MRI at 1.5-T and 3-T was performed in fetuses after miscarriage/stillbirth or termination. Clinical MRI diagnoses were assessed using a confidence diagnostic score and compared with classical autopsy to derive a diagnostic error score. The relation of diagnostic error for each organ group with gestational age was calculated and 1.5-T with 3-T was compared with accuracy analysis. 135 fetuses at 12-41 weeks underwent post-mortem MRI (followed by conventional autopsy in 92 fetuses). For all organ groups except the brain, and for both modalities, the diagnostic error decreased with gestation (P autopsy than 1.5-T MRI, especially for the thorax, heart and abdomen in fetuses autopsy increases with 3-T. • PM-MRI using 3-T is particularly interesting for thoracic and abdominal organs. • PM-MRI using 3-T is particularly interesting for fetuses < 20 weeks' gestation.

  19. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  20. The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis

    Science.gov (United States)

    Fuchs, E. F.

    1972-01-01

    A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.

  1. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    Science.gov (United States)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  2. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    Science.gov (United States)

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Validation of a Framework for Measuring Hospital Disaster Resilience Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-06-01

    Full Text Available Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1, disaster management mechanisms (F2, hospital infrastructural safety (F3, and disaster resources (F4. These factors displayed good internal consistency. The overall level of hospital disaster resilience (F was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

  4. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  5. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  6. Learner Analysis Framework for Globalized E-Learning: A Case Study

    Directory of Open Access Journals (Sweden)

    Mamta Saxena

    2011-06-01

    Full Text Available The shift to technology-mediated modes of instructional delivery and increased global connectivity has led to a rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging, and culturally competent instruction for an increasingly diverse learner population. The purpose of this study was to explore strategies for expanding learner analysis within the instructional design process to better address cultural influences on learning. A case study approach leveraged the experience of practicing instructional designers to build a framework for culturally competent learner analysis.The study discussed the related challenges and recommended strategies to improve the effectiveness of cross-cultural learner analysis. Based on the findings, a framework for conducting cross-cultural learner analysis to guide the cultural analysis of diverse learners was proposed. The study identified the most critical factors in improving cross-cultural learner analysis as the judicious use of existing research on cross-cultural theories and joint deliberation on the part of all the participants from the management to the learners. Several strategies for guiding and improving the cultural inquiry process were summarized. Barriers and solutions for the requirements are also discussed.

  7. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  8. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Verbruggen, Aviel; Laes, Erik

    2015-01-01

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  9. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  10. Post-mortem fetal MRI: What do we learn from it?

    International Nuclear Information System (INIS)

    Whitby, E.H.; Paley, M.N.J.; Cohen, M.; Griffiths, P.D.

    2006-01-01

    Post-mortem magnetic resonance (MR) imaging is of increasing interest not only as an alternative to autopsy but as a research tool to aid the interpretation and diagnosis of in utero MR images. The information from the post-mortem MR has allowed the development of imaging sequences applicable to in utero imaging and neonatal imaging. It has established brain development during gestation and has provided data on this to which in utero MR can be compared. The detail available from the post-mortem images is such that brain development can be studied in a non-invasive manner, a permanent record on the normal and abnormal areas is available and a greater understanding of developmental abnormalities is possible

  11. Traumatic brain injury: Comparison between autopsy and ante-mortem CT.

    Science.gov (United States)

    Panzer, Stephanie; Covaliov, Lidia; Augat, Peter; Peschel, Oliver

    2017-11-01

    The aim of this study was to compare pathological findings after traumatic brain injury between autopsy and ante-mortem computed tomography (CT). A second aim was to identify changes in these findings between the primary posttraumatic CT and the last follow-up CT before death. Through the collaboration between clinical radiology and forensic medicine, 45 patients with traumatic brain injury were investigated. These patients had undergone ante-mortem CT as well as autopsy. During autopsy, the brain was cut in fronto-parallel slices directly after removal without additional fixation or subsequent histology. Typical findings of traumatic brain injury were compared between autopsy and radiology. Additionally, these findings were compared between the primary CT and the last follow-up CT before death. The comparison between autopsy and radiology revealed a high specificity (≥80%) in most of the findings. Sensitivity and positive predictive value were high (≥80%) in almost half of the findings. Sixteen patients had undergone craniotomy with subsequent follow-up CT. Thirteen conservatively treated patients had undergone a follow-up CT. Comparison between the primary CT and the last ante-mortem CT revealed marked changes in the presence and absence of findings, especially in patients with severe traumatic brain injury requiring decompression craniotomy. The main pathological findings of traumatic brain injury were comparable between clinical ante-mortem CT examinations and autopsy. Comparison between the primary CT after trauma and the last ante-mortem CT revealed marked changes in the findings, especially in patients with severe traumatic brain injury. Hence, clinically routine ante-mortem CT should be included in the process of autopsy interpretation. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  12. An integrated framework for cost- benefit analysis in road safety projects using AHP method

    Directory of Open Access Journals (Sweden)

    Mahsa Mohamadian

    2011-10-01

    Full Text Available Cost benefit analysis (CBA is a useful tool for investment decision-making from economic point of view. When the decision involves conflicting goals, the multi-attribute analysis approach is more capable; because there are some social and environmental criteria that cannot be valued or monetized by cost benefit analysis. The complex nature of decision-making in road safety normally makes it difficult to reach a single alternative solution that can satisfy all decision-making problems. Generally, the application of multi-attribute analysis in road sector is promising; however, the applications are in preliminary stage. Some multi-attribute analysis techniques, such as analytic hierarchy process (AHP have been widely used in practice. This paper presents an integrated framework with CBA and AHP methods to select proper alternative in road safety projects. The proposed model of this paper is implemented for a case study of improving a road to reduce the accidents in Iran. The framework is used as an aid to cost benefit tool in road safety projects.

  13. Metacognition and evidence analysis instruction: an educational framework and practical experience.

    Science.gov (United States)

    Parrott, J Scott; Rubinstein, Matthew L

    2015-08-21

    The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.

  14. A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure

    Directory of Open Access Journals (Sweden)

    Yingjie Xia

    2013-01-01

    Full Text Available Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs, which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI, by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.

  15. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    Science.gov (United States)

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  16. A Framework for the Game-theoretic Analysis of Censorship Resistance

    Directory of Open Access Journals (Sweden)

    Elahi Tariq

    2016-10-01

    Full Text Available We present a game-theoretic analysis of optimal solutions for interactions between censors and censorship resistance systems (CRSs by focusing on the data channel used by the CRS to smuggle clients’ data past the censors. This analysis leverages the inherent errors (false positives and negatives made by the censor when trying to classify traffic as either non-circumvention traffic or as CRS traffic, as well as the underlying rate of CRS traffic. We identify Nash equilibrium solutions for several simple censorship scenarios and then extend those findings to more complex scenarios where we find that the deployment of a censorship apparatus does not qualitatively change the equilibrium solutions, but rather only affects the amount of traffic a CRS can support before being blocked. By leveraging these findings, we describe a general framework for exploring and identifying optimal strategies for the censorship circumventor, in order to maximize the amount of CRS traffic not blocked by the censor. We use this framework to analyze several scenarios with multiple data-channel protocols used as cover for the CRS. We show that it is possible to gain insights through this framework even without perfect knowledge of the censor’s (secret values for the parameters in their utility function.

  17. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  18. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  19. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  20. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  1. Time to address the problem of post-mortem procurement of organs for transplantation occurring without proper pre-mortem consent.

    Science.gov (United States)

    Garwood-Gowers, Austen

    2013-09-01

    Current cadaveric organ transplant systems allow individuals to be classified as donors after death where they registered wishes in favour of this prior to death. However, systems for registering wishes pertaining to donation fall woefully short of securing proper consent. Furthermore, even jurisdictions which technically require consent to be obtained in order to treat an individual as a donor, allow that consent to be given by next of kin after death in circumstances where there is no evidence of the individual having refused prior to death. This article explores these and related issues with current systems from the perspectives of health law norms, ethics and human rights. It concludes that proper pre-mortem consent ought to be a pre-requisite for post-mortem organ transplantation.

  2. Concepts of person-centred care: a framework analysis of five studies in daily care practices

    Directory of Open Access Journals (Sweden)

    Margreet

    2016-11-01

    Full Text Available Background: Person-centred care is used as a term to indicate a ‘made to measure’ approach in care. But what does this look like in daily practice? The person-centred nursing framework developed by McCormack and McCance (2010 offers specific concepts but these are still described in rather general terms. Empirical studies, therefore, could help to clarify them and make person-centredness more tangible for nurses. Aims: This paper describes how a framework analysis aimed to clarify the concepts described in the model of McCormack and McCance in order to guide professionals using them in practice. Methods: Five separate empirical studies focusing on older adults in the Netherlands were used in the framework analysis. The research question was: ‘How are concepts of person-centred care made tangible where empirical data are used to describe them?’ Analysis was done in five steps, leading to a comparison between the description of the concepts and the empirical significance found in the studies. Findings: Suitable illustrations were found for the majority of concepts. The results show that an empirically derived specification emerges from the data. In the concept of ‘caring relationship’ for example, it is shown that the personal character of each relationship is expressed by what the nurse and the older person know about each other. Other findings show the importance of values being present in care practices. Conclusions: The framework analysis shows that concepts can be clarified when empirical studies are used to make person-centred care tangible so nurses can understand and apply it in practice. Implications for practice: The concepts of the person-centred nursing framework are recognised when: Nurses know unique characteristics of the person they care for and what is important to them, and act accordingly Nurses use values such as trust, involvement and humour in their care practice Acknowledgement of emotions and compassion create

  3. Characterisation of the metabolome of ocular tissues and post-mortem changes in the rat retina.

    Science.gov (United States)

    Tan, Shi Z; Mullard, Graham; Hollywood, Katherine A; Dunn, Warwick B; Bishop, Paul N

    2016-08-01

    Time-dependent post-mortem biochemical changes have been demonstrated in donor cornea and vitreous, but there have been no published studies to date that objectively measure post-mortem changes in the retinal metabolome over time. The aim of the study was firstly, to investigate post-mortem, time-dependent changes in the rat retinal metabolome and secondly, to compare the metabolite composition of healthy rat ocular tissues. To study post-mortem changes in the rat retinal metabolome, globes were enucleated and stored at 4 °C and sampled at 0, 2, 4, 8, 24 and 48 h post-mortem. To study the metabolite composition of rat ocular tissues, eyes were dissected immediately after culling to isolate the cornea, lens, vitreous and retina, prior to storing at -80 °C. Tissue extracts were subjected to Gas Chromatograph Mass Spectrometry (GC-MS) and Ultra High Performance Liquid Chromatography Mass Spectrometry (UHPLC-MS). Generally, the metabolic composition of the retina was stable for 8 h post-mortem when eyes were stored at 4 °C, but showed increasing changes thereafter. However, some more rapid changes were observed such as increases in TCA cycle metabolites after 2 h post-mortem, whereas some metabolites such as fatty acids only showed decreases in concentration from 24 h. A total of 42 metabolites were identified across the ocular tissues by GC-MS (MSI level 1) and 2782 metabolites were annotated by UHPLC-MS (MSI level 2) according to MSI reporting standards. Many of the metabolites detected were common to all of the tissues but some metabolites showed partitioning between different ocular structures with 655, 297, 93 and 13 metabolites being uniquely detected in the retina, lens, cornea and vitreous respectively. Only a small percentage (1.6%) of metabolites found in the vitreous were only detected in the retina and not other tissues. In conclusion, mass spectrometry-based techniques have been used for the first time to compare the metabolic composition of

  4. Increased concentration of. cap alpha. - and. gamma. -endorphin in post mortem hypothalamic tissue of schizophrenic patients

    Energy Technology Data Exchange (ETDEWEB)

    Wiegant, V.M.; Verhoef, C.J.; Burbach, J.P.H.; de Wied, D.

    1988-01-01

    The concentrations of ..cap alpha..-, ..beta..- and ..gamma..-endorphin were determined by radioimmunoassay in HPLC fractionated extracts of post mortem hypothalamic tissue obtained from schizophrenic patients and controls. The hypothalamic concentration of ..cap alpha..- and ..gamma..-endorphin was significantly higher in patients than in controls. No difference was found in the concentration of ..beta..-endorphin, the putative precursor of ..cap alpha..- and ..gamma..-endorphins. These results suggest a deviant metabolism of ..beta..-endorphin in the brain of schizophrenic patients. Whether this phenomenon is related to the psychopathology, or is a consequence of ante mortem farmacotherapy, remains to be established.

  5. Template security analysis of multimodal biometric frameworks based on fingerprint and hand geometry

    Directory of Open Access Journals (Sweden)

    Arvind Selwal

    2016-09-01

    Full Text Available Biometric systems are automatic tools used to provide authentication during various applications of modern computing. In this work, three different design frameworks for multimodal biometric systems based on fingerprint and hand geometry modalities are proposed. An analysis is also presented to diagnose various types of template security issues in the proposed system. Fuzzy analytic hierarchy process (FAHP is applied with five decision parameters on all the designs and framework 1 is found to be better in terms of template data security, templates fusion and computational efficiency. It is noticed that template data security before storage in database is a challenging task. An important observation is that a template may be secured at feature fusion level and an indexing technique may be used to improve the size of secured templates.

  6. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    Science.gov (United States)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  7. Sensitivity and uncertainty in flood inundation modelling – concept of an analysis framework

    Directory of Open Access Journals (Sweden)

    T. Weichel

    2007-01-01

    Full Text Available After the extreme flood event of the Elbe in 2002 the definition of flood risk areas by law and their simulation became more important in Germany. This paper describes a concept of an analysis framework to improve the localisation and duration of validity of flood inundation maps. The two-dimensional finite difference model TrimR2D is used and linked to a Monte-Carlo routine for parameter sampling as well as to selected performance measures. The purpose is the investigation of the impact of different spatial resolutions and the influence of changing land uses in the simulation of flood inundation areas. The technical assembling of the framework is realised and beside the model calibration, first tests with different parameter ranges were done. Preliminary results show good correlations with observed data, but the investigation of shifting land uses reflects only poor changes in the flood extension.

  8. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, Lisandro; Collier, N.; Vignal, Philippe; Cortes, Adriano Mauricio; Calo, Victor M.

    2016-01-01

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  9. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, L.

    2016-05-25

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  10. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  11. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  12. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    Directory of Open Access Journals (Sweden)

    Kim Hyun

    2011-12-01

    Full Text Available Abstract Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  13. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  14. Evaluation of the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers: a systematic review and documentary analysis.

    Science.gov (United States)

    McGraw, Caroline; Drennan, Vari M

    2015-02-01

    To evaluate the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers. The objective was to identify the extent to which these frameworks take account of the setting where the ulcer originated as being the person's home rather than a hospital setting. Pressure ulcers involving full-thickness skin loss are increasingly being regarded as indicators of nursing patient safety failure, requiring investigation using root cause analysis frameworks. Evidence suggests that root cause analysis frameworks developed in hospital settings ignore the unique dimensions of risk in home healthcare settings. A systematic literature review and documentary analysis of frameworks used to investigate community-acquired grade three and four pressure ulcers by home nursing services in England. No published papers were identified for inclusion in the review. Fifteen patient safety investigative frameworks were collected and analysed. Twelve of the retrieved frameworks were intended for the investigation of community-acquired pressure ulcers; seven of which took account of the setting where the ulcer originated as being the patient's home. This study provides evidence to suggest that many of the root cause analysis frameworks used to investigate community-acquired pressure ulcers in England are unsuitable for this purpose. This study provides researchers and practitioners with evidence of the need to develop appropriate home nursing root cause analysis frameworks to investigate community-acquired pressure ulcers. © 2014 John Wiley & Sons Ltd.

  15. Formative assessment framework proposal for transversal competencies: Application to analysis and problem-solving competence

    Directory of Open Access Journals (Sweden)

    Pedro Gómez-Gasquet

    2018-04-01

    Full Text Available Purpose: In the last years, there is an increasing interest in the manner that transversal competences (TC are introduced in the curricula. Transversal competences are generic and relevant skills that students have to develop through the several stages of the educational degrees. This paper analyses TCs in the context of the learning process of undergraduate and postgraduate courses. The main aim of this paper is to propose a framework to improve results. The framework facilities the student's training and one of the important pieces is undoubtedly that he has constant feedback from his assessments that allowing to improve the learning. An applying in the analysis and problem solving competence in the context of Master Degree in Advanced Engineering Production, Logistics and Supply Chain at the UPV is carried out. Design/methodology/approach: The work is the result of several years of professional experience in the application of the concept of transversal competence in the UPV with undergraduate and graduate students. As a result of this work and various educational innovation projects, a team of experts has been created, which has been discussing some aspects relevant to the improvement of the teaching-learning process. One of these areas of work has been in relation to the integration of various proposals on the application and deployment of transversal competences. With respect to this work, a conceptual proposal is proposed that has subsequently been empirically validated through the analysis of the results of several groups of students in a degree. Findings: The main result that is offered in the work is a framework that allows identifying the elements that are part of the learning process in the area of transversal competences. Likewise, the different items that are part of the framework are linked to the student's life cycle, and a temporal scope is established for their deployment. Practical implications: One of the most noteworthy

  16. Three-dimensional finite element analysis of zirconia all-ceramic cantilevered fixed partial dentures with different framework designs.

    Science.gov (United States)

    Miura, Shoko; Kasahara, Shin; Yamauchi, Shinobu; Egusa, Hiroshi

    2017-06-01

    The purpose of this study were: to perform stress analyses using three-dimensional finite element analysis methods; to analyze the mechanical stress of different framework designs; and to investigate framework designs that will provide for the long-term stability of both cantilevered fixed partial dentures (FPDs) and abutment teeth. An analysis model was prepared for three units of cantilevered FPDs that assume a missing mandibular first molar. Four types of framework design (Design 1, basic type; Design 2, framework width expanded buccolingually by 2 mm; Design 3, framework height expanded by 0.5 mm to the occlusal surface side from the end abutment to the connector area; and Design 4, a combination of Designs 2 and 3) were created. Two types of framework material (yttrium-oxide partially stabilized zirconia and a high precious noble metal gold alloy) and two types of abutment material (dentin and brass) were used. In the framework designs, Design 1 exhibited the highest maximum principal stress value for both zirconia and gold alloy. In the abutment tooth, Design 3 exhibited the highest maximum principal stress value for all abutment teeth. In the present study, Design 4 (the design with expanded framework height and framework width) could contribute to preventing the concentration of stress and protecting abutment teeth. © 2017 Eur J Oral Sci.

  17. Rapid determination of quetiapine in blood by gas chromatography-mass spectrometry. Application to post-mortem cases.

    Science.gov (United States)

    López-Guarnido, Olga; Tabernero, María Jesús; Hernández, Antonio F; Rodrigo, Lourdes; Bermejo, Ana M

    2014-10-01

    A simple, fast and sensitive method for the determination of quetiapine in human blood has been developed and validated. The method involved a basic liquid-liquid extraction procedure and subsequent analysis by gas chromatography-mass spectrometry, previous derivatization with bis(trimethylsilyl)-trifluoro-acetamide and chorotrimethylsilane (99 : 1). The methods of validation included linearity with a correlation coefficient > 0.99 over the range 0.02-1 µg ml(-1), intra- and interday precision (always < 12%) and accuracy (mean relative error always < 12%) to meet the bioanalytical acceptance criteria. The limit of detection was 0.005 µg ml(-1). The procedure was further applied to post mortems from the Institute of Legal Medicine, University of Santiago de Compostela. Copyright © 2013 John Wiley & Sons, Ltd.

  18. A framework for interactive visual analysis of heterogeneous marine data in an integrated problem solving environment

    Science.gov (United States)

    Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei

    2017-07-01

    This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

  19. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  20. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  1. Bridging Human Reliability Analysis and Psychology, Part 2: A Cognitive Framework to Support HRA

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring; Jing Xing

    2012-06-01

    This is the second of two papers that discuss the literature review conducted as part of the U.S. Nuclear Regulatory Commission (NRC) effort to develop a hybrid human reliability analysis (HRA) method in response to Staff Requirements Memorandum (SRM) SRM-M061020. This review was conducted with the goal of strengthening the technical basis within psychology, cognitive science and human factors for the hybrid HRA method being proposed. An overview of the literature review approach and high-level structure is provided in the first paper, whereas this paper presents the results of the review. The psychological literature review encompassed research spanning the entirety of human cognition and performance, and consequently produced an extensive list of psychological processes, mechanisms, and factors that contribute to human performance. To make sense of this large amount of information, the results of the literature review were organized into a cognitive framework that identifies causes of failure of macrocognition in humans, and connects those proximate causes to psychological mechanisms and performance influencing factors (PIFs) that can lead to the failure. This cognitive framework can serve as a tool to inform HRA. Beyond this, however, the cognitive framework has the potential to also support addressing human performance issues identified in Human Factors applications.

  2. A classical regression framework for mediation analysis: fitting one model to estimate mediation effects.

    Science.gov (United States)

    Saunders, Christina T; Blume, Jeffrey D

    2017-10-26

    Mediation analysis explores the degree to which an exposure's effect on an outcome is diverted through a mediating variable. We describe a classical regression framework for conducting mediation analyses in which estimates of causal mediation effects and their variance are obtained from the fit of a single regression model. The vector of changes in exposure pathway coefficients, which we named the essential mediation components (EMCs), is used to estimate standard causal mediation effects. Because these effects are often simple functions of the EMCs, an analytical expression for their model-based variance follows directly. Given this formula, it is instructive to revisit the performance of routinely used variance approximations (e.g., delta method and resampling methods). Requiring the fit of only one model reduces the computation time required for complex mediation analyses and permits the use of a rich suite of regression tools that are not easily implemented on a system of three equations, as would be required in the Baron-Kenny framework. Using data from the BRAIN-ICU study, we provide examples to illustrate the advantages of this framework and compare it with the existing approaches. © The Author 2017. Published by Oxford University Press.

  3. Evaluating ecommerce websites cognitive efficiency: an integrative framework based on data envelopment analysis.

    Science.gov (United States)

    Lo Storto, Corrado

    2013-11-01

    This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. High-Fidelity Aerothermal Engineering Analysis for Planetary Probes Using DOTNET Framework and OLAP Cubes Database

    Directory of Open Access Journals (Sweden)

    Prabhakar Subrahmanyam

    2009-01-01

    Full Text Available This publication presents the architecture integration and implementation of various modules in Sparta framework. Sparta is a trajectory engine that is hooked to an Online Analytical Processing (OLAP database for Multi-dimensional analysis capability. OLAP is an Online Analytical Processing database that has a comprehensive list of atmospheric entry probes and their vehicle dimensions, trajectory data, aero-thermal data and material properties like Carbon, Silicon and Carbon-Phenolic based Ablators. An approach is presented for dynamic TPS design. OLAP has the capability to run in one simulation several different trajectory conditions and the output is stored back into the database and can be queried for appropriate trajectory type. An OLAP simulation can be setup by spawning individual threads to run for three types of trajectory: Nominal, Undershoot and Overshoot trajectory. Sparta graphical user interface provides capabilities to choose from a list of flight vehicles or enter trajectory and geometry information of a vehicle in design. DOTNET framework acts as a middleware layer between the trajectory engine and the user interface and also between the web user interface and the OLAP database. Trajectory output can be obtained in TecPlot format, Excel output or in a KML (Keyhole Markup Language format. Framework employs an API (application programming interface to convert trajectory data into a formatted KML file that is used by Google Earth for simulating Earth-entry fly-by visualizations.

  5. The Coronal Analysis of SHocks and Waves (CASHeW) framework

    Science.gov (United States)

    Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste

    2017-11-01

    Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.

  6. Practical static analysis of JavaScript applications in the presence of frameworks and libraries

    DEFF Research Database (Denmark)

    Madsen, Magnus; Livshits, Benjamin; Fanning, Michael

    2013-01-01

    and complex libraries and frameworks, often written in a combination of JavaScript and native code such as C and C++. Stubs have been commonly employed as a partial specification mechanism to address the library problem; however, they are tedious to write, incomplete, and occasionally incorrect. However......JavaScript is a language that is widely-used for both web- based and standalone applications such as those in the upcoming Windows 8 operating system. Analysis of JavaScript has long been known to be challenging due to its dynamic nature. On top of that, most JavaScript applications rely on large......, the manner in which library code is used within applications often sheds light on what library APIs return or consume as parameters. In this paper, we propose a technique which combines pointer analysis with use analysis to handle many challenges posed by large JavaScript libraries. Our approach enables...

  7. Integrated Information Technology Framework for Analysis of Data from Enrichment Plants to Support the Safeguards Mission

    International Nuclear Information System (INIS)

    Marr, Clifton T.; Thurman, David A.; Jorgensen, Bruce V.

    2008-01-01

    Many examples of software architectures exist that support process monitoring and analysis applications which could be applied to enrichment plants in a fashion that supports the Safeguards Mission. Pacific Northwest National Laboratory (PNNL) has developed mature solutions that will provide the framework to support online statistical analysis of enrichment plans and the entire nuclear fuel cycle. Most recently, PNNL has developed a refined architecture and supporting tools that address many of the common problems analysis and modeling environments experience: pipelining, handling large data volumes, and real-time performance. We propose the architecture and tools may be successfully used in furthering the goals of nuclear material control and accountability as both an aid to processing plant owners and as comprehensive monitoring for oversight teams.

  8. Development of a framework for the neutronics analysis system for next generation (3)

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hirai, Yasushi; Hyoudou, Hideaki; Tatsumi, Masahiro

    2010-02-01

    Development of innovative analysis methods and models in fundamental studies for next-generation nuclear reactor systems is in progress. In order to efficiently and effectively reflect the latest analysis methods and models to primary design of commercial reactor and/or in-core fuel management for power reactors, a next-generation analysis system MARBLE has been developed. The next-generation analysis system provides solutions to the following requirements: (1) flexibility, extensibility and user-friendliness that can apply new methods and models rapidly and effectively for fundamental studies, (2) quantitative proof of solution accuracy and adaptive scoping range for design studies, (3) coupling analysis among different study domains for the purpose of rationalization of plant systems and improvement of reliability, (4) maintainability and reusability for system extensions for the purpose of total quality management and development efficiency. The next-generation analysis system supports many fields, such as thermal-hydraulic analysis, structure analysis, reactor physics etc., and now we are studying reactor physics analysis system for fast reactor in advance. As for reactor physics analysis methods for fast reactor, we have established the JUPITER standard analysis methods based on the past study. But, there has been a problem of extreme inefficiency due to lack of functionality in the conventional analysis system when changing analysis targets and/or modeling levels. That is why, we have developed the next-generation analysis system for reactor physics which reproduces the JUPITER standard analysis method that has been developed so far and newly realizes burnup and design analysis for fast reactor and functions for cross section adjustment. In the present study, we examined in detail the existing design and implementation of ZPPR critical experiment analysis database followed by unification of models within the framework of the next-generation analysis system by

  9. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    Directory of Open Access Journals (Sweden)

    Sinha Amit U

    2011-11-01

    Full Text Available Abstract Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1 the ability to publish structured data compliant with accepted standards 2 support for multiple data types including microarrays and next generation sequencing 3 query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own

  10. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    Science.gov (United States)

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely

  11. Big Data Based Analysis Framework for Product Manufacturing and Maintenance Process

    OpenAIRE

    Zhang , Yingfeng; Ren , Shan

    2015-01-01

    Part 8: Cloud-Based Manufacturing; International audience; With the widely use of smart sensor devices in the product lifecycle management (PLM), it creates amount of real-time and muti-source lifecycle big data. These data allow decision makers to make better-informed PLM decisions. In this article, an overview framework of big data based analysis for product lifecycle (BDA-PL) was presented to provide a new paradigm by extending the techniques of Internet of Things (IoT) and big data analys...

  12. Nonlocal approach to the analysis of the stress distribution in granular systems. I. Theoretical framework

    Science.gov (United States)

    Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.

    1998-05-01

    A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.

  13. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  14. Experience with post-mortem computed tomography in Southern Denmark 2006-11

    DEFF Research Database (Denmark)

    Leth, Peter Mygind

    2013-01-01

    Objectives: (1) To explore the ability of post-mortem computed tomography (PMCT) to establish the cause of death. (2) To investigate the inter-method variation between autopsy and PMCT. (3) To investigate whether PMCT can select cases for autopsy. (4) To investigate the importance of histology...

  15. 42 CFR 35.16 - Autopsies and other post-mortem operations.

    Science.gov (United States)

    2010-10-01

    ... AND EXAMINATIONS HOSPITAL AND STATION MANAGEMENT General § 35.16 Autopsies and other post-mortem... to in writing by a person authorized under the law of the State in which the station or hospital is... made a part of the clinical record. [25 FR 6331, July 6, 1960] ...

  16. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Calder, Alistair D. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); Kiho, Liina [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); Taylor, Andrew M. [Great Ormond Street Hospital for Children, Cardiorespiratory Unit, London (United Kingdom); UCL Institute of Cardiovascular Science, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Sebire, Neil J. [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  17. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    International Nuclear Information System (INIS)

    Arthurs, Owen J.; Calder, Alistair D.; Kiho, Liina; Taylor, Andrew M.; Sebire, Neil J.

    2014-01-01

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  18. Overview of the Systems Analysis Framework for the EU Bioeconomy. Deliverable 1.4 of the EU FP 7 SAT-BBE project Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy (SAT BBE)

    NARCIS (Netherlands)

    Leeuwen, van M.G.A.; Meijl, van H.; Smeets, E.M.W.; Tabeau-Kowalska, E.W.

    2014-01-01

    In November 2012 the Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy project (SAT-BBE) was launched with the purpose to design an analysis tool useful to monitoring the evolution and impacts of the bioeconomy. In the SAT-BBE project the development of the analysis tool for the

  19. Establishing post mortem criteria for the metabolic syndrome: an autopsy based cross-sectional study.

    Science.gov (United States)

    Christensen, Martin Roest; Bugge, Anne; Malik, Mariam Elmegaard; Thomsen, Jørgen Lange; Lynnerup, Niels; Rungby, Jørgen; Banner, Jytte

    2018-01-01

    Individuals who suffer from mental illness are more prone to obesity and related co-morbidities, including the metabolic syndrome. Autopsies provide an outstanding platform for the macroscopic, microscopic and molecular-biological investigation of diseases. Autopsy-based findings may assist in the investigation of the metabolic syndrome. To utilise the vast information that an autopsy encompasses to elucidate the pathophysiology behind the syndrome further, we aimed to both develop and evaluate a method for the post mortem definition of the metabolic syndrome. Based on the nationwide Danish SURVIVE study of deceased mentally ill, we established a set of post mortem criteria for each of the harmonized criteria of the metabolic syndrome. We based the post mortem (PM) evaluation on information from the police reports and the data collected at autopsy, such as anthropometric measurements and biochemical and toxicological analyses (PM information). We compared our PM evaluation with the data from the Danish health registries [ante mortem (AM) information, considered the gold standard] from each individual. The study included 443 deceased individuals (272 male and 171 female) with a mean age of 50.4 (± 15.5) years and a median (interquartile range) post mortem interval of 114 (84-156) hours. We found no significant difference when defining the metabolic syndrome from the PM information in comparison to the AM information ( P  = 0.175). The PM evaluation yielded a high specificity (0.93) and a moderate sensitivity (0.63) with a moderate level of agreement compared to the AM evaluation (Cohen's κ = 0.51). Neither age nor post mortem interval affected the final results. Our model of a PM definition of the metabolic syndrome proved reliable when compared to the AM information. We believe that an appropriate estimate of the prevalence of the metabolic syndrome can be established post mortem. However, while neither the PM nor the AM information is exhaustive in

  20. Analysis Community’s Coping Strategies and Local Risk Governance Framework in Relation to Landslide

    Directory of Open Access Journals (Sweden)

    Heru Setiawan

    2014-12-01

    Full Text Available Analysis of people perception and analysis of the coping strategy to landslides are the two elements that are es-sential to determine the level of preparedness of communities to landslides. To know the preparedness of government and other stakeholders in facing landslide, the analysis risk governance framework was required. A survey using questionnaires with random sampling was applied to assess the level of people perception and people coping strategy related to landslide. Analysis of risk governance frame work was done at the district and sub-district level. ἀe study found that people perception related with landslide dominated by high and moderate level. Age and education are two factors that inḀuence the people’s perception to landslide. Local people applied four types coping strategy, which are: economic, structural, social and cultural coping strategy. Totally, 51.6% respondents have high level, 33.3% have moderate level and only 15.1% respondents that have low level of coping strategy. ἀe factors that inḀuence the level of coping strategy are education, income and building type.  Analysis of risk governance framework is limited to the three components including stakeholder involvement, risk management and risk communication. Based on the data analysis, the level of stakeholder involvement at the district scope was categorized on the moderate till high and the level of stakeholder involvement at sub-district level was categorized on the high level. Generally, the risk management of Karanganyar was categorized on the moderate level and high level and the risk management in Tawangmangu was categorized on the moderate level. ἀere are some elements must be improved on the risk governance framework, those are data management, the pattern of relationships among stakeholders, increased participation of NGOs, constructed and updated landslide risk map, enhancement of microᴀnance role in helping the com-munity when disaster strikes

  1. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-01-01

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  2. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu [Louisiana State University, Baton Rouge, LA (United States); Sattler, Meredith, E-mail: msattler@lsu.edu [School of Architecture, Louisiana State University, Baton Rouge, LA (United States); Friedland, Carol J., E-mail: friedland@lsu.edu [Bert S. Turner Department of Construction Management, Louisiana State University, Baton Rouge, LA (United States)

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  3. A decision analysis framework for stakeholder involvement and learning in groundwater management

    Science.gov (United States)

    Karjalainen, T. P.; Rossi, P. M.; Ala-aho, P.; Eskelinen, R.; Reinikainen, K.; Kløve, B.; Pulido-Velazquez, M.; Yang, H.

    2013-12-01

    Multi-criteria decision analysis (MCDA) methods are increasingly used to facilitate both rigorous analysis and stakeholder involvement in natural and water resource planning. Decision-making in that context is often complex and multi-faceted with numerous trade-offs between social, environmental and economic impacts. However, practical applications of decision-support methods are often too technically oriented and hard to use, understand or interpret for all participants. The learning of participants in these processes is seldom examined, even though successful deliberation depends on learning. This paper analyzes the potential of an interactive MCDA framework, the decision analysis interview (DAI) approach, for facilitating stakeholder involvement and learning in groundwater management. It evaluates the results of the MCDA process in assessing land-use management alternatives in a Finnish esker aquifer area where conflicting land uses affect the groundwater body and dependent ecosystems. In the assessment process, emphasis was placed on the interactive role of the MCDA tool in facilitating stakeholder participation and learning. The results confirmed that the structured decision analysis framework can foster learning and collaboration in a process where disputes and diverse interests are represented. Computer-aided interviews helped the participants to see how their preferences affected the desirability and ranking of alternatives. During the process, the participants' knowledge and preferences evolved as they assessed their initial knowledge with the help of fresh scientific information. The decision analysis process led to the opening of a dialogue, showing the overall picture of the problem context and the critical issues for the further process.

  4. Integration of targeted health interventions into health systems: a conceptual framework for analysis.

    Science.gov (United States)

    Atun, Rifat; de Jongh, Thyra; Secci, Federica; Ohiri, Kelechi; Adeyi, Olusoji

    2010-03-01

    The benefits of integrating programmes that emphasize specific interventions into health systems to improve health outcomes have been widely debated. This debate has been driven by narrow binary considerations of integrated (horizontal) versus non-integrated (vertical) programmes, and characterized by polarization of views with protagonists for and against integration arguing the relative merits of each approach. The presence of both integrated and non-integrated programmes in many countries suggests benefits to each approach. While the terms 'vertical' and 'integrated' are widely used, they each describe a range of phenomena. In practice the dichotomy between vertical and horizontal is not rigid and the extent of verticality or integration varies between programmes. However, systematic analysis of the relative merits of integration in various contexts and for different interventions is complicated as there is no commonly accepted definition of 'integration'-a term loosely used to describe a variety of organizational arrangements for a range of programmes in different settings. We present an analytical framework which enables deconstruction of the term integration into multiple facets, each corresponding to a critical health system function. Our conceptual framework builds on theoretical propositions and empirical research in innovation studies, and in particular adoption and diffusion of innovations within health systems, and builds on our own earlier empirical research. It brings together the critical elements that affect adoption, diffusion and assimilation of a health intervention, and in doing so enables systematic and holistic exploration of the extent to which different interventions are integrated in varied settings and the reasons for the variation. The conceptual framework and the analytical approach we propose are intended to facilitate analysis in evaluative and formative studies of-and policies on-integration, for use in systematically comparing and

  5. Introducing Advanced Practice Nurses / Nurse Practitioners in health care systems: a framework for reflection and analysis.

    Science.gov (United States)

    De Geest, Sabina; Moons, Philip; Callens, Betty; Gut, Chris; Lindpaintner, Lyn; Spirig, Rebecca

    2008-11-01

    An increasing number of countries are exploring the option of introducing Advanced Practice Nurses (APN), such as Nurse Practitioners (NP), as part of the health care workforce. This is particular relevant in light of the increase of the elderly and chronically ill. It is crucial that this introduction is preceded by an in depth understanding of the concept of advanced practice nursing as well as an analysis of the context. Firstly, a conceptual clarification of Advanced Practice Nurses and Nurse Practitioners is provided. Secondly, a framework is introduced that assists in the analysis of the introduction and development of Advanced Practice Nurse roles in a particular health care system. Thirdly, outcomes research on Advanced Practice Nursing is presented. Argumentation developed using data based papers and policy reports on Advanced Practice Nursing. The proposed framework consists of five drivers: (1) the health care needs of the population, (2) education, (3) workforce, (4) practice patterns and (5) legal and health policy framework. These drivers act synergistically and are dynamic in time and space. Outcomes research shows that nurse practitioners show clinical outcomes similar to or better than those of physicians. Further examples demonstrate favourable outcomes in view of the six Ds of outcome research; death, disease, disability, discomfort, dissatisfaction and dollars, for models of care in which Advanced Practice Nurses play a prominent role. Advanced Practice Nurses such as Nurse Practitioners show potential to contribute favourably to guaranteeing optimal health care. Advanced Practice Nurses will wield the greatest influence on health care by focusing on the most pressing health problems in society, especially the care of the chronically ill.

  6. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  7. Current status of paediatric post-mortem imaging: an ESPR questionnaire-based survey

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Rijn, Rick R. van [Academic Medical Centre, Department of Radiology, Amsterdam (Netherlands); Sebire, Neil J. [Great Ormond Street Hospital for Children, Department of Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    The use of post-mortem imaging, including skeletal radiography, CT and MRI, is increasing, providing a minimally invasive alternative to conventional autopsy techniques. The development of clinical guidelines and national standards is being encouraged, particularly for cross-sectional techniques. To outline the current practice of post-mortem imaging amongst members of the European Society of Paediatric Radiology (ESPR). We e-mailed an online questionnaire of current post-mortem service provisions to members of the ESPR in January 2013. The survey included direct questions about what services were offered, the population imaged, current techniques used, imaging protocols, reporting experience and intended future involvement. Seventy-one percent (47/66) of centres from which surveys were returned reported performing some form of post-mortem imaging in children, of which 81 % perform radiographs, 51% CT and 38% MRI. Eighty-seven percent of the imaging is performed within the radiology or imaging departments, usually by radiographers (75%), and 89% is reported by radiologists, of which 64% is reported by paediatric radiologists. Overall, 72% of positive respondents have a standardised protocol for radiographs, but only 32% have such a protocol for CT and 27% for MRI. Sixty-one percent of respondents wrote that this is an important area that needs to be developed. Overall, the majority of centres provide some post-mortem imaging service, most of which is performed within an imaging department and reported by a paediatric radiologist. However, the populations imaged as well as the details of the services offered are highly variable among institutions and lack standardisation. We have identified people who would be interested in taking this work forwards. (orig.)

  8. Quantitative susceptibility mapping (QSM) as a means to measure brain iron? A post mortem validation study

    Science.gov (United States)

    Langkammer, Christian; Schweser, Ferdinand; Krebs, Nikolaus; Deistung, Andreas; Goessler, Walter; Scheurer, Eva; Sommer, Karsten; Reishofer, Gernot; Yen, Kathrin; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen R.

    2012-01-01

    Quantitative susceptibility mapping (QSM) is a novel technique which allows determining the bulk magnetic susceptibility distribution of tissue in vivo from gradient echo magnetic resonance phase images. It is commonly assumed that paramagnetic iron is the predominant source of susceptibility variations in gray matter as many studies have reported a reasonable correlation of magnetic susceptibility with brain iron concentrations in vivo. Instead of performing direct comparisons, however, all these studies used the putative iron concentrations reported in the hallmark study by Hallgren and Sourander (1958) for their analysis. Consequently, the extent to which QSM can serve to reliably assess brain iron levels is not yet fully clear. To provide such information we investigated the relation between bulk tissue magnetic susceptibility and brain iron concentration in unfixed (in situ) post mortem brains of 13 subjects using MRI and inductively coupled plasma mass spectrometry. A strong linear correlation between chemically determined iron concentration and bulk magnetic susceptibility was found in gray matter structures (r = 0.84, p < 0.001), whereas the correlation coefficient was much lower in white matter (r = 0.27, p < 0.001). The slope of the overall linear correlation was consistent with theoretical considerations of the magnetism of ferritin supporting that most of the iron in the brain is bound to ferritin proteins. In conclusion, iron is the dominant source of magnetic susceptibility in deep gray matter and can be assessed with QSM. In white matter regions the estimation of iron concentrations by QSM is less accurate and more complex because the counteracting contribution from diamagnetic myelinated neuronal fibers confounds the interpretation. PMID:22634862

  9. Policy analysis and advocacy in nursing education: the Nursing Education Council of British Columbia framework.

    Science.gov (United States)

    Duncan, Susan M; Thorne, Sally; Van Neste-Kenny, Jocelyne; Tate, Betty

    2012-05-01

    Academic nursing leaders play a crucial role in the policy context for nursing education. Effectiveness in this role requires that they work together in presenting nursing education issues from a position of strength, informed by a critical analysis of policy pertaining to the delivery of quality nursing education and scholarship. We describe a collective process of dialog and critical analysis whereby nurse leaders in one Canadian province addressed pressing policy issues facing governments, nursing programs, faculty, and students. Consensus among academic nurse leaders, formalized through the development of a policy action framework, has enabled us to take a stand, at times highly contested, in the politicized arena of the nursing shortage. We present the components of a policy action framework for nursing education and share examples of how we have used a critical approach to analyze and frame policy issues in nursing education for inclusion on policy agendas. We believe our work has influenced provincial and national thinking about policy in nursing education is the foundation of our conclusion that political presence and shared strategy among academic nursing leaders is undeniably critical in the global context of nursing today. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  10. Developing a framework for qualitative engineering: Research in design and analysis of complex structural systems

    Science.gov (United States)

    Franck, Bruno M.

    1990-01-01

    The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.

  11. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  12. ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization

    CERN Document Server

    Naumann, Axel; Ballintijn, Maarten; Bellenot, Bertrand; Biskup, Marek; Brun, Rene; Buncic, Nenad; Canal, Philippe; Casadei, Diego; Couet, Olivier; Fine, Valery; Franco, Leandro; Ganis, Gerardo; Gheata, Andrei; Gonzalez~Maline, David; Goto, Masaharu; Iwaszkiewicz, Jan; Kreshuk, Anna; Marcos Segura, Diego; Maunder, Richard; Moneta, Lorenzo; Offermann, Eddy; Onuchin, Valeriy; Panacek, Suzanne; Rademakers, Fons; Russo, Paul; Tadel, Matevz

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advance...

  13. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    Science.gov (United States)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  14. OpenElectrophy: an electrophysiological data- and analysis-sharing framework

    Directory of Open Access Journals (Sweden)

    Samuel Garcia

    2009-05-01

    Full Text Available Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data and analysis sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et. al., 2007. OpenElectrophy is open-source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy.

  15. A comparison between rib fracture patterns in peri- and post-mortem compressive injury in a piglet model.

    Science.gov (United States)

    Bradley, Amanda L; Swain, Michael V; Neil Waddell, J; Das, Raj; Athens, Josie; Kieser, Jules A

    2014-05-01

    Forensic biomechanics is increasingly being used to explain how observed injuries occur. We studied infant rib fractures from a biomechanical and morphological perspective using a porcine model. We used 24, 6th ribs of one day old domestic pigs Sus scrofa, divided into three groups, desiccated (representing post-mortem trauma), fresh ribs with intact periosteum (representing peri-mortem trauma) and those stored at -20°C. Two experiments were designed to study their biomechanical behaviour fracture morphology: ribs were axially compressed and subjected to four-point bending in an Instron 3339 fitted with custom jigs. Morphoscopic analysis of resultant fractures consisted of standard optical methods, micro-CT (μCT) and Scanning Electron Microscopy (SEM). During axial compression fresh ribs did not fracture because of energy absorption capabilities of their soft and fluidic components. In flexure tests, dry ribs showed typical elastic-brittle behaviour with long linear load-extension curves, followed by short non-linear elastic (hyperelastic) behaviour and brittle fracture. Fresh ribs showed initial linear-elastic behaviour, followed by strain softening and visco-plastic responses. During the course of loading, dry bone showed minimal observable damage prior to the onset of unstable fracture. Frozen then thawed bone showed similar patterns to fresh bone. Morphologically, fresh ribs showed extensive periosteal damage to the tensile surface with areas of collagen fibre pull-out along the tensile surface. While all dry ribs fractured precipitously, with associated fibre pull-out, the latter feature was absent in thawed ribs. Our study highlights the fact that under controlled loading, fresh piglet ribs (representing perimortem trauma) did not fracture through bone, but was associated with periosteal tearing. These results suggest firstly, that complete lateral rib fracture in infants may in fact not result from pure compression as has been previously assumed; and

  16. Analyzing and modeling interdisciplinary product development a framework for the analysis of knowledge characteristics and design support

    CERN Document Server

    Neumann, Frank

    2015-01-01

    Frank Neumann focuses on establishing a theoretical basis that allows a description of the interplay between individual and collective processes in product development. For this purpose, he introduces the integrated descriptive model of knowledge creation as the first constituent of his research framework. As a second part of the research framework, an analysis and modeling method is proposed that captures the various knowledge conversion activities described by the integrated descriptive model of knowledge creation. Subsequently, this research framework is applied to the analysis of knowledge characteristics of mechatronic product development (MPD). Finally, the results gained from the previous steps are used within a design support system that aims at federating the information and knowledge resources contained in the models published in the various development activities of MPD. Contents Descriptive Model of Knowledge Creation in Interdisciplinary Product Development Research Framework for the Analysis of ...

  17. A Thermorisk framework for the analysis of energy systems by combining risk and exergy analysis

    International Nuclear Information System (INIS)

    Cassetti, G.; Colombo, E.; Zio, E.

    2016-01-01

    Highlights: • An exergy based analysis for improving efficiency and safety of energy systems is presented. • The relation between thermodynamic parameters and the safety characteristics is identified. • Possible modifications in the process are indicated to improve the safety of the system. - Abstract: The impact of energy production, transformation and use on the environmental resources encourage to understand the mechanisms of resource degradation and to develop proper analyses to reduce the impact of the energy systems on the environment. At the technical level, most attempts for reducing the environmental impact of energy systems focus on the improvement of process efficiency. One way toward an integrated approach is that of adopting exergy analysis for assessing efficiency and test improving design and operation solutions. The paper presents an exergy based analysis for improving efficiency and safety of energy systems, named Thermorisk analysis. The purpose of the Thermorisk analysis is to supply information to control, and eventually reduce, the risk of the systems (i.e. risk of accidents) by acting on the thermodynamic parameters and safety characteristics in the same frame. The proper combination of exergy and risk analysis allows monitoring the effects of efficiency improvement on the safety of the systems analyzed. A case study is presented, showing the potential of the analysis to identify the relation between the exergy efficiency and the risk of the system analyzed, and the contribution of inefficiencies on the safety of the process. Possible modifications in the process are indicated to improve the safety of the system.

  18. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    Science.gov (United States)

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the

  19. CM-DataONE: A Framework for collaborative analysis of climate model output

    Science.gov (United States)

    Xu, Hao; Bai, Yuqi; Li, Sha; Dong, Wenhao; Huang, Wenyu; Xu, Shiming; Lin, Yanluan; Wang, Bin

    2015-04-01

    CM-DataONE is a distributed collaborative analysis framework for climate model data which aims to break through the data access barriers of increasing file size and to accelerate research process. As data size involved in project such as the fifth Coupled Model Intercomparison Project (CMIP5) has reached petabytes, conventional methods for analysis and diagnosis of model outputs have been rather time-consuming and redundant. CM-DataONE is developed for data publishers and researchers from relevant areas. It can enable easy access to distributed data and provide extensible analysis functions based on tools such as NCAR Command Language, NetCDF Operators (NCO) and Climate Data Operators (CDO). CM-DataONE can be easily installed, configured, and maintained. The main web application has two separate parts which communicate with each other through APIs based on HTTP protocol. The analytic server is designed to be installed in each data node while a data portal can be configured anywhere and connect to a nearest node. Functions such as data query, analytic task submission, status monitoring, visualization and product downloading are provided to end users by data portal. Data conform to CMIP5 Model Output Format in each peer node can be scanned by the server and mapped to a global information database. A scheduler included in the server is responsible for task decomposition, distribution and consolidation. Analysis functions are always executed where data locate. Analysis function package included in the server has provided commonly used functions such as EOF analysis, trend analysis and time series. Functions are coupled with data by XML descriptions and can be easily extended. Various types of results can be obtained by users for further studies. This framework has significantly decreased the amount of data to be transmitted and improved efficiency in model intercomparison jobs by supporting online analysis and multi-node collaboration. To end users, data query is

  20. The FEDRA-Framework for emulsion data reconstruction and analysis in the OPERA experiment

    International Nuclear Information System (INIS)

    Tioukov, V.; Kreslo, I.; Petukhov, Y.; Sirri, G.

    2006-01-01

    OPERA is a massive lead/emulsion target for a long-baseline neutrino oscillation search. More than 90% of the useful experimental data in OPERA will be produced by the scanning of emulsion plates with the automatic microscopes. The main goal of the data processing in OPERA will be the search, analysis and identification of primary and secondary vertices produced by neutrino in lead-emulsion target. The volume of middle- and high-level data to be analysed and stored is expected to be of the order of several Gb per event. The storage, calibration, reconstruction, analysis and visualization of this data is the task of FEDRA system written in C ++ and based on ROOT framework. The system is now actively used for processing of test beams and simulation data. Several interesting algorithmic solutions permits us to make effective code for fast pattern recognition in heavy signal/noise conditions. The system consists of the storage part, intercalibration and segments linking part, track finding and fitting, vertex finding and fitting and kinematical analysis parts. Kalman Filtering technique is used for tracks and vertex fitting. ROOT-based event display is used for interactive analysis of the special events. iltering technique is used for tracks and vertex fitting. ROOT-based event display is used for interactive analysis of the special events

  1. Computational medical imaging and hemodynamics framework for functional analysis and assessment of cardiovascular structures.

    Science.gov (United States)

    Wong, Kelvin K L; Wang, Defeng; Ko, Jacky K L; Mazumdar, Jagannath; Le, Thu-Thao; Ghista, Dhanjoo

    2017-03-21

    Cardiac dysfunction constitutes common cardiovascular health issues in the society, and has been an investigation topic of strong focus by researchers in the medical imaging community. Diagnostic modalities based on echocardiography, magnetic resonance imaging, chest radiography and computed tomography are common techniques that provide cardiovascular structural information to diagnose heart defects. However, functional information of cardiovascular flow, which can in fact be used to support the diagnosis of many cardiovascular diseases with a myriad of hemodynamics performance indicators, remains unexplored to its full potential. Some of these indicators constitute important cardiac functional parameters affecting the cardiovascular abnormalities. With the advancement of computer technology that facilitates high speed computational fluid dynamics, the realization of a support diagnostic platform of hemodynamics quantification and analysis can be achieved. This article reviews the state-of-the-art medical imaging and high fidelity multi-physics computational analyses that together enable reconstruction of cardiovascular structures and hemodynamic flow patterns within them, such as of the left ventricle (LV) and carotid bifurcations. The combined medical imaging and hemodynamic analysis enables us to study the mechanisms of cardiovascular disease-causing dysfunctions, such as how (1) cardiomyopathy causes left ventricular remodeling and loss of contractility leading to heart failure, and (2) modeling of LV construction and simulation of intra-LV hemodynamics can enable us to determine the optimum procedure of surgical ventriculation to restore its contractility and health This combined medical imaging and hemodynamics framework can potentially extend medical knowledge of cardiovascular defects and associated hemodynamic behavior and their surgical restoration, by means of an integrated medical image diagnostics and hemodynamic performance analysis framework.

  2. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    Science.gov (United States)

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology

  3. GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Quiter, Brian J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bandstra, Mark S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-12-01

    The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of open and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.

  4. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  5. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  6. The Tracking and Analysis Framework (TAF): A tool for the integrated assessment of acid deposition

    International Nuclear Information System (INIS)

    Bloyd, C.N.; Henrion, M.; Marnicio, R.J.

    1995-01-01

    A major challenge that has faced policy makers concerned with acid deposition is obtaining an integrated view of the underlying science related to acid deposition. In response to this challenge, the US Department of Energy is sponsoring the development of an integrated Tracking and Analysis Framework (TAF) which links together the key acid deposition components of emissions, air transport, atmospheric deposition, and aquatic effects in a single modeling structure. The goal of TAF is to integrate credible models of the scientific and technical issues into an assessment framework that can directly address key policy issues, and in doing so act as a bridge between science and policy. Key objectives of TAF are to support coordination and communication among scientific researchers; to support communications with policy makers, and to provide rapid response for analyzing newly emerging policy issues; and to provide guidance for prioritizing research programs. This paper briefly describes how TAF was formulated to meet those objectives and the underlying principals which form the basis for its development

  7. An Ensemble Learning Based Framework for Traditional Chinese Medicine Data Analysis with ICD-10 Labels

    Directory of Open Access Journals (Sweden)

    Gang Zhang

    2015-01-01

    Full Text Available Objective. This study aims to establish a model to analyze clinical experience of TCM veteran doctors. We propose an ensemble learning based framework to analyze clinical records with ICD-10 labels information for effective diagnosis and acupoints recommendation. Methods. We propose an ensemble learning framework for the analysis task. A set of base learners composed of decision tree (DT and support vector machine (SVM are trained by bootstrapping the training dataset. The base learners are sorted by accuracy and diversity through nondominated sort (NDS algorithm and combined through a deep ensemble learning strategy. Results. We evaluate the proposed method with comparison to two currently successful methods on a clinical diagnosis dataset with manually labeled ICD-10 information. ICD-10 label annotation and acupoints recommendation are evaluated for three methods. The proposed method achieves an accuracy rate of 88.2%  ±  2.8% measured by zero-one loss for the first evaluation session and 79.6%  ±  3.6% measured by Hamming loss, which are superior to the other two methods. Conclusion. The proposed ensemble model can effectively model the implied knowledge and experience in historic clinical data records. The computational cost of training a set of base learners is relatively low.

  8. The strangeness, excess and lack in the construction of a methodological framework for voice discursive analysis

    Directory of Open Access Journals (Sweden)

    Jael Sânera Sigales Gonçalves

    2017-11-01

    Full Text Available Considering the notions of “lack”, “excess” and “strangeness”, from the perspective of materialist discourse analysis, I sustain that those three elements are constitutive of the construction process of a methodological framework that considers the prosodic materiality of the voice to be significant. I present the construction of a discursive object of research in which I intend to analyze the discourse of the Minister-Rapporteur during the trial of the “Mensalão” by the Brazilian Supreme Court. I also show how, from the first file reading and listening gestures, I end up with the “reported speech” as the regularity of that discourse, marked by the presence of another in the linguistic linearity. In conclusion, I discuss three concerns during the construction of the framework: the transcript of the speech of the Minister audios; the status given to the “reported speech” in the research; and the balance between horizontal exhaustiveness and vertical exhaustiveness.

  9. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    Directory of Open Access Journals (Sweden)

    Mohamed Elgendi

    2016-11-01

    Full Text Available Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA” involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 have to follow the inequality ( 8 × W 1 ≥ W 2 ≥ ( 2 × W 1 . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  10. Analysis of GEANT4 Physics List Properties in the 12 GeV MOLLER Simulation Framework

    Science.gov (United States)

    Haufe, Christopher; Moller Collaboration

    2013-10-01

    To determine the validity of new physics beyond the scope of the electroweak theory, nuclear physicists across the globe have been collaborating on future endeavors that will provide the precision needed to confirm these speculations. One of these is the MOLLER experiment - a low-energy particle experiment that will utilize the 12 GeV upgrade of Jefferson Lab's CEBAF accelerator. The motivation of this experiment is to measure the parity-violating asymmetry of scattered polarized electrons off unpolarized electrons in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron's weak charge and weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called ``remoll'', is written in GEANT4 code. As a result, the simulation can utilize a number of GEANT4 coded physics lists that provide the simulation with a number of particle interaction constraints based off of different particle physics models. By comparing these lists with one another using the data-analysis application ROOT, the most optimal physics list for the MOLLER simulation can be determined and implemented. This material is based upon work supported by the National Science Foundation under Grant No. 714001.

  11. Analysis of lipid experiments (ALEX: a software framework for analysis of high-resolution shotgun lipidomics data.

    Directory of Open Access Journals (Sweden)

    Peter Husen

    Full Text Available Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1. The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  12. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    Science.gov (United States)

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  13. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  14. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    Science.gov (United States)

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  15. FireCalc: An XML-based framework for distributed data analysis

    International Nuclear Information System (INIS)

    Duarte, A.S.; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F.

    2008-01-01

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis

  16. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  17. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  18. FireCalc: An XML-based framework for distributed data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duarte, A.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)], E-mail: andre.duarte@cfn.ist.utl.pt; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis.

  19. A Comparative Analysis of Competency Frameworks for Youth Workers in the Out-of-School Time Field

    OpenAIRE

    Vance, Femi

    2010-01-01

    Research suggests that the quality of out-of-school time (OST) programs is related to positive youth outcomes and skilled staff are a critical component of high quality programming. This descriptive case study of competency frameworks for youth workers in the OST field demonstrates how experts and practitioners characterize a skilled youth worker. A comparative analysis of 11 competency frameworks is conducted to identify a set of common core competencies. A set of 12 competency areas that ar...

  20. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  1. [Technology in nursing care: an analysis from the conceptual framework of Fundamental Nursing].

    Science.gov (United States)

    da Silva, Rafael Celestino; Ferreira, Márcia de Assunção

    2014-01-01

    This is a qualitative, field research, whose purpose was to discuss the use of technologies in the nursing care in intensive therapy, taking as reference the theoretical conceptual framework of Fundamental Nursing. Observation and interviews were conducted with twenty two nurses of an intensive therapy unit, with ethnographic analysis. The technology, from the domain of a technological language, provides conditions so that the fundamentals of the nursing care can be effectively incorporated to the nurse practice. The idea of dehumanization linked to the technology can be explained by the way that the nurse ads sense to the things related to his daily life, which will guide his action. The conclusion is that the technologies help to promote life and to rescue the human.

  2. Framework for the analysis of reactive power dispatch in energy pools

    International Nuclear Information System (INIS)

    Salgado, R.S.; Irving, M.R.

    2004-01-01

    This paper proposes a framework for the simulation and analysis of the reactive power distribution in electric energy markets of the pool type. Firstly, the analytical formulation of the OPF problem, with three optional performance indexes for the reactive power dispatch, is discussed. These OPF objectives are used to determine the reactive power distribution for a given active power dispatch (obtained through merit-order strategy, for instance). An allocation strategy is used to assess the participation of each power system agent in the loss/reactive power distribution. This strategy uses the premise of co-operative game theory. Numerical results obtained with the Ward-Hale 6-bus test system illustrate the main aspects of the proposed methodology. (author)

  3. Comparative Analysis of Two Industries for Validating Green Manufacturing (GM) Framework: An Indian Scenario

    Science.gov (United States)

    Rehman, Minhaj Ahemad Abdul; Shrivastava, Rakesh Lakshmikumar; Shrivastava, Rashmi Rakesh

    2017-04-01

    Green Manufacturing (GM) deals with manufacturing practices that reduces or eliminates the adverse environmental impact during any of its phases. It emphasizes the use of processes that do not contaminate the environment or hurt consumers, employees, or other stakeholders. This paper presents the comparative analysis of two Indian industries representing different sectors for validating GM framework. It also highlights the road map of the companies for achieving performance improvement through GM implementation and its impact on organisational performance. The case studies helps in evaluating the companies GM implementation and overall business performance. For this, a developed diagnostic instrument in the form of questionnaire was administered amongst employees in the companies respectively and their responses were analysed. In order to have a better understanding of the impact of GM implementation, the information about overall business performance was obtained over the last 3 years. The diagnostic instrument developed here may be used by manufacturing organisations to prioritise their management efforts to assess and implement GM.

  4. Mutational analysis a joint framework for Cauchy problems in and beyond vector spaces

    CERN Document Server

    Lorenz, Thomas

    2010-01-01

    Ordinary differential equations play a central role in science and have been extended to evolution equations in Banach spaces. For many applications, however, it is difficult to specify a suitable normed vector space. Shapes without a priori restrictions, for example, do not have an obvious linear structure. This book generalizes ordinary differential equations beyond the borders of vector spaces with a focus on the well-posed Cauchy problem in finite time intervals. Here are some of the examples: - Feedback evolutions of compact subsets of the Euclidean space - Birth-and-growth processes of random sets (not necessarily convex) - Semilinear evolution equations - Nonlocal parabolic differential equations - Nonlinear transport equations for Radon measures - A structured population model - Stochastic differential equations with nonlocal sample dependence and how they can be coupled in systems immediately - due to the joint framework of Mutational Analysis. Finally, the book offers new tools for modelling.

  5. Model-independent partial wave analysis using a massively-parallel fitting framework

    Science.gov (United States)

    Sun, L.; Aoude, R.; dos Reis, A. C.; Sokoloff, M.

    2017-10-01

    The functionality of GooFit, a GPU-friendly framework for doing maximum-likelihood fits, has been extended to extract model-independent {\\mathscr{S}}-wave amplitudes in three-body decays such as D + → h + h + h -. A full amplitude analysis is done where the magnitudes and phases of the {\\mathscr{S}}-wave amplitudes are anchored at a finite number of m 2(h + h -) control points, and a cubic spline is used to interpolate between these points. The amplitudes for {\\mathscr{P}}-wave and {\\mathscr{D}}-wave intermediate states are modeled as spin-dependent Breit-Wigner resonances. GooFit uses the Thrust library, with a CUDA backend for NVIDIA GPUs and an OpenMP backend for threads with conventional CPUs. Performance on a variety of platforms is compared. Executing on systems with GPUs is typically a few hundred times faster than executing the same algorithm on a single CPU.

  6. Critical Medical Anthropology in Midwifery Research: A Framework for Ethnographic Analysis.

    Science.gov (United States)

    Newnham, Elizabeth C; Pincombe, Jan I; McKellar, Lois V

    2016-01-01

    In this article, we discuss the use of critical medical anthropology (CMA) as a theoretical framework for research in the maternity care setting. With reference to the doctoral research of the first author, we argue for the relevance of using CMA for research into the maternity care setting, particularly as it relates to midwifery. We then give an overview of an existing analytic model within CMA that we adapted for looking specifically at childbirth practices and which was then used in both analyzing the data and structuring the thesis. There is often no clear guide to the analysis or writing up of data in ethnographic research; we therefore offer this Critical analytic model of childbirth practices for other researchers conducting ethnographic research into childbirth or maternity care.

  7. A framework for techno-economic & environmental sustainability analysis by risk assessment for conceptual process evaluation

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Sin, Gürkan; Carvalho, Ana

    2016-01-01

    for techno-economic and environmental sustainability analysis through risk assessment is proposed for the early-stage design and screening of conceptual process alternatives. The alternatives within the design space are analyzed following the framework’s work-flow, which targets the following: (i) quantify...... the economic risk; (ii) perform the monetary valuation of environmental impact categories under uncertainty; (iii) quantify the potential environmental risk; (iv) measure the alternatives’ eco-efficiency identifying possible trade-offs; and, lastly (v) propose a joint risk assessment matrix......The need to achieve a sustainable process performance has become increasingly important in order to keep a competitive advantage in the global markets. Development of comprehensive and systematic methods to accomplish this goal is the subject of this work. To this end, a multi-level framework...

  8. Duopoly Market Analysis within One-Shot Decision Framework with Asymmetric Possibilistic Information

    Directory of Open Access Journals (Sweden)

    Peijun Guo

    2010-12-01

    Full Text Available In this paper, a newly emerging duopoly market with a short life cycle is analyzed. The partially known information of market is characterized by the possibility distribution of the parameter in the demand function. Since the life cycle of the new product is short, how many products should be produced by two rival firms is a typical one-shot decision problem. Within the one-shot decision framework, the possibilistic Cournot equilibrium is obtained for the optimal production level of each firm in a duopoly market with asymmetrical possibilistic information. The analysis results show that the proposed approaches are reasonable for one-shot decision problems, which are extensively encountered in business and economics.

  9. Multiattribute utility analysis as a framework for public participation siting a hazardous waste facility

    International Nuclear Information System (INIS)

    Merkhofer, M.W.; Conway, R.; Anderson, R.G.

    1996-01-01

    How can the public play a role in decisions involving complicated scientific arguments? This paper describes a public participation exercise in which stakeholders used multiattribute utility analysis to select a site for a hazardous waste facility. Key to success was the ability to separate and address the two types of judgements inherent in environmental decisions: technical judgements on the likely consequences of alternative choices and value judgements on the importance or seriousness of those consequences. This enabled technical specialists to communicate the essential technical considerations and allowed stakeholders to establish the value judgements for the decision. Although rarely used in public participation, the multiattribute utility approach appears to provide a useful framework for the collaborative resolution of many complex environmental decision problems

  10. A framework for techno-economic & environmental sustainability analysis by risk assessment for conceptual process evaluation

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Sin, Gürkan; Carvalho, Ana

    2016-01-01

    the economic risk; (ii) perform the monetary valuation of environmental impact categories under uncertainty; (iii) quantify the potential environmental risk; (iv) measure the alternatives’ eco-efficiency identifying possible trade-offs; and, lastly (v) propose a joint risk assessment matrix......The need to achieve a sustainable process performance has become increasingly important in order to keep a competitive advantage in the global markets. Development of comprehensive and systematic methods to accomplish this goal is the subject of this work. To this end, a multi-level framework...... for techno-economic and environmental sustainability analysis through risk assessment is proposed for the early-stage design and screening of conceptual process alternatives. The alternatives within the design space are analyzed following the framework’s work-flow, which targets the following: (i) quantify...

  11. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  12. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  13. Tissues from equine cadaver ligaments up to 72 hours of post-mortem: a promising reservoir of stem cells.

    Science.gov (United States)

    Shikh Alsook, Mohamad Khir; Gabriel, Annick; Piret, Joëlle; Waroux, Olivier; Tonus, Céline; Connan, Delphine; Baise, Etienne; Antoine, Nadine

    2015-12-18

    Mesenchymal stem cells (MSCs) harvested from cadaveric tissues represent a promising approach for regenerative medicine. To date, no study has investigated whether viable MSCs could survive in cadaveric tissues from tendon or ligament up to 72 hours of post-mortem. The purpose of the present work was to find out if viable MSCs could survive in cadaveric tissues from adult equine ligaments up to 72 hours of post-mortem, and to assess their ability (i) to remain in an undifferentiated state and (ii) to divide and proliferate in the absence of any specific stimulus. MSCs were isolated from equine cadaver (EC) suspensory ligaments within 48-72 hours of post-mortem. They were evaluated for viability, proliferation, capacity for tri-lineage differentiation, expression of cell surface markers (CD90, CD105, CD73, CD45), pluripotent transcription factor (OCT-4), stage-specific embryonic antigen-1 (SSEA-1), neuron-specific class III beta-tubulin (TUJ-1), and glial fibrillary acidic protein (GFAP). As well, they were characterized by transmission electron microscope (TEM). EC-MSCs were successfully isolated and maintained for 20 passages with high cell viability and proliferation. Phase contrast microscopy revealed that cells with fibroblast-like appearance were predominant in the culture. Differentiation assays proved that EC-MSCs are able to differentiate towards mesodermal lineages (osteogenic, adipogenic, chondrogenic). Flow cytometry analysis demonstrated that EC-MSCs expressed CD90, CD105, and CD73, while being negative for the leukocyte common antigen CD45. Immunofluorescence analysis showed a high percentage of positive cells for OCT-4 and SSEA-1. Surprisingly, in absence of any stimuli, some adherent cells closely resembling neuronal and glial morphology were also observed. Interestingly, our results revealed that approximately 15 % of the cell populations were TUJ-1 positive, whereas GFAP expression was detected in only a few cells. Furthermore, TEM analysis

  14. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  15. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    Science.gov (United States)

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  16. Critical Care nurses' understanding of the NHS knowledge and skills framework. An interpretative phenomenological analysis.

    Science.gov (United States)

    Stewart, Laura F M; Rae, Agnes M

    2013-01-01

    This small-scale research study aimed to explore Critical Care nurses' understanding of the National Health Service (NHS) Knowledge and Skills Framework (KSF) in relationship to its challenges and their nursing role. The NHS KSF is central to the professional development of nurses in Critical Care and supports the effective delivery of health care in the UK. KSF was implemented in 2004 yet engagement seems lacking with challenges often identified. This qualitative study adopted an Interpretative Phenomenological Analysis framework. Data were collected from five Critical Care nurses using semi-structured interviews that were transcribed for analysis. Two super-ordinate themes of 'engagement' and 'theory-practice gap' were identified. Six subthemes of 'fluency', 'transparency', 'self-assessment', 'achieving for whom', 'reflection' and 'the nursing role' further explained the super-ordinate themes. Critical Care nurses demonstrated layers of understanding about KSF. Challenges identified were primarily concerned with complex language, an unclear process and the use of reflective and self-assessment skills. Two theory-practice gaps were found. Critical Care nurses understood the principles of KSF but they either did not apply or did not realize they applied these principles. They struggled to relate KSF to Critical Care practice and felt it did not capture the 'essence' of their nursing role in Critical Care. Recommendations were made for embedding KSF into Critical Care practice, using education and taking a flexible approach to KSF to support the development and care delivery of Critical Care nurses. © 2012 The Authors. Nursing in Critical Care © 2012 British Association of Critical Care Nurses.

  17. Economic and Nonproliferation Analysis Framework for Assessing Reliable Nuclear Fuel Service Arrangements

    International Nuclear Information System (INIS)

    Phillips, Jon R.; Kreyling, Sean J.; Short, Steven M.; Weimar, Mark R.

    2010-01-01

    Nuclear power is now broadly recognized as an essential technology in national strategies to provide energy security while meeting carbon management goals. Yet a long standing conundrum remains: how to enable rapid growth in the global nuclear power infrastructure while controlling the spread of sensitive enrichment and reprocessing technologies that lie at the heart of nuclear fuel supply and nuclear weapons programs. Reducing the latent proliferation risk posed by a broader horizontal spread of enrichment and reprocessing technology has been a primary goal of national nuclear supplier policies since the beginning of the nuclear power age. Attempts to control the spread of sensitive nuclear technology have been the subject of numerous initiatives in the intervening decades sometimes taking the form of calls to develop fuel supply and service assurances to reduce market pull to increase the number of states with fuel cycle capabilities. A clear understanding of what characteristics of specific reliable nuclear fuel service (RNFS) and supply arrangements qualify them as 'attractive offers' is critical to the success of current and future efforts. At a minimum, RNFS arrangements should provide economic value to all participants and help reduce latent proliferation risks posed by the global expansion of nuclear power. In order to inform the technical debate and the development of policy, Pacific Northwest National Laboratory has been developing an analytical framework to evaluate the economics and nonproliferation merits of alternative approaches to RNFS arrangements. This paper provides a brief overview of the economic analysis framework developed and applied to a model problem of current interest: full-service nuclear fuel leasing arrangements. Furthermore, this paper presents an extended outline of a proposed analysis approach to evaluate the non-proliferation merits of various RNFS alternatives.

  18. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  19. Using Functional Analysis as a Framework to Guide Individualized Treatment for Negative Symptoms

    Directory of Open Access Journals (Sweden)

    Tania M. Lincoln

    2017-12-01

    Full Text Available Although numerous interventions are available for negative symptoms, outcomes have been unsatisfactory with pharmacological and psychological interventions producing changes of only limited clinical significance. Here, we argue that because negative symptoms occur as a complex syndrome caused and maintained by numerous factors that vary between individuals they are unlikely to be treated effectively by the present “one size fits all” approaches. Instead, a well-founded selection of those interventions relevant to each individual is needed to optimize both the efficiency and the efficacy of existing approaches. The concept of functional analysis (FA can be used to structure existing knowledge so that it can guide individualized treatment planning. FA is based on stimulus—response learning mechanisms taking into account the characteristics of the organism that contribute to the responses, their consequences and the contingency with which consequences are tied to the response. FA can thus be flexibly applied to the level of individual patients to understand the factors causing and maintaining negative symptoms and derive suitable interventions. In this article we will briefly introduce the concept of FA and demonstrate—exemplarily—how known psychological and biological correlates of negative symptoms can be incorporated into its framework. We then outline the framework's implications for individual assessment and treatment. Following the logic of FA, we argue that a detailed assessment is needed to identify the key factors causing or maintaining negative symptoms for each individual patient. Interventions can then be selected according to their likelihood of changing these key factors and need to take interactions between different factors into account. Supplementary case vignettes exemplify the usefulness of functional analysis for individual treatment planning. Finally, we discuss and point to avenues for future research guided by this

  20. A Support Analysis Framework for mass movement damage assessment: applications to case studies in Calabria (Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-03-01

    Full Text Available The analysis of data describing damage caused by mass movements in Calabria (Italy allowed the organisation of the Support Analysis Framework (SAF, a spreadsheet that converts damage descriptions into numerical indices expressing direct, indirect, and intangible damage.

    The SAF assesses damage indices of past mass movements and the potential outcomes of dormant phenomena re-activations. It is based on the effects on damaged elements and is independent of both physical and geometric phenomenon characteristics.

    SAF sections that assess direct damage encompass several lines, each describing an element characterised by a value fixed on a relative arbitrary scale. The levels of loss are classified as: L4: complete; L3: high; L2: medium; or L1: low. For a generic line l, the SAF multiplies the value of a damaged element by its level of loss, obtaining dl, the contribution of the line to the damage.

    Indirect damage is appraised by two sections accounting for: (a actions aiming to overcome emergency situations and (b actions aiming to restore pre-movement conditions. The level of loss depends on the number of people involved (a or the cost of actions (b.

    For intangible damage, the level of loss depends on the number of people involved.

    We examined three phenomena, assessing damage using the SAF and SAFL, customised versions of SAF based on the elements actually present in the analysed municipalities that consider the values of elements in the community framework. We show that in less populated, inland, and affluent municipalities, the impact of mass movements is greater than in coastal areas.

    The SAF can be useful to sort groups of phenomena according to their probable future damage, supplying results significant either for insurance companies or for local authorities involved in both disaster management and planning of defensive measures.

  1. Development of a framework for the neutorinics analysis system for next generation (2) (Contract research)

    International Nuclear Information System (INIS)

    Hirai, Yasushi; Hyoudou, Hideaki; Tatsumi, Masahiro; Jin, Tomoyuki; Yokoyama, Kenji

    2008-10-01

    Japan Atomic Energy Agency promotes development of innovative analysis methods and models in fundamental studies for next-generation nuclear reactor systems. In order to efficiently and effectively reflect the latest analysis methods and models to primary design of prototype reactor and/or in-core fuel management for power reactors, a next-generation analysis system MARBLE has been developed. The next-generation system provides solutions to the following requirements: (1) Flexibility, extensibility and user-friendliness that can apply new methods and models rapidly and effectively for fundamental studies, (2) quantitative assurance of solution accuracy and adaptive scoping range for design studies, (3) coupling analysis among different study domains for the purpose of rationalization of plant systems and improvement of reliability, (4) maintainability and reusability for system extensions for the purpose of total quality assurance and development efficiency. There has been a problem of extreme inefficiency due to lack of functionality in the conventional analysis system when changing analysis targets and/or modeling levels. In order to solve this problem, a policy of the hybrid system is adopted for the next-generation system, in which a controlling part is implemented in the scripting language with rich flexibility and maintainability and solution kernels that requires execution speed in the system language. In this study, detailed design of a framework, its implementation and tests are conducted so that a Python system layer can drive calculation codes written in C++ and/or Fortran. It is confirmed that various type of calculation codes such as diffusion, transport and burnup codes can be treated in the same manner on the platform for unified management system for calculation codes with a data exchange mechanism for abstracted data model between the Python and the calculation code layers. (author)

  2. A stochastic context free grammar based framework for analysis of protein sequences

    Directory of Open Access Journals (Sweden)

    Nebel Jean-Christophe

    2009-10-01

    Full Text Available Abstract Background In the last decade, there have been many applications of formal language theory in bioinformatics such as RNA structure prediction and detection of patterns in DNA. However, in the field of proteomics, the size of the protein alphabet and the complexity of relationship between amino acids have mainly limited the application of formal language theory to the production of grammars whose expressive power is not higher than stochastic regular grammars. However, these grammars, like other state of the art methods, cannot cover any higher-order dependencies such as nested and crossing relationships that are common in proteins. In order to overcome some of these limitations, we propose a Stochastic Context Free Grammar based framework for the analysis of protein sequences where grammars are induced using a genetic algorithm. Results This framework was implemented in a system aiming at the production of binding site descriptors. These descriptors not only allow detection of protein regions that are involved in these sites, but also provide insight in their structure. Grammars were induced using quantitative properties of amino acids to deal with the size of the protein alphabet. Moreover, we imposed some structural constraints on grammars to reduce the extent of the rule search space. Finally, grammars based on different properties were combined to convey as much information as possible. Evaluation was performed on sites of various sizes and complexity described either by PROSITE patterns, domain profiles or a set of patterns. Results show the produced binding site descriptors are human-readable and, hence, highlight biologically meaningful features. Moreover, they achieve good accuracy in both annotation and detection. In addition, findings suggest that, unlike current state-of-the-art methods, our system may be particularly suited to deal with patterns shared by non-homologous proteins. Conclusion A new Stochastic Context Free

  3. A holistic framework of degradation modeling for reliability analysis and maintenance optimization of nuclear safety systems

    International Nuclear Information System (INIS)

    Lin, Yanhui

    2016-01-01

    Components of nuclear safety systems are in general highly reliable, which leads to a difficulty in modeling their degradation and failure behaviors due to the limited amount of data available. Besides, the complexity of such modeling task is increased by the fact that these systems are often subject to multiple competing degradation processes and that these can be dependent under certain circumstances, and influenced by a number of external factors (e.g. temperature, stress, mechanical shocks, etc.). In this complicated problem setting, this PhD work aims to develop a holistic framework of models and computational methods for the reliability-based analysis and maintenance optimization of nuclear safety systems taking into account the available knowledge on the systems, degradation and failure behaviors, their dependencies, the external influencing factors and the associated uncertainties.The original scientific contributions of the work are: (1) For single components, we integrate random shocks into multi-state physics models for component reliability analysis, considering general dependencies between the degradation and two types of random shocks. (2) For multi-component systems (with a limited number of components):(a) a piecewise-deterministic Markov process modeling framework is developed to treat degradation dependency in a system whose degradation processes are modeled by physics-based models and multi-state models; (b) epistemic uncertainty due to incomplete or imprecise knowledge is considered and a finite-volume scheme is extended to assess the (fuzzy) system reliability; (c) the mean absolute deviation importance measures are extended for components with multiple dependent competing degradation processes and subject to maintenance; (d) the optimal maintenance policy considering epistemic uncertainty and degradation dependency is derived by combining finite-volume scheme, differential evolution and non-dominated sorting differential evolution; (e) the

  4. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  5. Forensic Identification of Decomposed Human Body through Comparison between Ante-Mortem and Post-Mortem CT Images of Frontal Sinuses: Case Report

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira Silva

    2017-01-01

    Full Text Available Objective: The aim of this paper is to report on a case of positive human identification of a decomposed body after the comparison of ante-mortem (AM and port-mortem (PM computed tomography images of frontal sinus. Case report: An unknown, highly decomposed human body, aged between 30 and 40 years, was found in a forest region in Brazil. The dental autopsy revealed several teeth missing AM and the presence of removable partial prostheses. The search for AM data resulted in a sequence of 20 axial images of the paranasal sinuses obtained by Multislice Computed Tomography (MSCT. PM reproduction of the MSCT images was performed in order to enable a comparative identification. After a direct confrontation between AM/PM MSCT, the data were collected for morphological findings, specifically for the lateral expansion of the left lobe, the anteroposterior dimension, and the position of median and accessory septa of the sinuses. Conclusion: The importance of storing and interpreting radiographic medical data properly is highlighted in this text, thus pointing out the importance of application of forensic radiology in the field of law.

  6. Patient satisfaction with nursing care: a concept analysis within a nursing framework.

    Science.gov (United States)

    Wagner, Debra; Bear, Mary

    2009-03-01

    This paper is a report of a concept analysis of patient satisfaction with nursing care. Patient satisfaction is an important indicator of quality of care, and healthcare facilities are interested in maintaining high levels of satisfaction in order to stay competitive in the healthcare market. Nursing care has a prominent role in patient satisfaction. Using a nursing model to measure patient satisfaction with nursing care helps define and clarify this concept. Rodgers' evolutionary method of concept analysis provided the framework for this analysis. Data were retrieved from the Cumulative Index of Nursing and Allied Health Literature and MEDLINE databases and the ABI/INFORM global business database. The literature search used the keywords patient satisfaction, nursing care and hospital. The sample included 44 papers published in English, between 1998 and 2007. Cox's Interaction Model of Client Health Behavior was used to analyse the concept of patient satisfaction with nursing care. The attributes leading to the health outcome of patient satisfaction with nursing care were categorized as affective support, health information, decisional control and professional/technical competencies. Antecedents embodied the uniqueness of the patient in terms of demographic data, social influence, previous healthcare experiences, environmental resources, intrinsic motivation, cognitive appraisal and affective response. Consequences of achieving patient satisfaction with nursing care included greater market share of healthcare finances, compliance with healthcare regimens and better health outcomes. The meaning of patient satisfaction continues to evolve. Using a nursing model to measure patient satisfaction with nursing care delineates the concept from other measures of patient satisfaction.

  7. An Analysis Framework for Understanding the Origin of Nuclear Activity in Low-power Radio Galaxies

    Science.gov (United States)

    Lin, Yen-Ting; Huang, Hung-Jin; Chen, Yen-Chi

    2018-05-01

    Using large samples containing nearly 2300 active galaxies of low radio luminosity (1.4 GHz luminosity between 2 × 1023 and 3 × 1025 W Hz‑1, essentially low-excitation radio galaxies) at z ≲ 0.3, we present a self-contained analysis of the dependence of the nuclear radio activity on both intrinsic and extrinsic properties of galaxies, with the goal of identifying the best predictors of the nuclear radio activity. While confirming the established result that stellar mass must play a key role on the triggering of radio activities, we point out that for the central, most massive galaxies, the radio activity also shows a strong dependence on halo mass, which is not likely due to enhanced interaction rates in denser regions in massive, cluster-scale halos. We thus further investigate the effects of various properties of the intracluster medium (ICM) in massive clusters on the radio activities, employing two standard statistical tools, principle component analysis and logistic regression. It is found that ICM entropy, local cooling time, and pressure are the most effective in predicting the radio activity, pointing to the accretion of gas cooling out of a hot atmosphere to be the likely origin in triggering such activities in galaxies residing in massive dark matter halos. Our analysis framework enables us to logically discern the mechanisms responsible for the radio activity separately for central and satellite galaxies.

  8. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  9. A Conjoint Analysis Framework for Evaluating User Preferences in Machine Translation.

    Science.gov (United States)

    Kirchhoff, Katrin; Capurro, Daniel; Turner, Anne M

    2014-03-01

    Despite much research on machine translation (MT) evaluation, there is surprisingly little work that directly measures users' intuitive or emotional preferences regarding different types of MT errors. However, the elicitation and modeling of user preferences is an important prerequisite for research on user adaptation and customization of MT engines. In this paper we explore the use of conjoint analysis as a formal quantitative framework to assess users' relative preferences for different types of translation errors. We apply our approach to the analysis of MT output from translating public health documents from English into Spanish. Our results indicate that word order errors are clearly the most dispreferred error type, followed by word sense, morphological, and function word errors. The conjoint analysis-based model is able to predict user preferences more accurately than a baseline model that chooses the translation with the fewest errors overall. Additionally we analyze the effect of using a crowd-sourced respondent population versus a sample of domain experts and observe that main preference effects are remarkably stable across the two samples.

  10. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  11. Towards a framework for agent-based image analysis of remote-sensing data.

    Science.gov (United States)

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-04-03

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects' properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA).

  12. Effect of gamma irradiation on the microstructure and post-mortem anaerobic metabolism of bovine muscle

    International Nuclear Information System (INIS)

    Yook, H.-S.; Lee, J.-W.; Lee, K.-H.; Kim, M.-K.; Song, C.-W.; Byun, M.-W.

    2001-01-01

    Experiments were performed to study the effect of gamma irradiation on morphological properties and post-mortem metabolism in bovine M. sternomandibularis with special reference to ultrastructure, shear force, pH and ATP breakdown. The shortening of sarcomere was not observed in gamma-irradiated muscle, however, the disappearance of M-line and of A- and I-bands was perceptible. During cold storage, the destruction of muscle bundles was faster in the gamma-irradiated muscle than in the non-irradiated with a dose-dependent manner. The same is true for the post mortem pH drop and ATP breakdown. So, experimental results confirmed that the anaerobic metabolism and morphological properties are noticeably affected by gamma irradiation in beef

  13. Herpetic brainstem encephalitis: report of a post-mortem case studied electron microscopically and immunohisiochemically

    Directory of Open Access Journals (Sweden)

    José Eymard Homem Pitella

    1987-03-01

    Full Text Available A post-mortem examined case of herpetic brainstem encephalitis is presented. Clinically, the patient had cephalea followed by ataxia, drowsiness and multiple palsies of some cranial nerves, developing into death in eight days. The pathologic examination of the brain showed necrotizing encephalitis in multiple foci limited to the brainstem, more distinctly in the pons and medula oblongata. The technique of immunoperoxidase revealed rare glial cells with intranuclear immunoreactivity for herpes antigen. Rare viral particles with the morphological characteristics of the herpesvirus were identified in the nuclei of neurons in 10% formol fixed material. This is the second reported case of herpetic brainstem encephalitis confirmed by post-mortem examination. The pathway used by the virus to reach the central nervous system and its posterior dissemination to the oral cavity, the orbitofrontal region and the temporal lobes as well as to the brainstem, after a period of latency and reactivation, are discussed.

  14. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture

    International Nuclear Information System (INIS)

    Winklhofer, Sebastian; Berger, Nicole; Stolzmann, Paul; Stoeck, Christian T.; Kozerke, Sebastian; Thali, Michael; Manka, Robert; Alkadhi, Hatem

    2014-01-01

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p 0.05). Post-mortem cardiac DTI enablesdifferentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. (orig.)

  15. Various methods for the estimation of the post mortem interval from Calliphoridae: A review

    Directory of Open Access Journals (Sweden)

    Ruchi Sharma

    2015-03-01

    Forensic entomology is recognized in many countries as an important tool for legal investigations. Unfortunately, it has not received much attention in India as an important investigative tool. The maggots of the flies crawling on the dead bodies are widely considered to be just another disgusting element of decay and are not collected at the time of autopsy. They can aid in death investigations (time since death, manner of death, etc.. This paper reviews the various methods of post mortem interval estimation using Calliphoridae to make the investigators, law personnel and researchers aware of the importance of entomology in criminal investigations. The various problems confronted by forensic entomologists in estimating the time since death have also been discussed and there is a need for further research in the field as well as the laborator. Correct estimation of the post mortem interval is one of the most important aspects of legal medicine.

  16. Viability and infectivity of Ichthyophonus sp. in post-mortem Pacific herring, Clupea pallasii.

    Science.gov (United States)

    Kocan, Richard; Hart, Lucas; Lewandowski, Naomi; Hershberger, Paul

    2014-12-01

    Ichthyophonus-infected Pacific herring, Clupea pallasii , were allowed to decompose in ambient seawater then serially sampled for 29 days to evaluate parasite viability and infectivity for Pacific staghorn sculpin, Leptocottus armatus . Ichthyophonus sp. was viable in decomposing herring tissues for at least 29 days post-mortem and could be transmitted via ingestion to sculpin for up to 5 days. The parasite underwent morphologic changes during the first 48 hr following death of the host that were similar to those previously reported, but as host tissue decomposition progressed, several previously un-described forms of the parasite were observed. The significance of long-term survival and continued morphologic transformation in the post-mortem host is unknown, but it could represent a saprozoic phase of the parasite life cycle that has survival value for Ichthyophonus sp.

  17. Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods

    OpenAIRE

    Kênia de Fátima Carrijo; Elmiro Rosendo do Nascimento; Virginia Léo de Almeida Pereira; Nelson Morés; Catia Silene Klein; Leonardo Muliterno Domingues; Rogerio Tortelly

    2014-01-01

    ABSTRACT. Carrijo K.F., Nascimento E.R., Pereira V.L.A., Morés N., Klein, C.S., Domingues L.M. & Tortelly R. [Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods.] Diagnóstico da pneumonia enzoótica suína pela inspeção sanitária post mortem: comparação com outros métodos de diagnóstico. Revista Brasileira de Veterinária Brasileira 36(2):188-194, 2014. Faculdade de Medicina Veterinária, Universidade Federal de Uberlândia, Av. Par...

  18. [Research Progress of Carrion-breeding Phorid Flies for Post-mortem Interval Estimation in Forensic Medicine].

    Science.gov (United States)

    Li, L; Feng, D X; Wu, J

    2016-10-01

    It is a difficult problem of forensic medicine to accurately estimate the post-mortem interval. Entomological approach has been regarded as an effective way to estimate the post-mortem interval. The developmental biology of carrion-breeding flies has an important position at the post-mortem interval estimation. Phorid flies are tiny and occur as the main or even the only insect evidence in relatively enclosed environments. This paper reviews the research progress of carrion-breeding phorid flies for estimating post-mortem interval in forensic medicine which includes their roles, species identification and age determination of immatures. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  19. Post-mortem MRI as an alternative to non-forensic autopsy in foetuses and children: from research into clinical practice

    Science.gov (United States)

    Addison, S; Arthurs, O J

    2014-01-01

    Although post-mortem MRI (PMMR) was proposed as an alternative to conventional autopsy more than a decade ago, the lack of systematic validation has limited its clinical uptake. Minimally invasive autopsy (MIA) using PMMR together with ancillary investigations has now been shown to be as accurate as conventional autopsy in foetuses, newborns and infants and is particularly useful for cerebral, cardiac and genitourinary imaging. Unlike conventional autopsy, PMMR provides a permanent three-dimensional auditable record, with accurate estimation of internal organ volumes. MIA is becoming highly acceptable to parents and professionals, and there is widespread political support and public interest in its clinical implementation in the UK. In the short to medium term, it is desirable that a supraregional network of specialist centres should be established to provide this service within the current National Health Service framework. PMID:24288400

  20. [Inheritance rights fo the child born from post-mortem fertilization].

    Science.gov (United States)

    Iniesta Delgado, Juan José

    2008-01-01

    Spanish Law allows in the possibility of post mortem fertilization, recognizing the paternity of the deceased male. The most prominent legal effects of this fact have to do with the succession of his father. The way of fixing the child's portion in the forced succession and its protection, the question of determining his share in the inheritance and the necessity of defending his rights until the verification of the birth are some of the issues that are discussed in this article.

  1. Post-mortem changes in the physical meat quality characteristics of ...

    African Journals Online (AJOL)

    ... apparatus) of the muscle generally improved with time. The quadratic equation y = -0.0817x2 + 0.4468x + 10.477 best described (R2 = 0.32) this improvement in tenderness. The implications of this result is that fresh game meat producers can de-bone carcasses after 24 hours post mortem and leave the primal cuts to age ...

  2. Forensic aspects of incised wounds and bruises in pigs established post-mortem

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2017-01-01

    Recognizing post-mortem (PM) changes is of crucial importance in veterinary forensic pathology. In porcine wounds established PM contradicting observations regarding infiltration of leukocytes have been described. In the present study, skin, subcutis and muscle tissue sampled from experimental pigs...... of sampling. Moreover, it was found that AM bruises free of leukocyte infiltration cannot be distinguished from PM bruises, an observation which is of crucial importance when timing bruises in forensic cases....

  3. An evaluation of the DRI-ETG EIA method for the determination of ethyl glucuronide concentrations in clinical and post-mortem urine.

    Science.gov (United States)

    Turfus, Sophie C; Vo, Tu; Niehaus, Nadia; Gerostamoulos, Dimitri; Beyer, Jochen

    2013-06-01

    A commercial enzyme immunoassay for the qualitative and semi-quantitative measurement of ethyl glucuronide (EtG) in urine was evaluated. Post-mortem (n=800), and clinical urine (n=200) samples were assayed using a Hitachi 902 analyzer. The determined concentrations were compared with those obtained using a previously published liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the quantification of EtG and ethyl sulfate. Using a cut-off of 0.5 µg/ml and LC-MS/MS limit of reporting of 0.1 µg/ml, there was a sensitivity of 60.8% and a specificity of 100% for clinical samples. For post-mortem samples, sensitivity and specificity were 82.4% and 97.1%, respectively. When reducing the cut-off to 0.1 µg/ml, the sensitivity and specificity were 83.3% and 100% for clinical samples whereas for post-mortem samples the sensitivity and specificity were 90.3 % and 88.3 %, respectively. The best trade-offs between sensitivity and specificity for LC-MS/MS limits of reporting of 0.5 and 0.1 µg/ml were achieved when using immunoassay cut-offs of 0.3 and 0.092 µg/ml, respectively. There was good correlation between quantitative results obtained by both methods but analysis of samples by LC-MS/MS gave higher concentrations than by enzyme immunoassay (EIA), with a statistically significant proportional bias (P<0.0001, Deming regression) for both sample types. The immunoassay is reliable for the qualitative and semi-quantitative presumptive detection of ethyl glucuronide in urine. Copyright © 2012 John Wiley & Sons, Ltd.

  4. The use of contrast-enhanced post Mortem CT in the detection of cardiovascular deaths.

    Directory of Open Access Journals (Sweden)

    Jonas Christoph Apitzsch

    Full Text Available OBJECTIVES: To evaluate the diagnostic value of contrast enhanced post mortem computed tomography (PMCT in comparison to non-enhanced post mortem CT in the detection of cardiovascular causes of death (COD. BACKGROUND: As autopsy rates decline, new methods to determine CODs are necessary. So contrast enhanced PMCT shall be evaluated in comparison to established non-enhanced PMCT in order to further improve the method. METHODS: In a prospective study, 20 corpses were examined using a 64-row multisclice CT (MSCT before and after intraarterial perfusion with a newly developed, barium-bearing contrast agent and ventilation of the lungs. The cause of death was determined in enhanced and unenhanced scans and a level of confidence (LOC was given by three experienced radiologists on a scale between 0 and 4. Results were compared to autopsy results as gold standard. Autopsy was performed blinded to PMCT-findings. RESULTS: The method allowed visualization of different types of cause of death. There was a significant improvement in LOC in enhanced scans compared to unenhanced scans as well as an improvement in the detection of COD. The cause of death could be determined in 19 out of 20 patients. CONCLUSIONS: PMCT is feasible and appears to be robust for diagnosing cardiovascular causes of death. When compared with unenhanced post-mortem CT intraarterial perfusion and pulmonary ventilation significantly improve visualization and diagnostic accuracy. These promising results warrant further studies.

  5. Assessment of coronary artery disease by post-mortem cardiac MR

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Bauer-Kreutz, Regula; Ampanozi, Garyfalia; Rosskopf, Andrea B.; Pilgrim, Thomas M.; Weber, Oliver M.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objectives: Minimally invasive or virtual autopsies are being advocated as alternative to traditional autopsy, but have limited abilities to detect coronary artery disease. It was the objective of this study to assess if the occurrence of chemical shift artifacts (CSA) along the coronary arteries on non-contrast, post-mortem cardiac MR may be used to investigate coronary artery disease. Methods: We retrospectively compared autopsy and CT findings of 30 cases with significant (≥75%), insignificant (<75%), or absent coronary artery stenosis to post-mortem cardiac MR findings. The chi-square test was used to investigate if the occurrence of CSA depends on the presence or absence of stenosis. Sensitivity, specificity and predictive values were calculated for each finding. Results: CSA indicates the absence of (significant) stenosis (p < 0.001). The occurrence of paired dark bands in lieu of CSA on post-mortem cardiac MR suggests (significant) coronary arteries stenosis (p < 0.001). Both findings have a high specificity but low sensitivity. Conclusions: CSA is a marker of vessel patency. The presence of paired dark bands indicates stenosis. These criteria improve the ability of minimally invasive or virtual autopsy to detect coronary artery disease related deaths

  6. Effects of post mortem interval and gender in DNA base excision repair activities in rat brains

    Energy Technology Data Exchange (ETDEWEB)

    Soltys, Daniela Tathiana; Pereira, Carolina Parga Martins; Ishibe, Gabriela Naomi; Souza-Pinto, Nadja Cristhina de, E-mail: nadja@iq.usp.br

    2015-06-15

    Most human tissues used in research are of post mortem origin. This is the case for all brain samples, and due to the difficulty in obtaining a good number of samples, especially in the case of neurodegenerative diseases, male and female samples are often included in the same experimental group. However, the effects of post mortem interval (PMI) and gender differences in the endpoints being analyzed are not always fully understood, as is the case for DNA repair activities. To investigate these effects, in a controlled genetic background, base excision repair (BER) activities were measured in protein extracts obtained from Wistar rat brains from different genders and defined PMI up to 24 hours, using a novel fluorescent-based in vitro incision assay. Uracil and AP-site incision activity in nuclear and mitochondrial extracts were similar in all groups included in this study. Our results show that gender and PMI up to 24 hours have no influence in the activities of the BER proteins UDG and APE1 in rat brains. These findings demonstrate that these variables do not interfere on the BER activities included in these study, and provide a security window to work with UDG and APE1 proteins in samples of post mortem origin.

  7. Effects of post mortem interval and gender in DNA base excision repair activities in rat brains

    International Nuclear Information System (INIS)

    Soltys, Daniela Tathiana; Pereira, Carolina Parga Martins; Ishibe, Gabriela Naomi; Souza-Pinto, Nadja Cristhina de

    2015-01-01

    Most human tissues used in research are of post mortem origin. This is the case for all brain samples, and due to the difficulty in obtaining a good number of samples, especially in the case of neurodegenerative diseases, male and female samples are often included in the same experimental group. However, the effects of post mortem interval (PMI) and gender differences in the endpoints being analyzed are not always fully understood, as is the case for DNA repair activities. To investigate these effects, in a controlled genetic background, base excision repair (BER) activities were measured in protein extracts obtained from Wistar rat brains from different genders and defined PMI up to 24 hours, using a novel fluorescent-based in vitro incision assay. Uracil and AP-site incision activity in nuclear and mitochondrial extracts were similar in all groups included in this study. Our results show that gender and PMI up to 24 hours have no influence in the activities of the BER proteins UDG and APE1 in rat brains. These findings demonstrate that these variables do not interfere on the BER activities included in these study, and provide a security window to work with UDG and APE1 proteins in samples of post mortem origin

  8. Radiological emergency response for community agencies with cognitive task analysis, risk analysis, and decision support framework.

    Science.gov (United States)

    Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H

    2012-01-01

    Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.

  9. Analysis Framework of China’s Grain Production System: A Spatial Resilience Perspective

    Directory of Open Access Journals (Sweden)

    Dazhuan Ge

    2017-12-01

    Full Text Available China’s grain production has transformed from absolute shortage to a current structural oversupply. High-intensity production introduced further challenges for the eco-environment, smallholder livelihood, and the man-land interrelationship. Driven by urban-rural transformation, research on food security patterns and grain production has expanded into a new field. To analyze the challenges and required countermeasures for China’s grain production system (GPS, this study constructed a theoretical GPS framework based on space resilience. Firstly, a new GPS concept was proposed and a functional system was established for protecting the regional food security, thus guaranteeing smallholder livelihood, stabilizing urban-rural transformation, and sustaining the eco-environment in terms of economic, social, and ecological attributes of the GPS. Secondly, based on a cross-scale interaction analysis that varied from a smallholder scale to a global scale, the systematic crisis of the GPS was analyzed. Thirdly, a cross-scale analytic framework of the GPS was formed from the perspective of spatial resilience, integrating both inner and external disturbance factors of the GPS. Both spatial heterogeneity and connectivity of internal and external disturbance factors are important contents of system space resilience. Finally, the hierarchy of spatial resilience of GPS became clear. The transformation of labor force and the land use transition form key thresholds of the GPS. In summary: based on protecting the basic functions of the GPS, the cross-scale effect of systematic disturbance factors and relevant countermeasures for spatial resilience are effectively influenced by the coordination of the interests of multiple stakeholders; spatial resilience is an effective analytical tool for GPS regulation, providing a reference for revealing the inherent mechanism and functional evolution of the GPS in the process of urban-rural transformation.

  10. A framework for performance measurement in university using extended network data envelopment analysis (DEA) structures

    Science.gov (United States)

    Kashim, Rosmaini; Kasim, Maznah Mat; Rahman, Rosshairy Abd

    2015-12-01

    Measuring university performance is essential for efficient allocation and utilization of educational resources. In most of the previous studies, performance measurement in universities emphasized the operational efficiency and resource utilization without investigating the university's ability to fulfill the needs of its stakeholders and society. Therefore, assessment of the performance of university should be separated into two stages namely efficiency and effectiveness. In conventional DEA analysis, a decision making unit (DMU) or in this context, a university is generally treated as a black-box which ignores the operation and interdependence of the internal processes. When this happens, the results obtained would be misleading. Thus, this paper suggest an alternative framework for measuring the overall performance of a university by incorporating both efficiency and effectiveness and applies network DEA model. The network DEA models are recommended because this approach takes into account the interrelationship between the processes of efficiency and effectiveness in the system. This framework also focuses on the university structure which is expanded from the hierarchical to form a series of horizontal relationship between subordinate units by assuming both intermediate unit and its subordinate units can generate output(s). Three conceptual models are proposed to evaluate the performance of a university. An efficiency model is developed at the first stage by using hierarchical network model. It is followed by an effectiveness model which take output(s) from the hierarchical structure at the first stage as a input(s) at the second stage. As a result, a new overall performance model is proposed by combining both efficiency and effectiveness models. Thus, once this overall model is realized and utilized, the university's top management can determine the overall performance of each unit more accurately and systematically. Besides that, the result from the network

  11. Efficient storage, retrieval and analysis of poker hands: An adaptive data framework

    Directory of Open Access Journals (Sweden)

    Gorawski Marcin

    2017-12-01

    Full Text Available In online gambling, poker hands are one of the most popular and fundamental units of the game state and can be considered objects comprising all the events that pertain to the single hand played. In a situation where tens of millions of poker hands are produced daily and need to be stored and analysed quickly, the use of relational databases no longer provides high scalability and performance stability. The purpose of this paper is to present an efficient way of storing and retrieving poker hands in a big data environment. We propose a new, read-optimised storage model that offers significant data access improvements over traditional database systems as well as the existing Hadoop file formats such as ORC, RCFile or SequenceFile. Through index-oriented partition elimination, our file format allows reducing the number of file splits that needs to be accessed, and improves query response time up to three orders of magnitude in comparison with other approaches. In addition, our file format supports a range of new indexing structures to facilitate fast row retrieval at a split level. Both index types operate independently of the Hive execution context and allow other big data computational frameworks such as MapReduce or Spark to benefit from the optimized data access path to the hand information. Moreover, we present a detailed analysis of our storage model and its supporting index structures, and how they are organised in the overall data framework. We also describe in detail how predicate based expression trees are used to build effective file-level execution plans. Our experimental tests conducted on a production cluster, holding nearly 40 billion hands which span over 4000 partitions, show that multi-way partition pruning outperforms other existing file formats, resulting in faster query execution times and better cluster utilisation.

  12. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  13. A framework for smartphone-enabled, patient-generated health data analysis

    Directory of Open Access Journals (Sweden)

    Shreya S. Gollamudi

    2016-08-01

    Full Text Available Background: Digital medicine and smartphone-enabled health technologies provide a novel source of human health and human biology data. However, in part due to its intricacies, few methods have been established to analyze and interpret data in this domain. We previously conducted a six-month interventional trial examining the efficacy of a comprehensive smartphone-based health monitoring program for individuals with chronic disease. This included 38 individuals with hypertension who recorded 6,290 blood pressure readings over the trial. Methods: In the present study, we provide a hypothesis testing framework for unstructured time series data, typical of patient-generated mobile device data. We used a mixed model approach for unequally spaced repeated measures using autoregressive and generalized autoregressive models, and applied this to the blood pressure data generated in this trial. Results: We were able to detect, roughly, a 2 mmHg decrease in both systolic and diastolic blood pressure over the course of the trial despite considerable intra- and inter-individual variation. Furthermore, by supplementing this finding by using a sequential analysis approach, we observed this result over three months prior to the official study end—highlighting the effectiveness of leveraging the digital nature of this data source to form timely conclusions. Conclusions: Health data generated through the use of smartphones and other mobile devices allow individuals the opportunity to make informed health decisions, and provide researchers the opportunity to address innovative health and biology questions. The hypothesis testing framework we present can be applied in future studies utilizing digital medicine technology or implemented in the technology itself to support the quantified self.

  14. Contextualized analysis of a needs assessment using the Theoretical Domains Framework: a case example in endocrinology.

    Science.gov (United States)

    Lazure, Patrice; Bartel, Robert C; Biller, Beverly M K; Molitch, Mark E; Rosenthal, Stephen M; Ross, Judith L; Bernsten, Brock D; Hayes, Sean M

    2014-07-24

    The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain. The TDF categorization of

  15. Contextualized analysis of a needs assessment using the Theoretical Domains Framework: a case example in endocrinology

    Science.gov (United States)

    2014-01-01

    Background The Theoretical Domains Framework (TDF) is a set of 14 domains of behavior change that provide a framework for the critical issues and factors influencing optimal knowledge translation. Considering that a previous study has identified optimal knowledge translation techniques for each TDF domain, it was hypothesized that the TDF could be used to contextualize and interpret findings from a behavioral and educational needs assessment. To illustrate this hypothesis, findings and recommendations drawn from a 2012 national behavioral and educational needs assessment conducted with healthcare providers who treat and manage Growth and Growth Hormone Disorders, will be discussed using the TDF. Methods This needs assessment utilized a mixed-methods research approach that included a combination of: [a] data sources (Endocrinologists (n:120), Pediatric Endocrinologists (n:53), Pediatricians (n:52)), [b] data collection methods (focus groups, interviews, online survey), [c] analysis methodologies (qualitative - analyzed through thematic analysis, quantitative - analyzed using frequencies, cross-tabulations, and gap analysis). Triangulation was used to generate trustworthy findings on the clinical practice gaps of endocrinologists, pediatric endocrinologists, and general pediatricians in their provision of care to adult patients with adult growth hormone deficiency or acromegaly, or children/teenagers with pediatric growth disorders. The identified gaps were then broken into key underlying determinants, categorized according to the TDF domains, and linked to optimal behavioral change techniques. Results The needs assessment identified 13 gaps, each with one or more underlying determinant(s). Overall, these determinants were mapped to 9 of the 14 TDF domains. The Beliefs about Consequences domain was identified as a contributing determinant to 7 of the 13 challenges. Five of the gaps could be related to the Skills domain, while three were linked to the Knowledge domain

  16. A framework for noise-power spectrum analysis of multidimensional images

    International Nuclear Information System (INIS)

    Siewerdsen, J.H.; Cunningham, I.A.; Jaffray, D.A.

    2002-01-01

    A methodological framework for experimental analysis of the noise-power spectrum (NPS) of multidimensional images is presented that employs well-known properties of the n-dimensional (nD) Fourier transform. The approach is generalized to n dimensions, reducing to familiar cases for n=1 (e.g., time series) and n=2 (e.g., projection radiography) and demonstrated experimentally for two cases in which n=3 (viz., using an active matrix flat-panel imager for x-ray fluoroscopy and cone-beam CT to form three-dimensional (3D) images in spatiotemporal and volumetric domains, respectively). The relationship between fully nD NPS analysis and various techniques for analyzing a 'central slice' of the NPS is formulated in a manner that is directly applicable to measured nD data, highlights the effects of correlation, and renders issues of NPS normalization transparent. The spatiotemporal NPS of fluoroscopic images is analyzed under varying conditions of temporal correlation (image lag) to investigate the degree to which the NPS is reduced by such correlation. For first-frame image lag of ∼5-8 %, the NPS is reduced by ∼20% compared to the lag-free case. A simple model is presented that results in an approximate rule of thumb for computing the effect of image lag on NPS under conditions of spatiotemporal separability. The volumetric NPS of cone-beam CT images is analyzed under varying conditions of spatial correlation, controlled by adjustment of the reconstruction filter. The volumetric NPS is found to be highly asymmetric, exhibiting a ramp characteristic in transverse planes (typical of filtered back-projection) and a band-limited characteristic in the longitudinal direction (resulting from low-pass characteristics of the imager). Such asymmetry could have implications regarding the detectability of structures visualized in transverse versus sagittal or coronal planes. In all cases, appreciation of the full dimensionality of the image data is essential to obtaining

  17. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    CERN Document Server

    Antcheva, I; Bellenot, B; Biskup,1, M; Brun, R; Buncic, N; Canal, Ph; Casadei, D; Couet, O; Fine, V; Franco,1, L; Ganis, G; Gheata, A; Gonzalez Maline, D; Goto, M; Iwaszkiewicz, J; Kreshuk, A; Marcos Segura, D; Maunder, R; Moneta, L; Naumann, A; Offermann, E; Onuchin, V; Panacek, S; Rademakers, F; Russo, P; Tadel, M

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariat...

  18. Consistency Analysis of Ultrasound Echoes within a Dual Symmetric Path Inspection Framework

    Directory of Open Access Journals (Sweden)

    VASILE, C.

    2015-05-01

    Full Text Available Non-destructive ultrasound inspection of metallic structures is a perpetual high-interest area of research because of its we