WorldWideScience

Sample records for mortem analysis framework

  1. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  2. [Post-mortem microbiology analysis].

    Science.gov (United States)

    Fernández-Rodríguez, Amparo; Alberola, Juan; Cohen, Marta Cecilia

    2013-12-01

    Post-mortem microbiology is useful in both clinical and forensic autopsies, and allows a suspected infection to be confirmed. Indeed, it is routinely applied to donor studies in the clinical setting, as well as in sudden and unexpected death in the forensic field. Implementation of specific sampling techniques in autopsy can minimize the possibility of contamination, making interpretation of the results easier. Specific interpretation criteria for post-mortem cultures, the use of molecular diagnosis, and its fusion with molecular biology and histopathology have led to post-mortem microbiology playing a major role in autopsy. Multidisciplinary work involving microbiologists, pathologists, and forensic physicians will help to improve the achievements of post-mortem microbiology, prevent infectious diseases, and contribute to a healthier population. Crown Copyright © 2012. Published by Elsevier Espana. All rights reserved.

  3. Deuterium inventory in Tore Supra: reconciling particle balance and post-mortem analysis

    International Nuclear Information System (INIS)

    Tsitrone, E.; Brosset, C.; Pegourie, B.; Gauthier, E.; Bouvet, J.; Bucalossi, J.; Carpentier, S.; Corre, Y.; Delchambre, E.; Dittmar, T.; Douai, D.; Ekedahl, A.; Ghendrih, Ph.; Grisolia, C.; Grosman, A.; Gunn, J.; Hong, S.H.; Desgranges, L.; Escarguel, A.; Jacob, W.

    2009-01-01

    Fuel retention, a crucial issue for next step devices, is assessed in present-day tokamaks using two methods: particle balance performed during shots and post-mortem analysis carried out during shutdowns between experimental campaigns. Post-mortem analysis generally gives lower estimates of fuel retention than integrated particle balance. In order to understand the discrepancy between these two methods, a dedicated experimental campaign has been performed in Tore Supra to load the vessel walls with deuterium (D) and monitor the trapped D inventory through particle balance. The campaign was followed by an extensive post-mortem analysis phase of the Tore Supra limiter. This paper presents the status of the analysis phase, including the assessment of the D content in the castellated tile structure of the limiter. Indeed, using combined surface analysis techniques, it was possible to derive the relative contributions of different zones of interest on the limiter (erosion, thick deposits, thin deposits), showing that the post-mortem inventory is mainly due to codeposition (90% of the total), in particular due to gap deposits. However, deuterium was also evidenced deep into the material in erosion zones (10% of the total). At the present stage of the analysis, 50% of the inventory deduced from particle balance has been found through post-mortem analysis, a significant progress with respect to previous studies (factor 8-10 discrepancy). This shows that post-mortem analysis can be consistent with particle balance provided specific procedures are implemented (dedicated campaign followed by extensive post-mortem analysis). Both techniques are needed for a reliable assessment of fuel retention in tokamaks, giving complementary information on how much and where fuel is retained in the vessel walls.

  4. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  5. Joint analysis of three-dimensional anatomical and functional data considering the cerebral post mortem imaging in rodents

    International Nuclear Information System (INIS)

    Dubois, Albertine

    2008-01-01

    The recent development of dedicated small animal anatomical (MRI) and functional (micro-PET) scanners has opened up the possibility of performing repeated functional in vivo studies in the same animal as the longitudinal follow-up of cerebral glucose metabolism. However, these systems still suffer technical limitations including a limited sensitivity and a reduced spatial resolution. Hence, autoradiography and histological studies remain the reference and widely used techniques for biological studies in small animals. The major disadvantage of these post mortem imaging techniques is that they require brain tissue sectioning, entailing the production of large numbers (up to several hundreds) of serial sections and the inherent loss of three-dimensional (3D) spatial consistency. The first step towards improving the analysis of this post mortem information was the development of reliable, automated procedures for the 3D reconstruction of the whole brain sections. We first developed an optimized data acquisition from large numbers of post mortem data (2D sections and block-face photographs). Then, we proposed different strategies of 3D reconstruction of the corresponding volumes. We also addressed the histological to autoradiographic sections and to block-face photographs co-registration problem (the photographic volume is intrinsically spatially consistent). These developments were essential for the 3D reconstruction but also enabled the evaluation of different methods of functional data analysis, from the most straightforward (manual delineation of regions of interest) to the most automated (Statistical Parametric Mapping-like approaches for group analysis). Two biological applications were carried out: visual stimulation in rats and cerebral metabolism in a transgenic mouse model of Alzheimer's disease. One perspective of this work is to match reconstructed post mortem data with in vivo images of the same animal. (author) [fr

  6. Post-mortem CT evaluation of atlanto-occipital dissociation.

    Science.gov (United States)

    Madadin, Mohammed; Samaranayake, Ravindra Priyalal; O'Donnell, Chris; Cordner, Stephen

    2017-02-01

    Atlanto-occipital dissociation injury is an important injury in forensic pathology practice. Radiological diagnosis of atlanto-occipital dissociation clinically is assessed by direct measurement of occipito-vertebral skeletal relationships. Different measurements may be used to diagnose atlanto-occipital dissociation, including the basion-dens interval (BDI) and basion-axial interval (BAI). It is not known whether the normal ante-mortem measurements of BDI and BAI described in the literature are applicable to post-mortem CT images of the occipito-cervical junction (OCJ) or whether these measurements could be affected by early post-mortem changes. This study aims to compare post-mortem BDI and BAI measurements with ante-mortem values. Post-mortem CT scans of the cervical spines of 100 deceased adults were reviewed, and the BDI and BAI were measured. Different parameters were recorded in each case. The results from this study suggest that there are no effects of post-mortem changes on the measurement of BAI as relied upon clinically. There appear to be some effects of fully established rigor mortis on BDI measurement, shortening it. This may have consequences for the post mortem diagnosis of atlanto-occipital dissociation. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  7. Quantification of ante-mortem hypoxic ischemic brain injury by post-mortem cerebral magnetic resonance imaging in neonatal encephalopathy.

    Science.gov (United States)

    Montaldo, Paolo; Chaban, Badr; Lally, Peter J; Sebire, Neil J; Taylor, Andrew M; Thayyil, Sudhin

    2015-11-01

    Post-mortem (PM) magnetic resonance imaging (MRI) is increasingly used as an alternative to conventional autopsy in babies dying from neonatal encephalopathy. However, the confounding effect of post-mortem changes on the detection of ante-mortem ischemic injury is unclear. We examined whether quantitative MR measurements can accurately distinguish ante-mortem ischemic brain injury from artifacts using post-mortem MRI. We compared PM brain MRI (1.5 T Siemens, Avanto) in 7 infants who died with neonatal encephalopathy (NE) of presumed hypoxic-ischemic origin with 7 newborn infants who had sudden unexplained neonatal death (SUND controls) without evidence of hypoxic-ischemic brain injury at autopsy. We measured apparent diffusion coefficients (ADCs), T1-weighted signal intensity ratios (SIRs) compared to vitreous humor and T2 relaxation times from 19 predefined brain areas typically involved in neonatal encephalopathy. There were no differences in mean ADC values, SIRs on T1-weighted images or T2 relaxation times in any of the 19 predefined brain areas between NE and SUND infants. All MRI images showed loss of cortical gray/white matter differentiation, loss of the normal high signal intensity (SI) in the posterior limb of the internal capsule on T1-weighted images, and high white matter SI on T2-weighted images. Normal post-mortem changes may be easily mistaken for ante-mortem ischemic injury, and current PM MRI quantitative assessment cannot reliably distinguish these. These findings may have important implications for appropriate interpretation of PM imaging findings, especially in medico-legal practice. Copyright © 2015 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  8. [Legal aspects of post-mortem radiology in the Netherlands].

    Science.gov (United States)

    Venderink, W; Dute, J C J

    2016-01-01

    In the Netherlands, the application of post-mortem radiology (virtual autopsy) is on the rise. Contrary to conventional autopsy, with post-mortem radiology the body remains intact. There is uncertainty concerning the legal admissibility of post-mortem radiology, since the Dutch Corpse Disposal Act does not contain any specific regulations for this technique. Autopsy and post-mortem radiology differ significantly from a technical aspect, but these differences do not have far-reaching legal consequences from a legal perspective. Even though the body remains intact during post-mortem radiology, the bodily integrity of a deceased person is breached if it would be applied without previously obtained consent. This permission can only be obtained after the relatives are fully informed about the proposed activity. In this respect, it is not relevant which technique is used, be it post-mortem radiology or autopsy. Therefore, the other legal conditions for post-mortem radiology are essentially identical to those for autopsy.

  9. Multimodal imaging and in vivo/post mortem co-registration in rodents and non human primates

    International Nuclear Information System (INIS)

    Delzescaux, T.

    2006-01-01

    Within the framework of neuro-degenerative disease studies, animal models still remain essential for the improvement of our understanding of underlying pathological mechanisms and for the discovery and development of potential novel therapeutic approaches. The pre-clinical research especially requires the use of non-human primates models because of the similarities between their brain and Human's, whereas fundamental investigations in many areas of biology and medicine more widely involve the use of rodent models.The recent developments of in vivo imaging systems dedicated to small animals (μ-CT, μ-MRI and μ-PET) have made possible the study of brain anatomic alterations as well as the longitudinal follow-up of metabolism and neurotransmission impairments, which can be involved in neuro-degenerative diseases. In particular, μ-PET is becoming increasingly relevant to assess the efficiency of a potential candidate in the field of drug discovery and development and disease diagnosis. However, until today a few laboratories are equipped with them. Moreover, their limited spatial resolution and the lack of specific biological markers are still major limitations. As a consequence, the scientific community still needs comparative anatomical and/or functional analyses, in particular for studies concerning rodent brain. Hence, post mortem biological imaging remains the powerful, reference and predominantly technology used for small animal imaging and for the validation of in vivo imaging systems. Generally, anatomical and complementary functional information are, respectively, provided by histological staining and autoradiography of corresponding brain sections. The large variety of histological dyes (cresyl violet for Nissl bodies Congo red for amyloid plaques) and radioactive compounds ([ 14 C]Deoxyglucose for cerebral glucose metabolism, [ 14 C]leucine for cerebral protein synthesis [ 14 C]iodoantipyrine for cerebral blood flow), as well as the microscopic range of

  10. Post-mortem MRI of the foetal spine and spinal cord

    International Nuclear Information System (INIS)

    Widjaja, E.; Whitby, E.H.; Cohen, M.; Paley, M.N.J.; Griffiths, P.D.

    2006-01-01

    Aims: To compare the findings of post-mortem magnetic resonance imaging (MRI) of the foetal spine with autopsy with a view to using post-mortem MRI as an alternative or adjunct to autopsy, particularly in foetal and neonatal cases. Materials and Methods: The brains and spines of 41 foetuses, with a gestational age range of 14-41 weeks, underwent post-mortem MRI before autopsy. Post-mortem MRI of the brain consisted of T2-weighted sequences in three orthogonal planes and MRI of the spine consisted of T2-weighted sequence in the sagittal and axial planes in all cases and coronal planes in selected cases. Results: Thirty of 41 (78%) foetal spines were found to be normal at autopsy and on post-mortem MRI. Eleven of 41 (22%) foetal spines were abnormal: eight foetuses had myelomeningocoeles and Chiari 2 deformities, one foetus had limited dorsal myeloschisis, one foetus had caudal regression syndrome, and one had diastematomyelia. The post-mortem MRI findings concurred with the autopsy findings in 10/11 of the abnormal cases, the disagreement being the case of diastematomyelia that was shown on post-mortem MRI but was not diagnosed at autopsy. Conclusions: In this series, post-mortem MRI findings agreed with the autopsy findings in 40/41(98%) cases and in one case the post-mortem MRI demonstrated an abnormality not demonstrated at autopsy

  11. Isolation of primary microglia from the human post-mortem brain: effects of ante- and post-mortem variables.

    Science.gov (United States)

    Mizee, Mark R; Miedema, Suzanne S M; van der Poel, Marlijn; Adelia; Schuurman, Karianne G; van Strien, Miriam E; Melief, Jeroen; Smolders, Joost; Hendrickx, Debbie A; Heutinck, Kirstin M; Hamann, Jörg; Huitinga, Inge

    2017-02-17

    Microglia are key players in the central nervous system in health and disease. Much pioneering research on microglia function has been carried out in vivo with the use of genetic animal models. However, to fully understand the role of microglia in neurological and psychiatric disorders, it is crucial to study primary human microglia from brain donors. We have developed a rapid procedure for the isolation of pure human microglia from autopsy tissue using density gradient centrifugation followed by CD11b-specific cell selection. The protocol can be completed in 4 h, with an average yield of 450,000 and 145,000 viable cells per gram of white and grey matter tissue respectively. This method allows for the immediate phenotyping of microglia in relation to brain donor clinical variables, and shows the microglia population to be distinguishable from autologous choroid plexus macrophages. This protocol has been applied to samples from over 100 brain donors from the Netherlands Brain Bank, providing a robust dataset to analyze the effects of age, post-mortem delay, brain acidity, and neurological diagnosis on microglia yield and phenotype. Our data show that cerebrospinal fluid pH is positively correlated to microglial cell yield, but donor age and post-mortem delay do not negatively affect viable microglia yield. Analysis of CD45 and CD11b expression showed that changes in microglia phenotype can be attributed to a neurological diagnosis, and are not influenced by variation in ante- and post-mortem parameters. Cryogenic storage of primary microglia was shown to be possible, albeit with variable levels of recovery and effects on phenotype and RNA quality. Microglial gene expression substantially changed due to culture, including the loss of the microglia-specific markers, showing the importance of immediate microglia phenotyping. We conclude that primary microglia can be isolated effectively and rapidly from human post-mortem brain tissue, allowing for the study of the

  12. Multimodal imaging and in vivo/post mortem co-registration in rodents and non human primates

    Energy Technology Data Exchange (ETDEWEB)

    Delzescaux, T. [Service Hospitalier Frederic Joliot, Isotopic Imaging, 91 - Orsay (France)

    2006-07-01

    Within the framework of neuro-degenerative disease studies, animal models still remain essential for the improvement of our understanding of underlying pathological mechanisms and for the discovery and development of potential novel therapeutic approaches. The pre-clinical research especially requires the use of non-human primates models because of the similarities between their brain and Human's, whereas fundamental investigations in many areas of biology and medicine more widely involve the use of rodent models.The recent developments of in vivo imaging systems dedicated to small animals ({mu}-CT, {mu}-MRI and {mu}-PET) have made possible the study of brain anatomic alterations as well as the longitudinal follow-up of metabolism and neurotransmission impairments, which can be involved in neuro-degenerative diseases. In particular, {mu}-PET is becoming increasingly relevant to assess the efficiency of a potential candidate in the field of drug discovery and development and disease diagnosis. However, until today a few laboratories are equipped with them. Moreover, their limited spatial resolution and the lack of specific biological markers are still major limitations. As a consequence, the scientific community still needs comparative anatomical and/or functional analyses, in particular for studies concerning rodent brain. Hence, post mortem biological imaging remains the powerful, reference and predominantly technology used for small animal imaging and for the validation of in vivo imaging systems. Generally, anatomical and complementary functional information are, respectively, provided by histological staining and autoradiography of corresponding brain sections. The large variety of histological dyes (cresyl violet for Nissl bodies Congo red for amyloid plaques) and radioactive compounds ([{sup 14}C]Deoxyglucose for cerebral glucose metabolism, [{sup 14}C]leucine for cerebral protein synthesis [{sup 14}C]iodoantipyrine for cerebral blood flow), as well as

  13. Reviviendo la consulta post-mortem.

    OpenAIRE

    Armando Cortés

    2009-01-01

    Por estos días se inaugura el “Centro de consulta post-mortem del Hospital Universitario del Valle”, una denominación más apropiada para la autopsia «ver por sí mismo» o cualquiera de sus sinónimos necropsia, examen post-mortem, necroscopia, o tanatopsia; todos ellos no aceptados y condicionados por factores culturales, sociales o religiosos. Estos términos han alcanzado una connotación claramente negativa en el ambiente médico y en el público general. Quizás, el mejor término sea «consulta p...

  14. Post-mortem radiology-a new sub-speciality?

    International Nuclear Information System (INIS)

    O'Donnell, C.; Woodford, N.

    2008-01-01

    Computed tomography (CT) and magnetic resonance imaging (MRI) examinations of deceased individuals are increasingly being utilized in the field of forensic pathology. However, there are differences in the interpretation of post-mortem and clinical imaging. Radiologists with only occasional experience in post-mortem imaging are at risk of misinterpreting the findings if they rely solely on clinical experience. Radiological specialists working in a co-operative environment with pathologists are pivotal in the understanding of post-mortem CT and MRI, and its appropriate integration into the autopsy. This has spawned a novel subspecialty called post-mortem radiology or necro-radiology (radiology of the deceased). In the future it is likely that whole-body CT will be incorporated into the routine forensic autopsy due its ability to accurately detect and localise abnormalities commonly seen in forensic practice, such as haematoma, abnormal gas collections, fractures, and metallic foreign bodies. In the next 5-10 years most forensic institutes will seek regular access to such CT facilities or install machines into their own mortuaries. MRI is technically more problematic in the deceased but the improved tissue contrast over CT means that it is also very useful for investigation of pathology in the cranial, thoracic, and abdominal cavities, as well as the detection of haematoma in soft tissue. In order for radiologists to be an integral part of this important development in forensic investigation, radiological organizations must recognize the subspecialty of post-mortem radiology and provide a forum for radiologists to advance scientific knowledge in the field

  15. Delayed Post Mortem Predation in Lightning Strike Carcasses ...

    African Journals Online (AJOL)

    Campbell Murn

    An adult giraffe was struck dead by lightning on a game farm outside. Phalaborwa, South Africa in March 2014. Interestingly, delayed post-mortem predation occurred on the carcass, which according to the farm owners was an atypical phenomenon for the region. Delayed post-mortem scavenging on lightning strike ...

  16. Fuel retention in JET ITER-Like Wall from post-mortem analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heinola, K., E-mail: kalle.heinola@ccfe.ac.uk [Association EURATOM-TEKES, University of Helsinki, PO Box 64, 00560 Helsinki (Finland); EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Widdowson, A. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Likonen, J. [Association EURATOM-TEKES, VTT, PO Box 1000, 02044 VTT, Espoo (Finland); Alves, E. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Baron-Wiechec, A. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Barradas, N. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Brezinsek, S. [Forschungszentrum Julich GmbH, EURATOM Association, D-52425 Julich (Germany); Catarino, N. [Instituto Superior Tecnico, Instituto de Plasmas e Fusao Nuclear, Universidade de Lisboa, 1049-001 Lisboa (Portugal); Coad, P. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Koivuranta, S. [Association EURATOM-TEKES, VTT, PO Box 1000, 02044 VTT, Espoo (Finland); Matthews, G.F. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Mayer, M. [Max-Planck Institut fur Plasmaphysik, EURATOM Association, D-85748 Garching (Germany); Petersson, P. [Royal Institute of Technology, Association EURATOM-VR, SE-10044 Stockholm (Sweden)

    2015-08-15

    Selected Ion Beam Analysis techniques applicable for detecting deuterium and heavier impurities have been used in the post-mortem analyses of tiles removed after the first JET ITER-Like Wall (JET-ILW) campaign. Over half of the retained fuel was measured in the divertor region. The highest figures for fuel retention were obtained from regions with the thickest deposited layers, i.e. in the inner divertor on top of tile 1 and on the High Field Gap Closure tile, which resides deep in the plasma scrape-off layer. Least retention was found in the main chamber high erosion regions, i.e. in the mid-plane of Inner Wall Guard Limiter. The fuel retention values found typically varied with deposition layer thicknesses. The reported retention values support the observed decrease in fuel retention obtained with gas balance experiments of JET-ILW.

  17. Partitioning the proteome: phase separation for targeted analysis of membrane proteins in human post-mortem brain.

    Directory of Open Access Journals (Sweden)

    Jane A English

    Full Text Available Neuroproteomics is a powerful platform for targeted and hypothesis driven research, providing comprehensive insights into cellular and sub-cellular disease states, Gene × Environmental effects, and cellular response to medication effects in human, animal, and cell culture models. Analysis of sub-proteomes is becoming increasingly important in clinical proteomics, enriching for otherwise undetectable proteins that are possible markers for disease. Membrane proteins are one such sub-proteome class that merit in-depth targeted analysis, particularly in psychiatric disorders. As membrane proteins are notoriously difficult to analyse using traditional proteomics methods, we evaluate a paradigm to enrich for and study membrane proteins from human post-mortem brain tissue. This is the first study to extensively characterise the integral trans-membrane spanning proteins present in human brain. Using Triton X-114 phase separation and LC-MS/MS analysis, we enriched for and identified 494 membrane proteins, with 194 trans-membrane helices present, ranging from 1 to 21 helices per protein. Isolated proteins included glutamate receptors, G proteins, voltage gated and calcium channels, synaptic proteins, and myelin proteins, all of which warrant quantitative proteomic investigation in psychiatric and neurological disorders. Overall, our sub-proteome analysis reduced sample complexity and enriched for integral membrane proteins by 2.3 fold, thus allowing for more manageable, reproducible, and targeted proteomics in case vs. control biomarker studies. This study provides a valuable reference for future neuroproteomic investigations of membrane proteins, and validates the use Triton X-114 detergent phase extraction on human post mortem brain.

  18. In vitro studies of ante-mortem proliferation kinetics

    International Nuclear Information System (INIS)

    McBride, W.H.; Withers, H.R.

    1986-01-01

    Using K562 human erythroblastoid cells, it was concluded that dose fractionation has no discrepant effect on the ante-mortem proliferation kinetics of doomed cells as opposed to clonogenic cell survival and that effects on ante-mortem proliferation kinetics cannot be solely responsible for the differences in fractionation response between early and late responding tissues. (UK)

  19. Usefulness of post-mortem ophthalmological endoscopy during forensic autopsy: a case report.

    Science.gov (United States)

    Tsujinaka, Masatake; Akaza, Kayoko; Nagai, Atsushi; Nakamura, Isao; Bunai, Yasuo

    2005-01-01

    Post-mortem intraocular findings in two autopsy cases with traumatic intracranial haemorrhage were obtained using an ophthalmological endoscope. The endoscopy results clearly revealed the presence of intraocular haemorrhages and papilledema caused by intracranial haemorrhage. Post-mortem ophthalmological endoscopy offers several benefits. First, post-mortem intraocular findings can be directly observed in corpses with post-mortem clouding of the cornea. Secondly, the endoscopy only requires a 0.9 mm incision in the sclera and does not require the removal of the eye from the corpse, a procedure that should be avoided for ethical and cosmetic reasons. Thus, post-mortem opthalmological endoscopy is a useful method for obtaining intraocular findings in autopsies.

  20. Application of contrast media in post-mortem imaging (CT and MRI).

    Science.gov (United States)

    Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice

    2015-09-01

    The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.

  1. Drowning - post-mortem imaging findings by computed tomography

    International Nuclear Information System (INIS)

    Christe, Andreas; Aghayev, Emin; Jackowski, Christian; Thali, Michael J.; Vock, Peter

    2008-01-01

    The aim of this study was to identify the classic autopsy signs of drowning in post-mortem multislice computed tomography (MSCT). Therefore, the post-mortem pre-autopsy MSCT- findings of ten drowning cases were correlated with autopsy and statistically compared with the post-mortem MSCT of 20 non-drowning cases. Fluid in the airways was present in all drowning cases. Central aspiration in either the trachea or the main bronchi was usually observed. Consecutive bronchospasm caused emphysema aquosum. Sixty percent of drowning cases showed a mosaic pattern of the lung parenchyma due to regions of hypo- and hyperperfused lung areas of aspiration. The resorption of fresh water in the lung resulted in hypodensity of the blood representing haemodilution and possible heart failure. Swallowed water distended the stomach and duodenum; and inflow of water filled the paranasal sinuses (100%). All the typical findings of drowning, except Paltau's spots, were detected using post-mortem MSCT, and a good correlation of MSCT and autopsy was found. The advantage of MSCT was the direct detection of bronchospasm, haemodilution and water in the paranasal sinus, which is rather complicated or impossible at the classical autopsy. (orig.)

  2. Histopathological features of post-mortem pituitaries: A retrospective analysis

    Directory of Open Access Journals (Sweden)

    Francisco José Tortosa Vallecillos

    Full Text Available SUMMARY Objective: As a result of the use of neuroimaging techniques, silent pituitary lesions are diagnosed more and more frequently; however, there are few published post-mortem studies about this gland. Incidence data of pituitary lesions are rare and in Portugal they are outdated or even non-existent. The aim of this study is to determine the prevalence of normal patterns and incidental post-mortem pituitary pathology at Centro Hospitalar Lisboa Norte, analyzing the associations with clinical data and assessing the clinical relevance of the findings. Method: We reviewed retrospectively and histologically 167 pituitaries of a consecutive series of autopsies from the Department of Pathology of this centre. They were done between 2012 and 2014, and in all cases medical records were reviewed. The morphological patterns observed, were classified into three major groups: 1 Normal histological patterns and variants; 2 Infectious-inflammatory pathology, metabolic and vascular disorders; 3 Incidental primary proliferation and secondary to systemic diseases. Results: The subjects included in this study were of all age groups (from 1 day to 91 years old, 71 were female and 96 male. Fifty-seven of these glands didn’t show any alteration; 51 showed colloid cysts arising from Rathke cleft; 44 presented hyperplasia in adenohypophysis and we identified 20 adenomas in 19 glands (immunohistochemically, eight PRL-producing and five ACTH-producing tumors, ten of which associated with obesity, 11 to hypertension and six to diabetes mellitus. There were two cases with metastasis. Conclusion: Subclinical pathology in our country is similar to that seen in other parts of the world, but at older ages.

  3. Microstructural analysis of geopolymer developed from wood fly ash, post-mortem doloma refractory and metakaolin

    International Nuclear Information System (INIS)

    Moura, Jailes de Santana; Mafra, Marcio Paulo de Araujo; Rabelo, Adriano Alves; Fagury, Renata Lilian Ribeiro Portugal; Fagury Neto, Elias

    2016-01-01

    Geopolymers are one of the widely discussed topics of materials science in recent times due to its vast potential as an alternative binder material to cement. This work aimed to evaluate the microstructure of geopolymers developed from wood fly ash, post-mortem doloma refractory and metakaolin. A preliminary study has been completed and achieved significant results compressive strength: the best formulation of geopolymer paste obtained approximately 25 MPa. Microstructural analysis by scanning electron microscopy, the geopolymer paste, allowed to verify the homogeneity, distribution of components, and providing evidence of raw materials that do not respond if there was crystalline phase, porosity and density of the structure. (author)

  4. Dutch guideline for clinical foetal-neonatal and paediatric post-mortem radiology, including a review of literature.

    Science.gov (United States)

    Sonnemans, L J P; Vester, M E M; Kolsteren, E E M; Erwich, J J H M; Nikkels, P G J; Kint, P A M; van Rijn, R R; Klein, W M

    2018-06-01

    Clinical post-mortem radiology is a relatively new field of expertise and not common practice in most hospitals yet. With the declining numbers of autopsies and increasing demand for quality control of clinical care, post-mortem radiology can offer a solution, or at least be complementary. A working group consisting of radiologists, pathologists and other clinical medical specialists reviewed and evaluated the literature on the diagnostic value of post-mortem conventional radiography (CR), ultrasonography, computed tomography (PMCT), magnetic resonance imaging (PMMRI), and minimally invasive autopsy (MIA). Evidence tables were built and subsequently a Dutch national evidence-based guideline for post-mortem radiology was developed. We present this evaluation of the radiological modalities in a clinical post-mortem setting, including MIA, as well as the recently published Dutch guidelines for post-mortem radiology in foetuses, neonates, and children. In general, for post-mortem radiology modalities, PMMRI is the modality of choice in foetuses, neonates, and infants, whereas PMCT is advised in older children. There is a limited role for post-mortem CR and ultrasonography. In most cases, conventional autopsy will remain the diagnostic method of choice. Based on a literature review and clinical expertise, an evidence-based guideline was developed for post-mortem radiology of foetal, neonatal, and paediatric patients. What is Known: • Post-mortem investigations serve as a quality check for the provided health care and are important for reliable epidemiological registration. • Post-mortem radiology, sometimes combined with minimally invasive techniques, is considered as an adjunct or alternative to autopsy. What is New: • We present the Dutch guidelines for post-mortem radiology in foetuses, neonates and children. • Autopsy remains the reference standard, however minimal invasive autopsy with a skeletal survey, post-mortem computed tomography, or post-mortem

  5. A Case Study in Support of Multiple Post Mortem Assessments (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jill Pable

    2015-02-01

    Full Text Available Creative projects in various fields are often subjected to afterthe- fact 'post-mortem' assessments to better understand their successes and failures. Names for these include project retrospectives or post occupancy evaluations (POEs depending on their field of origin. This case study from the architecture field will show the utility of engaging in multiple rounds of post-mortem activities in order to assess the solution from multiple stakeholder perspectives and in doing so, more fully recognize its strengths and weaknesses. The design of a homeless shelter bedroom was subjected to two POE analyses: a 'demand side' focused study that analyzed user accommodation, and a 'supply side' study that addressed issues including budget and funding. The two POEs yielded both corroborative and contrasting findings that sometimes worked at cross purposes. Three evaluation tactics emerged that could be extended to other fields' post mortem assessment activities: 1 conduct two or more POEs; 2 vary the POE criteria so that one is deep and focused 'demand side' user analysis and the other is 'supply side' operational and installation issues; and 3 conduct the POEs over a broad time period.

  6. Late stillbirth post mortem examination in New Zealand: Maternal decision-making.

    Science.gov (United States)

    Cronin, Robin S; Li, Minglan; Wise, Michelle; Bradford, Billie; Culling, Vicki; Zuccollo, Jane; Thompson, John M D; Mitchell, Edwin A; McCowan, Lesley M E

    2018-03-05

    For parents who experience stillbirth, knowing the cause of their baby's death is important. A post mortem examination is the gold standard investigation, but little is known about what may influence parents' decisions to accept or decline. We aimed to identify factors influencing maternal decision-making about post mortem examination after late stillbirth. In the New Zealand Multicentre Stillbirth Study, 169 women with singleton pregnancies, no known abnormality at recruitment, and late stillbirth (≥28weeks gestation), from seven health regions were interviewed within six weeks of birth. The purpose of this paper was to explore factors related to post mortem examination decision-making and the reasons for declining. We asked women if they would make the same decision again. Maternal decision to decline a post mortem (70/169, 41.4%) was more common among women of Māori (adjusted odds ratio (aOR) 4.99 95% confidence interval (CI) 1.70-14.64) and Pacific (aOR 3.94 95% CI 1.47-10.54) ethnicity compared to European, and parity two or more (aOR 2.95 95% CI 1.14-7.62) compared to primiparous. The main reason for declining was that women 'did not want baby to be cut'. Ten percent (7/70) who declined said they would not make this decision again. No woman who consented regretted her decision. Ethnic differences observed in women's post mortem decision-making should be further explored in future studies. Providing information of the effect of post mortem on the baby's body and the possible emotional benefits of a post mortem may assist women faced with this decision in the future. © 2018 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  7. Effects of post mortem temperature on rigor tension, shortening and ...

    African Journals Online (AJOL)

    Fully developed rigor mortis in muscle is characterised by maximum loss of extensibility. The course of post mortem changes in ostrich muscle was studied by following isometric tension, shortening and change in pH during the first 24 h post mortem within muscle strips from the muscularis gastrocnemius, pars interna at ...

  8. Post-mortem CT-coronary angiography

    DEFF Research Database (Denmark)

    Pøhlsgaard, Camilla; Leth, Peter Mygind

    2007-01-01

    post-mortem coronary angiography and computerized tomography.  We describe how to prepare and inject the contrast medium, and how to establish a CT-protocol that optimizes spatial resolution, low contrast resolution and noise level. Testing of the method on 6 hearts, showed that the lumen...

  9. Traumatic brain injury: Comparison between autopsy and ante-mortem CT.

    Science.gov (United States)

    Panzer, Stephanie; Covaliov, Lidia; Augat, Peter; Peschel, Oliver

    2017-11-01

    The aim of this study was to compare pathological findings after traumatic brain injury between autopsy and ante-mortem computed tomography (CT). A second aim was to identify changes in these findings between the primary posttraumatic CT and the last follow-up CT before death. Through the collaboration between clinical radiology and forensic medicine, 45 patients with traumatic brain injury were investigated. These patients had undergone ante-mortem CT as well as autopsy. During autopsy, the brain was cut in fronto-parallel slices directly after removal without additional fixation or subsequent histology. Typical findings of traumatic brain injury were compared between autopsy and radiology. Additionally, these findings were compared between the primary CT and the last follow-up CT before death. The comparison between autopsy and radiology revealed a high specificity (≥80%) in most of the findings. Sensitivity and positive predictive value were high (≥80%) in almost half of the findings. Sixteen patients had undergone craniotomy with subsequent follow-up CT. Thirteen conservatively treated patients had undergone a follow-up CT. Comparison between the primary CT and the last ante-mortem CT revealed marked changes in the presence and absence of findings, especially in patients with severe traumatic brain injury requiring decompression craniotomy. The main pathological findings of traumatic brain injury were comparable between clinical ante-mortem CT examinations and autopsy. Comparison between the primary CT after trauma and the last ante-mortem CT revealed marked changes in the findings, especially in patients with severe traumatic brain injury. Hence, clinically routine ante-mortem CT should be included in the process of autopsy interpretation. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. Utility of Post-Mortem Genetic Testing in Cases of Sudden Arrhythmic Death Syndrome

    DEFF Research Database (Denmark)

    Lahrouchi, Najim; Raju, Hariharan; Lodder, Elisabeth M

    2017-01-01

    BACKGROUND: Sudden arrhythmic death syndrome (SADS) describes a sudden death with negative autopsy and toxicological analysis. Cardiac genetic disease is a likely etiology. OBJECTIVES: This study investigated the clinical utility and combined yield of post-mortem genetic testing (molecular autopsy...

  11. Post-mortem fetal MRI: What do we learn from it?

    International Nuclear Information System (INIS)

    Whitby, E.H.; Paley, M.N.J.; Cohen, M.; Griffiths, P.D.

    2006-01-01

    Post-mortem magnetic resonance (MR) imaging is of increasing interest not only as an alternative to autopsy but as a research tool to aid the interpretation and diagnosis of in utero MR images. The information from the post-mortem MR has allowed the development of imaging sequences applicable to in utero imaging and neonatal imaging. It has established brain development during gestation and has provided data on this to which in utero MR can be compared. The detail available from the post-mortem images is such that brain development can be studied in a non-invasive manner, a permanent record on the normal and abnormal areas is available and a greater understanding of developmental abnormalities is possible

  12. Forensic radiology: The role of cross-sectional imaging in virtual post-mortem examinations

    International Nuclear Information System (INIS)

    Higginbotham-Jones, Joshua; Ward, Anthony

    2014-01-01

    Aim: The aim of this review is to assess the benefits and limitations of using Multi Slice Computed Tomography and Magnetic Resonance as non-invasive post-mortem imaging methods. Method: The author utilised SciVerse (Science Direct), Scopus, PubMed and Discover to search for relevant articles. The following search terms were used: virtopsy, minimally invasive post-mortem imaging, autopsy, Multi Slice Computed Tomography, Magnetic Resonance. Articles which discussed the use of non-invasive imaging techniques for post-mortem examinations were included in the review. Any articles published before 2003 were excluded with a few exceptions. Findings: The decline in use of the conventional post-mortem method has led to the need for an alternative method of investigation which increases both sensitivity and specificity, and also is more acceptable to the family of the deceased. Discussion/conclusion: There are numerous factors affecting the usability of these non-invasive post-mortem options including cost and availability. With the price of non-invasive post-mortem examinations often rising above £1000, it is considered to be less economically viable than the conventional method. Therefore, further research into this method and its implementation in hospitals has been delayed

  13. Utility of Post-Mortem Genetic Testing in Cases of Sudden Arrhythmic Death Syndrome

    NARCIS (Netherlands)

    Lahrouchi, Najim; Raju, Hariharan; Lodder, Elisabeth M.; Papatheodorou, Efstathios; Ware, James S.; Papadakis, Michael; Tadros, Rafik; Cole, Della; Skinner, Jonathan R.; Crawford, Jackie; Love, Donald R.; Pua, Chee J.; Soh, Bee Y.; Bhalshankar, Jaydutt D.; Govind, Risha; Tfelt-Hansen, Jacob; Winkel, Bo G.; van der Werf, Christian; Wijeyeratne, Yanushi D.; Mellor, Greg; Till, Jan; Cohen, Marta C.; Tome-Esteban, Maria; Sharma, Sanjay; Wilde, Arthur A. M.; Cook, Stuart A.; Bezzina, Connie R.; Sheppard, Mary N.; Behr, Elijah R.

    2017-01-01

    Sudden arrhythmic death syndrome (SADS) describes a sudden death with negative autopsy and toxicological analysis. Cardiac genetic disease is a likely etiology. This study investigated the clinical utility and combined yield of post-mortem genetic testing (molecular autopsy) in cases of SADS and

  14. Quality of coroner's post-mortems in a UK hospital.

    Science.gov (United States)

    Al Mahdy, Husayn

    2014-01-01

    The aim of this paper was, principally, to look at the coroner's post-mortem report quality regarding adult medical patients admitted to an English hospital; and to compare results with Royal College of Pathologists guidelines. Hospital clinical notes of adult medical patients dying in 2011 and who were referred to the coroner's office to determine the cause of death were scrutinised. Their clinical care was also reviewed. There needs to be a comprehensive approach to coroner's post-mortems such as routinely taking histological and microbiological specimens. Acute adult medical patient care needs to improve. Steps should be taken to ensure that comprehensive coroner's post-mortems are performed throughout the UK, including with routine histological and microbiological specimens examination. Additionally, closer collaboration between clinicians and pathologists needs to occur to improve emergency adult medical patient clinical care. The study highlights inadequacies in coroner's pathology services.

  15. Uses and social meanings of post-mortem photography in Colombia

    OpenAIRE

    Ana María Henao Albarracín

    2013-01-01

    This research is intended to understand the social uses and meanings of post-mortem or funeral photography between the late nineteenth and mid-twentieth century in Colombia. The article seeks to contribute to the analysis of the relationship between photography and society, and more particularly, between photography and a social representation of death, identifying the conventions and rules of this photographic practice that determine aesthetic behaviors around death.

  16. Uses and social meanings of post-mortem photography in Colombia

    Directory of Open Access Journals (Sweden)

    Ana María Henao Albarracín

    2013-06-01

    Full Text Available This research is intended to understand the social uses and meanings of post-mortem or funeral photography between the late nineteenth and mid-twentieth century in Colombia. The article seeks to contribute to the analysis of the relationship between photography and society, and more particularly, between photography and a social representation of death, identifying the conventions and rules of this photographic practice that determine aesthetic behaviors around death.

  17. Rigor mortis at the myocardium investigated by post-mortem magnetic resonance imaging.

    Science.gov (United States)

    Bonzon, Jérôme; Schön, Corinna A; Schwendener, Nicole; Zech, Wolf-Dieter; Kara, Levent; Persson, Anders; Jackowski, Christian

    2015-12-01

    Post-mortem cardiac MR exams present with different contraction appearances of the left ventricle in cardiac short axis images. It was hypothesized that the grade of post-mortem contraction may be related to the post-mortem interval (PMI) or cause of death and a phenomenon caused by internal rigor mortis that may give further insights in the circumstances of death. The cardiac contraction grade was investigated in 71 post-mortem cardiac MR exams (mean age at death 52 y, range 12-89 y; 48 males, 23 females). In cardiac short axis images the left ventricular lumen volume as well as the left ventricular myocardial volume were assessed by manual segmentation. The quotient of both (LVQ) represents the grade of myocardial contraction. LVQ was correlated to the PMI, sex, age, cardiac weight, body mass and height, cause of death and pericardial tamponade when present. In cardiac causes of death a separate correlation was investigated for acute myocardial infarction cases and arrhythmic deaths. LVQ values ranged from 1.99 (maximum dilatation) to 42.91 (maximum contraction) with a mean of 15.13. LVQ decreased slightly with increasing PMI, however without significant correlation. Pericardial tamponade positively correlated with higher LVQ values. Variables such as sex, age, body mass and height, cardiac weight and cause of death did not correlate with LVQ values. There was no difference in LVQ values for myocardial infarction without tamponade and arrhythmic deaths. Based on the observation in our investigated cases, the phenomenon of post-mortem myocardial contraction cannot be explained by the influence of the investigated variables, except for pericardial tamponade cases. Further research addressing post-mortem myocardial contraction has to focus on other, less obvious factors, which may influence the early post-mortem phase too. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Post-mortem imaging compared with autopsy in trauma victims--A systematic review.

    Science.gov (United States)

    Jalalzadeh, Hamid; Giannakopoulos, Georgios F; Berger, Ferco H; Fronczek, Judith; van de Goot, Frank R W; Reijnders, Udo J; Zuidema, Wietse P

    2015-12-01

    Post-mortem imaging or virtual autopsy is a rapidly advancing field of post-mortem investigations of trauma victims. In this review we evaluate the feasibility of complementation or replacement of conventional autopsy by post-mortem imaging in trauma victims. A systematic review was performed in compliance with the PRISMA guidelines. MEDLINE, Embase and Cochrane databases were systematically searched for studies published between January 2008 and January 2014, in which post-mortem imaging was compared to conventional autopsy in trauma victims. Studies were included when two or more trauma victims were investigated. Twenty-six studies were included, with a total number of 563 trauma victims. Post-mortem computer tomography (PMCT) was performed in 22 studies, post-mortem magnetic resonance imaging (PMMRI) in five studies and conventional radiography in two studies. PMCT and PMMRI both demonstrate moderate to high-grade injuries and cause of death accurately. PMCT is more sensitive than conventional autopsy or PMMRI in detecting skeletal injuries. For detecting minor organ and soft tissue injuries, autopsy remains superior to imaging. Aortic injuries are missed frequently by PMCT and PMMRI and form their main limitation. PMCT should be considered as an essential supplement to conventional autopsy in trauma victims since it detects many additional injuries. Despite some major limitations, PMCT could be used as an alternative for conventional autopsy in situations where conventional autopsy is rejected or unavailable. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture.

    Science.gov (United States)

    Winklhofer, Sebastian; Stoeck, Christian T; Berger, Nicole; Thali, Michael; Manka, Robert; Kozerke, Sebastian; Alkadhi, Hatem; Stolzmann, Paul

    2014-11-01

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p Analysis of HA distribution demonstrated remodelling of myofibre architecture, with significant differences between healthy segments and segments with chronic (p  0.05). Post-mortem cardiac DTI enables differentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. • DTI enables post-mortem detection of myocardial infarction with good accuracy. • A decrease in right-handed helical fibre indicates myofibre remodelling following chronic myocardial infarction. • DTI allows for ruling out myocardial infarction by means of FA. • Post-mortem DTI may represent a valuable screening tool in forensic investigations.

  20. Time to address the problem of post-mortem procurement of organs for transplantation occurring without proper pre-mortem consent.

    Science.gov (United States)

    Garwood-Gowers, Austen

    2013-09-01

    Current cadaveric organ transplant systems allow individuals to be classified as donors after death where they registered wishes in favour of this prior to death. However, systems for registering wishes pertaining to donation fall woefully short of securing proper consent. Furthermore, even jurisdictions which technically require consent to be obtained in order to treat an individual as a donor, allow that consent to be given by next of kin after death in circumstances where there is no evidence of the individual having refused prior to death. This article explores these and related issues with current systems from the perspectives of health law norms, ethics and human rights. It concludes that proper pre-mortem consent ought to be a pre-requisite for post-mortem organ transplantation.

  1. Feather retention force in broilers ante-, peri-, and post-mortem as influenced by electrical and carbon dioxide stunning.

    Science.gov (United States)

    Buhr, R J; Cason, J A; Rowland, G N

    1997-11-01

    Stunning and slaughter trials were conducted to evaluate the influence of stunning method (electrical 50 V alternating current, CO2 gas: 0 to 40% for 90 s or 40 to 60% for 30 s) on feather retention force (FRF) in commercial broilers. Feathers from the pectoral, sternal, and femoral feather tracts were sampled with a force gauge before stunning (ante-mortem) and contralaterally either after stunning (peri-mortem from 0.5 to 4 min) or after stunning and bleeding (post-mortem from 2 to 6 min). Prior to stunning, ante-mortem FRF values varied among assigned stunning methods only for the pectoral (7%) feather tract. After stunning, peri-mortem FRF values were higher only for the sternal tract (11% for 40 to 60% CO2 for 30 s); whereas after stunning and bleeding, post-mortem FRF values were lower than ante- or peri-mortem only for the sternal tract (10% lower for 40 to 60% CO2 for 30 s). Peri- and post-mortem FRF values did not differ among stunning methods for the pectoral and femoral feather tracts. Small changes in FRF values occurred from ante-mortem to peri-mortem (-1 to +12%), and from ante-mortem to post-mortem (-2 to +8%) across stunning methods. A significant increase was determined for only the pectoral tract (7%) from ante- to peri-mortem across stunning methods. Electrically stunned broilers that were not bled gained weight in excess of the 36 feathers removed (0.16%), apparently due to body surface water pickup during the brine-stunning process, whereas CO2-stunned broilers lost weight due to excretion of cloacal contents (-0.31 to -0.98%). The change in body weight among stunning methods was significant (P defeathering efficiency may not differ after scalding.

  2. The toxicological significance of post-mortem drug concentrations in bile.

    Science.gov (United States)

    Ferner, Robin E; Aronson, Jeffrey K

    2018-01-01

    Some authors have proposed that post-mortem drug concentrations in bile are useful in estimating concentrations in blood. Both The International Association of Forensic Toxicologists (TIAFT) and the US Federal Aviation Administration recommend that samples of bile should be obtained in some circumstances. Furthermore, standard toxicological texts compare blood and bile concentrations, implying that concentrations in bile are of forensic value. To review the evidence on simultaneous measurements of blood and bile drug concentrations reported in the medical literature. We made a systematic search of EMBASE 1980-2016 using the search terms ("bile/" OR "exp drug bile level/concentration/") AND "drug blood level/concentration/", PubMed 1975-2017 for ("bile[tw]" OR "biliary[tw]") AND ("concentration[tw]" OR "concentrations[tw]" OR "level[tw]" OR "levels[tw]") AND "post-mortem[tw]" and also MEDLINE 1990-2016 for information on drugs whose biliary concentrations were mentioned in standard textbooks. The search was limited to human studies without language restrictions. We also examined recent reviews, indexes of relevant journals and citations in Web of Science and Google Scholar. We calculated the bile:blood concentration ratio. The searches together yielded 1031 titles with abstracts. We scanned titles and abstracts for relevance and retrieved 230, of which 161 were considered further. We excluded 49 papers because: the paper reported only one case (30 references); the data referred only to a metabolite (1); the work was published before 1980 (3); the information concerned only samples taken during life (10); or the paper referred to a toxin or unusual recreational drug (5). The remaining 112 papers provided data for analysis, with at least two observations for each of 58 drugs. Bile:blood concentration ratios: Median bile:blood concentration ratios varied from 0.18 (range 0.058-0.32) for dextromoramide to 520 (range 0.62-43,000) for buprenorphine. Median bile

  3. An audit of the contribution to post-mortem examination diagnosis of individual analyte results obtained from biochemical analysis of the vitreous.

    Science.gov (United States)

    Mitchell, Rebecca; Charlwood, Cheryl; Thomas, Sunethra Devika; Bellis, Maria; Langlois, Neil E I

    2013-12-01

    Biochemical analysis of the vitreous humor from the eye is an accepted accessory test for post-mortem investigation of cause of death. Modern biochemical analyzers allow testing of a range of analytes from a sample. However, it is not clear which analytes should be requested in order to prevent unnecessary testing (and expense). The means and standard deviation of the values obtained from analysis of the vitreous humor for sodium, potassium, chloride, osmolality, glucose, ketones (β-hydroxybutyrate), creatinine, urea, calcium, lactate, and ammonia were calculated from which the contribution of each analyte was reviewed in the context of post-mortem findings and final cause of death. For sodium 32 cases were regarded as high (more than one standard deviation above the mean), from which 9 contributed to post-mortem diagnosis [drowning (4), heat related death (2), diabetic hyperglycemia (2), and dehydration (1)], but 25 low values (greater than one standard deviation below the mean) made no contribution. For chloride 29 high values contributed to 4 cases--3 drowning and 1 heat-related, but these were all previously identified by a high sodium level. There were 29 high and 35 low potassium values, none of which contributed to determining the final cause of death. Of 22 high values of creatinine, 12 contributed to a diagnosis of renal failure. From 32 high values of urea, 18 contributed to 16 cases of renal failure (2 associated with diabetic hyperglycemia), 1 heat-related death, and one case with dehydration. Osmolarity contributed to 12 cases (5 heat-related, 4 diabetes, 2 renal failure, and 1 dehydration) from 36 high values. There was no contribution from 32 high values and 19 low values of calcium and there was no contribution from 4 high and 2 low values of ammonia. There were 11 high values of glucose, which contributed to the diagnosis of 6 cases of diabetic hyperglycemia and 21 high ketone levels contributed to 8 cases: 4 diabetic ketosis, 3 hypothermia, 3

  4. Blast furnace hearth lining: post mortem analysis

    International Nuclear Information System (INIS)

    Almeida, Bruno Vidal de; Vernilli Junior, Fernando

    2017-01-01

    The main refractory lining of blast furnace hearth is composed by carbon blocks that operates in continuous contact with hot gases, liquid slag and hot metal, in temperatures above 1550 deg C for 24 hours a day. To fully understand the wear mechanism that acts in this refractory layer system it was performed a Post Mortem study during the last partial repair of this furnace. The samples were collected from different parts of the hearth lining and characterized using the following techniques: Bulk Density and Apparent Porosity, X-Ray Fluorescence, X-ray Diffraction, Scanning Electron Microscopy with Energy-dispersive X-Ray Spectroscopy. The results showed that the carbon blocks located at the opposite side of the blast furnace tap hole kept its main physicochemical characteristics preserved even after the production of 20x10"6 ton of hot metal. However, the carbon blocks around the Tap Hole showed infiltration by hot metal and slag and it presents a severe deposition of zinc and sulfur over its carbon flakes. The presence of these elements is undesired because it reduces the physic-chemical stability of this refractory system. This deposition found in the carbon refractory is associated with impurities present in the both coke and the sinter feed used in this blast furnace in the last few years. (author)

  5. Blast furnace hearth lining: post mortem analysis

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Bruno Vidal de; Vernilli Junior, Fernando, E-mail: bva@usp.br [Universidade de Sao Paulo (USP), Lorena, SP (Brazil). Escola de Engenharia; Neves; Elton Silva; Silva, Sidiney Nascimento [Companhia Siderugica Nacional (CSN), Rio de Janeiro, RJ (Brazil)

    2017-05-15

    The main refractory lining of blast furnace hearth is composed by carbon blocks that operates in continuous contact with hot gases, liquid slag and hot metal, in temperatures above 1550 deg C for 24 hours a day. To fully understand the wear mechanism that acts in this refractory layer system it was performed a Post Mortem study during the last partial repair of this furnace. The samples were collected from different parts of the hearth lining and characterized using the following techniques: Bulk Density and Apparent Porosity, X-Ray Fluorescence, X-ray Diffraction, Scanning Electron Microscopy with Energy-dispersive X-Ray Spectroscopy. The results showed that the carbon blocks located at the opposite side of the blast furnace tap hole kept its main physicochemical characteristics preserved even after the production of 20x10{sup 6} ton of hot metal. However, the carbon blocks around the Tap Hole showed infiltration by hot metal and slag and it presents a severe deposition of zinc and sulfur over its carbon flakes. The presence of these elements is undesired because it reduces the physic-chemical stability of this refractory system. This deposition found in the carbon refractory is associated with impurities present in the both coke and the sinter feed used in this blast furnace in the last few years. (author)

  6. Microstructural analysis of geopolymer developed from wood fly ash, post-mortem doloma refractory and metakaolin; Analise microestrutural de geopolimero desenvolvido a partir de cinza de olaria, tijolo refratario dolomitico post-mortem e metacaulim

    Energy Technology Data Exchange (ETDEWEB)

    Moura, Jailes de Santana; Mafra, Marcio Paulo de Araujo; Rabelo, Adriano Alves; Fagury, Renata Lilian Ribeiro Portugal; Fagury Neto, Elias, E-mail: jailesmoura@hotmail.com, E-mail: fagury@unifesspa.edu.br [Universidade Federal do Sul e Sudeste do Para (UNIFESSPA), PA (Brazil). Faculdade de Engenharia de Materiais

    2016-07-01

    Geopolymers are one of the widely discussed topics of materials science in recent times due to its vast potential as an alternative binder material to cement. This work aimed to evaluate the microstructure of geopolymers developed from wood fly ash, post-mortem doloma refractory and metakaolin. A preliminary study has been completed and achieved significant results compressive strength: the best formulation of geopolymer paste obtained approximately 25 MPa. Microstructural analysis by scanning electron microscopy, the geopolymer paste, allowed to verify the homogeneity, distribution of components, and providing evidence of raw materials that do not respond if there was crystalline phase, porosity and density of the structure. (author)

  7. [Research Progress of Carrion-breeding Phorid Flies for Post-mortem Interval Estimation in Forensic Medicine].

    Science.gov (United States)

    Li, L; Feng, D X; Wu, J

    2016-10-01

    It is a difficult problem of forensic medicine to accurately estimate the post-mortem interval. Entomological approach has been regarded as an effective way to estimate the post-mortem interval. The developmental biology of carrion-breeding flies has an important position at the post-mortem interval estimation. Phorid flies are tiny and occur as the main or even the only insect evidence in relatively enclosed environments. This paper reviews the research progress of carrion-breeding phorid flies for estimating post-mortem interval in forensic medicine which includes their roles, species identification and age determination of immatures. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  8. DNA methylation results depend on DNA integrity – role of post mortem interval

    Directory of Open Access Journals (Sweden)

    Mathias eRhein

    2015-05-01

    Full Text Available Major questions of neurological and psychiatric mechanisms involve the brain functions on a molecular level and cannot be easily addressed due to limitations in access to tissue samples. Post mortem studies are able to partly bridge the gap between brain tissue research retrieved from animal trials and the information derived from peripheral analysis (e.g. measurements in blood cells in patients. Here, we wanted to know how fast DNA degradation is progressing under controlled conditions in order to define thresholds for tissue quality to be used in respective trials. Our focus was on the applicability of partly degraded samples for bisulfite sequencing and the determination of simple means to define cut-off values.After opening the brain cavity, we kept two consecutive pig skulls at ambient temperature (19-21°C and removed cortex tissue up to a post mortem interval (PMI of 120h. We calculated the percentage of degradation on DNA gel electrophoresis of brain DNA to estimate quality and relate this estimation spectrum to the quality of human post-mortem control samples. Functional DNA quality was investigated by bisulfite sequencing of two functionally relevant genes for either the serotonin receptor 5 (SLC6A4 or aldehyde dehydrogenase 2 (ALDH2.Testing our approach in a heterogeneous collective of human blood and brain samples, we demonstrate integrity of measurement quality below the threshold of 72h PMI.While sequencing technically worked for all timepoints irrespective of conceivable DNA degradation, there is a good correlation between variance of methylation to degradation levels documented in the gel (R2=0.4311, p=0.0392 for advancing post mortem intervals (PMI. This otherwise elusive phenomenon is an important prerequisite for the interpretation and evaluation of samples prior to in-depth processing via an affordable and easy assay to estimate identical sample quality and thereby comparable methylation measurements.

  9. Diagnosis of drowning using post-mortem computed tomography - state of the art.

    Science.gov (United States)

    Raux, C; Saval, F; Rouge, D; Telmon, N; Dedouit, F

    Recent studies using post-mortem computed tomography (PMCT) have suggested this imaging modality is of value in the positive diagnosis of drowning. We summarize the data from the literature regarding the diagnostic value of CT in cases of drowning. We performed an all-language search of literature published from 1999 to 2013 with the key words "post-mortem CT scan", "drowning and CT scan", "near-drowning diagnosis", and "drowning diagnosis". Only 11 articles, whose data enabled complementary statistical analysis, were included. The presence of fluid and sediment in paranasal sinuses appear to be the determinants of the diagnosis of drowning. The presence of fluid in the sinuses had a sensitivity of 100%, and of 90% in the trachea and main bronchi. The results were completed by the high specificity of the presence of sediment in the paranasal sinuses, upper airways and stomach, which was 100% for all three. Haemodilution was present in cases of drowning (p drowning.

  10. Ethical considerations in forensic genetics research on tissue samples collected post-mortem in Cape Town, South Africa.

    Science.gov (United States)

    Heathfield, Laura J; Maistry, Sairita; Martin, Lorna J; Ramesar, Raj; de Vries, Jantina

    2017-11-29

    The use of tissue collected at a forensic post-mortem for forensic genetics research purposes remains of ethical concern as the process involves obtaining informed consent from grieving family members. Two forensic genetics research studies using tissue collected from a forensic post-mortem were recently initiated at our institution and were the first of their kind to be conducted in Cape Town, South Africa. This article discusses some of the ethical challenges that were encountered in these research projects. Among these challenges was the adaptation of research workflows to fit in with an exceptionally busy service delivery that is operating with limited resources. Whilst seeking guidance from the literature regarding research on deceased populations, it was noted that next of kin of decedents are not formally recognised as a vulnerable group in the existing ethical and legal frameworks in South Africa. The authors recommend that research in the forensic mortuary setting is approached using guidance for vulnerable groups, and the benefit to risk standard needs to be strongly justified. Lastly, when planning forensic genetics research, consideration must be given to the potential of uncovering incidental findings, funding to validate these findings and the feedback of results to family members; the latter of which is recommended to occur through a genetic counsellor. It is hoped that these experiences will contribute towards a formal framework for conducting forensic genetic research in medico-legal mortuaries in South Africa.

  11. Cochlear neuropathy in human presbycusis: Confocal analysis of hidden hearing loss in post-mortem tissue.

    Science.gov (United States)

    Viana, Lucas M; O'Malley, Jennifer T; Burgess, Barbara J; Jones, Dianne D; Oliveira, Carlos A C P; Santos, Felipe; Merchant, Saumil N; Liberman, Leslie D; Liberman, M Charles

    2015-09-01

    Recent animal work has suggested that cochlear synapses are more vulnerable than hair cells in both noise-induced and age-related hearing loss. This synaptopathy is invisible in conventional histopathological analysis, because cochlear nerve cell bodies in the spiral ganglion survive for years, and synaptic analysis requires special immunostaining or serial-section electron microscopy. Here, we show that the same quadruple-immunostaining protocols that allow synaptic counts, hair cell counts, neuronal counts and differentiation of afferent and efferent fibers in mouse can be applied to human temporal bones, when harvested within 9 h post-mortem and prepared as dissected whole mounts of the sensory epithelium and osseous spiral lamina. Quantitative analysis of five "normal" ears, aged 54-89 yrs, without any history of otologic disease, suggests that cochlear synaptopathy and the degeneration of cochlear nerve peripheral axons, despite a near-normal hair cell population, may be an important component of human presbycusis. Although primary cochlear nerve degeneration is not expected to affect audiometric thresholds, it may be key to problems with hearing in noise that are characteristic of declining hearing abilities in the aging ear. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Compressive rib fracture: peri-mortem and post-mortem trauma patterns in a pig model.

    Science.gov (United States)

    Kieser, Jules A; Weller, Sarah; Swain, Michael V; Neil Waddell, J; Das, Raj

    2013-07-01

    Despite numerous studies on high impact fractures of ribs, little is known about compressive rib injuries. We studied rib fractures from a biomechanical and morphological perspective using 15, 5th ribs of domestic pigs Sus scrofa, divided into two groups, desiccated (representing post-mortem trauma) and fresh ribs with intact periosteum (representing peri-mortem trauma). Ribs were axially compressed and subjected to four-point bending in an Instron 3339 fitted with custom jigs. Morphoscopic analysis of resultant fractures consisted of standard optical methods, micro-CT (μCT) and scanning electron microscopy (SEM). During axial compression, fresh ribs had slightly higher strength because of energy absorption capabilities of their soft and fluidic components. In flexure tests, dry ribs showed typical elastic-brittle behaviour with long linear load-extension curves, followed by relatively short non-linear elastic (hyperelastic) behaviour and brittle fracture. Fresh ribs showed initial linear-elastic behaviour, followed by strain softening, visco-plastic responses. During the course of loading, dry bone showed minimal observable damage prior to the onset of unstable fracture. In contrast, fresh bone showed buckling-like damage features on the compressive surface and cracking parallel to the axis of the bone. Morphologically, all dry ribs fractured precipitously, whereas all but one of the fresh ribs showed incomplete fracture. The mode of fracture, however, was remarkably similar for both groups, with butterfly fractures predominating (7/15, 46.6% dry and wet). Our study highlights the fact that under controlled loading, despite seemingly similar butterfly fracture morphology, fresh ribs (representing perimortem trauma) show a non-catastrophic response. While extensive strain softening observed for the fresh bone does show some additional micro-cracking damage, it appears that the periosteum may play a key role in imparting the observed pseudo-ductility to the ribs

  13. Current status of paediatric post-mortem imaging: an ESPR questionnaire-based survey

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Rijn, Rick R. van [Academic Medical Centre, Department of Radiology, Amsterdam (Netherlands); Sebire, Neil J. [Great Ormond Street Hospital for Children, Department of Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    The use of post-mortem imaging, including skeletal radiography, CT and MRI, is increasing, providing a minimally invasive alternative to conventional autopsy techniques. The development of clinical guidelines and national standards is being encouraged, particularly for cross-sectional techniques. To outline the current practice of post-mortem imaging amongst members of the European Society of Paediatric Radiology (ESPR). We e-mailed an online questionnaire of current post-mortem service provisions to members of the ESPR in January 2013. The survey included direct questions about what services were offered, the population imaged, current techniques used, imaging protocols, reporting experience and intended future involvement. Seventy-one percent (47/66) of centres from which surveys were returned reported performing some form of post-mortem imaging in children, of which 81 % perform radiographs, 51% CT and 38% MRI. Eighty-seven percent of the imaging is performed within the radiology or imaging departments, usually by radiographers (75%), and 89% is reported by radiologists, of which 64% is reported by paediatric radiologists. Overall, 72% of positive respondents have a standardised protocol for radiographs, but only 32% have such a protocol for CT and 27% for MRI. Sixty-one percent of respondents wrote that this is an important area that needs to be developed. Overall, the majority of centres provide some post-mortem imaging service, most of which is performed within an imaging department and reported by a paediatric radiologist. However, the populations imaged as well as the details of the services offered are highly variable among institutions and lack standardisation. We have identified people who would be interested in taking this work forwards. (orig.)

  14. 9 CFR 354.122 - Condemnation on ante-mortem inspection.

    Science.gov (United States)

    2010-01-01

    ... AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY..., on ante-mortem inspection, are condemned shall not be dressed, nor shall they be conveyed into any...

  15. Assessment of coronary artery disease by post-mortem cardiac MR

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Bauer-Kreutz, Regula; Ampanozi, Garyfalia; Rosskopf, Andrea B.; Pilgrim, Thomas M.; Weber, Oliver M.; Thali, Michael J.; Hatch, Gary M.

    2012-01-01

    Objectives: Minimally invasive or virtual autopsies are being advocated as alternative to traditional autopsy, but have limited abilities to detect coronary artery disease. It was the objective of this study to assess if the occurrence of chemical shift artifacts (CSA) along the coronary arteries on non-contrast, post-mortem cardiac MR may be used to investigate coronary artery disease. Methods: We retrospectively compared autopsy and CT findings of 30 cases with significant (≥75%), insignificant (<75%), or absent coronary artery stenosis to post-mortem cardiac MR findings. The chi-square test was used to investigate if the occurrence of CSA depends on the presence or absence of stenosis. Sensitivity, specificity and predictive values were calculated for each finding. Results: CSA indicates the absence of (significant) stenosis (p < 0.001). The occurrence of paired dark bands in lieu of CSA on post-mortem cardiac MR suggests (significant) coronary arteries stenosis (p < 0.001). Both findings have a high specificity but low sensitivity. Conclusions: CSA is a marker of vessel patency. The presence of paired dark bands indicates stenosis. These criteria improve the ability of minimally invasive or virtual autopsy to detect coronary artery disease related deaths

  16. Changes of microbial spoilage, lipid-protein oxidation and physicochemical properties during post mortem refrigerated storage of goat meat.

    Science.gov (United States)

    Sabow, Azad Behnan; Sazili, Awis Qurni; Aghwan, Zeiad Amjad; Zulkifli, Idrus; Goh, Yong Meng; Ab Kadir, Mohd Zainal Abidin; Nakyinsige, Khadijah; Kaka, Ubedullah; Adeyemi, Kazeem Dauda

    2016-06-01

    Examined was the effect of post mortem refrigerated storage on microbial spoilage, lipid-protein oxidation and physicochemical traits of goat meat. Seven Boer bucks were slaughtered, eviscerated and aged for 24 h. The Longissimus lumborum (LL) and Semitendinosus (ST) muscles were excised and subjected to 13 days post mortem refrigerated storage. The pH, lipid and protein oxidation, tenderness, color and drip loss were determined in LL while microbiological analysis was performed on ST. Bacterial counts generally increased with increasing aging time and the limit for fresh meat was reached at day 14 post mortem. Significant differences were observed in malondialdehyde (MDA) content at day 7 of storage. The thiol concentration significantly reduced as aging time increased. The band intensities of myosin heavy chain (MHC) and troponin-T significantly decreased as storage progressed, while actin remained relatively stable. After 14 days of aging, tenderness showed significant improvement while muscle pH and drip loss reduced with increase in storage time. Samples aged for 14 days had higher lightness (P goat meat. © 2016 Japanese Society of Animal Science.

  17. Characterisation of the metabolome of ocular tissues and post-mortem changes in the rat retina.

    Science.gov (United States)

    Tan, Shi Z; Mullard, Graham; Hollywood, Katherine A; Dunn, Warwick B; Bishop, Paul N

    2016-08-01

    Time-dependent post-mortem biochemical changes have been demonstrated in donor cornea and vitreous, but there have been no published studies to date that objectively measure post-mortem changes in the retinal metabolome over time. The aim of the study was firstly, to investigate post-mortem, time-dependent changes in the rat retinal metabolome and secondly, to compare the metabolite composition of healthy rat ocular tissues. To study post-mortem changes in the rat retinal metabolome, globes were enucleated and stored at 4 °C and sampled at 0, 2, 4, 8, 24 and 48 h post-mortem. To study the metabolite composition of rat ocular tissues, eyes were dissected immediately after culling to isolate the cornea, lens, vitreous and retina, prior to storing at -80 °C. Tissue extracts were subjected to Gas Chromatograph Mass Spectrometry (GC-MS) and Ultra High Performance Liquid Chromatography Mass Spectrometry (UHPLC-MS). Generally, the metabolic composition of the retina was stable for 8 h post-mortem when eyes were stored at 4 °C, but showed increasing changes thereafter. However, some more rapid changes were observed such as increases in TCA cycle metabolites after 2 h post-mortem, whereas some metabolites such as fatty acids only showed decreases in concentration from 24 h. A total of 42 metabolites were identified across the ocular tissues by GC-MS (MSI level 1) and 2782 metabolites were annotated by UHPLC-MS (MSI level 2) according to MSI reporting standards. Many of the metabolites detected were common to all of the tissues but some metabolites showed partitioning between different ocular structures with 655, 297, 93 and 13 metabolites being uniquely detected in the retina, lens, cornea and vitreous respectively. Only a small percentage (1.6%) of metabolites found in the vitreous were only detected in the retina and not other tissues. In conclusion, mass spectrometry-based techniques have been used for the first time to compare the metabolic composition of

  18. Post-Mortem Projections: Medieval Mystical Resurrection and the Return of Tupac Shakur

    OpenAIRE

    Spencer-Hall, A.

    2012-01-01

    Medieval hagiographies abound with tales of post-mortem visits and miracles by saints. The saint was a powerful religious individual both in life and in death, a conduit of divine grace and lightning rod for Christian fervour. With her post-mortem presence, the presumptive boundary between living and dead, spirit and flesh, is rent apart: showing the reality of the hereafter and shattering the fantasies of the mortal world. The phenomenon of a glorified individual returning to ...

  19. Profiling of RNA degradation for estimation of post mortem [corrected] interval.

    Directory of Open Access Journals (Sweden)

    Fernanda Sampaio-Silva

    Full Text Available An estimation of the post mortem interval (PMI is frequently touted as the Holy Grail of forensic pathology. During the first hours after death, PMI estimation is dependent on the rate of physical observable modifications including algor, rigor and livor mortis. However, these assessment methods are still largely unreliable and inaccurate. Alternatively, RNA has been put forward as a valuable tool in forensic pathology, namely to identify body fluids, estimate the age of biological stains and to study the mechanism of death. Nevertheless, the attempts to find correlation between RNA degradation and PMI have been unsuccessful. The aim of this study was to characterize the RNA degradation in different post mortem tissues in order to develop a mathematical model that can be used as coadjuvant method for a more accurate PMI determination. For this purpose, we performed an eleven-hour kinetic analysis of total extracted RNA from murine's visceral and muscle tissues. The degradation profile of total RNA and the expression levels of several reference genes were analyzed by quantitative real-time PCR. A quantitative analysis of normalized transcript levels on the former tissues allowed the identification of four quadriceps muscle genes (Actb, Gapdh, Ppia and Srp72 that were found to significantly correlate with PMI. These results allowed us to develop a mathematical model with predictive value for estimation of the PMI (confidence interval of ±51 minutes at 95% that can become an important complementary tool for traditional methods.

  20. Influence of Post-Mortem Sperm Recovery Method and Extender on Unstored and Refrigerated Rooster Sperm Variables.

    Science.gov (United States)

    Villaverde-Morcillo, S; Esteso, M C; Castaño, C; Santiago-Moreno, J

    2016-02-01

    Many post-mortem sperm collection techniques have been described for mammalian species, but their use in birds is scarce. This paper compares the efficacy of two post-mortem sperm retrieval techniques - the flushing and float-out methods - in the collection of rooster sperm, in conjunction with the use of two extenders, i.e., L&R-84 medium and Lake 7.1 medium. To determine whether the protective effects of these extenders against refrigeration are different for post-mortem and ejaculated sperm, pooled ejaculated samples (procured via the massage technique) were also diluted in the above extenders. Post-mortem and ejaculated sperm variables were assessed immediately at room temperature (0 h), and after refrigeration at 5°C for 24 and 48 h. The flushing method retrieved more sperm than the float-out method (596.5 ± 75.4 million sperm vs 341.0 ± 87.6 million sperm; p < 0.05); indeed, the number retrieved by the former method was similar to that obtained by massage-induced ejaculation (630.3 ± 78.2 million sperm). For sperm collected by all methods, the L&R-84 medium provided an advantage in terms of sperm motility variables at 0 h. In the refrigerated sperm samples, however, the Lake 7.1 medium was associated with higher percentages of viable sperm, and had a greater protective effect (p < 0.05) with respect to most motility variables. In conclusion, the flushing method is recommended for collecting sperm from dead birds. If this sperm needs to be refrigerated at 5°C until analysis, Lake 7.1 medium is recommended as an extender. © 2015 Blackwell Verlag GmbH.

  1. Post-Mortem Projections: Medieval Mystical Resurrection and the Return of Tupac Shakur

    OpenAIRE

    Spencer-Hall, Alicia

    2012-01-01

    Medieval hagiographies abound with tales of post-mortem visits and miracles by saints. The saint was a powerful religious individual both in life and in death, a conduit of divine grace and lightning rod for Christian fervour. With her post-mortem presence, the presumptive boundary between living and dead, spirit and flesh, is rent apart: showing the reality of the hereafter and shattering the fantasies of the mortal world. The phenomenon of a glorified individual returning to a worshipful co...

  2. Establishing post mortem criteria for the metabolic syndrome: an autopsy based cross-sectional study.

    Science.gov (United States)

    Christensen, Martin Roest; Bugge, Anne; Malik, Mariam Elmegaard; Thomsen, Jørgen Lange; Lynnerup, Niels; Rungby, Jørgen; Banner, Jytte

    2018-01-01

    Individuals who suffer from mental illness are more prone to obesity and related co-morbidities, including the metabolic syndrome. Autopsies provide an outstanding platform for the macroscopic, microscopic and molecular-biological investigation of diseases. Autopsy-based findings may assist in the investigation of the metabolic syndrome. To utilise the vast information that an autopsy encompasses to elucidate the pathophysiology behind the syndrome further, we aimed to both develop and evaluate a method for the post mortem definition of the metabolic syndrome. Based on the nationwide Danish SURVIVE study of deceased mentally ill, we established a set of post mortem criteria for each of the harmonized criteria of the metabolic syndrome. We based the post mortem (PM) evaluation on information from the police reports and the data collected at autopsy, such as anthropometric measurements and biochemical and toxicological analyses (PM information). We compared our PM evaluation with the data from the Danish health registries [ante mortem (AM) information, considered the gold standard] from each individual. The study included 443 deceased individuals (272 male and 171 female) with a mean age of 50.4 (± 15.5) years and a median (interquartile range) post mortem interval of 114 (84-156) hours. We found no significant difference when defining the metabolic syndrome from the PM information in comparison to the AM information ( P  = 0.175). The PM evaluation yielded a high specificity (0.93) and a moderate sensitivity (0.63) with a moderate level of agreement compared to the AM evaluation (Cohen's κ = 0.51). Neither age nor post mortem interval affected the final results. Our model of a PM definition of the metabolic syndrome proved reliable when compared to the AM information. We believe that an appropriate estimate of the prevalence of the metabolic syndrome can be established post mortem. However, while neither the PM nor the AM information is exhaustive in

  3. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture

    Energy Technology Data Exchange (ETDEWEB)

    Winklhofer, Sebastian; Berger, Nicole; Stolzmann, Paul [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); Stoeck, Christian T.; Kozerke, Sebastian [Institute for Biomedical Engineering University and ETH Zurich, Zurich (Switzerland); Thali, Michael [University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); Manka, Robert [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); Institute for Biomedical Engineering University and ETH Zurich, Zurich (Switzerland); University Hospital Zurich, Clinic for Cardiology, Zurich (Switzerland); Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland)

    2014-11-15

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p < 0.01) and lower MD (p < 0.001) compared to segments with MI. Multivariate logistic regression demonstrated that FA (p < 0.10) and MD (p = 0.01) with the covariate post-mortem time (p < 0.01) predicted MI with an accuracy of 0.73. Analysis of HA distribution demonstrated remodelling of myofibre architecture, with significant differences between healthy segments and segments with chronic (p < 0.001) but not with acute MI (p > 0.05). Post-mortem cardiac DTI enablesdifferentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. (orig.)

  4. Diagnostic accuracy of post-mortem magnetic resonance imaging in fetuses, children and adults: A systematic review

    Energy Technology Data Exchange (ETDEWEB)

    Thayyil, Sudhin, E-mail: s.thayyil@ucl.ac.u [Centre for Cardiovascular Imaging, UCL Institute of Child Health, London (United Kingdom); UCL Institute for Women' s Health, London (United Kingdom); Chandrasekaran, Manigandan [UCL Institute for Women' s Health, London (United Kingdom); Chitty, Lyn S. [UCL Institute for Women' s Health, London (United Kingdom); Clinical Molecular Genetics Unit, UCL Institute of Child Health, London (United Kingdom); Wade, Angie [Medical Statistics, UCL Institute of Child Health (United Kingdom); Skordis-Worrall, Jolene [Centre for International Health and Development, UCL Institute of Child Health (United Kingdom); Bennett-Britton, Ian [Centre for International Health and Development, UCL Institute of Child Health (United Kingdom); Health Economics and Financing Program, London School of Hygiene and Tropical Medicine, London (United Kingdom); Cohen, Marta [Department of Histopathology, Sheffield Children' s Hospital, Sheffield (United Kingdom); Withby, Elspeth [Department of Academic Radiology, Sheffield Children' s Hospital, Sheffield (United Kingdom); Sebire, Neil J. [Department of Histopathology, UCL Institute of Child Health and Great Ormond Street Hospital for Children, London (United Kingdom); Robertson, Nicola J. [UCL Institute for Women' s Health, London (United Kingdom); Taylor, Andrew M. [Centre for Cardiovascular Imaging, UCL Institute of Child Health, London (United Kingdom)

    2010-07-15

    To determine, in a systematic review, the diagnostic accuracy, acceptability and cost-effectiveness of less invasive autopsy by post-mortem MR imaging, in fetuses, children and adults. We searched Medline, Embase, the Cochrane library and reference lists to identify all studies comparing post-mortem MR imaging with conventional autopsy, published between January 1990 and March 2009. 539 abstracts were identified; 15 papers met the inclusion criteria; data from 9 studies were extracted (total: 146 fetuses, 11 children and 24 adults). In accurately identifying the final cause of death or most clinically significant abnormality, post-mortem MR imaging had a sensitivity and specificity of 69% (95% CI-56%, 80%) and 95% (95% CI-88%, 98%) in fetuses, and 28% (95% CI-13%, 47%) and 64% (95% CI-23%, 94%) in children and adults, respectively; however the published data is limited to small, heterogenous and poorly designed studies. Insufficient data is available on acceptability and economic evaluation of post-mortem MR imaging. Well designed, large, prospective studies are required to evaluate the accuracy of post-mortem MR imaging, before it can be offered as a clinical tool.

  5. Diagnostic accuracy of post-mortem magnetic resonance imaging in fetuses, children and adults: A systematic review

    International Nuclear Information System (INIS)

    Thayyil, Sudhin; Chandrasekaran, Manigandan; Chitty, Lyn S.; Wade, Angie; Skordis-Worrall, Jolene; Bennett-Britton, Ian; Cohen, Marta; Withby, Elspeth; Sebire, Neil J.; Robertson, Nicola J.; Taylor, Andrew M.

    2010-01-01

    To determine, in a systematic review, the diagnostic accuracy, acceptability and cost-effectiveness of less invasive autopsy by post-mortem MR imaging, in fetuses, children and adults. We searched Medline, Embase, the Cochrane library and reference lists to identify all studies comparing post-mortem MR imaging with conventional autopsy, published between January 1990 and March 2009. 539 abstracts were identified; 15 papers met the inclusion criteria; data from 9 studies were extracted (total: 146 fetuses, 11 children and 24 adults). In accurately identifying the final cause of death or most clinically significant abnormality, post-mortem MR imaging had a sensitivity and specificity of 69% (95% CI-56%, 80%) and 95% (95% CI-88%, 98%) in fetuses, and 28% (95% CI-13%, 47%) and 64% (95% CI-23%, 94%) in children and adults, respectively; however the published data is limited to small, heterogenous and poorly designed studies. Insufficient data is available on acceptability and economic evaluation of post-mortem MR imaging. Well designed, large, prospective studies are required to evaluate the accuracy of post-mortem MR imaging, before it can be offered as a clinical tool.

  6. Deuterium Inventory in Tore Supra (DITS): 2nd post-mortem analysis campaign and fuel retention in the gaps

    International Nuclear Information System (INIS)

    Dittmar, T.; Tsitrone, E.; Pegourie, B.; Cadez, I.; Pelicon, P.; Gauthier, E.; Languille, P.; Likonen, J.; Litnovsky, A.; Markelj, S.; Martin, C.; Mayer, M.; Pascal, J.-Y.; Pardanaud, C.; Philipps, V.; Roth, J.; Roubin, P.; Vavpetic, P.

    2011-01-01

    A dedicated study on fuel retention has been launched in Tore Supra, which includes a D wall-loading campaign and the dismantling of the main limiter (Deuterium Inventory in Tore Supra, DITS project). This paper presents new results from a second post-mortem analysis campaign on 40 tiles with special emphasis on the D retention in the gaps. SIMS analysis reveals that only 1/3 of the thickness of deposits in the plasma shadowed zones are due to the DITS wall-loading campaign. As pre-DITS deposits contain less D than DITS deposits, the contribution of DITS to the D inventory is about 30-50%. The new estimate for the total amount of D retained in the Tore Supra limiter is 1.7 x 10 24 atoms, close to the previous estimate, with the gap surfaces contributing about 33%. NRA measurements show a stepped decrease of D along the gap with strong asymmetries between different gap orientations.

  7. Post-mortem examination and sampling of African flamingos ...

    African Journals Online (AJOL)

    Recent largely unexplained deaths in African flamingos have prompted the need for standard, reproducible methods for the post-mortem examination of these birds, for the taking of samples and for the recording of findings. Here we describe suitable techniques and present three distinct protocols for field-based ...

  8. 9 CFR 381.71 - Condemnation on ante mortem inspection.

    Science.gov (United States)

    2010-01-01

    ... dressed, nor shall they be conveyed into any department of the official establishment where poultry... AGRICULTURE AGENCY ORGANIZATION AND TERMINOLOGY; MANDATORY MEAT AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS Ante Mortem Inspection § 381.71...

  9. Usefulness of post mortem computed tomography versus conventional forensic autopsy of road accident victims (drivers and passengers).

    Science.gov (United States)

    Moskała, Artur; Woźniak, Krzysztof; Kluza, Piotr; Romaszko, Karol; Lopatin, Oleksiy

    2017-01-01

    Aim of the study: Deaths of in-vehicle victims (drivers and passengers) of road accidents represent a significant group of issues addressed by forensic medicine. Expressing opinions in this regard involves first of all the determination of the cause of death and the forensic pathologist's participation in the process of road accident reconstruction through defining the mechanism of bodily harm. The scope of the opinion as well as its accuracy and degree of detail largely depend on the scope of forensic autopsy. In this context, techniques that broaden the capabilities of standard autopsy are of particular importance. This paper compares the results of post mortem computed tomography (PMCT) of road accident victims (drivers and passengers) against the results of standard examination in order to determine the scope to which PMCT significantly enhances autopsy capabilities. Material and methods: The analysis covers 118 in-vehicle victims (drivers and passengers) examined from 2012 to 2014. In each case, post-mortem examination was preceded by PMCT examination using Somatom Emotion 16 (Siemens AG, Germany). Results: The results are presented in a tabular form. Conclusions: In most road accident victims (drivers and passengers), post mortem computed tomography significantly increases the results' degree of detail, particularly with regard to injuries of bones and gas collections.

  10. Corroboration of in utero MRI using post-mortem MRI and autopsy in foetuses with CNS abnormalities

    International Nuclear Information System (INIS)

    Whitby, E.H.; Variend, S.; Rutter, S.; Paley, M.N.J.; Wilkinson, I.D.; Davies, N.P.; Sparey, C.; Griffiths, P.D.

    2004-01-01

    AIMS: To corroborate the findings of in utero magnetic resonance imaging (MRI) with autopsy and post-mortem MRI in cases of known or suspected central nervous system (CNS) abnormalities on ultrasound and to compare the diagnostic accuracy of ante-natal ultrasound and in utero MRI. METHODS: Twelve pregnant women, whose foetuses had suspected central nervous system abnormalities underwent in utero MRI. The foetuses were imaged using MRi before autopsy. The data were used to evaluate the diagnostic accuracy of in utero MRI when compared with a reference standard of autopsy and post-mortem MRI in 10 cases and post-mortem MRI alone in two cases. RESULTS: The diagnostic accuracy of antenatal ultrasound and in utero MRI in correctly characterizing brain and spine abnormalities were 42 and 100%, respectively. CONCLUSION: In utero MRI provides a useful adjuvant to antenatal ultrasound when assessing CNS abnormalities by providing more accurate anatomical information. Post-mortem MRI assists the diagnosis of macroscopic structural abnormalities

  11. Fatty kidney diagnosed by mortem computed tomography

    DEFF Research Database (Denmark)

    Leth, P. M.

    2016-01-01

    Subnuclear vacuolization of the renal tubular epithelium is indicative of diabetic and alcoholic ketoacidosis and has also been proposed as a postmortem marker for hypothermia. We present for the first time a fatal case of ketoacidosis in combination with exposure where a suspicion of these diagn...... of these diagnoses was raised by a marked radiolucency of the kidneys at post-mortem computed tomography (PMCT). © 2015 Elsevier Ltd....

  12. Differential Nuclear and Mitochondrial DNA Preservation in Post-Mortem Teeth with Implications for Forensic and Ancient DNA Studies

    Science.gov (United States)

    Higgins, Denice; Rohrlach, Adam B.; Kaidonis, John; Townsend, Grant; Austin, Jeremy J.

    2015-01-01

    Major advances in genetic analysis of skeletal remains have been made over the last decade, primarily due to improvements in post-DNA-extraction techniques. Despite this, a key challenge for DNA analysis of skeletal remains is the limited yield of DNA recovered from these poorly preserved samples. Enhanced DNA recovery by improved sampling and extraction techniques would allow further advancements. However, little is known about the post-mortem kinetics of DNA degradation and whether the rate of degradation varies between nuclear and mitochondrial DNA or across different skeletal tissues. This knowledge, along with information regarding ante-mortem DNA distribution within skeletal elements, would inform sampling protocols facilitating development of improved extraction processes. Here we present a combined genetic and histological examination of DNA content and rates of DNA degradation in the different tooth tissues of 150 human molars over short-medium post-mortem intervals. DNA was extracted from coronal dentine, root dentine, cementum and pulp of 114 teeth via a silica column method and the remaining 36 teeth were examined histologically. Real time quantification assays based on two nuclear DNA fragments (67 bp and 156 bp) and one mitochondrial DNA fragment (77 bp) showed nuclear and mitochondrial DNA degraded exponentially, but at different rates, depending on post-mortem interval and soil temperature. In contrast to previous studies, we identified differential survival of nuclear and mtDNA in different tooth tissues. Futhermore histological examination showed pulp and dentine were rapidly affected by loss of structural integrity, and pulp was completely destroyed in a relatively short time period. Conversely, cementum showed little structural change over the same time period. Finally, we confirm that targeted sampling of cementum from teeth buried for up to 16 months can provide a reliable source of nuclear DNA for STR-based genotyping using standard

  13. The social life of the dead: The role of post-mortem examinations in medical student socialisation.

    Science.gov (United States)

    Goodwin, Dawn; Machin, Laura; Taylor, Adam

    2016-07-01

    Dissection has held a privileged position in medical education although the professional values it inculcates have been subject to intense debate. Claims vary from it generating a dehumanising level of emotional detachment, to promotion of rational and dispassionate decision-making, even to being a positive vehicle for ethical education. Social scientists have positioned dissection as a critical experience in the emotional socialisation of medical students. However, curricular revision has provoked debate about the style and quantity of anatomy teaching thus threatening this 'rite of passage' of medical students. Consequently, some UK medical schools do not employ dissection at all. In its place, observation of post-mortem examinations - a long established, if underutilised, practice - has re-emerged in an attempt to recoup aspects of anatomical knowledge that are arguably lost when dissection is omitted. Bodies for post-mortem examinations and bodies for dissection, however, have striking differences, meaning that post-mortem examinations and dissection cannot be considered comparable opportunities to learn anatomy. In this article, we explore the distinctions between dissection and post-mortem examinations. In particular, we focus on the absence of a discourse of consent, concerns about bodily integrity, how the body's shifting ontology, between object and person, disrupts students' attempts to distance themselves, and how the observation of post-mortem examinations features in the emotional socialisation of medical students. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Diagnosis of drowning using post-mortem computed tomography – state of the art

    Directory of Open Access Journals (Sweden)

    Catherine Raux

    2014-12-01

    Full Text Available Aim of the study: Recent studies using post-mortem computed tomography (PMCT have suggested this imaging modality is of value in the positive diagnosis of drowning. We summarize the data from the literature regarding the diagnostic value of CT in cases of drowning. Material and methods: We performed an all-language search of literature published from 1999 to 2013 with the key words “post-mortem CT scan”, “drowning and CT scan”, “near-drowning diagnosis”, and “drowning diagnosis”. Results : Only 11 articles, whose data enabled complementary statistical analysis, were included. The presence of fluid and sediment in paranasal sinuses appear to be the determinants of the diagnosis of drowning. The presence of fluid in the sinuses had a sensitivity of 100%, and of 90% in the trachea and main bronchi. The results were completed by the high specificity of the presence of sediment in the paranasal sinuses, upper airways and stomach, which was 100% for all three. Haemodilution was present in cases of drowning (p < 0.001. The values made it possible to formulate a decision algorithm for the diagnosis of drowning.

  15. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Calder, Alistair D. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); Kiho, Liina [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); Taylor, Andrew M. [Great Ormond Street Hospital for Children, Cardiorespiratory Unit, London (United Kingdom); UCL Institute of Cardiovascular Science, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Sebire, Neil J. [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  16. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    International Nuclear Information System (INIS)

    Arthurs, Owen J.; Calder, Alistair D.; Kiho, Liina; Taylor, Andrew M.; Sebire, Neil J.

    2014-01-01

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  17. Tissues from equine cadaver ligaments up to 72 hours of post-mortem: a promising reservoir of stem cells.

    Science.gov (United States)

    Shikh Alsook, Mohamad Khir; Gabriel, Annick; Piret, Joëlle; Waroux, Olivier; Tonus, Céline; Connan, Delphine; Baise, Etienne; Antoine, Nadine

    2015-12-18

    Mesenchymal stem cells (MSCs) harvested from cadaveric tissues represent a promising approach for regenerative medicine. To date, no study has investigated whether viable MSCs could survive in cadaveric tissues from tendon or ligament up to 72 hours of post-mortem. The purpose of the present work was to find out if viable MSCs could survive in cadaveric tissues from adult equine ligaments up to 72 hours of post-mortem, and to assess their ability (i) to remain in an undifferentiated state and (ii) to divide and proliferate in the absence of any specific stimulus. MSCs were isolated from equine cadaver (EC) suspensory ligaments within 48-72 hours of post-mortem. They were evaluated for viability, proliferation, capacity for tri-lineage differentiation, expression of cell surface markers (CD90, CD105, CD73, CD45), pluripotent transcription factor (OCT-4), stage-specific embryonic antigen-1 (SSEA-1), neuron-specific class III beta-tubulin (TUJ-1), and glial fibrillary acidic protein (GFAP). As well, they were characterized by transmission electron microscope (TEM). EC-MSCs were successfully isolated and maintained for 20 passages with high cell viability and proliferation. Phase contrast microscopy revealed that cells with fibroblast-like appearance were predominant in the culture. Differentiation assays proved that EC-MSCs are able to differentiate towards mesodermal lineages (osteogenic, adipogenic, chondrogenic). Flow cytometry analysis demonstrated that EC-MSCs expressed CD90, CD105, and CD73, while being negative for the leukocyte common antigen CD45. Immunofluorescence analysis showed a high percentage of positive cells for OCT-4 and SSEA-1. Surprisingly, in absence of any stimuli, some adherent cells closely resembling neuronal and glial morphology were also observed. Interestingly, our results revealed that approximately 15 % of the cell populations were TUJ-1 positive, whereas GFAP expression was detected in only a few cells. Furthermore, TEM analysis

  18. Forensic Identification of Decomposed Human Body through Comparison between Ante-Mortem and Post-Mortem CT Images of Frontal Sinuses: Case Report

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira Silva

    2017-01-01

    Full Text Available Objective: The aim of this paper is to report on a case of positive human identification of a decomposed body after the comparison of ante-mortem (AM and port-mortem (PM computed tomography images of frontal sinus. Case report: An unknown, highly decomposed human body, aged between 30 and 40 years, was found in a forest region in Brazil. The dental autopsy revealed several teeth missing AM and the presence of removable partial prostheses. The search for AM data resulted in a sequence of 20 axial images of the paranasal sinuses obtained by Multislice Computed Tomography (MSCT. PM reproduction of the MSCT images was performed in order to enable a comparative identification. After a direct confrontation between AM/PM MSCT, the data were collected for morphological findings, specifically for the lateral expansion of the left lobe, the anteroposterior dimension, and the position of median and accessory septa of the sinuses. Conclusion: The importance of storing and interpreting radiographic medical data properly is highlighted in this text, thus pointing out the importance of application of forensic radiology in the field of law.

  19. Diagnostic accuracy of post-mortem CT with targeted coronary angiography versus autopsy for coroner-requested post-mortem investigations: a prospective, masked, comparison study.

    Science.gov (United States)

    Rutty, Guy N; Morgan, Bruno; Robinson, Claire; Raj, Vimal; Pakkal, Mini; Amoroso, Jasmin; Visser, Theresa; Saunders, Sarah; Biggs, Mike; Hollingbury, Frances; McGregor, Angus; West, Kevin; Richards, Cathy; Brown, Laurence; Harrison, Rebecca; Hew, Roger

    2017-07-08

    England and Wales have one of the highest frequencies of autopsy in the world. Implementation of post-mortem CT (PMCT), enhanced with targeted coronary angiography (PMCTA), in adults to avoid invasive autopsy would have cultural, religious, and potential economic benefits. We aimed to assess the diagnostic accuracy of PMCTA as a first-line technique in post-mortem investigations. In this single-centre (Leicester, UK), prospective, controlled study, we selected cases of natural and non-suspicious unnatural death referred to Her Majesty's (HM) Coroners. We excluded cases younger than 18 years, known to have had a transmittable disease, or who weighed more than 125 kg. Each case was assessed by PMCTA, followed by autopsy. Pathologists were masked to the PMCTA findings, unless a potential risk was shown. The primary endpoint was the accuracy of the cause of death diagnosis from PMCTA against a gold standard of autopsy findings, modified by PMCTA findings only if additional substantially incontrovertible findings were identified. Between Jan 20, 2010, and Sept 13, 2012, we selected 241 cases, for which PMCTA was successful in 204 (85%). Seven cases were excluded from the analysis because of procedural unmasking or no autopsy data, as were 24 cases with a clear diagnosis of traumatic death before investigation; 210 cases were included. In 40 (19%) cases, predictable toxicology or histology testing accessible by PMCT informed the result. PMCTA provided a cause of death in 193 (92%) cases. A major discrepancy with the gold standard was noted in 12 (6%) cases identified by PMCTA, and in nine (5%) cases identified by autopsy (because of specific findings on PMCTA). The frequency of autopsy and PMCTA discrepancies were not significantly different (p=0·65 for major discrepancies and p=0·21 for minor discrepancies). Cause of death given by PMCTA did not overlook clinically significant trauma, occupational lung disease, or reportable disease, and did not significantly affect

  20. Influence of operational condition on lithium plating for commercial lithium-ion batteries – Electrochemical experiments and post-mortem-analysis

    International Nuclear Information System (INIS)

    Ecker, Madeleine; Shafiei Sabet, Pouyan; Sauer, Dirk Uwe

    2017-01-01

    Highlights: •Investigation of lithium plating to support reliable system integration. •Influence of operational conditions at low temperature on lithium plating. •Comparison of different lithium-ion battery technologies. •Large differences in low-temperature behaviour for different technologies. •Post-mortem analysis reveals inhomogeneous deposition of metallic lithium. -- Abstract: The lifetime and safety of lithium-ion batteries are key requirements for successful market introduction of electro mobility. Especially charging at low temperature and fast charging, known to provoke lithium plating, is an important issue for automotive engineers. Lithium plating, leading both to ageing as well as safety risks, is known to play a crucial role in system design of the application. To gain knowledge of different influence factors on lithium plating, low-temperature ageing tests are performed in this work. Commercial lithium-ion batteries of various types are tested under various operational conditions such as temperature, current, state of charge, charging strategy as well as state of health. To analyse the ageing behaviour, capacity fade and resistance increase are tracked over lifetime. The results of this large experimental survey on lithium plating provide support for the design of operation strategies for the implementation in battery management systems. To further investigate the underlying degradation mechanisms, differential voltage curves and impedance spectra are analysed and a post-mortem analysis of anode degradation is performed for a selected technology. The results confirm the deposition of metallic lithium or lithium compounds in the porous structure and suggest a strongly inhomogeneous deposition over the electrode thickness with a dense deposition layer close to the separator for the considered cell. It is shown that this inhomogeneous deposition can even lead to loss of active material. The plurality of the investigated technologies

  1. 42 CFR 35.16 - Autopsies and other post-mortem operations.

    Science.gov (United States)

    2010-10-01

    ... AND EXAMINATIONS HOSPITAL AND STATION MANAGEMENT General § 35.16 Autopsies and other post-mortem... to in writing by a person authorized under the law of the State in which the station or hospital is... made a part of the clinical record. [25 FR 6331, July 6, 1960] ...

  2. Post-mortem whole-body magnetic resonance imaging of human fetuses: a comparison of 3-T vs. 1.5-T MR imaging with classical autopsy

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Xin; Bevilacqua, Elisa; Cos Sanchez, Teresa; Jani, Jacques C. [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Obstetrics and Gynecology, Fetal Medicine Unit, Brussels (Belgium); Cannie, Mieke M. [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Radiology, Brussels (Belgium); Vrije Universiteit Brussel, Department of Radiology, UZ Brussel, Brussels (Belgium); Arthurs, Owen J.; Sebire, Neil J. [Great Ormond Street Hospital for Children NHS Foundation Trust, London (United Kingdom); UCL Institute of Child Health, London (United Kingdom); Segers, Valerie; Fourneau, Catherine [University Hospital Brugmann, Universite Libre de Bruxelles, Department of Fetopathology, Brussels (Belgium)

    2017-08-15

    To prospectively compare diagnostic accuracy of fetal post-mortem whole-body MRI at 3-T vs. 1.5-T. Between 2012 and 2015, post-mortem MRI at 1.5-T and 3-T was performed in fetuses after miscarriage/stillbirth or termination. Clinical MRI diagnoses were assessed using a confidence diagnostic score and compared with classical autopsy to derive a diagnostic error score. The relation of diagnostic error for each organ group with gestational age was calculated and 1.5-T with 3-T was compared with accuracy analysis. 135 fetuses at 12-41 weeks underwent post-mortem MRI (followed by conventional autopsy in 92 fetuses). For all organ groups except the brain, and for both modalities, the diagnostic error decreased with gestation (P < 0.0001). 3-T MRI diagnostic error was significantly lower than that of 1.5-T for all anatomic structures and organ groups, except the orbits and brain. This difference was maintained for fetuses <20 weeks gestation. Moreover, 3-T was associated with fewer non-diagnostic scans and greater concordance with classical autopsy than 1.5-T MRI, especially for the thorax, heart and abdomen in fetuses <20 weeks. Post-mortem fetal 3-T MRI improves confidence scores and overall accuracy compared with 1.5-T, mainly for the thorax, heart and abdomen of fetuses <20 weeks of gestation. (orig.)

  3. Post mortem examination report concerning Nadim Nuwwara

    DEFF Research Database (Denmark)

    Leth, Peter Mygind

    2014-01-01

    Post mortem examination report concerning Nadim Nuwawara, 17-years old, who was killed may 15 2014 in Beitunia near Rahmallah, Palestine. The examination was performed by an international team consisting of dr. Saber Al-Aloul, director of the Medico Legal Institute at Quds University, dr. Marc A....... Krouse, Deputy Chief Medical Examiner, Office of Chief Medical Examiner, Fort Worth, Texas, USA, dr. Chen Kugel, Chief Forensic Pathologist, Abu Kabir Institute of Forensic Medicine, Tel Aviv, dr. Ricardo Pablo Nachman, forensic expert at Abu Kabir Institute of Forensic Medicine, Tel Aviv and dr. Peter...

  4. Post-mortem hemoparasite detection in free-living Brazilian brown brocket deer (Mazama gouazoubira, Fischer 1814).

    Science.gov (United States)

    Silveira, Júlia Angélica Gonçalves da; Rabelo, Elida Mara Leite; Lima, Paula Cristina Senra; Chaves, Bárbara Neves; Ribeiro, Múcio Flávio Barbosa

    2014-01-01

    Tick-borne infections can result in serious health problems for wild ruminants, and some of these infectious agents can be considered zoonosis. The aim of the present study was the post-mortem detection of hemoparasites in free-living Mazama gouazoubira from Minas Gerais state, Brazil. The deer samples consisted of free-living M. gouazoubira (n = 9) individuals that died after capture. Necropsy examinations of the carcasses were performed to search for macroscopic alterations. Organ samples were collected for subsequent imprint slides, and nested PCR assays were performed to detect hemoparasite species. Imprint slide assays from four deer showed erythrocytes infected with Piroplasmida small trophozoites, and A. marginale corpuscles were observed in erythrocytes from two animals. A. marginale and trophozoite co-infections occurred in two deer. A nested PCR analysis of the organs showed that six of the nine samples were positive for Theileria sp., five were positive for A. phagocytophilum and three were positive for A. marginale, with co-infection occurring in four deer. The results of the present study demonstrate that post-mortem diagnostics using imprint slides and molecular assays are an effective method for detecting hemoparasites in organs.

  5. Method for modeling post-mortem biometric 3D fingerprints

    Science.gov (United States)

    Rajeev, Srijith; Shreyas, Kamath K. M.; Agaian, Sos S.

    2016-05-01

    Despite the advancements of fingerprint recognition in 2-D and 3-D domain, authenticating deformed/post-mortem fingerprints continue to be an important challenge. Prior cleansing and reconditioning of the deceased finger is required before acquisition of the fingerprint. The victim's finger needs to be precisely and carefully operated by a medium to record the fingerprint impression. This process may damage the structure of the finger, which subsequently leads to higher false rejection rates. This paper proposes a non-invasive method to perform 3-D deformed/post-mortem finger modeling, which produces a 2-D rolled equivalent fingerprint for automated verification. The presented novel modeling method involves masking, filtering, and unrolling. Computer simulations were conducted on finger models with different depth variations obtained from Flashscan3D LLC. Results illustrate that the modeling scheme provides a viable 2-D fingerprint of deformed models for automated verification. The quality and adaptability of the obtained unrolled 2-D fingerprints were analyzed using NIST fingerprint software. Eventually, the presented method could be extended to other biometric traits such as palm, foot, tongue etc. for security and administrative applications.

  6. Reflexiones Acerca del Papel de la Mujer en la Reproducción Artificial Post Mortem (Analysis of the Role of Women in the Posthumous Reproduction

    Directory of Open Access Journals (Sweden)

    Alma Marìa Rodrìguez Guitián

    2017-03-01

    Full Text Available This contribution focuses on the analysis of the role of women in the posthumous reproduction. In the first place, it studies whether the woman's right to have children should be subject to limits and, if so, to which ones. Second, it explores if the posthumous reproduction extends to lesbian couples, married or not, and finally it focuses on the relevance of the mother's will to decide the deceased to be registered as the parent of the child. Este trabajo tiene por objeto el análisis del papel de la mujer en la hipótesis de la reproducción artificial post mortem. En primer lugar, aborda si el derecho a procrear de la mujer está sujeto a límites y, si es así, cuáles son. En segundo lugar, si la reproducción artificial post mortem se extiende desde el punto de vista subjetivo al matrimonio y parejas de mujeres y, por último, cuál es la relevancia de la voluntad de la madre gestante a la hora de decidir la determinación de la paternidad o maternidad de la persona fallecida. DOWNLOAD THIS PAPER FROM SSRN: https://ssrn.com/abstract=2921870

  7. Post-mortem whole-body magnetic resonance imaging of human fetuses: a comparison of 3-T vs. 1.5-T MR imaging with classical autopsy.

    Science.gov (United States)

    Kang, Xin; Cannie, Mieke M; Arthurs, Owen J; Segers, Valerie; Fourneau, Catherine; Bevilacqua, Elisa; Cos Sanchez, Teresa; Sebire, Neil J; Jani, Jacques C

    2017-08-01

    To prospectively compare diagnostic accuracy of fetal post-mortem whole-body MRI at 3-T vs. 1.5-T. Between 2012 and 2015, post-mortem MRI at 1.5-T and 3-T was performed in fetuses after miscarriage/stillbirth or termination. Clinical MRI diagnoses were assessed using a confidence diagnostic score and compared with classical autopsy to derive a diagnostic error score. The relation of diagnostic error for each organ group with gestational age was calculated and 1.5-T with 3-T was compared with accuracy analysis. 135 fetuses at 12-41 weeks underwent post-mortem MRI (followed by conventional autopsy in 92 fetuses). For all organ groups except the brain, and for both modalities, the diagnostic error decreased with gestation (P autopsy than 1.5-T MRI, especially for the thorax, heart and abdomen in fetuses autopsy increases with 3-T. • PM-MRI using 3-T is particularly interesting for thoracic and abdominal organs. • PM-MRI using 3-T is particularly interesting for fetuses < 20 weeks' gestation.

  8. Experience with post-mortem computed tomography in Southern Denmark 2006-11

    DEFF Research Database (Denmark)

    Leth, Peter Mygind

    2013-01-01

    Objectives: (1) To explore the ability of post-mortem computed tomography (PMCT) to establish the cause of death. (2) To investigate the inter-method variation between autopsy and PMCT. (3) To investigate whether PMCT can select cases for autopsy. (4) To investigate the importance of histology...

  9. Post-mortem changes in the physical meat quality characteristics of ...

    African Journals Online (AJOL)

    ... apparatus) of the muscle generally improved with time. The quadratic equation y = -0.0817x2 + 0.4468x + 10.477 best described (R2 = 0.32) this improvement in tenderness. The implications of this result is that fresh game meat producers can de-bone carcasses after 24 hours post mortem and leave the primal cuts to age ...

  10. The effect of stress and exercise on post-mortem biochemistry of Atlantic salmon and rainbow trout

    DEFF Research Database (Denmark)

    Thomas, P.M.; Pankhurst, N.W.; Bremner, Allan

    1999-01-01

    Freshwater Atlantic salmon Salmo salar and rainbow trout Oncorhynchus mykiss responded similarly to increase in water flow (exercise), reduction in holding tank water level (stress), or 30 min chasing with water level reduction (stress and exercise). Stress generally resulted in elevated plasma c...... and exercise, results in mostly transient changes in post-mortem muscle biochemistry. These changes lead to an earlier onset and resolution of rigor, and lower post-mortem muscle pH in comparison to the control. (C) 1999 The Fisheries Society of the British Isles...

  11. Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods

    OpenAIRE

    Kênia de Fátima Carrijo; Elmiro Rosendo do Nascimento; Virginia Léo de Almeida Pereira; Nelson Morés; Catia Silene Klein; Leonardo Muliterno Domingues; Rogerio Tortelly

    2014-01-01

    ABSTRACT. Carrijo K.F., Nascimento E.R., Pereira V.L.A., Morés N., Klein, C.S., Domingues L.M. & Tortelly R. [Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods.] Diagnóstico da pneumonia enzoótica suína pela inspeção sanitária post mortem: comparação com outros métodos de diagnóstico. Revista Brasileira de Veterinária Brasileira 36(2):188-194, 2014. Faculdade de Medicina Veterinária, Universidade Federal de Uberlândia, Av. Par...

  12. Post-mortem diagnosis of chronic Chagas's disease comparative evaluation of three serological tests on pericardial fluid.

    Science.gov (United States)

    Lopes, E R; Chapadeiro, E; Batista, S M; Cunha, J G; Rocha, A; Miziara, L; Ribeiro, J U; Patto, R J

    1978-01-01

    In an attempt to improve the post-mortem diagnosis of Chagas's disease the authors performed haemagglutination tests (HAT), fluorescent Trypanosoma cruzi antibody tests (FAT), and complement fixation tests (CFT) on the pericardial fluid obtained at autopsy of 50 individuals with Chagas's heart disease, and 93 patients in whom this disease was not thought to be present. The results demonstrate that all three tests are efficient for the post-mortem diagnosis of Chagas's disease but suggest that their combined use would detect more cases than would one isolated reaction only.

  13. Abnormal fetal cerebral laminar organization in cobblestone complex as seen on post-mortem MRI and DTI

    International Nuclear Information System (INIS)

    Widjaja, Elysa; Geibprasert, Sasikhan; Blaser, Susan; Rayner, Tammy; Shannon, Patrick

    2009-01-01

    We report a unique case of cobblestone complex using post-mortem MR and diffusion tensor imaging to assess the laminar organization of the fetal cerebrum. The imaging findings were correlated with autopsy findings. Abnormal cortical development in cobblestone complex resulted in disruption of normal laminar organization of the fetal brain, which was seen as interruption and nodularity of the high-signal T1 cortical band with increased anisotropy and medium diffusivity extending beyond the cortical band into the cerebral mantle on post-mortem MR and diffusion tensor imaging. (orig.)

  14. Abnormal fetal cerebral laminar organization in cobblestone complex as seen on post-mortem MRI and DTI

    Energy Technology Data Exchange (ETDEWEB)

    Widjaja, Elysa; Geibprasert, Sasikhan; Blaser, Susan; Rayner, Tammy [Hospital for Sick Children, Department of Diagnostic Imaging, Toronto (Canada); Shannon, Patrick [University of Toronto, Department of Pathology, Mount Sinai Hospital, Toronto (Canada)

    2009-08-15

    We report a unique case of cobblestone complex using post-mortem MR and diffusion tensor imaging to assess the laminar organization of the fetal cerebrum. The imaging findings were correlated with autopsy findings. Abnormal cortical development in cobblestone complex resulted in disruption of normal laminar organization of the fetal brain, which was seen as interruption and nodularity of the high-signal T1 cortical band with increased anisotropy and medium diffusivity extending beyond the cortical band into the cerebral mantle on post-mortem MR and diffusion tensor imaging. (orig.)

  15. Using Framework Analysis in nursing research: a worked example.

    Science.gov (United States)

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  16. Burned bodies: post-mortem computed tomography, an essential tool for modern forensic medicine.

    Science.gov (United States)

    Coty, J-B; Nedelcu, C; Yahya, S; Dupont, V; Rougé-Maillart, C; Verschoore, M; Ridereau Zins, C; Aubé, C

    2018-06-07

    Currently, post-mortem computed tomography (PMCT) has become an accessible and contemporary tool for forensic investigations. In the case of burn victims, it provides specific semiologies requiring a prudent understanding to differentiate between the normal post-mortem changes from heat-related changes. The aim of this pictorial essay is to provide to the radiologist the keys to establish complete and focused reports in cases of PMCT of burn victims. Thus, the radiologist must discern all the contextual divergences with the forensic history, and must be able to report all the relevant elements to answer to the forensic pathologist the following questions: Are there tomographic features that could help to identify the victim? Is there evidence of remains of biological fluids in liquid form available for toxicological analysis and DNA sampling? Is there another obvious cause of death than heat-related lesions, especially metallic foreign bodies of ballistic origin? Finally, what are the characteristic burn-related injuries seen on the corpse that should be sought during the autopsy? • CT is highly useful to find features permitting the identification of a severely burned body. • PMCT is a major asset in gunshot injuries to depict ballistic foreign bodies in the burned cadavers. • CT is able to recognise accessible blood for tests versus heat clot (air-crescent sign). • Heat-related fractures are easily differentiated from traumatic fractures. • Epidural collections with a subdural appearance are typical heat-related head lesions.

  17. Viability and infectivity of Ichthyophonus sp. in post-mortem Pacific herring, Clupea pallasii.

    Science.gov (United States)

    Kocan, Richard; Hart, Lucas; Lewandowski, Naomi; Hershberger, Paul

    2014-12-01

    Ichthyophonus-infected Pacific herring, Clupea pallasii , were allowed to decompose in ambient seawater then serially sampled for 29 days to evaluate parasite viability and infectivity for Pacific staghorn sculpin, Leptocottus armatus . Ichthyophonus sp. was viable in decomposing herring tissues for at least 29 days post-mortem and could be transmitted via ingestion to sculpin for up to 5 days. The parasite underwent morphologic changes during the first 48 hr following death of the host that were similar to those previously reported, but as host tissue decomposition progressed, several previously un-described forms of the parasite were observed. The significance of long-term survival and continued morphologic transformation in the post-mortem host is unknown, but it could represent a saprozoic phase of the parasite life cycle that has survival value for Ichthyophonus sp.

  18. Making post-mortem implantable cardioverter defibrillator explantation safe

    DEFF Research Database (Denmark)

    Räder, Sune B E W; Zeijlemaker, Volkert; Pehrson, Steen

    2009-01-01

    that the resting voltage over the operating person would not exceed 50 V. CONCLUSION: The use of intact medical gloves made of latex, neoprene, or plastic eliminates the potential electrical risk during explantation of an ICD. Two gloves on each hand offer sufficient protection. We will recommend the use......AIMS: The aim of this study is to investigate whether protection with rubber or plastic gloves during post-mortem explantation of an implantable cardioverter defibrillator (ICD) offers enough protection for the explanting operator during a worst-case scenario (i.e. ICD shock). METHODS AND RESULTS...

  19. Post mortem magnetic resonance imaging in the fetus, infant and child: A comparative study with conventional autopsy (MaRIAS Protocol

    Directory of Open Access Journals (Sweden)

    Thayyil Sudhin

    2011-12-01

    Full Text Available Abstract Background Minimally invasive autopsy by post mortem magnetic resonance (MR imaging has been suggested as an alternative for conventional autopsy in view of the declining consented autopsy rates. However, large prospective studies rigorously evaluating the accuracy of such an approach are lacking. We intend to compare the accuracy of a minimally invasive autopsy approach using post mortem MR imaging with that of conventional autopsy in fetuses, newborns and children for detection of the major pathological abnormalities and/or determination of the cause of death. Methods/Design We recruited 400 consecutive fetuses, newborns and children referred for conventional autopsy to one of the two participating hospitals over a three-year period. We acquired whole body post mortem MR imaging using a 1.5 T MR scanner (Avanto, Siemens Medical Solutions, Enlargen, Germany prior to autopsy. The total scan time varied between 90 to 120 minutes. Each MR image was reported by a team of four specialist radiologists (paediatric neuroradiology, paediatric cardiology, paediatric chest & abdominal imaging and musculoskeletal imaging, blinded to the autopsy data. Conventional autopsy was performed according to the guidelines set down by the Royal College of Pathologists (UK by experienced paediatric or perinatal pathologists, blinded to the MR data. The MR and autopsy data were recorded using predefined categorical variables by an independent person. Discussion Using conventional post mortem as the gold standard comparator, the MR images will be assessed for accuracy of the anatomical morphology, associated lesions, clinical usefulness of information and determination of the cause of death. The sensitivities, specificities and predictive values of post mortem MR alone and MR imaging along with other minimally invasive post mortem investigations will be presented for the final diagnosis, broad diagnostic categories and for specific diagnosis of each system

  20. External foam and the post-mortem period in freshwater drowning; results from a retrospective study in Amsterdam, the Netherlands.

    Science.gov (United States)

    Reijnen, G; Buster, M C; Vos, P J E; Reijnders, U J L

    2017-11-01

    Determining the time of death of bodies recovered from water can be difficult. A feature of drowning is the presence of external foam. This study describes the presence of external foam in relation to the post-mortem period. The study utilizes a database of death reports dated between January 2011 and July 2016. For bodies recovered from fresh water, the presence or absence of external foam was noted. In this study, 112 death reports are included. Of these reports, 18 mentioned external foam, which account for 16.1% of the entire study population. In the population with a post-mortem period of less than 24 h, external foam was detected in 27.7% of cases. All 18 incidents with external foam had an estimated post-mortem period of less than 24 h. In our study, external foam was only present in freshwater drowning cases with a post-mortem period of less than 24 h. Based on this finding, the presence of external foam may be useful as an additional indicator when estimating the time of death in freshwater drowning. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  2. Histological transformations of the dental pulp as possible indicator of post mortem interval: a pilot study.

    Science.gov (United States)

    Carrasco, Patricio A; Brizuela, Claudia I; Rodriguez, Ismael A; Muñoz, Samuel; Godoy, Marianela E; Inostroza, Carolina

    2017-10-01

    The correct estimation of the post mortem interval (PMI) can be crucial on the success of a forensic investigation. Diverse methods have been used to estimate PMI, considering physical changes that occur after death, such as mortis algor, livor mortis, among others. Degradation after death of dental pulp is a complex process that has not yet been studied thoroughly. It has been described that pulp RNA degradation could be an indicator of PMI, however this study is limited to 6 days. The tooth is the hardest organ of the human body, and within is confined dental pulp. The pulp morphology is defined as a lax conjunctive tissue with great sensory innervation, abundant microcirculation and great presence of groups of cell types. The aim of this study is to describe the potential use of pulp post mortem alterations to estimate PMI, using a new methodology that will allow obtainment of pulp tissue to be used for histomorphological analysis. The current study will identify potential histological indicators in dental pulp tissue to estimate PMI in time intervals of 24h, 1 month, 3 months and 6 months. This study used 26 teeth from individuals with known PMI of 24h, 1 month, 3 months or 6 months. All samples were manipulated with the new methodology (Carrasco, P. and Inostroza C. inventors; Universidad de los Andes, assignee. Forensic identification, post mortem interval estimation and cause of death determination by recovery of dental tissue. United State patent US 61/826,558 23.05.2013) to extract pulp tissue without the destruction of the tooth. The dental pulp tissues obtained were fixed in formalin for the subsequent generation of histological sections, stained with Hematoxylin Eosin and Masson's Trichrome. All sections were observed under an optical microscope using magnifications of 10× and 40×. The microscopic analysis of the samples showed a progressive transformation of the cellular components and fibers of dental pulp along PMI. These results allowed creating a

  3. Prevalence and concordance between the clinical and the post-mortem diagnosis of dementia in a psychogeriatric clinic.

    Science.gov (United States)

    Grandal Leiros, B; Pérez Méndez, L I; Zelaya Huerta, M V; Moreno Eguinoa, L; García-Bragado, F; Tuñón Álvarez, T; Roldán Larreta, J J

    The aim of our study is to describe the types of dementia found in a series of patients and to estimate the level of agreement between the clinical diagnosis and post-mortem diagnosis. We conducted a descriptive analysis of the prevalence of the types of dementia found in our series and we established the level of concordance between the clinical and the post-mortem diagnoses. The diagnosis was made based on current diagnostic criteria. 114 cases were included. The most common clinical diagnoses both at a clinical and autopsy level were Alzheimer disease and mixed dementia but the prevalence was quite different. While at a clinical level, prevalence was 39% for Alzheimer disease and 18% for mixed dementia, in the autopsy level, prevalence was 22% and 34%, respectively. The agreement between the clinical and the autopsy diagnoses was 62% (95% CI 53-72%). Almost a third of our patients were not correctly diagnosed in vivo. The most common mistake was the underdiagnosis of cerebrovascular pathology. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Increased concentration of. cap alpha. - and. gamma. -endorphin in post mortem hypothalamic tissue of schizophrenic patients

    Energy Technology Data Exchange (ETDEWEB)

    Wiegant, V.M.; Verhoef, C.J.; Burbach, J.P.H.; de Wied, D.

    1988-01-01

    The concentrations of ..cap alpha..-, ..beta..- and ..gamma..-endorphin were determined by radioimmunoassay in HPLC fractionated extracts of post mortem hypothalamic tissue obtained from schizophrenic patients and controls. The hypothalamic concentration of ..cap alpha..- and ..gamma..-endorphin was significantly higher in patients than in controls. No difference was found in the concentration of ..beta..-endorphin, the putative precursor of ..cap alpha..- and ..gamma..-endorphins. These results suggest a deviant metabolism of ..beta..-endorphin in the brain of schizophrenic patients. Whether this phenomenon is related to the psychopathology, or is a consequence of ante mortem farmacotherapy, remains to be established.

  5. Effect of cooling rate upon processing characteristics of pork meat of different glycolysis type during post mortem ageing.

    Science.gov (United States)

    Vada, M

    1977-10-01

    Rapid chilling was applied to porcine longissimus dorsi muscles at 1 h post mortem in order to observe its effect on the quality of canned products prepared from those of different pH(1) values. The muscle from one side of each animal was removed from the carcase 50 minutes post mortem and divided into two longitudinal strips. One was chilled immediately to 13-15°C (1 h post mortem): the other after a further hour (2 h post mortem) acted as control. After the centre temperature had reached 10°C the muscles were stored in a refrigerator at 3-5°C. Compared with the control samples (chilled at 2 h p.m.), rapid chilling from 1 h p.m. caused an improvement in the water-holding capacity and the texture of pork meat, which had higher pH(1) values and was processed at 2, 4 and 48 h p.m. There was minimum brine retention and texture score if samples-both rapidly chilled and control-were processed at 24 h p.m. Although brine retention of PSE pork meat could not be increased even by rapid chilling, the texture of heat treated PSE pork showed an improvement during storage, which was more pronounced after ageing for 48 h, if PSE samples were chilled at 1 h p.m. Copyright © 1977. Published by Elsevier Ltd.

  6. Effect of gamma irradiation on the microstructure and post-mortem anaerobic metabolism of bovine muscle

    International Nuclear Information System (INIS)

    Yook, H.-S.; Lee, J.-W.; Lee, K.-H.; Kim, M.-K.; Song, C.-W.; Byun, M.-W.

    2001-01-01

    Experiments were performed to study the effect of gamma irradiation on morphological properties and post-mortem metabolism in bovine M. sternomandibularis with special reference to ultrastructure, shear force, pH and ATP breakdown. The shortening of sarcomere was not observed in gamma-irradiated muscle, however, the disappearance of M-line and of A- and I-bands was perceptible. During cold storage, the destruction of muscle bundles was faster in the gamma-irradiated muscle than in the non-irradiated with a dose-dependent manner. The same is true for the post mortem pH drop and ATP breakdown. So, experimental results confirmed that the anaerobic metabolism and morphological properties are noticeably affected by gamma irradiation in beef

  7. The Elusive Universal Post-Mortem Interval Formula

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad Alexander [ORNL

    2011-01-01

    The following manuscript details our initial attempt at developing universal post-mortem interval formulas describing human decomposition. These formulas are empirically derived from data collected over the last 20 years from the University of Tennessee's Anthropology Research Facility, in Knoxville, Tennessee, USA. Two formulas were developed (surface decomposition and burial decomposition) based on temperature, moisture, and the partial pressure of oxygen, as being three of the four primary drivers for human decomposition. It is hoped that worldwide application of these formulas to environments and situations not readily studied in Tennessee will result in interdisciplinary cooperation between scientists and law enforcement personnel that will allow for future refinements of these models leading to increased accuracy.

  8. Post-mortem radiography of the lungs: Experiments to compare various methods of examination and descriptions of their usefulness in actual practice

    International Nuclear Information System (INIS)

    Pankow, W.

    1986-01-01

    Described is the post-mortem examination of the isolated lung using radiologic-morphologic and histologic methods. Comparisons are made ragarding practical value of the conservation techniques chosen, which were the method of Markaria and Heitzmann that is based on fixing in alcohol and air drying and the nitrogen freezing method developed by Rau and colleagues. Both methods ensure adequate visualition of the pulmonary ultrastructure by X-rays, even though this observation should be qualified by the fact that pulmonary tissue fixed in alcohol tends to shrink and that intraalveolar edema are thus artificially reduced. Either of the methods under investigation permits angiographic and bronchographic examinations to be carried out without difficulty. In macroscopic evaluations better results are obtained for lungs fixed in alcohol. Freeze-dried samples offer advantages in the histological assessment of the pulmonary ultrastructure. Post-mortem radiography of the lungs is particularly valuable in the analysis of pathological changes in the pulmonary structure. (MBC) [de

  9. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  10. Efeito do resfriamento sobre a textura post-mortem da carne do peixe matrinxã Brycon cephalus Chilling effect on the post-mortem texture of the matrinxã fish muscle Brycon cephalus

    Directory of Open Access Journals (Sweden)

    H. Suárez-Mahecha

    2007-08-01

    Full Text Available Os mecanismos que causam o amolecimento e a perda na textura post-mortem da carne de matrinxã foram determinados por meio das mudanças na microestrutura do músculo, imediatamente após a morte e depois de 12 horas de estocagem a -3°C. As observações na microestrutura, realizadas com microscópio eletrônico de transmissão, foram semelhantes aos resultados obtidos na força de ruptura do músculo medidos com o texturômetro. Os valores da força da ruptura foram menores para a carne após o resfriamento. Observou-se que as fibras do colágeno do tecido conectivo pericelular se desintegraram e que as do colágeno do tecido conectivo do miocommata conservaram sua arquitetura e integridade. Houve pouca degradação da linha Z. Isso sugere que o amolecimento post-mortem da carne de mantrinxã, durante a estocagem a -3°C, é causado pela degradação do tecido conectivo pericelular.In order to determine the mechanisms that cause the post mortem muscle softness of the matrinxã Brycon cephalus, changes in the micro structure of the muscle were observed immediately after death and after 12 hours of storage at -3º C, measuring the firmness of the flesh with test instruments. Observations by the transmission electron microscope were similar to the results obtained in the breaking strength of the muscle measured with a texturometer. The values of the breaking strength of the fish muscle were smaller after chilling. At the same time, it was observed that the collagen fibers of the pericellular connective tissue had disintegrated, while the collagen fibers of the miocommata connective tissue maintained their organization and integrity. No evident breakdown of Z-discs was observed. It is suggested that the post-mortem tenderization of the matrinxã muscle during chilled storage was due to the disintegration of the collagen fibers in the pericellular connective tissue and, in a smaller extent, to the weakening of Z-disk.

  11. Post-mortem MRI as an alternative to non-forensic autopsy in foetuses and children: from research into clinical practice

    Science.gov (United States)

    Addison, S; Arthurs, O J

    2014-01-01

    Although post-mortem MRI (PMMR) was proposed as an alternative to conventional autopsy more than a decade ago, the lack of systematic validation has limited its clinical uptake. Minimally invasive autopsy (MIA) using PMMR together with ancillary investigations has now been shown to be as accurate as conventional autopsy in foetuses, newborns and infants and is particularly useful for cerebral, cardiac and genitourinary imaging. Unlike conventional autopsy, PMMR provides a permanent three-dimensional auditable record, with accurate estimation of internal organ volumes. MIA is becoming highly acceptable to parents and professionals, and there is widespread political support and public interest in its clinical implementation in the UK. In the short to medium term, it is desirable that a supraregional network of specialist centres should be established to provide this service within the current National Health Service framework. PMID:24288400

  12. Clarke's Isolation and identification of drugs in pharmaceuticals, body fluids, and post-mortem material

    National Research Council Canada - National Science Library

    Clarke, E. G. C; Moffat, A. C; Jackson, J. V

    1986-01-01

    This book is intended for scientists faced with the difficult problem of identifying an unknown drug in a pharmaceutical product, in a sample of tissue or body fluid from a living patient, or in post-mortem material...

  13. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  14. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    Science.gov (United States)

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  15. Can we infer post mortem interval on the basis of decomposition rate? A case from a Portuguese cemetery.

    Science.gov (United States)

    Ferreira, M Teresa; Cunha, Eugénia

    2013-03-10

    Post mortem interval estimation is crucial in forensic sciences for both positive identification and reconstruction of perimortem events. However, reliable dating of skeletonized remains poses a scientific challenge since human remains decomposition involves a set of complex and highly variable processes. Many of the difficulties in determining post mortem interval and/or the permanence of a body in a specific environment relates with the lack of systematic observations and research in human body decomposition modalities in different environments. In March 2006, in order to solve a problem of misidentification, a team of the South Branch of Portuguese National Institute of Legal Medicine carried out the exhumation of 25 identified individuals buried for almost five years in the same cemetery plot. Even though all individuals shared similar post mortem intervals, they presented different stages of decomposition. In order to analyze the post mortem factors associated with the different stages of decomposition displayed by the 25 exhumed individuals, the stages of decomposition were scored. Information regarding age at death and sex of the individuals were gathered and recorded as well as data in the cause of death and grave and coffin characteristics. Although the observed distinct decay stages may be explained by the burial conditions, namely by the micro taphonomic environments, individual endogenous factors also play an important role on differential decomposition as witnessed by the present case. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. The use of contrast-enhanced post Mortem CT in the detection of cardiovascular deaths.

    Directory of Open Access Journals (Sweden)

    Jonas Christoph Apitzsch

    Full Text Available OBJECTIVES: To evaluate the diagnostic value of contrast enhanced post mortem computed tomography (PMCT in comparison to non-enhanced post mortem CT in the detection of cardiovascular causes of death (COD. BACKGROUND: As autopsy rates decline, new methods to determine CODs are necessary. So contrast enhanced PMCT shall be evaluated in comparison to established non-enhanced PMCT in order to further improve the method. METHODS: In a prospective study, 20 corpses were examined using a 64-row multisclice CT (MSCT before and after intraarterial perfusion with a newly developed, barium-bearing contrast agent and ventilation of the lungs. The cause of death was determined in enhanced and unenhanced scans and a level of confidence (LOC was given by three experienced radiologists on a scale between 0 and 4. Results were compared to autopsy results as gold standard. Autopsy was performed blinded to PMCT-findings. RESULTS: The method allowed visualization of different types of cause of death. There was a significant improvement in LOC in enhanced scans compared to unenhanced scans as well as an improvement in the detection of COD. The cause of death could be determined in 19 out of 20 patients. CONCLUSIONS: PMCT is feasible and appears to be robust for diagnosing cardiovascular causes of death. When compared with unenhanced post-mortem CT intraarterial perfusion and pulmonary ventilation significantly improve visualization and diagnostic accuracy. These promising results warrant further studies.

  17. Minimally invasive autopsy employing post-mortem CT and targeted coronary angiography: evaluation of its application to a routine Coronial service.

    Science.gov (United States)

    Roberts, Ian S D; Traill, Zoe C

    2014-01-01

    Post-mortem imaging is a potential alternative to traditional medicolegal autopsy. We investigate the reduction in number of invasive autopsies required by use of post-mortem CT ± coronary angiography. A total of 120 adult deaths referred to the Coroner were investigated by CT, with coronary angiography employed only for the second series of 60 cases, in order to determine the added value of angiography. The confidence of imaging cause of death was classified as definite (no autopsy), probable, possible or unascertained. Invasive autopsy was not required in 38% of cases without coronary angiography and 70% of cases with angiography. Full autopsy, including brain dissection, was required in only 9% of cases. There was complete agreement between autopsy and radiological causes of death in the cases with a 'probable' imaging cause of death, indicating that cases for which imaging provides an accurate cause of death without autopsy were identified correctly. In two patients, CT demonstrated unsuspected fractures, not detected at subsequent autopsy. A two-thirds reduction in the number of invasive coronial autopsies can be achieved by use of post-mortem CT plus coronary angiography. At the same time, use of post-mortem CT may improve accuracy of diagnosis, particularly for traumatic deaths. © 2013 John Wiley & Sons Ltd.

  18. Effects of post mortem interval and gender in DNA base excision repair activities in rat brains

    Energy Technology Data Exchange (ETDEWEB)

    Soltys, Daniela Tathiana; Pereira, Carolina Parga Martins; Ishibe, Gabriela Naomi; Souza-Pinto, Nadja Cristhina de, E-mail: nadja@iq.usp.br

    2015-06-15

    Most human tissues used in research are of post mortem origin. This is the case for all brain samples, and due to the difficulty in obtaining a good number of samples, especially in the case of neurodegenerative diseases, male and female samples are often included in the same experimental group. However, the effects of post mortem interval (PMI) and gender differences in the endpoints being analyzed are not always fully understood, as is the case for DNA repair activities. To investigate these effects, in a controlled genetic background, base excision repair (BER) activities were measured in protein extracts obtained from Wistar rat brains from different genders and defined PMI up to 24 hours, using a novel fluorescent-based in vitro incision assay. Uracil and AP-site incision activity in nuclear and mitochondrial extracts were similar in all groups included in this study. Our results show that gender and PMI up to 24 hours have no influence in the activities of the BER proteins UDG and APE1 in rat brains. These findings demonstrate that these variables do not interfere on the BER activities included in these study, and provide a security window to work with UDG and APE1 proteins in samples of post mortem origin.

  19. Effects of post mortem interval and gender in DNA base excision repair activities in rat brains

    International Nuclear Information System (INIS)

    Soltys, Daniela Tathiana; Pereira, Carolina Parga Martins; Ishibe, Gabriela Naomi; Souza-Pinto, Nadja Cristhina de

    2015-01-01

    Most human tissues used in research are of post mortem origin. This is the case for all brain samples, and due to the difficulty in obtaining a good number of samples, especially in the case of neurodegenerative diseases, male and female samples are often included in the same experimental group. However, the effects of post mortem interval (PMI) and gender differences in the endpoints being analyzed are not always fully understood, as is the case for DNA repair activities. To investigate these effects, in a controlled genetic background, base excision repair (BER) activities were measured in protein extracts obtained from Wistar rat brains from different genders and defined PMI up to 24 hours, using a novel fluorescent-based in vitro incision assay. Uracil and AP-site incision activity in nuclear and mitochondrial extracts were similar in all groups included in this study. Our results show that gender and PMI up to 24 hours have no influence in the activities of the BER proteins UDG and APE1 in rat brains. These findings demonstrate that these variables do not interfere on the BER activities included in these study, and provide a security window to work with UDG and APE1 proteins in samples of post mortem origin

  20. Laboratory experimental infection of sheep to Ornithobilharzia turkestanicum and its confirmation using post-mortem examination and histopathology

    Directory of Open Access Journals (Sweden)

    gholamreza karimi

    2014-11-01

    Full Text Available Ornithobilharzia turkestanicum from genus Ornithobilharzia genus and family Schistosomatidae is an important agent of parasitological infection in sheep. This parasite has been reported from Russia, China, Turkestan (Kazakhstan, Kyrgyzstan, Turkmenistan and Uzbekistan, Pakistan, Iraq, Turkey and Iran. Parasitological infection due to this agent could be one of the important factors of decreasing the production rate of livestock in Iran. The purpose of this study, was to experimentally infect sheep with this parasite and confirm the infection by post-mortem examination and Histopathology which was done successfully. Twenty five sheep were used in the study of which 10 sheep were experimentally infected by Ornithobilharzia turkestanikum using subcutaneous injection and 10 sheep by skin contact method and the other 5 sheep were kept as control. Result of post-mortem and Histopathology during a one year period confirmed that all of sheep were infected and adult worm, were seen in their mesentery. Mean number of cercaria used for inducing the infection was 6425 and 462 adult worms were collected post-mortem. There was no significant relationship between the number of cercaria and adult worms collected. Male sheep were more infected than female.

  1. Post-Mortem Analysis after High-Power Operation of the TD24_R05 Tested in Xbox_1

    CERN Document Server

    Degiovanni, Alberto; Mouriz Irazabal, Nerea; Aicheler, Markus

    2016-01-01

    The CLIC prototype structure TD24_R05 has been high power tested in Xbox_1 in 2013. This report summarizes all examinations conducted after the high power test including bead-pull measurements, structure cutting, metrology and SEM observations. A synthesis of the various results is then made. The structure developed a hot cell progressively during operation and detuning was observed after the test was complete. The post mortem examination clearly showed a developed standing wave pattern which was explained by the physical deformation of one of the coupler iris. An elevated breakdown count through SEM imaging in the suspected hot cell however could not be confirmed. Neither any particular feature offering an explanation for the observed longitudinal breakdown distribution could be detected.

  2. [Inheritance rights fo the child born from post-mortem fertilization].

    Science.gov (United States)

    Iniesta Delgado, Juan José

    2008-01-01

    Spanish Law allows in the possibility of post mortem fertilization, recognizing the paternity of the deceased male. The most prominent legal effects of this fact have to do with the succession of his father. The way of fixing the child's portion in the forced succession and its protection, the question of determining his share in the inheritance and the necessity of defending his rights until the verification of the birth are some of the issues that are discussed in this article.

  3. Various methods for the estimation of the post mortem interval from Calliphoridae: A review

    Directory of Open Access Journals (Sweden)

    Ruchi Sharma

    2015-03-01

    Forensic entomology is recognized in many countries as an important tool for legal investigations. Unfortunately, it has not received much attention in India as an important investigative tool. The maggots of the flies crawling on the dead bodies are widely considered to be just another disgusting element of decay and are not collected at the time of autopsy. They can aid in death investigations (time since death, manner of death, etc.. This paper reviews the various methods of post mortem interval estimation using Calliphoridae to make the investigators, law personnel and researchers aware of the importance of entomology in criminal investigations. The various problems confronted by forensic entomologists in estimating the time since death have also been discussed and there is a need for further research in the field as well as the laborator. Correct estimation of the post mortem interval is one of the most important aspects of legal medicine.

  4. Essentials of forensic post-mortem MR imaging in adults

    Science.gov (United States)

    Ruder, T D; Thali, M J; Hatch, G M

    2014-01-01

    Post-mortem MR (PMMR) imaging is a powerful diagnostic tool with a wide scope in forensic radiology. In the past 20 years, PMMR has been used as both an adjunct and an alternative to autopsy. The role of PMMR in forensic death investigations largely depends on the rules and habits of local jurisdictions, availability of experts, financial resources, and individual case circumstances. PMMR images are affected by post-mortem changes, including position-dependent sedimentation, variable body temperature and decomposition. Investigators must be familiar with the appearance of normal findings on PMMR to distinguish them from disease or injury. Coronal whole-body images provide a comprehensive overview. Notably, short tau inversion–recovery (STIR) images enable investigators to screen for pathological fluid accumulation, to which we refer as “forensic sentinel sign”. If scan time is short, subsequent PMMR imaging may be focussed on regions with a positive forensic sentinel sign. PMMR offers excellent anatomical detail and is especially useful to visualize pathologies of the brain, heart, subcutaneous fat tissue and abdominal organs. PMMR may also be used to document skeletal injury. Cardiovascular imaging is a core area of PMMR imaging and growing evidence indicates that PMMR is able to detect ischaemic injury at an earlier stage than traditional autopsy and routine histology. The aim of this review is to present an overview of normal findings on forensic PMMR, provide general advice on the application of PMMR and summarise the current literature on PMMR imaging of the head and neck, cardiovascular system, abdomen and musculoskeletal system. PMID:24191122

  5. Preliminary study of post mortem identification using lip prints.

    Science.gov (United States)

    Utsuno, Hajime; Kanoh, Takashi; Tadokoro, Osamu; Inoue, Katsuhiro

    2005-05-10

    Identification using lip prints was first performed in the 1950s and was the subject of much research in the 1960s and 70s, leading to the acceptance of this technique as evidence in the criminal justice system. Previous research has focused on identifying lip print types or on methods of obtaining hidden lip prints left at the crime scene. The present study aimed to clarify characteristics of lip prints from cadavers with various causes of death (including drowning and hanging) and to determine the effects of fixation on post mortem changes in lip impressions.

  6. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  7. Herpetic brainstem encephalitis: report of a post-mortem case studied electron microscopically and immunohisiochemically

    Directory of Open Access Journals (Sweden)

    José Eymard Homem Pitella

    1987-03-01

    Full Text Available A post-mortem examined case of herpetic brainstem encephalitis is presented. Clinically, the patient had cephalea followed by ataxia, drowsiness and multiple palsies of some cranial nerves, developing into death in eight days. The pathologic examination of the brain showed necrotizing encephalitis in multiple foci limited to the brainstem, more distinctly in the pons and medula oblongata. The technique of immunoperoxidase revealed rare glial cells with intranuclear immunoreactivity for herpes antigen. Rare viral particles with the morphological characteristics of the herpesvirus were identified in the nuclei of neurons in 10% formol fixed material. This is the second reported case of herpetic brainstem encephalitis confirmed by post-mortem examination. The pathway used by the virus to reach the central nervous system and its posterior dissemination to the oral cavity, the orbitofrontal region and the temporal lobes as well as to the brainstem, after a period of latency and reactivation, are discussed.

  8. The capability of high field MRI in demonstrating post-mortem fetal brains at different gestational age

    International Nuclear Information System (INIS)

    Zhang Zhonghe; Liu Shuwei; Lin Xiangtao; Gen Hequn; Teng Gaojun; Fang Fang; Zang Fengchao; Yu Taifei; Zhao Bin

    2009-01-01

    Objective: To study the capability of high field MRI in demonstrating the post-mortem fetal brains at different gestational age (GA). Methods: One hundred and eight post-mortem fetal brains of 14-40 weeks GA were evaluated by 3.0 T MRI. Eleven brains of 14 to 27 weeks GA with good 3.0 T MRI images were chosen and scanned by 7.0 T MRI. The developing sulci, layered structures of fetal cerebral cortex and basal nuclei were evaluated on MRI of different Tesla (3.0 T and 7.0 T) and their results analyzed. Results: On T 1 WI of 3.0 T MRI, the layered structures of fetal cerebral cortex were present at 14 weeks GA, the sulci were more accurately identified after 16 weeks GA. The basal nuclei were clearly distinguishable after 20 weeks CA, and these structures were better visualized as the GA increased. On T 2 WI of 7.0 T MRI, the sulci, layered structures of fetal cerebral cortex and basal nuclei were shown more clearly at the same GA when compared to 3.0 T, especially the sulci at the early developmental stages. Conclusions: T 1 WI of 3.0 T MRI could show the developing structures of post-mortem fetal brain well, but the T 2 WI of 7.0 T MRI were comparatively better. (authors)

  9. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Onozuka, Naoki; Kakizaki, Ayana [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Sato, Miho, E-mail: meifan58@m.tains.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi, 980-8575 (Japan)

    2013-02-15

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high.

  10. Hypothermic death: Possibility of diagnosis by post-mortem computed tomography

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-01-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann–Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p = 0.0007, p < 0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p = 0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high

  11. [Acceptance of post-mortem organ donation in Germany : Representative cross-sectional study].

    Science.gov (United States)

    Tackmann, E; Dettmer, S

    2018-02-01

    The German post-mortem organ donation rate has dropped by one third since 2010. Furthermore, 958 patients died in 2015 in Germany while waiting for an organ. To decrease organ shortage, an amendment of the transplantation law was established in 2012. An information package including an organ donor card is sent to all German citizens via the postal service. A voluntary national transplantation register was introduced in 2016 to improve transparency in the organ donation process. The influence of several transplantation scandals starting in 2012 on organ donation rates is in question. Therefore, the objective of this article is to discuss approval and objections to post-mortem organ donation among the next of kin of potential donors and the general public in Germany. Binary logistic regression of data from the 2014 survey by the Federal Centre for Health Education on attitudes towards organ and tissue donation in Germany was conducted, aiming to identify influencing factors on the likelihood of organ donor card possession. Additionally, data of the German Organ Transplantation Foundation on post-mortem organ donations in Germany in 2014 were studied to highlight reasons for approval and objections by next of kin of potential and explanted post-mortem organ donors. Methods of documentation of the deceased's will according to data of the German Organ Transplantation Foundation were analyzed. Male gender and lack of knowledge about organ donation decrease the likelihood of having an organ donor card. Of the respondents in the survey of the Federal Centre for Health Education 71.0% would donate their own organs, whereas only one third possess an organ donor card. Health insurances and physicians are the most important providers of organ donor cards in Germany. An increase in the percentage of organ donor card possession following the amendment of the transplantation law could not be observed by 2016. Fear of organ trade and unjust organ allocation are the main reasons

  12. Post-mortem cytogenomic investigations in patients with congenital malformations.

    Science.gov (United States)

    Dias, Alexandre Torchio; Zanardo, Évelin Aline; Dutra, Roberta Lelis; Piazzon, Flavia Balbo; Novo-Filho, Gil Monteiro; Montenegro, Marilia Moreira; Nascimento, Amom Mendes; Rocha, Mariana; Madia, Fabricia Andreia Rosa; Costa, Thais Virgínia Moura Machado; Milani, Cintia; Schultz, Regina; Gonçalves, Fernanda Toledo; Fridman, Cintia; Yamamoto, Guilherme Lopes; Bertola, Débora Romeo; Kim, Chong Ae; Kulikowski, Leslie Domenici

    2016-08-01

    Congenital anomalies are the second highest cause of infant deaths, and, in most cases, diagnosis is a challenge. In this study, we characterize patterns of DNA copy number aberrations in different samples of post-mortem tissues from patients with congenital malformations. Twenty-eight patients undergoing autopsy were cytogenomically evaluated using several methods, specifically, Multiplex Ligation-dependent Probe Amplification (MLPA), microsatellite marker analysis with a MiniFiler kit, FISH, a cytogenomic array technique and bidirectional Sanger sequencing, which were performed on samples of different tissues (brain, heart, liver, skin and diaphragm) preserved in RNAlater, in formaldehyde or by paraffin-embedding. The results identified 13 patients with pathogenic copy number variations (CNVs). Of these, eight presented aneuploidies involving chromosomes 13, 18, 21, X and Y (two presented inter- and intra-tissue mosaicism). In addition, other abnormalities were found, including duplication of the TYMS gene (18p11.32); deletion of the CHL1 gene (3p26.3); deletion of the HIC1 gene (17p13.3); and deletion of the TOM1L2 gene (17p11.2). One patient had a pathogenic missense mutation of g.8535C>G (c.746C>G) in exon 7 of the FGFR3 gene consistent with Thanatophoric Dysplasia type I. Cytogenomic techniques were reliable for the analysis of autopsy material and allowed the identification of inter- and intra-tissue mosaicism and a better understanding of the pathogenesis of congenital malformations. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Design and Analysis of Web Application Frameworks

    DEFF Research Database (Denmark)

    Schwarz, Mathias Romme

    -state manipulation vulnerabilities. The hypothesis of this dissertation is that we can design frameworks and static analyses that aid the programmer to avoid such errors. First, we present the JWIG web application framework for writing secure and maintainable web applications. We discuss how this framework solves...... some of the common errors through an API that is designed to be safe by default. Second, we present a novel technique for checking HTML validity for output that is generated by web applications. Through string analysis, we approximate the output of web applications as context-free grammars. We model......Numerous web application frameworks have been developed in recent years. These frameworks enable programmers to reuse common components and to avoid typical pitfalls in web application development. Although such frameworks help the programmer to avoid many common errors, we nd...

  14. An evaluation of the DRI-ETG EIA method for the determination of ethyl glucuronide concentrations in clinical and post-mortem urine.

    Science.gov (United States)

    Turfus, Sophie C; Vo, Tu; Niehaus, Nadia; Gerostamoulos, Dimitri; Beyer, Jochen

    2013-06-01

    A commercial enzyme immunoassay for the qualitative and semi-quantitative measurement of ethyl glucuronide (EtG) in urine was evaluated. Post-mortem (n=800), and clinical urine (n=200) samples were assayed using a Hitachi 902 analyzer. The determined concentrations were compared with those obtained using a previously published liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the quantification of EtG and ethyl sulfate. Using a cut-off of 0.5 µg/ml and LC-MS/MS limit of reporting of 0.1 µg/ml, there was a sensitivity of 60.8% and a specificity of 100% for clinical samples. For post-mortem samples, sensitivity and specificity were 82.4% and 97.1%, respectively. When reducing the cut-off to 0.1 µg/ml, the sensitivity and specificity were 83.3% and 100% for clinical samples whereas for post-mortem samples the sensitivity and specificity were 90.3 % and 88.3 %, respectively. The best trade-offs between sensitivity and specificity for LC-MS/MS limits of reporting of 0.5 and 0.1 µg/ml were achieved when using immunoassay cut-offs of 0.3 and 0.092 µg/ml, respectively. There was good correlation between quantitative results obtained by both methods but analysis of samples by LC-MS/MS gave higher concentrations than by enzyme immunoassay (EIA), with a statistically significant proportional bias (P<0.0001, Deming regression) for both sample types. The immunoassay is reliable for the qualitative and semi-quantitative presumptive detection of ethyl glucuronide in urine. Copyright © 2012 John Wiley & Sons, Ltd.

  15. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  16. Post-mortem cardiac diffusion tensor imaging: detection of myocardial infarction and remodeling of myofiber architecture

    International Nuclear Information System (INIS)

    Winklhofer, Sebastian; Berger, Nicole; Stolzmann, Paul; Stoeck, Christian T.; Kozerke, Sebastian; Thali, Michael; Manka, Robert; Alkadhi, Hatem

    2014-01-01

    To investigate the accuracy of post-mortem diffusion tensor imaging (DTI) for the detection of myocardial infarction (MI) and to demonstrate the feasibility of helix angle (HA) calculation to study remodelling of myofibre architecture. Cardiac DTI was performed in 26 deceased subjects prior to autopsy for medicolegal reasons. Fractional anisotropy (FA) and mean diffusivity (MD) were determined. Accuracy was calculated on per-segment (AHA classification), per-territory, and per-patient basis, with pathology as reference standard. HAs were calculated and compared between healthy segments and those with MI. Autopsy demonstrated MI in 61/440 segments (13.9 %) in 12/26 deceased subjects. Healthy myocardial segments had significantly higher FA (p 0.05). Post-mortem cardiac DTI enablesdifferentiation between healthy and infarcted myocardial segments by means of FA and MD. HA assessment allows for the demonstration of remodelling of myofibre architecture following chronic MI. (orig.)

  17. Analysis framework for GLORIA

    Science.gov (United States)

    Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian

    2012-05-01

    GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  18. Post-mortem detection of gasoline residues in lung tissue and heart blood of fire victims.

    Science.gov (United States)

    Pahor, Kevin; Olson, Greg; Forbes, Shari L

    2013-09-01

    The purpose of this study was to determine whether gasoline residues could be detected post-mortem in lung tissue and heart blood of fire victims. The lungs and heart blood were investigated to determine whether they were suitable samples for collection and could be collected without contamination during an autopsy. Three sets of test subjects (pig carcasses) were investigated under two different fire scenarios. Test subjects 1 were anaesthetized following animal ethics approval, inhaled gasoline vapours for a short period and then euthanized. The carcasses were clothed and placed in a house where additional gasoline was poured onto the carcass post-mortem in one fire, but not in the other. Test subjects 2 did not inhale gasoline, were clothed and placed in the house and had gasoline poured onto them in both fires. Test subjects 3 were clothed but had no exposure to gasoline either ante- or post-mortem. Following controlled burns and suppression with water, the carcasses were collected, and their lungs and heart blood were excised at a necropsy. The headspace from the samples was analysed using thermal desorption-gas chromatography-mass spectroscopy. Gasoline was identified in the lungs and heart blood from the subjects that were exposed to gasoline vapours prior to death (test subjects 1). All other samples were negative for gasoline residues. These results suggest that it is useful to analyse for volatile ignitable liquids in lung tissue and blood as it may help to determine whether a victim was alive and inhaling gases at the time of a fire.

  19. The importance of post-mortem computed tomography (PMCT) in confrontation with conventional forensic autopsy of victims of motorcycle accidents.

    Science.gov (United States)

    Moskała, Artur; Woźniak, Krzysztof; Kluza, Piotr; Romaszko, Karol; Lopatin, Oleksij

    2016-01-01

    Since traffic accidents are an important problem in forensic medicine, there is a constant search for new solutions to help with an investigation process in such cases. In recent years there was a rapid development of post-mortem imaging techniques, especially post-mortem computed tomography (PMCT). In our work we concentrated on a potential advantage of PMCT in cases of motorcycle accident fatalities. The results of forensic autopsy were compared with combined results of the autopsy and PMCT to check in which areas use of these two techniques gives statistically important increase in number of findings. The hypothesis was confirmed in case of pneumothorax and fractures of skull, spine, clavicle, scapula, lower leg bones. As for majority of other bone fractures locations and brain injures there were single cases with pathologies visible only in PMCT, but too few to reach expected level of p-value. In case of injuries of solid organs and soft tissues statistical analysis did not confirmed any advantage of unenhanced PMCT use. On the whole it has been shown that PMCT used as an adjunct to forensic autopsy can cause an increase in information about vitally important regions in case of motorcycle accident fatalities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Optimising imaging parameters for post mortem MR imaging of the human brain

    Energy Technology Data Exchange (ETDEWEB)

    Blamire, A.M.; Rowe, J.G.; Styles, P. [MRC Magnetic Resonance Spectroscopy Unit, Dept. of Biochemistry, Univ. of Oxford (United Kingdom); McDonald, B. [MRC Schizophrenia Research Group, Dept. of Neuropathology, Radcliffe Infirmary, Oxford (United Kingdom)

    1999-11-01

    Purpose: MR imaging of post mortem brains has the potential to yield volumetric information and define the extent of structural changes prior to pathological examination. Although standard T2-weighted clinical imaging sequences have been used for the examination of formalin-fixed brains, these protocols may not yield optimum contrast. We examined the effect of varying durations of formalin fixation on the transverse relaxation time (T2) and the tissue spin density. Material and Methods: Three post mortem brains were examined weekly during formalin fixation from the unfixed state to 35 days fixation. Standard MR spin-echo imaging was used at 5 echo times (20-100 ms) to calculate transverse relaxation time (T2) and spin density. Results: T2 decreased significantly (ANOVA, p<0.001) in both grey and white matter by 7 days fixation and there was a further (but non-significant) trend towards lower values between7 and 35 days. Grey and white matter T2 times converged with fixation. Conversely, the grey to white matter spin density ratio increased from 1.19{+-}0.01 to 1.54{+-}0.06 over five weeks of fixation. Conclusion: Our results suggest that spin density-weighted imaging sequences would provide improved grey to white matter contrast over T2-weighted sequences. (orig.)

  1. Quantification of maceration changes using post mortem MRI in fetuses

    International Nuclear Information System (INIS)

    Montaldo, P.; Addison, S.; Oliveira, V.; Lally, P. J.; Taylor, A. M.; Sebire, N. J.; Thayyil, S.; Arthurs, O. J.

    2016-01-01

    Post mortem imaging is playing an increasingly important role in perinatal autopsy, and correct interpretation of imaging changes is paramount. This is particularly important following intra-uterine fetal death, where there may be fetal maceration. The aim of this study was to investigate whether any changes seen on a whole body fetal post mortem magnetic resonance imaging (PMMR) correspond to maceration at conventional autopsy. We performed pre-autopsy PMMR in 75 fetuses using a 1.5 Tesla Siemens Avanto MR scanner (Erlangen, Germany). PMMR images were reported blinded to the clinical history and autopsy data using a numerical severity scale (0 = no maceration changes to 2 = severe maceration changes) for 6 different visceral organs (total 12). The degree of maceration at autopsy was categorized according to severity on a numerical scale (1 = no maceration to 4 = severe maceration). We also generated quantitative maps to measure the liver and lung T 2 . The mean PMMR maceration score correlated well with the autopsy maceration score (R 2 = 0.93). A PMMR score of ≥4.5 had a sensitivity of 91 %, specificity of 64 %, for detecting moderate or severe maceration at autopsy. Liver and lung T 2 were increased in fetuses with maceration scores of 3–4 in comparison to those with 1–2 (liver p = 0.03, lung p = 0.02). There was a good correlation between PMMR maceration score and the extent of maceration seen at conventional autopsy. This score may be useful in interpretation of fetal PMMR

  2. Hypothermic death: possibility of diagnosis by post-mortem computed tomography.

    Science.gov (United States)

    Kawasumi, Yusuke; Onozuka, Naoki; Kakizaki, Ayana; Usui, Akihito; Hosokai, Yoshiyuki; Sato, Miho; Saito, Haruo; Ishibashi, Tadashi; Hayashizaki, Yoshie; Funayama, Masato

    2013-02-01

    Referring to our experience with post-mortem computed tomography (CT), many hypothermic death cases presented a lack of increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and urine retention in the bladder. Thus we evaluated the diagnostic performance of post-mortem CT on hypothermic death based on the above-mentioned three findings. Twenty-four hypothermic death subjects and 53 non-hypothermic death subjects were examined. Two radiologists assessed the presence or lack of an increase in lung-field concentration, blood clotting in the heart, thoracic aorta or pulmonary artery, and measured urine volume in the bladder. Pearson's chi-square test and Mann-Whitney U-test were used to assess the relationship between the three findings and hypothermic death. The sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) of the diagnosis were also calculated. Lack of an increase in lung-field concentration and blood clotting in the heart, thoracic aorta or pulmonary artery were significantly associated with hypothermic death (p=0.0007, p<0.0001, respectively). The hypothermic death cases had significantly more urine in the bladder than the non-hypothermic death cases (p=0.0011). Regarding the diagnostic performance with all three findings, the sensitivity was 29.2% but the specificity was 100%. These three findings were more common in hypothermic death cases. Although the sensitivity was low, these findings will assist forensic physicians in diagnosing hypothermic death since the specificity was high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  3. Forensic aspects of incised wounds and bruises in pigs established post-mortem

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2017-01-01

    Recognizing post-mortem (PM) changes is of crucial importance in veterinary forensic pathology. In porcine wounds established PM contradicting observations regarding infiltration of leukocytes have been described. In the present study, skin, subcutis and muscle tissue sampled from experimental pigs...... of sampling. Moreover, it was found that AM bruises free of leukocyte infiltration cannot be distinguished from PM bruises, an observation which is of crucial importance when timing bruises in forensic cases....

  4. Intravital and post-mortem CT examinations of cerebral gunshot injuries

    International Nuclear Information System (INIS)

    Schumacher, M.; Oehmichen, M.; Koenig, H.G.; Einighammer, H.; Koeln Univ.; Tuebingen Univ.; Duesseldorf Univ.

    1983-01-01

    The value of CT was assessed in 24 patients who died of cerebral gun-shot injuries and in two patients with more recent injuries in order to reconstruct the mode of injury and for adding forensic information. The post-mortem and intravital appearances are described and are compared with ultrasound rotation compound scans of the isolated brains. CT showed good agreement with pathological findings. Ultrasound produced images with an accuracy between CT and photographs of the brain specimen. Both methods are regarded as valuable additions to the pathological and forensic information concerning gunshot injuries. (orig.) [de

  5. Investigations into the analysis of the rate of decay of the compound action potentials recorded from the rat sciatic nerve after death: significance for the prediction of the post-mortem period.

    Science.gov (United States)

    Nokes, L D; Daniel, D; Flint, T; Barasi, S

    1991-01-01

    There have been a number of papers that have reported the investigations of electrical stimulation of muscle groups in order to determine the post-mortem period. To the authors knowledge, no techniques have been described that analyse the compound action potentials (CAP) of various nerve fibre groups after death. This paper reports the monitoring of both the amplitude and latency changes of the CAP recorded from a stimulated rat sciatic nerve after death. Initial results suggest that the method my be useful in determining the early post-mortem period within 1 or 2 h after death. It may also be of use in measuring nerve conduction delay in various pathological conditions that can affect the neural network; for example diabetes.

  6. Breast density quantification with cone-beam CT: a post-mortem study

    International Nuclear Information System (INIS)

    Johnson, Travis; Ding, Huanjun; Le, Huy Q; Ducote, Justin L; Molloi, Sabee

    2013-01-01

    Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The per cent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson's r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. (paper)

  7. Establishing a framework for comparative analysis of genome sequences

    Energy Technology Data Exchange (ETDEWEB)

    Bansal, A.K.

    1995-06-01

    This paper describes a framework and a high-level language toolkit for comparative analysis of genome sequence alignment The framework integrates the information derived from multiple sequence alignment and phylogenetic tree (hypothetical tree of evolution) to derive new properties about sequences. Multiple sequence alignments are treated as an abstract data type. Abstract operations have been described to manipulate a multiple sequence alignment and to derive mutation related information from a phylogenetic tree by superimposing parsimonious analysis. The framework has been applied on protein alignments to derive constrained columns (in a multiple sequence alignment) that exhibit evolutionary pressure to preserve a common property in a column despite mutation. A Prolog toolkit based on the framework has been implemented and demonstrated on alignments containing 3000 sequences and 3904 columns.

  8. Is survival improved by the use of NIV and PEG in amyotrophic lateral sclerosis (ALS)? A post-mortem study of 80 ALS patients

    OpenAIRE

    Burkhardt, Christian; Neuwirth, Christoph; Sommacal, Andreas; Andersen, Peter M.; Weber, Markus

    2017-01-01

    Background: Non-invasive ventilation (NIV) and percutaneous gastrostomy (PEG) are guideline-recommended interventions for symptom management in amyotrophic lateral sclerosis (ALS). Their effect on survival is controversial and the impact on causes of death is unknown. Objective: To investigate the effect of NIV and PEG on survival and causes of death in ALS patients. Methods: Eighty deceased ALS patients underwent a complete post mortem analysis for causes of death between 2003 and 2015. Fort...

  9. Overview of the NRC/EPRI common cause analysis framework

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Worledge, D.H.; Mosleh, A.; Fleming, K.; Parry, G.W.; Paula, H.

    1988-01-01

    This paper presents an overview of a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures whose causes are not explicitly included in the logic model as basic events. The emphasis here is on providing guidelines for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) Logic Model Development, (2) Identification of Common Cause Component Groups, (3) Common Cause Modeling and Data Analysis, and (4) Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 25 references

  10. Towards muscle-specific meat color stability of Chinese Luxi yellow cattle: A proteomic insight into post-mortem storage.

    Science.gov (United States)

    Wu, Wei; Yu, Qian-Qian; Fu, Yu; Tian, Xiao-Jing; Jia, Fei; Li, Xing-Min; Dai, Rui-Tong

    2016-09-16

    Searching for potential predictors of meat color is a challenging task for the meat industry. In this study, the relationship between meat color parameters and the sarcoplasmic proteome of M. longissimuss lumborum (LL) and M. psoas major (PM) from Chinese Luxi yellow cattle during post-mortem storage (0, 5, 10 and 15days) were explored with the aid of the integrated proteomics and bioinformatics approaches. Meat color attributes revealed that LL displayed better color stability than PM during storage. Furthermore, sarcoplasmic proteins of these two muscles were compared between days 5, 10, 15 and day 0. Several proteins were closely correlated with meat color attributes and they were muscle-specific and responsible for the meat color stability at different storage periods. Glycerol-3-phosphate dehydrogenase, fructose-bisphosphate aldolase A isoform, glycogen phosphorylase, peroxiredoxin-2, phosphoglucomutase-1, superoxide dismutase [Cu-Zn], heat shock cognate protein (71kDa) might serve as the candidate predictors of meat color stability during post-mortem storage. In addition, bioinformatics analyses indicated that more proteins were involved in glycolytic metabolism of LL, which contributed to better meat color stability of LL than PM. The present results could provide a proteomic insight into muscle-specific meat color stability of Chinese Luxi yellow cattle during post-mortem storage. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Small vessel disease, neurovascular regulation and cognitive impairment: post-mortem studies reveal a complex relationship, still poorly understood.

    Science.gov (United States)

    Love, Seth; Miners, J Scott

    2017-07-15

    The contribution of vascular disease to cognitive impairment is under-recognized and the pathogenesis is poorly understood. This information gap has multiple causes, including a lack of post-mortem validation of clinical diagnoses of vascular cognitive impairment (VCI) or vascular dementia (VaD), the exclusion of cases with concomitant neurodegenerative disease when diagnosing VCI/VaD, and a lack of standardization of neuropathological assessment protocols for vascular disease. Other contributors include a focus on end-stage destructive lesions to the exclusion of more subtle types of diffuse brain injury, on structural abnormalities of arteries and arterioles to the exclusion of non-structural abnormalities and capillary damage, and the use of post-mortem sampling strategies that are biased towards the identification of neurodegenerative pathologies. Recent studies have demonstrated the value of detailed neuropathology in characterizing vascular contributions to cognitive impairment (e.g. in diabetes), and highlight the importance of diffuse white matter changes, capillary damage and vasoregulatory abnormalities in VCI/VaD. The use of standardized, evidence-based post-mortem assessment protocols and the inclusion of biochemical as well as morphological methods in neuropathological studies should improve the accuracy of determination of the contribution of vascular disease to cognitive impairment and clarify the relative contribution of different pathogenic processes to the tissue damage. © 2017 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  12. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  13. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  14. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  15. Value of systematic post mortem radiographic examinations of fetuses - 400 cases

    Energy Technology Data Exchange (ETDEWEB)

    Kalifa, G.; Sellier, N.; Barbet, J.P.; Labbe, F.; Houette, A.

    1989-01-01

    A retrospective study of 400 cases of fetal deaths has been carried out to assess the value of systematic post mortem radiological examination. Apart from general diagnosis purpose, special attention was given to the assessment of bone age and mineralization. The results were correlated with the clinical, U.S., chromosomal and pathological data. Computerized analysis of our information show the following results: (1) The radiological examination was valuable for the final diagnosis in 13.5% of cases. (2) It brings additional information in 34.5% of cases. (3) It had no diagnostic value in 52%. Furthermore several points deserve attention such as apparition of teeth (21 weeks), calcaneum (24 weeks). Major osteoporosis was always associated with a constitutional bone disease or an infectious process. An excessive length of the upper limbs (12) was seen in 11 cases of anencephaly. We suggest that a radiological examination should not be routinely performed, when the diagnosis is otherwise obvious, but should be considered in the presence of dwarfism, or other limb abnormalities and when the gestational age is uncertain. The films provide essential information especially for further genetic counselling.

  16. Value of systematic post mortem radiographic examinations of fetuses - 400 cases

    International Nuclear Information System (INIS)

    Kalifa, G.; Sellier, N.; Barbet, J.P.; Labbe, F.; Houette, A.

    1989-01-01

    A retrospective study of 400 cases of fetal deaths has been carried out to assess the value of systematic post mortem radiological examination. Apart from general diagnosis purpose, special attention was given to the assessment of bone age and mineralization. The results were correlated with the clinical, U.S., chromosomal and pathological data. Computerized analysis of our information show the following results: (1) The radiological examination was valuable for the final diagnosis in 13.5% of cases. (2) It brings additional information in 34.5% of cases. (3) It had no diagnostic value in 52%. Furthermore several points deserve attention such as apparition of teeth (21 weeks), calcaneum (24 weeks). Major osteoporosis was always associated with a constitutional bone disease or an infectious process. An excessive length of the upper limbs (12) was seen in 11 cases of anencephaly. We suggest that a radiological examination should not be routinely performed, when the diagnosis is otherwise obvious, but should be considered in the presence of dwarfism, or other limb abnormalities and when the gestational age is uncertain. The films provide essential information especially for further genetic counselling. (orig./MG)

  17. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  18. The social impacts of dams: A new framework for scholarly analysis

    International Nuclear Information System (INIS)

    Kirchherr, Julian; Charles, Katrina J.

    2016-01-01

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  19. O líquido cefalorraqueano no post-mortem The cerebrospinal fluid in the post-mortem

    Directory of Open Access Journals (Sweden)

    A. Spina-França

    1969-12-01

    Full Text Available Foi estudado o LCR de 45 cadáveres, sendo os resultados considerados em função do tempo decorrido entre o momento da morte e a colheita do LCR (TOC. Obedecendo a esse critério os casos foram assim grupados: 1 aqueles com TOC até 4 horas; 2 aqueles com TOC de 4 a 8 horas; 3 aqueles com TOC de 8 horas ou mais. Com o aumento do TOC a presença de hemácias no LCR de cadáveres se torna mais freqüente e mais intensa. A mistura de sangue ao LCR prejudica a avaliação das modificações cadavéricas de outros componentes do LCR, conforme foi demonstrado para as concentrações de cloretos glicose e proteínas totais, para o perfil protêico e para a atividade de transaminases. Assim sendo, para avaliar as modificações da composição do LCR próprias ao post-mortem devem ser considerados apenas os casos com menos de 1000 hemácias/mm³. O número normal de leucócitos foi proporcionalmente mais comum nas amostras provenientes de cadáveres cujo TOC era igual ou superior a 8 horas. A pleocitose foi observada com mais freqüência que o número normal de leucócitos, sendo mais comumente ligeira ou discreta. Quantidades superiores a 50 leucócitos/mm³ foram observadas geralmente em casos relativos a pacientes que faleceram na vigência de processos infecciosos agudos. As concentrações de cloretos e de glicose no LCR tendem a cair no postmortem e as diminuições mostraram-se, em média, tanto mais intensas quanto maior o TOC. A hipoglicorraquia foi, em média mais acentuada nos casos com pleoeitose mais intensa. A concentração de uréia tende a elevar-se de modo precoce, não tendo sido encontradas médias significativamente diversas em função do TOC. A atividade de TGO tende a elevar-se no post-mortem sendo esta elevação, em média, mais nítida a partir do grupo de casos com TOC de 4 até 8 horas. Ocorre também tendência a aumento da atividade de TGP; esta se mostrou menos intensa que a de TGO e, em média, foi mais nítida a

  20. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  1. Rapid determination of quetiapine in blood by gas chromatography-mass spectrometry. Application to post-mortem cases.

    Science.gov (United States)

    López-Guarnido, Olga; Tabernero, María Jesús; Hernández, Antonio F; Rodrigo, Lourdes; Bermejo, Ana M

    2014-10-01

    A simple, fast and sensitive method for the determination of quetiapine in human blood has been developed and validated. The method involved a basic liquid-liquid extraction procedure and subsequent analysis by gas chromatography-mass spectrometry, previous derivatization with bis(trimethylsilyl)-trifluoro-acetamide and chorotrimethylsilane (99 : 1). The methods of validation included linearity with a correlation coefficient > 0.99 over the range 0.02-1 µg ml(-1), intra- and interday precision (always < 12%) and accuracy (mean relative error always < 12%) to meet the bioanalytical acceptance criteria. The limit of detection was 0.005 µg ml(-1). The procedure was further applied to post mortems from the Institute of Legal Medicine, University of Santiago de Compostela. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  3. Post-mortem 3H-thymidine incorporation in human epidermis and oral mucosa

    International Nuclear Information System (INIS)

    Schellmann, B.

    1981-01-01

    Using the 3H-thymidine labelling method, the authors studied post-mortem incorporation activity in the epidermis and oral mucosa of corpses which were stored with their clothes on under conditions of normal room temperature (+20 0 ) and of cooling (+4 0 C). Samples were taken in the form of skin punches at 2 h or 4 h intervals, respec.. Using histo-autoradiograms, the incorporation of 3H-thymidine in dependence from the time interval between the points of time of death and sampling were determined in situe and given as the ratio of labelled cells of the germinative layer per 100 μm length of basement membrane. A linear drop of post-mortem thymidine incorporation rates in epidermis and oral mucosa was found in human corpse skin correlating with increasing temporal distance from the point of time of death. Incorporation rates in the oral mucosa were markedly higher (by a factor of 3 to 5) than those of the epidermis which agrees well with in vivo conditions. No labelling of cell nuclei, i.e. no synthetic activity of the germinative layer, could be detected in the epidermis 35-40 h after individual death at the latest (in the oral mucosa after 45-50 h). However, clear incorporation activities could be observed in the germinative layer of epidermis and oral mucosa after more than 4 d in the case of storage at +4 0 C. (orig./MG) [de

  4. The influence of body temperature on image contrast in post mortem MRI

    International Nuclear Information System (INIS)

    Ruder, Thomas D.; Hatch, Gary M.; Siegenthaler, Lea; Ampanozi, Garyfalia; Mathier, Sandra; Thali, Michael J.; Weber, Oliver M.

    2012-01-01

    Objective: To assess the temperature dependency of tissue contrast on post mortem magnetic resonance (PMMR) images both objectively and subjectively; and to visually demonstrate the changes of image contrast at various temperatures. Materials and methods: The study was approved by the responsible justice department and the ethics committee. The contrast of water, fat, and muscle was measured using regions of interest (ROI) in the orbit of 41 human corpses to assess how body temperature (range 2.1–39.8 °C) relates to image contrast of T1-weighted (T1W) and T2-weighted (T2W) sequences on PMMR. Regressions were calculated using the method of least squares. Three readers judged visible changes of image contrast subjectively by consensus. Results: There was a positive relationship between temperature and contrast on T1-weighted (T1W) images and between temperature and the contrast of fat/muscle on T2-weighted (T2W) images. There was a negative relationship between temperature and the contrast of water/fat and water/muscle on T2W images. Subjectively, the influence of temperature became visible below 20 °C on T2W images, and below 10 °C on T1W images. Conclusion: Image contrast on PMMR depends on the temperature of a corpse. Radiologists involved in post mortem imaging must be aware of temperature-related changes in MR image contrast. To preserve technical quality, scanning corpses below 10 °C should be avoided.

  5. The social impacts of dams: A new framework for scholarly analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk

    2016-09-15

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  6. Severe malaria - a case of fatal Plasmodium knowlesi infection with post-mortem findings: a case report

    Directory of Open Access Journals (Sweden)

    Adem Patricia

    2010-01-01

    Full Text Available Abstract Background Zoonotic malaria caused by Plasmodium knowlesi is an important, but newly recognized, human pathogen. For the first time, post-mortem findings from a fatal case of knowlesi malaria are reported here. Case presentation A formerly healthy 40 year-old male became symptomatic 10 days after spending time in the jungle of North Borneo. Four days later, he presented to hospital in a state of collapse and died within two hours. He was hyponatraemic and had elevated blood urea, potassium, lactate dehydrogenase and amino transferase values; he was also thrombocytopenic and eosinophilic. Dengue haemorrhagic shock was suspected and a post-mortem examination performed. Investigations for dengue virus were negative. Blood for malaria parasites indicated hyperparasitaemia and single species P. knowlesi infection was confirmed by nested-PCR. Macroscopic pathology of the brain and endocardium showed multiple petechial haemorrhages, the liver and spleen were enlarged and lungs had features consistent with ARDS. Microscopic pathology showed sequestration of pigmented parasitized red blood cells in the vessels of the cerebrum, cerebellum, heart and kidney without evidence of chronic inflammatory reaction in the brain or any other organ examined. Brain sections were negative for intracellular adhesion molecule-1. The spleen and liver had abundant pigment containing macrophages and parasitized red blood cells. The kidney had evidence of acute tubular necrosis and endothelial cells in heart sections were prominent. Conclusions The overall picture in this case was one of systemic malaria infection that fit the WHO classification for severe malaria. Post-mortem findings in this case were unexpectedly similar to those that define fatal falciparum malaria, including cerebral pathology. There were important differences including the absence of coma despite petechial haemorrhages and parasite sequestration in the brain. These results suggest that further

  7. Prevalence of Atherosclerotic Coronary Stenosis in Asymptomatic North Indian Population: A Post-mortem Coronary Angiography Study.

    Science.gov (United States)

    Bansal, Yogender Singh; Mandal, Shatrugan Prasad; Kumar, Senthil; Setia, Puneet

    2015-09-01

    A preliminary study of coronaries using post-mortem angiography was undertaken to see the prevalence of atherosclerotic coronary stenosis in non-cardiac unnatural deaths. This study was conducted in a tertiary care centre located in Chandigarh. A total of 128 medico-legal cases were studied comprising 88 males and 40 females. Post-mortem examinations of these MLC cases were conducted in the Department of Forensic Medicine, PGIMER, Chandigarh. All hearts were visually screened by post-mortem coronary angiography first and then grossly examined using serial transverse incision technique in positive screening cases to find the degree of narrowing. Of the study group, 34% males and 20% females showed evidence of narrowing on angiography. Of the males showing coronary stenosis, 83% had single vessel disease and 13% had double vessel disease, while only one individual had triple vessel disease. In cases of female, all the cases of coronary stenosis were single vessel disease. Left anterior descending coronary artery (LAD) was the most common vessel involved, followed by right coronary artery (RCA) & Left circumflex artery (LCX) and in cases of double vessel disease, LAD in combination with LCX was responsible for 75% of the cases. Remarkably 23.6% of study population in the age group of less than 40 years showed appreciable narrowing in at least one of the coronaries. In general, the prevalence of CAD is on the rise, particularly in younger population owing to the changes in their lifestyle and food habits. This preliminary study revealed evidence of narrowing of at least one coronary in 34% male and 20% female population and 23.6% subjects were less than 40 years old. Further detailed studies are needed especially in younger age group and to support the need for preventive cardiology in the early years of life.

  8. Practical experience in post-mortem tissue donation in consideration of the European tissue law.

    Science.gov (United States)

    Karbe, Thomas; Braun, Christian; Wulff, Birgit; Schröder, Ann Sophie; Püschel, Klaus; Bratzke, Hansjürgen; Parzeller, Markus

    2010-03-01

    In consequence of the European guidelines of safety and quality standards for the donation, retrieval, storing and distribution of human tissues and cells the purpose of tissue transplantation was implemented into German legislation in May 2007. The law came into effect on August 1st 2007 considering of the European rules. The Institutes for Legal Medicine of the University of Frankfurt/Main and the University Medical Center Hamburg-Eppendorf developed a model for tissue retrieval. The Institute of Legal Medicine (I.f.R.) at the University Medical Center Hamburg cooperates with the German Institute of Cell and Tissue Replacement (Deutsches Institut für Zell--und Gewebeersatz DIZG). Potential post-mortem tissue donors (PMTD) among the deceased are selected by standardized sets of defined criteria. The procedure is guided by the intended exclusion criteria of the tissue regulation draft (German Transplant Law TPG GewV) in accordance with the European Guideline (2006/17/EC). Following the identification of the donor and subsequent removal of tissue, the retrieved samples were sent to the DIZG, a non-profit tissue bank according to the tissue regulation. Here the final processing into transplantable tissue grafts takes place, which then results in the allocation of tissue to hospitals in Germany and other European countries. The Center of Legal Medicine at the Johann Wolfgang Goethe-University Medical Center Frankfurt/Main cooperates since 2000 with Tutogen, a pharmaceutical company. Harvesting of musculoskeletal tissues follows corresponding regulations. To verify the outcome of PMTD at the I.f.R. Hamburg, two-statistic analysis over 12 and 4 months have been implemented. Our results have shown an increasing number of potential appropriate PMTD within the second inquiry interval but a relatively small and unvaryingly rate of successful post-mortem tissue retrievals similar to the first examination period. Thus, the aim of the model developed by the I.f.R. is to

  9. Second generation CO2 FEP analysis: Cassifcarbon sequestration scenario identification framework

    NARCIS (Netherlands)

    Yavuz, F.T.; Tilburg, T. van; Pagnier, H.

    2008-01-01

    A novel scenario analysis framework has been created, called Carbon Sequestration Scenario Identification Framework (CASSIF). This framework addresses containment performance defined by the three major categories: well, fault and seal integrity. The relevant factors that influence the integrity are

  10. Analysis of death in major trauma: value of prompt post mortem computed tomography (pmCT) in comparison to office hour autopsy.

    Science.gov (United States)

    Schmitt-Sody, Markus; Kurz, Stefanie; Reiser, Maximilian; Kanz, Karl Georg; Kirchhoff, Chlodwig; Peschel, Oliver; Kirchhoff, Sonja

    2016-03-29

    To analyze diagnostic accuracy of prompt post mortem Computed Tomography (pmCT) in determining causes of death in patients who died during trauma room management and to compare the results to gold standard autopsy during office hours. Multiple injured patients who died during trauma room care were enrolled. PmCT was performed immediately followed by autopsy during office hours. PmCT and autopsy were analyzed primarily regarding pmCT ability to find causes of death and secondarily to define exact causes of death including accurate anatomic localizations. For the secondary analysis data was divided in group-I with equal results of pmCT and autopsy, group-II with autopsy providing superior results and group-III with pmCT providing superior information contributing to but not majorly causing death. Seventeen multiple trauma patients were enrolled. Since multiple trauma patients were enrolled more injuries than patients are provided. Eight patients sustained deadly head injuries (47.1%), 11 chest (64.7%), 4 skeletal system (23.5%) injuries and one patient drowned (5.8%). Primary analysis revealed in 16/17 patients (94.1%) causes of death in accordance with autopsy. Secondary analysis revealed in 9/17 cases (group-I) good agreement of autopsy and pmCT. In seven cases autopsy provided superior results (group-II) whereas in 1 case pmCT found more information (group-III). The presented work studied the diagnostic value of pmCT in defining causes of death in comparison to standard autopsy. Primary analysis revealed that in 94.1% of cases pmCT was able to define causes of death even if only indirect signs were present. Secondary analysis showed that pmCT and autopsy showed equal results regarding causes of death in 52.9%. PmCT is useful in traumatic death allowing for an immediate identification of causes of death and providing detailed information on bony lesions, brain injuries and gas formations. It is advisable to conduct pmCT especially in cases without consent to

  11. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  12. Identification of discrete vascular lesions in the extremities using post-mortem computed tomography angiography – Case reports

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2017-01-01

    In this case report, we introduced post-mortem computed tomography angiography (PMCTA) in three cases suffering from vascular lesions in the upper extremities. In each subject, the third part of the axillary arteries and veins were used to catheterize the arms. The vessels were filled with a barium

  13. Post-mortem re-cloning of a transgenic red fluorescent protein dog.

    Science.gov (United States)

    Hong, So Gun; Koo, Ok Jae; Oh, Hyun Ju; Park, Jung Eun; Kim, Minjung; Kim, Geon-A; Park, Eun Jung; Jang, Goo; Lee, Byeong-Chun

    2011-12-01

    Recently, the world's first transgenic dogs were produced by somatic cell nuclear transfer. However, cellular senescence is a major limiting factor for producing more advanced transgenic dogs. To overcome this obstacle, we rejuvenated transgenic cells using a re-cloning technique. Fibroblasts from post-mortem red fluorescent protein (RFP) dog were reconstructed with in vivo matured oocytes and transferred into 10 surrogate dogs. One puppy was produced and confirmed as a re-cloned dog. Although the puppy was lost during birth, we successfully established a rejuvenated fibroblast cell line from this animal. The cell line was found to stably express RFP and is ready for additional genetic modification.

  14. Unusually extensive head trauma in a hydraulic elevator accident: post-mortem MSCT findings, autopsy results and scene reconstruction.

    Science.gov (United States)

    Jacobsen, Christina; Schön, Corinna A; Kneubuehl, Beat; Thali, Michael J; Aghayev, Emin

    2008-10-01

    Accidental or intentional falls from a height are a form of blunt trauma and occur frequently in forensic medicine. Reports describing elevator accidents as a small subcategory of falls from heights are rare in the medical literature and no report on injury patterns or scene reconstruction of such an accident was found. A case of an accident in a hydraulic elevator with a man falling 3m was examined using post-mortem multi-slice computed tomography (MSCT) and autopsy. The man suffered an unusually extensive trauma and died at the scene. Post-mortem MSCT examination showed a comminute fracture of the skull, the right femur and the first lumbar vertebra. Severe lacerations of the brain with epidural, subdural and subarachnoidal haemorrhages over both hemispheres were diagnosed. Autopsy confirmed these findings. To reconstruct the accident we used radiological and autopsy results as well as findings at the scene.

  15. A Conceptual Framework over Contextual Analysis of Concept Learning within Human-Machine Interplays

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    This research provides a contextual description concerning existential and structural analysis of ‘Relations’ between human beings and machines. Subsequently, it will focus on conceptual and epistemological analysis of (i) my own semantics-based framework [for human meaning construction] and of (ii......) a well-structured machine concept learning framework. Accordingly, I will, semantically and epistemologically, focus on linking those two frameworks for logical analysis of concept learning in the context of human-machine interrelationships. It will be demonstrated that the proposed framework provides...

  16. Note sulla concezione del post mortem presso gli Ittiti

    DEFF Research Database (Denmark)

    Vigo, Matteo; Bellucci, Benedetta

    significato escatologico della morte; 2) la definizione dello status ultraterreno dei defunti (i.e. il diverso trattamento del defunto, ad esempio dal punto di vista cultuale, sulla base della sua condizione sociale); 3) il delineamento dei caratteri e delle funzioni delle divinità ultra-terrene; 4) la......Oggetto d’analisi di questo paper è la concezione della sfera del post-mortem presso la civiltà degli Ittiti. Lo studio delle tematiche che riguardano la concezione dell’“aldilà” nel mondo ittita, sia in senso fisico che metafisico, riveste una larga importanza nella letteratura ittitologica ed è...... stato, pertanto, diffusamente trattato. Nel presente contributo si è scelto quindi di approfondire solo alcuni aspetti della concezione dell’oltretomba, inteso come dimensione metafisica in cui il corpo del defunto giace, staziona o transita dopo la morte. I temi approfonditi saranno quindi: 1) il...

  17. Post-mortem virtual estimation of free abdominal blood volume

    International Nuclear Information System (INIS)

    Ampanozi, Garyfalia; Hatch, Gary M.; Ruder, Thomas D.; Flach, Patricia M.; Germerott, Tanja; Thali, Michael J.; Ebert, Lars C.

    2012-01-01

    Purpose: The purpose of this retrospective study was to examine the reliability of virtually estimated abdominal blood volume using segmentation from postmortem computed tomography (PMCT) data. Materials and methods: Twenty-one cases with free abdominal blood were investigated by PMCT and autopsy. The volume of the blood was estimated using a manual segmentation technique (Amira, Visage Imaging, Germany) and the results were compared to autopsy data. Six of 21 cases had undergone additional post-mortem computed tomographic angiography (PMCTA). Results: The virtually estimated abdominal blood volumes did not differ significantly from those measured at autopsy. Additional PMCTA did not bias data significantly. Conclusion: Virtual estimation of abdominal blood volume is a reliable technique. The virtual blood volume estimation is a useful tool to deliver additional information in cases where autopsy is not performed or in cases where a postmortem angiography is performed

  18. Role of forensic odontologist in post mortem person identification

    Directory of Open Access Journals (Sweden)

    Jahagirdar B Pramod

    2012-01-01

    Full Text Available The natural teeth are the most durable organs in the bodies of vertebrates, and humankind′s understanding of their own past and evolution relies heavily upon remnant dental evidence found as fossils. The use of features unique to the human dentition as an aid to personal identification is widely accepted within the forensic field. Comparative dental identifications play a major role in identifying the victims of violence, disaster or other mass tragedies. The comparison of ante-mortem and postmortem dental records to determine human identity has long been established. Indeed, it is still a major identification method in criminal investigations, mass disasters, grossly decomposed or traumatized bodies, and in other situations where visual identification is neither possible nor desirable. This article has comprehensively described some of the methods, and additional factors aiding in postmortem person identification.

  19. Post-mortem toxicology in young sudden cardiac death victims

    DEFF Research Database (Denmark)

    Bjune, Thea; Risgaard, Bjarke; Kruckow, Line

    2017-01-01

    Aims: Several drugs increase the risk of ventricular fibrillation and sudden cardiac death (SCD). We aimed to investigate in detail the toxicological findings of all young SCD throughout Denmark. Methods and results: Deaths in persons aged 1-49 years were included over a 10-year period. Death...... certificates and autopsy reports were retrieved and read to identify cases of sudden death and establish cause of death. All medico-legal autopsied SCD were included and toxicological reports collected. Positive toxicology was defined as the presence of any substance (licit and/or illicit). All toxicological...... findings had previously been evaluated not to have caused the death (i.e. lethal concentrations were excluded). We identified 620 medico-legal autopsied cases of SCD, of which 77% (n = 477) were toxicologically investigated post-mortem, and 57% (n = 270) had a positive toxicology profile. Sudden cardiac...

  20. Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods

    Directory of Open Access Journals (Sweden)

    Kênia de Fátima Carrijo

    2014-06-01

    Full Text Available ABSTRACT. Carrijo K.F., Nascimento E.R., Pereira V.L.A., Morés N., Klein, C.S., Domingues L.M. & Tortelly R. [Diagnosis of porcine enzootic pneumonia by post mortem sanitary inspection: comparison with other diagnostic methods.] Diagnóstico da pneumonia enzoótica suína pela inspeção sanitária post mortem: comparação com outros métodos de diagnóstico. Revista Brasileira de Veterinária Brasileira 36(2:188-194, 2014. Faculdade de Medicina Veterinária, Universidade Federal de Uberlândia, Av. Pará, 1720, Bloco 2T, Jardim Umuarama, Uberlândia, MG 38400-902, Brasil. E-mail: keniacarrijo@ famev.ufu.br To compare the concordance of the diagnosis of porcine enzootic pneumonia (PEP by post-mortem Sanitary Inspection with other methods (histophatology and immunohistochemistry - IHC, were used lung tissue samples from 100 pigs slaughtered under sanitary inspection, and 50 of these had macroscopic lesions suggestive of PEP and 50 had no such lesions. These were fixed in 10% buffered formalin and processed by routine procedures for paraffin embedding and IHC technique for Mycoplasma hyopneumoniae using a monoespecific polyclonal antibody. The study demonstrating that there is concordance between the diagnosis of Sanitary Inspection with histophatology, between the diagnosis of Sanitary Inspection with IHC and histophatology with IHC. It can be conclude that when the lung has gross lesions of PEP, the probability the result is positive to M. hyopneumoniae by IHC and the presence of microscopic lesions increases. Thus, the microscopic diagnosis for PEP is feasible because it is associated to the other, so that the diagnosis given by the officials of Sanitary Inspection in slaughterhouses is not wrong; the macroscopic diagnosis is therefore a valid method for the diagnosis of PEP, it being understood this is not to say that the detection of M. hyopneumoniae.

  1. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    opportunities, a generic modelling framework is proposed to handle this task. This framework outlines a set of building blocks which are necessary for carrying out the economic analysis of various BS applications. Further, special focus is given on describing how to use the rainflow cycle counting algorithm...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so......Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...

  2. Variation in post mortem rate of glycolyis does not necessarily affect drip loss of non-stimulated veal

    NARCIS (Netherlands)

    Hertog-Meischke, den M.J.; Klont, R.E.; Smulders, F.J.M.; Logtestijn, van J.G.

    1997-01-01

    In this study the effect of the rate of post mortem pH fall on the water-holding capacity of meat from moderately chilled veal carcasses was investigated. Also the relationship between muscle protein denaturation and drip loss of veal was examined. Three groups of 10 Friesian Holstein male veal

  3. Comparative analysis of bones, mites, soil chemistry, nematodes and soil micro-eukaryotes from a suspected homicide to estimate the post-mortem interval.

    Science.gov (United States)

    Szelecz, Ildikó; Lösch, Sandra; Seppey, Christophe V W; Lara, Enrique; Singer, David; Sorge, Franziska; Tschui, Joelle; Perotti, M Alejandra; Mitchell, Edward A D

    2018-01-08

    Criminal investigations of suspected murder cases require estimating the post-mortem interval (PMI, or time after death) which is challenging for long PMIs. Here we present the case of human remains found in a Swiss forest. We have used a multidisciplinary approach involving the analysis of bones and soil samples collected beneath the remains of the head, upper and lower body and "control" samples taken a few meters away. We analysed soil chemical characteristics, mites and nematodes (by microscopy) and micro-eukaryotes (by Illumina high throughput sequencing). The PMI estimate on hair 14 C-data via bomb peak radiocarbon dating gave a time range of 1 to 3 years before the discovery of the remains. Cluster analyses for soil chemical constituents, nematodes, mites and micro-eukaryotes revealed two clusters 1) head and upper body and 2) lower body and controls. From mite evidence, we conclude that the body was probably brought to the site after death. However, chemical analyses, nematode community analyses and the analyses of micro-eukaryotes indicate that decomposition took place at least partly on site. This study illustrates the usefulness of combining several lines of evidence for the study of homicide cases to better calibrate PMI inference tools.

  4. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  5. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  6. Characteristics of human infant primary fibroblast cultures from Achilles tendons removed post-mortem

    DEFF Research Database (Denmark)

    Rohde, Marianne Cathrine; Corydon, Thomas Juhl; Hansen, Jakob

    2014-01-01

    Primary cell cultures were investigated as a tool for molecular diagnostics in a forensic setting. Fibroblast cultures had been established from human Achilles tendon resected at autopsies, from cases of sudden infant death syndrome and control infants who died in traumatic events (n=41). After...... established from post-mortem tissue are renewable sources of biological material; they can be the foundation for genetic, metabolic and other functional studies and thus constitute a valuable tool for molecular and pathophysiological investigations in biomedical and forensic sciences....

  7. Strategy analysis frameworks for strategy orientation and focus

    OpenAIRE

    Isoherranen, V. (Ville)

    2012-01-01

    Abstract The primary research target of this dissertation is to develop new strategy analysis frameworks, focusing on analysing changes in strategic position as a function of variations in life cycle s-curve/time/typology/market share/orientation. Research is constructive and qualitative by nature, with case study methodology being the adopted approach. The research work is carried out as a compilation dissertation containing four (4) journal articles. The theoretical framework of thi...

  8. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  9. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  10. O líquido cefalorraqueano no post-mortem

    Directory of Open Access Journals (Sweden)

    A. Spina-França

    1969-12-01

    Full Text Available Foi estudado o LCR de 45 cadáveres, sendo os resultados considerados em função do tempo decorrido entre o momento da morte e a colheita do LCR (TOC. Obedecendo a esse critério os casos foram assim grupados: 1 aqueles com TOC até 4 horas; 2 aqueles com TOC de 4 a 8 horas; 3 aqueles com TOC de 8 horas ou mais. Com o aumento do TOC a presença de hemácias no LCR de cadáveres se torna mais freqüente e mais intensa. A mistura de sangue ao LCR prejudica a avaliação das modificações cadavéricas de outros componentes do LCR, conforme foi demonstrado para as concentrações de cloretos glicose e proteínas totais, para o perfil protêico e para a atividade de transaminases. Assim sendo, para avaliar as modificações da composição do LCR próprias ao post-mortem devem ser considerados apenas os casos com menos de 1000 hemácias/mm³. O número normal de leucócitos foi proporcionalmente mais comum nas amostras provenientes de cadáveres cujo TOC era igual ou superior a 8 horas. A pleocitose foi observada com mais freqüência que o número normal de leucócitos, sendo mais comumente ligeira ou discreta. Quantidades superiores a 50 leucócitos/mm³ foram observadas geralmente em casos relativos a pacientes que faleceram na vigência de processos infecciosos agudos. As concentrações de cloretos e de glicose no LCR tendem a cair no postmortem e as diminuições mostraram-se, em média, tanto mais intensas quanto maior o TOC. A hipoglicorraquia foi, em média mais acentuada nos casos com pleoeitose mais intensa. A concentração de uréia tende a elevar-se de modo precoce, não tendo sido encontradas médias significativamente diversas em função do TOC. A atividade de TGO tende a elevar-se no post-mortem sendo esta elevação, em média, mais nítida a partir do grupo de casos com TOC de 4 até 8 horas. Ocorre também tendência a aumento da atividade de TGP; esta se mostrou menos intensa que a de TGO e, em média, foi mais nítida a

  11. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  12. Post-mortem magnetic resonance foetal imaging: a study of morphological correlation with conventional autopsy and histopathological findings.

    Science.gov (United States)

    Vullo, Annamaria; Panebianco, Valeria; Cannavale, Giuseppe; Aromatario, Mariarosaria; Cipolloni, Luigi; Frati, Paola; Santurro, Alessandro; Vullo, Francesco; Catalano, Carlo; Fineschi, Vittorio

    2016-11-01

    The aim of the present study is to offer our experience concerning post-mortem magnetic resonance (PMMR) in foetal death cases and an evaluation of the differences between the findings acquired by PMMR and by forensic autopsy. Fifteen foetuses were recruited from July 2014 to December 2015. These had suffered intrauterine death in women in the 21st to 38th week of gestation who were treated in the emergency department for non-perception of foetal movements. We performed a PMMR on foetuses, 3 ± 1 days on average from the time of death, and then a complete forensic autopsy was performed. All 15 foetuses were examined with a whole-body study protocol, starting from the skull, down to and including the lower limbs. The total time of examination ranged from 20 to 30 min in each case. The external evaluation and description of post-mortem phenomena (maceration), record of the weight and detection and the various measurements of foetal diameters were evaluated before performing autopsy. A complete histopathological study was performed in each case. Out of 15 cases examined, eight were negative for structural anatomical abnormalities and/or diseases, both in the preliminary radiological examination and the traditional autopsy. In the remaining seven cases, pathological findings were detected by PMMR with corresponding results at autopsy. PMMR can provide useful information on foetal medical conditions and result in improved diagnostic classification. It may enable the planning of a more suitable technique before proceeding to autopsy, including focusing on certain aspects of organ pathology otherwise not detectable. The association between PMMR, post-mortem examination and related histological study of the foetus-placenta unit could help reduce the percentage of cases in which the cause of foetal death remains unexplained. Lastly, it may allow a selective sampling of the organ in order to target histological investigations.

  13. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  14. Comparison of diagnostic performance for perinatal and paediatric post-mortem imaging: CT versus MRI

    International Nuclear Information System (INIS)

    Arthurs, Owen J.; Jacques, Thomas S.; Sebire, Neil J.; Guy, Anna; Chong, W.K.; Gunny, Roxanna; Saunders, Dawn; Olsen, Oystein E.; Thayyil, Sudhin; Wade, Angie; Jones, Rod; Norman, Wendy; Taylor, Andrew M.; Scott, Rosemary; Robertson, Nicola J.; Owens, Catherine M.; Offiah, Amaka C.; Chitty, Lyn S.

    2016-01-01

    To compare the diagnostic yield of whole-body post-mortem computed tomography (PMCT) imaging to post-mortem magnetic resonance (PMMR) imaging in a prospective study of fetuses and children. We compared PMCT and PMMR to conventional autopsy as the gold standard for the detection of (a) major pathological abnormalities related to the cause of death and (b) all diagnostic findings in five different body organ systems. Eighty two cases (53 fetuses and 29 children) underwent PMCT and PMMR prior to autopsy, at which 55 major abnormalities were identified. Significantly more PMCT than PMMR examinations were non-diagnostic (18/82 vs. 4/82; 21.9 % vs. 4.9 %, diff 17.1 % (95 % CI 6.7, 27.6; p < 0.05)). PMMR gave an accurate diagnosis in 24/55 (43.64 %; 95 % CI 31.37, 56.73 %) compared to 18/55 PMCT (32.73 %; 95 % CI 21.81, 45.90). PMCT was particularly poor in fetuses <24 weeks, with 28.6 % (8.1, 46.4 %) more non-diagnostic scans. Where both PMCT and PMMR were diagnostic, PMMR gave slightly higher diagnostic accuracy than PMCT (62.8 % vs. 59.4 %). Unenhanced PMCT has limited value in detection of major pathology primarily because of poor-quality, non-diagnostic fetal images. On this basis, PMMR should be the modality of choice for non-invasive PM imaging in fetuses and children. (orig.)

  15. Post-mortem changes in chicken muscle : some key biochemical processes involved in the conversion of muscle to meat

    NARCIS (Netherlands)

    Schreurs, F.J.G.

    1999-01-01

    The post mortem changes taking place in poultry muscular tissue and the resulting meat quality, until the moment of consumption of the meat by the consumer are described. Modern broiler chickens grow 'at the edge of what is metabolically possible'. This hypothesis is derived from the fact

  16. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    Science.gov (United States)

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  17. Understanding Early Post-Mortem Biochemical Processes Underlying Meat Color and pH Decline in the Longissimus thoracis Muscle of Young Blond d'Aquitaine Bulls Using Protein Biomarkers.

    Science.gov (United States)

    Gagaoua, Mohammed; Terlouw, E M Claudia; Micol, Didier; Boudjellal, Abdelghani; Hocquette, Jean-François; Picard, Brigitte

    2015-08-05

    Many studies on color biochemistry and protein biomarkers were undertaken in post-mortem beef muscles after ≥24 hours. The present study was conducted on Longissimus thoracis muscles of 21 Blond d'Aquitaine young bulls to evaluate the relationships between protein biomarkers present during the early post-mortem and known to be related to tenderness and pH decline and color development. pH values at 45 min, 3 h, and 30 h post-mortem were correlated with three, seven, and six biomarkers, respectively. L*a*b* color coordinates 24 h post-mortem were correlated with nine, five, and eight protein biomarkers, respectively. Regression models included Hsp proteins and explained between 47 and 59% of the variability between individuals in pH and between 47 and 65% of the variability in L*a*b* color coordinates. Proteins correlated with pH and/or color coordinates were involved in apoptosis or had antioxidative or chaperone activities. The main results include the negative correlations between pH45 min, pH3 h, and pHu and Prdx6, which may be explained by the antioxidative and phospholipase activities of this biomarker. Similarly, inducible Hsp70-1A/B and μ-calpain were correlated with L*a*b* coordinates, due to the protective action of Hsp70-1A/B on the proteolytic activities of μ-calpain on structural proteins. Correlations existed further between MDH1, ENO3, and LDH-B and pH decline and color stability probably due to the involvement of these enzymes in the glycolytic pathway and, thus, the energy status of the cell. The present results show that research using protein indicators may increase the understanding of early post-mortem biological mechanisms involved in pH and beef color development.

  18. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  19. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  20. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  1. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  2. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  3. Post-mortem computed tomography angiography utilizing barium sulfate to identify microvascular structures : a preliminary phantom model and case study

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Kuster, Lidy; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2016-01-01

    We investigated the use of computer tomography angiography (CTA) to visualize microvascular structures in a vessel-mimicking phantom and post-mortem (PM) bodies. A contrast agent was used based on 22% barium sulfate, 20% polyethylene glycol and 58% distilled water. A vessel-mimicking phantom

  4. Diagnostic accuracy and limitations of post-mortem MRI for neurological abnormalities in fetuses and children

    International Nuclear Information System (INIS)

    Arthurs, O.J.; Thayyil, S.; Pauliah, S.S.; Jacques, T.S.; Chong, W.K.; Gunny, R.; Saunders, D.; Addison, S.; Lally, P.; Cady, E.; Jones, R.; Norman, W.; Scott, R.; Robertson, N.J.; Wade, A.; Chitty, L.; Taylor, A.M.

    2015-01-01

    Aim: To compare the diagnostic accuracy of non-invasive cerebral post-mortem magnetic resonance imaging (PMMRI) specifically for cerebral and neurological abnormalities in a series of fetuses and children, compared to conventional autopsy. Materials and methods: Institutional ethics approval and parental consent was obtained. Pre-autopsy cerebral PMMRI was performed in a sequential prospective cohort (n = 400) of fetuses (n = 277; 185 ≤ 24 weeks and 92 > 24 weeks gestation) and children <16 years (n = 123) of age. PMMRI and conventional autopsy findings were reported blinded and independently of each other. Results: Cerebral PMMRI had sensitivities and specificities (95% confidence interval) of 88.4% (75.5 to 94.9), and 95.2% (92.1 to 97.1), respectively, for cerebral malformations; 100% (83.9 to 100), and 99.1% (97.2 to 99.7) for major intracranial bleeds; and 87.5% (80.1 to 92.4) and 74.1% (68 to 79.4) for overall brain pathology. Formal neuropathological examination was non-diagnostic due to maceration/autolysis in 43/277 (16%) fetuses; of these, cerebral PMMRI imaging provided clinically important information in 23 (53%). The sensitivity of PMMRI for detecting significant ante-mortem ischaemic injury was only 68% (48.4 to 82.8) overall. Conclusions: PMMRI is an accurate investigational technique for identifying significant neuropathology in fetuses and children, and may provide important information even in cases where autolysis prevents formal neuropathological examination; however, PMMRI is less sensitive at detecting hypoxic–ischaemic brain injury, and may not detect rarer disorders not encountered in this study. -- Highlights: •Post mortem MRI (PMMRI) has a sensitivity of >87% for detecting cerebral malformations, intracranial bleeds and neurological cause of death. •PMMRI provides important diagnostic information in >50% of fetuses where conventional brain autopsy is non-diagnostic. •PMMRI is currently poor at reliably identifying

  5. A Framework for Analysis of Case Studies of Reading Lessons

    Science.gov (United States)

    Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita

    2013-01-01

    This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…

  6. Usos y significados sociales de la fotografía post-mortem en Colombia

    OpenAIRE

    Henao Albarracín, Ana María; Universidad Paris Ouest Nanterre La Défense, Paris, Francia

    2013-01-01

    Esta investigación tiene como finalidad conocer los usos y significados sociales de la fotografía post-mortem o fúnebre entre finales del XIX y mediados del siglo XX en Colombia. De tal forma, busca contribuir al análisis de las relaciones entre fotografía y sociedad, y más particularmente, entre la fotografía y una representación social de la muerte, identificando las convenciones y reglas de esta práctica fotográfica que determinan comportamientos estéticos alrededor de la muerte. This r...

  7. UNC-Utah NA-MIC framework for DTI fiber tract analysis.

    Science.gov (United States)

    Verde, Audrey R; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts.

  8. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  9. Risk and train control : a framework for analysis

    Science.gov (United States)

    2001-01-01

    This report develops and demonstrates a framework for examining the effects of various train control strategies on some of the major risks of railroad operations. Analysis of hypothetical 1200-mile corridor identified the main factors that increase r...

  10. Transactional Analysis: Conceptualizing a Framework for Illuminating Human Experience

    Directory of Open Access Journals (Sweden)

    Trevor Thomas Stewart PhD

    2011-09-01

    Full Text Available Myriad methods exist for analyzing qualitative data. It is, however, imperative for qualitative researchers to employ data analysis tools that are congruent with the theoretical frameworks underpinning their inquiries. In this paper, I have constructed a framework for analyzing data that could be useful for researchers interested in focusing on the transactional nature of language as they engage in Social Science research. Transactional Analysis (TA is an inductive approach to data analysis that transcends constant comparative methods of exploring data. Drawing on elements of narrative and thematic analysis, TA uses the theories of Bakhtin and Rosenblatt to attend to the dynamic processes researchers identify as they generate themes in their data and seek to understand how their participants' worldviews are being shaped. This paper highlights the processes researchers can utilize to study the mutual shaping that occurs as participants read and enter into dialogue with the world around them.

  11. Flexible Human Behavior Analysis Framework for Video Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Weilun Lao

    2010-01-01

    Full Text Available We study a flexible framework for semantic analysis of human motion from surveillance video. Successful trajectory estimation and human-body modeling facilitate the semantic analysis of human activities in video sequences. Although human motion is widely investigated, we have extended such research in three aspects. By adding a second camera, not only more reliable behavior analysis is possible, but it also enables to map the ongoing scene events onto a 3D setting to facilitate further semantic analysis. The second contribution is the introduction of a 3D reconstruction scheme for scene understanding. Thirdly, we perform a fast scheme to detect different body parts and generate a fitting skeleton model, without using the explicit assumption of upright body posture. The extension of multiple-view fusion improves the event-based semantic analysis by 15%–30%. Our proposed framework proves its effectiveness as it achieves a near real-time performance (13–15 frames/second and 6–8 frames/second for monocular and two-view video sequences.

  12. 9 CFR 314.7 - Carcasses of livestock condemned on ante-mortem inspection not to pass through edible product areas.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Carcasses of livestock condemned on... DISPOSAL OF CONDEMNED OR OTHER INEDIBLE PRODUCTS AT OFFICIAL ESTABLISHMENTS § 314.7 Carcasses of livestock condemned on ante-mortem inspection not to pass through edible product areas. Carcasses of livestock which...

  13. A Framework for Collaborative Networked Learning in Higher Education: Design & Analysis

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2014-06-01

    Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.

  14. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    Science.gov (United States)

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  15. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  16. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  17. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  18. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  19. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  20. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  1. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  2. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    Science.gov (United States)

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well

  3. Accuracy of an efficient framework for structural analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert D.; Fedorov, Vladimir

    2016-01-01

    -section analysis tool is able to capture the effects stemming from material anisotropy and inhomogeneity for sections of arbitrary geometry. The proposed framework is very efficient and therefore ideally suited for integration within wind turbine aeroelastic design and analysis tools. A number of benchmark......This paper presents a novel framework for the structural design and analysis of wind turbine blades and establishes its accuracy. The framework is based on a beam model composed of two parts—a 2D finite element-based cross-section analysis tool and a 3D beam finite element model. The cross...... examples are presented comparing the results from the proposed beam model to 3D shell and solid finite element models. The examples considered include a square prismatic beam, an entire wind turbine rotor blade and a detailed wind turbine blade cross section. Phenomena at both the blade length scale...

  4. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...

  5. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. UNC-Utah NA-MIC Framework for DTI Fiber Tract Analysis

    Directory of Open Access Journals (Sweden)

    Audrey Rose Verde

    2014-01-01

    Full Text Available Diffusion tensor imaging has become an important modality in the field ofneuroimaging to capture changes in micro-organization and to assess white matterintegrity or development. While there exists a number of tractography toolsets,these usually lack tools for preprocessing or to analyze diffusion properties alongthe fiber tracts. Currently, the field is in critical need of a coherent end-to-endtoolset for performing an along-fiber tract analysis, accessible to non-technicalneuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents acoherent, open source, end-to-end toolset for atlas fiber tract based DTI analysisencompassing DICOM data conversion, quality control, atlas building, fibertractography, fiber parameterization, and statistical analysis of diffusionproperties. Most steps utilize graphical user interfaces (GUI to simplifyinteraction and provide an extensive DTI analysis framework for non-technicalresearchers/investigators. We illustrate the use of our framework on a smallsample, cross sectional neuroimaging study of 8 healthy 1-year-old children fromthe Infant Brain Imaging Study (IBIS Network. In this limited test study, weillustrate the power of our method by quantifying the diffusion properties at 1year of age on the genu and splenium fiber tracts.

  7. Combinatorial-topological framework for the analysis of global dynamics

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  8. Combinatorial-topological framework for the analysis of global dynamics.

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  9. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  10. A Stochastic Hybrid Systems framework for analysis of Markov reward models

    International Nuclear Information System (INIS)

    Dhople, S.V.; DeVille, L.; Domínguez-García, A.D.

    2014-01-01

    In this paper, we propose a framework to analyze Markov reward models, which are commonly used in system performability analysis. The framework builds on a set of analytical tools developed for a class of stochastic processes referred to as Stochastic Hybrid Systems (SHS). The state space of an SHS is comprised of: (i) a discrete state that describes the possible configurations/modes that a system can adopt, which includes the nominal (non-faulty) operational mode, but also those operational modes that arise due to component faults, and (ii) a continuous state that describes the reward. Discrete state transitions are stochastic, and governed by transition rates that are (in general) a function of time and the value of the continuous state. The evolution of the continuous state is described by a stochastic differential equation and reward measures are defined as functions of the continuous state. Additionally, each transition is associated with a reset map that defines the mapping between the pre- and post-transition values of the discrete and continuous states; these mappings enable the definition of impulses and losses in the reward. The proposed SHS-based framework unifies the analysis of a variety of previously studied reward models. We illustrate the application of the framework to performability analysis via analytical and numerical examples

  11. Influence of supplemental maslinic acid (olive-derived triterpene) on the post-mortem muscle properties and quality traits of gilthead seabream

    DEFF Research Database (Denmark)

    Matos, E.; Silva, Tomé Santos; Wulff, Tune

    2013-01-01

    , enzymatic activities and protein expression in the muscle were assessed. Supplementing gilthead seabream diets with maslinic acid mainly resulted in hypertrophy of muscle fibres and inhibition of cathepsin B activity, with no observed differences in terms of glycogen and ATP content of the muscle, as well...... as glycogen phosphorylase activity. Proteomic analysis showed a low impact of maslinic acid supplementation on muscle metabolism, with most changes reflecting increased stress coping capacity and muscle hypertrophy in maslinic acid-fed fish. As a finishing strategy to improve the muscle's energetic status......Maslinic acid, a natural triterpene, was evaluated as a dietary supplement to modulate glycogen post-mortem mobilization in gilthead seabream muscle. For this purpose, a multidisciplinary trial was undertaken, where flesh quality criteria, as well as biochemical and histological parameters...

  12. Trace elements distribution and post-mortem intake in human bones from Middle Age by total reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Carvalho, M.L.; Marques, A.F.; Lima, M.T.; Reus, U.

    2004-01-01

    The purpose of the present work is to investigate the suitability of TXRF technique to study the distribution of trace elements along human bones of the 13th century, to conclude about environmental conditions and dietary habits of old populations and to study the uptake of some elements from the surrounding soil. In this work, we used TXRF to quantify and to make profiles of the elements through long bones. Two femur bones, one from a man and another from a woman, buried in the same grave were cross-sectioned in four different points at a distance of 1 cm. Microsamples of each section were taken at a distance of 1 mm from each other. Quantitative analysis was performed for Ca, Mn, Fe, Cu, Zn, Sr, Ba and Pb. Very high concentrations of Mn and Fe were obtained in the whole analysed samples, reaching values higher than 2% in some samples of trabecular tissue, very much alike to the concentrations in the burial soil. A sharp decrease for both elements was observed in cortical tissue. Zn and Sr present steady concentration levels in both kinds of bone tissues. Pb and Cu show very low concentrations in the inner tissue of cortical bone. However, these concentrations increase in the regions in contact to trabecular tissue and external surface in contact with the soil, where high levels of both elements were found. We suggest that contamination from the surrounding soil exists for Mn and Fe in the whole bone tissue. Pb can be both from post-mortem and ante-mortem origin. Inner compact tissue might represent in vivo accumulation and trabecular one corresponds to uptake during burial. The steady levels of Sr and Zn together with soil concentration lower levels for these elements may allow us to conclude that they are originated from in vivo incorporation in the hydroxyapatite bone matrix

  13. FEBEX Project Post-mortem Analysis: Corrosion Study

    International Nuclear Information System (INIS)

    Madina, V.; Azkarate, I.

    2004-01-01

    The partial dismantling of the FEBEX in situ test was carried out during de summer of 2002, following 5 years of continuous heating. The operation included the demolition of the concrete plug and the removal of the section of the test corresponding to the first heater. A large number of samples from all types of materials have been taken during the dismantling for subsequent analysis. Part of the samples collected were devoted to the analysis of the corrosion processes occurred during the first operational phase of the test. These samples comprised corrosion coupons from different metals installed for that purpose, sensors retrieved during the dismantling that were found severely corroded and bentonite in contact with those sensors. In addition, a corrosion study was performed on the heater extracted and on one section of liner surrounding it. All the analyses were carried out by the Fundacion INASMET (Spain). This report describes, in detail the studies carried out the different materials and the obtained results, as well as the drawn conclusions. (Author)

  14. FEBEX Project Post-mortem Analysis: Corrosion Study

    Energy Technology Data Exchange (ETDEWEB)

    Madina, V.; Azkarate, I.

    2004-07-01

    The partial dismantling of the FEBEX in situ test was carried out during de summer of 2002, following 5 years of continuous heating. The operation included the demolition of the concrete plug and the removal of the section of the test corresponding to the first heater. A large number of samples from all types of materials have been taken during the dismantling for subsequent analysis. Part of the samples collected were devoted to the analysis of the corrosion processes occurred during the first operational phase of the test. These samples comprised corrosion coupons from different metals installed for that purpose, sensors retrieved during the dismantling that were found severely corroded and bentonite in contact with those sensors. In addition, a corrosion study was performed on the heater extracted and on one section of liner surrounding it. All the analyses were carried out by the Fundacion INASMET (Spain). This report describes, in detail the studies carried out the different materials and the obtained results, as well as the drawn conclusions. (Author)

  15. A comparison between rib fracture patterns in peri- and post-mortem compressive injury in a piglet model.

    Science.gov (United States)

    Bradley, Amanda L; Swain, Michael V; Neil Waddell, J; Das, Raj; Athens, Josie; Kieser, Jules A

    2014-05-01

    Forensic biomechanics is increasingly being used to explain how observed injuries occur. We studied infant rib fractures from a biomechanical and morphological perspective using a porcine model. We used 24, 6th ribs of one day old domestic pigs Sus scrofa, divided into three groups, desiccated (representing post-mortem trauma), fresh ribs with intact periosteum (representing peri-mortem trauma) and those stored at -20°C. Two experiments were designed to study their biomechanical behaviour fracture morphology: ribs were axially compressed and subjected to four-point bending in an Instron 3339 fitted with custom jigs. Morphoscopic analysis of resultant fractures consisted of standard optical methods, micro-CT (μCT) and Scanning Electron Microscopy (SEM). During axial compression fresh ribs did not fracture because of energy absorption capabilities of their soft and fluidic components. In flexure tests, dry ribs showed typical elastic-brittle behaviour with long linear load-extension curves, followed by short non-linear elastic (hyperelastic) behaviour and brittle fracture. Fresh ribs showed initial linear-elastic behaviour, followed by strain softening and visco-plastic responses. During the course of loading, dry bone showed minimal observable damage prior to the onset of unstable fracture. Frozen then thawed bone showed similar patterns to fresh bone. Morphologically, fresh ribs showed extensive periosteal damage to the tensile surface with areas of collagen fibre pull-out along the tensile surface. While all dry ribs fractured precipitously, with associated fibre pull-out, the latter feature was absent in thawed ribs. Our study highlights the fact that under controlled loading, fresh piglet ribs (representing perimortem trauma) did not fracture through bone, but was associated with periosteal tearing. These results suggest firstly, that complete lateral rib fracture in infants may in fact not result from pure compression as has been previously assumed; and

  16. TomoPy: a framework for the analysis of synchrotron tomographic data

    International Nuclear Information System (INIS)

    Gürsoy, Doǧa; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris

    2014-01-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing

  17. TomoPy: a framework for the analysis of synchrotron tomographic data

    Energy Technology Data Exchange (ETDEWEB)

    Gürsoy, Doǧa, E-mail: dgursoy@aps.anl.gov; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris [Advanced Photon Source, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4837 (United States)

    2014-08-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing.

  18. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  19. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  20. Evaluation of the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers: a systematic review and documentary analysis.

    Science.gov (United States)

    McGraw, Caroline; Drennan, Vari M

    2015-02-01

    To evaluate the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers. The objective was to identify the extent to which these frameworks take account of the setting where the ulcer originated as being the person's home rather than a hospital setting. Pressure ulcers involving full-thickness skin loss are increasingly being regarded as indicators of nursing patient safety failure, requiring investigation using root cause analysis frameworks. Evidence suggests that root cause analysis frameworks developed in hospital settings ignore the unique dimensions of risk in home healthcare settings. A systematic literature review and documentary analysis of frameworks used to investigate community-acquired grade three and four pressure ulcers by home nursing services in England. No published papers were identified for inclusion in the review. Fifteen patient safety investigative frameworks were collected and analysed. Twelve of the retrieved frameworks were intended for the investigation of community-acquired pressure ulcers; seven of which took account of the setting where the ulcer originated as being the patient's home. This study provides evidence to suggest that many of the root cause analysis frameworks used to investigate community-acquired pressure ulcers in England are unsuitable for this purpose. This study provides researchers and practitioners with evidence of the need to develop appropriate home nursing root cause analysis frameworks to investigate community-acquired pressure ulcers. © 2014 John Wiley & Sons Ltd.

  1. Post mortem concentrations of endogenous gamma hydroxybutyric acid (GHB) and in vitro formation in stored blood and urine samples.

    Science.gov (United States)

    Busardò, Francesco Paolo; Bertol, Elisabetta; Vaiano, Fabio; Baglio, Giovanni; Montana, Angelo; Barbera, Nunziata; Zaami, Simona; Romano, Guido

    2014-10-01

    Gamma-hydroxybutyrate (GHB) is a central nervous system depressant, primarily used as a recreational drug of abuse with numerous names. It has also been involved in various instances of drug-facilitated sexual assault due to its potential incapacitating effects. The first aim of this paper is to measure the post-mortem concentration of endogenous GHB in whole blood and urine samples of 30 GHB free-users, who have been divided according to the post-mortem interval (PMI) in three groups (first group: 24-36h; second group: 37-72h; third group: 73-192h), trying to evaluate the role of PMI in affecting post mortem levels. Second, the Authors have evaluated the new formation of GHB in vitro in blood and urine samples of the three groups, which have been stored at -20°C, 4°C and 20°C over a period of one month. The concentrations were measured by GC-MS after liquid-liquid extraction according to the method validated and published by Elliot (For. Sci. Int., 2003). For urine samples, GHB concentrations were creatinine-normalized. In the first group the GHB mean concentration measured after autopsy was: 2.14mg/L (range 0.54-3.21mg/L) in blood and 3.90mg/g (range 0.60-4.81mg/g) in urine; in the second group it was: 5.13mg/L (range 1.11-9.60mg/L) in blood and 3.93mg/g (range 0.91-7.25mg/g) in urine; in the third group it was: 11.8mg/L (range 3.95-24.12mg/L) in blood and 9.83mg/g (range 3.67-21.90mg/g) in urine. The results obtained in blood and urine samples showed a statistically significant difference among groups (pblood and urine samples a mean difference at 20°C compared to -20°C not statistically significant at the 10% level. These findings allow us to affirm that the PMI strongly affects the post mortem production of GHB in blood and urine samples. Regarding the new formation of GHB in vitro both in blood and urine samples of the three groups, which have been stored at -20°C, 4°C and 20°C over a period of one month, although there was no significant increases of

  2. Estimating the Post-Mortem Interval of skeletonized remains: The use of Infrared spectroscopy and Raman spectro-microscopy

    Science.gov (United States)

    Creagh, Dudley; Cameron, Alyce

    2017-08-01

    When skeletonized remains are found it becomes a police task to determine to identify the body and establish the cause of death. It assists investigators if the Post-Mortem Interval (PMI) can be established. Hitherto no reliable qualitative method of estimating the PMI has been found. A quantitative method has yet to be developed. This paper shows that IR spectroscopy and Raman microscopy have the potential to form the basis of a quantitative method.

  3. Detection and differentiation of early acute and following age stages of myocardial infarction with quantitative post-mortem cardiac 1.5T MR.

    Science.gov (United States)

    Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J; Schuster, Frederick; Riva, Fabiano; Zech, Wolf-Dieter

    2017-01-01

    Recently, quantitative MR sequences have started being used in post-mortem imaging. The goal of the present study was to evaluate if early acute and following age stages of myocardial infarction can be detected and discerned by quantitative 1.5T post-mortem cardiac magnetic resonance (PMCMR) based on quantitative T1, T2 and PD values. In 80 deceased individuals (25 female, 55 male), a cardiac MR quantification sequence was performed prior to cardiac dissection at autopsy in a prospective study. Focal myocardial signal alterations detected in synthetically generated MR images were MR quantified for their T1, T2 and PD values. The locations of signal alteration measurements in PMCMR were targeted at autopsy heart dissection and cardiac tissue specimens were taken for histologic examinations. Quantified signal alterations in PMCMR were correlated to their according histologic age stage of myocardial infarction. In PMCMR seventy-three focal myocardial signal alterations were detected in 49 of 80 investigated hearts. These signal alterations were diagnosed histologically as early acute (n=39), acute (n=14), subacute (n=10) and chronic (n=10) age stages of myocardial infarction. Statistical analysis revealed that based on their quantitative T1, T2 and PD values, a significant difference between all defined age groups of myocardial infarction can be determined. It can be concluded that quantitative 1.5T PMCMR quantification based on quantitative T1, T2 and PD values is feasible for characterization and differentiation of early acute and following age stages of myocardial infarction. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. An in-depth analysis of theoretical frameworks for the study of care coordination

    Directory of Open Access Journals (Sweden)

    Sabine Van Houdt

    2013-06-01

    Full Text Available Introduction: Complex chronic conditions often require long-term care from various healthcare professionals. Thus, maintaining quality care requires care coordination. Concepts for the study of care coordination require clarification to develop, study and evaluate coordination strategies. In 2007, the Agency for Healthcare Research and Quality defined care coordination and proposed five theoretical frameworks for exploring care coordination. This study aimed to update current theoretical frameworks and clarify key concepts related to care coordination. Methods: We performed a literature review to update existing theoretical frameworks. An in-depth analysis of these theoretical frameworks was conducted to formulate key concepts related to care coordination.Results: Our literature review found seven previously unidentified theoretical frameworks for studying care coordination. The in-depth analysis identified fourteen key concepts that the theoretical frameworks addressed. These were ‘external factors’, ‘structure’, ‘tasks characteristics’, ‘cultural factors’, ‘knowledge and technology’, ‘need for coordination’, ‘administrative operational processes’, ‘exchange of information’, ‘goals’, ‘roles’, ‘quality of relationship’, ‘patient outcome’, ‘team outcome’, and ‘(interorganizational outcome’.Conclusion: These 14 interrelated key concepts provide a base to develop or choose a framework for studying care coordination. The relational coordination theory and the multi-level framework are interesting as these are the most comprehensive.

  5. The Soldier-Cyborg Transformation: A Framework for Analysis of Social and Ethical Issues of Future Warfare

    Science.gov (United States)

    1998-05-26

    government agency. STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...UNCLASSIFIED USAWC STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...P) Donald A. Gagliano, M.D. TITLE: THE SOLDIER CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE WARFARE

  6. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  7. Studies on time of death estimation in the early post mortem period -- application of a method based on eyeball temperature measurement to human bodies.

    Science.gov (United States)

    Kaliszan, Michał

    2013-09-01

    This paper presents a verification of the thermodynamic model allowing an estimation of the time of death (TOD) by calculating the post mortem interval (PMI) based on a single eyeball temperature measurement at the death scene. The study was performed on 30 cases with known PMI, ranging from 1h 35min to 5h 15min, using pin probes connected to a high precision electronic thermometer (Dostmann-electronic). The measured eye temperatures ranged from 20.2 to 33.1°C. Rectal temperature was measured at the same time and ranged from 32.8 to 37.4°C. Ambient temperatures which ranged from -1 to 24°C, environmental conditions (still air to light wind) and the amount of hair on the head were also recorded every time. PMI was calculated using a formula based on Newton's law of cooling, previously derived and successfully tested in comprehensive studies on pigs and a few human cases. Thanks to both the significantly faster post mortem decrease of eye temperature and a residual or nonexistent plateau effect in the eye, as well as practically no influence of body mass, TOD in the human death cases could be estimated with good accuracy. The highest TOD estimation error during the post mortem intervals up to around 5h was 1h 16min, 1h 14min and 1h 03min, respectively in three cases among 30, while for the remaining 27 cases it was not more than 47min. The mean error for all 30 cases was ±31min. All that indicates that the proposed method is of quite good precision in the early post mortem period, with an accuracy of ±1h for a 95% confidence interval. On the basis of the presented method, TOD can be also calculated at the death scene with the use of a proposed portable electronic device (TOD-meter). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Role of biogenic amines in the post-mortem migration of Anisakis pegreffii (Nematoda: Anisakidae Dujardin, 1845) larvae into fish fillets.

    Science.gov (United States)

    Šimat, Vida; Miletić, Jelena; Bogdanović, Tanja; Poljak, Vedran; Mladineo, Ivona

    2015-12-02

    Infective third-stage larvae (L3) of nematode Anisakis spp. have been recognized as one of the major food-borne threats in lightly processed fish products in Europe, particularly in the Mediterranean region. Therefore, the effect of different storage temperatures of fish on larval post-mortem migration from visceral cavity into fillets is an important parameter to take into account when evaluating the risk for consumer safety. The European anchovy (Engraulis encrasicolus) were caught during fishing season, a subsample of fillets was checked for the presence of Anisakis larvae at capture (mean abundance=0.07), and the rest was stored at four different temperatures (-18, 0, 4 and 22°C) in order to count migrating larvae and measure the production of biogenic amines over a period of time. Larvae were identified by morphological features and molecular tools. Post-mortem migration was observed in fillets stored at 0 and 4°C after three and five days, respectively, but not at 22 and -18°C. In case of storage at 22°C for two days, at the onset of putrefaction of the visceral organs, larvae migrated out of the visceral cavity towards the fish surface. Measured pH and biogenic amine profile during storage indicated that certain biochemical conditions trigger larval migration into fillets. Likewise, migration was observed at pH ~6.4 when sensory degradation of the fish was markedly visible. Although larval migration was delayed for approximately four days at a temperature of fillet was high and statistically significant at both 0 (r=0.998, p<0.01) and 4°C (r=0.946, p<0.05). Out of eight biogenic amines measured, cadaverine and putrescine levels correlated the most with the post-mortem migration at 4°C, while tyramine levels were significant at both temperatures. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Analysis of Worldwide Regulatory Framework for On-Line Maintenance

    International Nuclear Information System (INIS)

    Ahn, Sang Kyu; Oh, Kyu Myung; Lee, Chang Ju

    2010-01-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of online maintenance (OLM). OLM means a kind of planned maintenance of nuclear reactor facilities, including structure, systems, and components (SSCs), during power operation. In Korea, a similar situation is made up, so it needs to establish a regulatory framework for OLM. A few years ago, foreign countries' practices related to OLM were surveyed by the Working Group on Inspection Practices (WGIP) of OECD/NEA/CNRA. The survey results and additional new information of countries' status will be helpful to establish our own regulatory framework for OLM, which are analyzed in this paper. From the analysis, some considerable points to be addressed for establishing a regulatory framework for OLM are suggested

  10. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  11. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  12. Learner Analysis Framework for Globalized E-Learning: A Case Study

    Directory of Open Access Journals (Sweden)

    Mamta Saxena

    2011-06-01

    Full Text Available The shift to technology-mediated modes of instructional delivery and increased global connectivity has led to a rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging, and culturally competent instruction for an increasingly diverse learner population. The purpose of this study was to explore strategies for expanding learner analysis within the instructional design process to better address cultural influences on learning. A case study approach leveraged the experience of practicing instructional designers to build a framework for culturally competent learner analysis.The study discussed the related challenges and recommended strategies to improve the effectiveness of cross-cultural learner analysis. Based on the findings, a framework for conducting cross-cultural learner analysis to guide the cultural analysis of diverse learners was proposed. The study identified the most critical factors in improving cross-cultural learner analysis as the judicious use of existing research on cross-cultural theories and joint deliberation on the part of all the participants from the management to the learners. Several strategies for guiding and improving the cultural inquiry process were summarized. Barriers and solutions for the requirements are also discussed.

  13. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    Science.gov (United States)

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  14. Post-mortem hemoparasite detection in free-living Brazilian brown brocket deer (Mazama gouazoubira, Fischer 1814

    Directory of Open Access Journals (Sweden)

    Júlia Angélica Gonçalves da Silveira

    Full Text Available Tick-borne infections can result in serious health problems for wild ruminants, and some of these infectious agents can be considered zoonosis. The aim of the present study was the post-mortem detection of hemoparasites in free-living Mazama gouazoubira from Minas Gerais state, Brazil. The deer samples consisted of free-living M. gouazoubira (n = 9 individuals that died after capture. Necropsy examinations of the carcasses were performed to search for macroscopic alterations. Organ samples were collected for subsequent imprint slides, and nested PCR assays were performed to detect hemoparasite species. Imprint slide assays from four deer showed erythrocytes infected with Piroplasmida small trophozoites, and A. marginale corpuscles were observed in erythrocytes from two animals. A. marginale and trophozoite co-infections occurred in two deer. A nested PCR analysis of the organs showed that six of the nine samples were positive for Theileria sp., five were positive for A. phagocytophilum and three were positive for A. marginale, with co-infection occurring in four deer. The results of the present study demonstrate that post-mortemdiagnostics using imprint slides and molecular assays are an effective method for detecting hemoparasites in organs.

  15. Metacognition and evidence analysis instruction: an educational framework and practical experience.

    Science.gov (United States)

    Parrott, J Scott; Rubinstein, Matthew L

    2015-08-21

    The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.

  16. Optimisation of post mortem cardiac computed tomography compared to optical coherence tomography and histopathology - Technical note

    DEFF Research Database (Denmark)

    Falk, Erling

    2014-01-01

    . Here, a new method for optimising cardiac coronary CT with optical coherence tomography (OCT) and histopathology is presented. Materials and methods: Twenty human hearts obtained from autopsies were used. A contrast agent that solidifies after cooling was injected into the coronary arteries. CT...... of the images was also developed. Results: We have succeeded in developing a new method for post-mortem coronary CT angiography in which an autopsy heart is placed in a chest phantom to simulate clinical CT. Conclusion: The new method permits comparison of CT with OCT and histopathology. This method can also...

  17. Phosphatidylcholine 36:1 concentration decreases along with demyelination in the cuprizone animal model and post-mortem of multiple sclerosis brain tissue.

    Science.gov (United States)

    Trépanier, Marc-Olivier; Hildebrand, Kayla D; Nyamoya, Stella D; Amor, Sandra; Bazinet, Richard P; Kipp, Markus

    2018-03-25

    Multiple sclerosis (MS) is a demyelinating and inflammatory disease. Myelin is enriched in lipids, and more specifically, oleic acid. The goal of this study was to evaluate the concentration of oleic acid following demyelination and remyelination in the cuprizone model, test if these changes occurred in specific lipid species, and whether differences in the cuprizone model correlate with changes observed in post-mortem human brains. Eight-week-old C57Bl/6 mice were fed a 0.2% cuprizone diet for 5 weeks and some animals allowed to recover for 11 days. Demyelination, inflammation, and lipid concentrations were measured in the corpus callosum. Standard fatty acid techniques and liquid chromatography combined with tandem mass spectrometry were performed to measure concentrations of fatty acids in total brain lipids and a panel of lipid species within the phosphatidylcholine (PC). Similar measurements were conducted in post-mortem brain tissues of MS patients and were compared to healthy controls. Five weeks of cuprizone administration resulted in demyelination followed by significant remyelination after 11 days of recovery. Compared to control, oleic acid was decreased after 5 weeks of cuprizone treatment and increased during the recovery phase. This decrease in oleic acid was associated with a specific decrease in the PC 36:1 pool. Similar results were observed in human post-mortem brains. Decreases in myelin content in the cuprizone model was accompanied with decreases in oleic acid concentration and is associated with PC 36:1 suggesting that specific lipids could be a potential biomarker for myelin degeneration. The biological relevance of oleic acid for disease progression remains to be verified. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  18. Decreased CSF transferrin in sCJD: a potential pre-mortem diagnostic test for prion disorders.

    Directory of Open Access Journals (Sweden)

    Ajay Singh

    2011-03-01

    Full Text Available Sporadic Creutzfeldt-Jakob-disease (sCJD is a fatal neurodegenerative condition that escapes detection until autopsy. Recently, brain iron dyshomeostasis accompanied by increased transferrin (Tf was reported in sCJD cases. The consequence of this abnormality on cerebrospinal-fluid (CSF levels of Tf is uncertain. We evaluated the accuracy of CSF Tf, a 'new' biomarker, as a pre-mortem diagnostic test for sCJD when used alone or in combination with the 'current' biomarker total-tau (T-tau. Levels of total-Tf (T-Tf, isoforms of Tf (Tf-1 and Tf-β2, and iron saturation of Tf were quantified in CSF collected 0.3-36 months before death (duration from 99 autopsy confirmed sCJD (CJD+ and 75 confirmed cases of dementia of non-CJD origin (CJD-. Diagnostic accuracy was estimated by non-parametric tests, logistic regression, and receiver operating characteristic (ROC analysis. Area under the ROC curve (AUC, sensitivity, specificity, positive and negative predictive values (PV, and likelihood ratios (LR of each biomarker and biomarker combination were calculated. We report that relative to CJD-, CJD+ cases had lower median CSF T-Tf (125,7093 vs. 217,7893 and higher T-tau (11530 vs. 1266 values. AUC was 0.90 (95% confidence interval (CI, 0.85-0.94 for T-Tf, and 0.93 (95% CI, 0.89-0.97 for T-Tf combined with T-tau. With cut-offs defined to achieve a sensitivity of ∼85%, T-Tf identified CJD+ cases with a specificity of 71.6% (95% CI, 59.1-81.7, positive LR of 3.0 (95% CI, 2.1-4.5, negative LR of 0.2 (95% CI, 0.1-0.3, and accuracy of 80.1%. The effect of patient age and duration was insignificant. T-Tf combined with T-tau identified CJD+ with improved specificity of 87.5% (95%CI, 76.3-94.1, positive LR of 6.8 (95% CI, 3.5-13.1, negative LR of 0.2 (95% CI, 0.1-0.3, positive-PV of 91.0%, negative-PV of 80.0%, and accuracy of 86.2%. Thus, CSF T-Tf, a new biomarker, when combined with the current biomarker T-tau, is a reliable pre-mortem diagnostic test for sCJD.

  19. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  20. A new kernel discriminant analysis framework for electronic nose recognition

    International Nuclear Information System (INIS)

    Zhang, Lei; Tian, Feng-Chun

    2014-01-01

    Graphical abstract: - Highlights: • This paper proposes a new discriminant analysis framework for feature extraction and recognition. • The principle of the proposed NDA is derived mathematically. • The NDA framework is coupled with kernel PCA for classification. • The proposed KNDA is compared with state of the art e-Nose recognition methods. • The proposed KNDA shows the best performance in e-Nose experiments. - Abstract: Electronic nose (e-Nose) technology based on metal oxide semiconductor gas sensor array is widely studied for detection of gas components. This paper proposes a new discriminant analysis framework (NDA) for dimension reduction and e-Nose recognition. In a NDA, the between-class and the within-class Laplacian scatter matrix are designed from sample to sample, respectively, to characterize the between-class separability and the within-class compactness by seeking for discriminant matrix to simultaneously maximize the between-class Laplacian scatter and minimize the within-class Laplacian scatter. In terms of the linear separability in high dimensional kernel mapping space and the dimension reduction of principal component analysis (PCA), an effective kernel PCA plus NDA method (KNDA) is proposed for rapid detection of gas mixture components by an e-Nose. The NDA framework is derived in this paper as well as the specific implementations of the proposed KNDA method in training and recognition process. The KNDA is examined on the e-Nose datasets of six kinds of gas components, and compared with state of the art e-Nose classification methods. Experimental results demonstrate that the proposed KNDA method shows the best performance with average recognition rate and total recognition rate as 94.14% and 95.06% which leads to a promising feature extraction and multi-class recognition in e-Nose

  1. Information management and ante-mortem inspection procedures for the emerging diseases control: Experiences acquired in the epidemiological surveillance of bluetongue and lumpy skin disease.

    Science.gov (United States)

    Corradini, Alessandra; Trevisani, Marcello; Dosa, Geremia; Padovani, Anna

    2018-03-31

    The spread of exotic, emerging and reemerging diseases, has become, in the last years, one of the most important threats to the animal productions and public health, representing a new challenge for the European Community. In a global-market framework, where trade and contacts between countries are simplified, effective and well-developed surveillance systems are necessary. Multiple factors are, in fact, associated with the emergence of new, known or exotic diseases in this new economic panorama and for these reasons controls on animal imports, traceability and timeliness detection of infected animals should be considered the basis of a sound surveillance. In this work, we focused our attention on the management of Bluetongue and on the risk of introduction of the Lumpy Skin Disease in Italy, in order to describe the national and European surveillance systems for these diseases. In particular, we underlined the crucial role of information that reach the Official Veterinarian at the slaughterhouse concerning the epidemiological situation of the sending countries. Information that are important for the management of the ante-mortem inspection and for increasing the awareness of the Veterinary Inspectors of their role in the surveillance.

  2. Information management and ante-mortem inspection procedures for the emerging diseases control: Experiences acquired in the epidemiological surveillance of bluetongue and lumpy skin disease

    Directory of Open Access Journals (Sweden)

    Alessandra Corradini

    2018-03-01

    Full Text Available The spread of exotic, emerging and reemerging diseases, has become, in the last years, one of the most important threats to the animal productions and public health, representing a new challenge for the European Community. In a global-market framework, where trade and contacts between countries are simplified, effective and well-developed surveillance systems are necessary. Multiple factors are, in fact, associated with the emergence of new, known or exotic diseases in this new economic panorama and for these reasons controls on animal imports, traceability and timeliness detection of infected animals should be considered the basis of a sound surveillance. In this work, we focused our attention on the management of Bluetongue and on the risk of introduction of the Lumpy Skin Disease in Italy, in order to describe the national and European surveillance systems for these diseases. In particular, we underlined the crucial role of information that reach the Official Veterinarian at the slaughterhouse concerning the epidemiological situation of the sending countries. Information that are important for the management of the ante-mortem inspection and for increasing the awareness of the Veterinary Inspectors of their role in the surveillance.

  3. An analysis of a national strategic framework to promote tourism ...

    African Journals Online (AJOL)

    An analysis of a national strategic framework to promote tourism, leisure, sport and ... is to highlight the extent to which selected macro policy components namely, ... tourism growth, tourism safety and security, environmental management and ...

  4. A framework for the economic analysis of data collection methods for vital statistics.

    Science.gov (United States)

    Jimenez-Soto, Eliana; Hodge, Andrew; Nguyen, Kim-Huong; Dettrick, Zoe; Lopez, Alan D

    2014-01-01

    Over recent years there has been a strong movement towards the improvement of vital statistics and other types of health data that inform evidence-based policies. Collecting such data is not cost free. To date there is no systematic framework to guide investment decisions on methods of data collection for vital statistics or health information in general. We developed a framework to systematically assess the comparative costs and outcomes/benefits of the various data methods for collecting vital statistics. The proposed framework is four-pronged and utilises two major economic approaches to systematically assess the available data collection methods: cost-effectiveness analysis and efficiency analysis. We built a stylised example of a hypothetical low-income country to perform a simulation exercise in order to illustrate an application of the framework. Using simulated data, the results from the stylised example show that the rankings of the data collection methods are not affected by the use of either cost-effectiveness or efficiency analysis. However, the rankings are affected by how quantities are measured. There have been several calls for global improvements in collecting useable data, including vital statistics, from health information systems to inform public health policies. Ours is the first study that proposes a systematic framework to assist countries undertake an economic evaluation of DCMs. Despite numerous challenges, we demonstrate that a systematic assessment of outputs and costs of DCMs is not only necessary, but also feasible. The proposed framework is general enough to be easily extended to other areas of health information.

  5. FIND--a unified framework for neural data analysis.

    Science.gov (United States)

    Meier, Ralph; Egert, Ulrich; Aertsen, Ad; Nawrot, Martin P

    2008-10-01

    The complexity of neurophysiology data has increased tremendously over the last years, especially due to the widespread availability of multi-channel recording techniques. With adequate computing power the current limit for computational neuroscience is the effort and time it takes for scientists to translate their ideas into working code. Advanced analysis methods are complex and often lack reproducibility on the basis of published descriptions. To overcome this limitation we develop FIND (Finding Information in Neural Data) as a platform-independent, open source framework for the analysis of neuronal activity data based on Matlab (Mathworks). Here, we outline the structure of the FIND framework and describe its functionality, our measures of quality control, and the policies for developers and users. Within FIND we have developed a unified data import from various proprietary formats, simplifying standardized interfacing with tools for analysis and simulation. The toolbox FIND covers a steadily increasing number of tools. These analysis tools address various types of neural activity data, including discrete series of spike events, continuous time series and imaging data. Additionally, the toolbox provides solutions for the simulation of parallel stochastic point processes to model multi-channel spiking activity. We illustrate two examples of complex analyses with FIND tools: First, we present a time-resolved characterization of the spiking irregularity in an in vivo extracellular recording from a mushroom-body extrinsic neuron in the honeybee during odor stimulation. Second, we describe layer specific input dynamics in the rat primary visual cortex in vivo in response to visual flash stimulation on the basis of multi-channel spiking activity.

  6. An intersectionality-based policy analysis framework: critical reflections on a methodology for advancing equity.

    Science.gov (United States)

    Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie

    2014-12-10

    In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.

  7. VisRseq: R-based visual framework for analysis of sequencing data

    OpenAIRE

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven JM

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for ...

  8. Hippocampal Microbleed on a Post-Mortem T2*-Weighted Gradient-Echo 7.0-Tesla Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    J. De Reuck

    2011-09-01

    Full Text Available The present post-mortem study of a brain from an Alzheimer patient showed on a T2*-weighted gradient-echo 7.0-T MRI of a coronal brain section a hyposignal in the hippocampus, suggesting a microbleed. On the corresponding histological examination, only iron deposits around the granular cellular layer and in blood vessel walls of the hippocampus were observed without evidence of a bleeding. This case report illustrates that the detection of microbleeds on MRI has to be interpreted with caution.

  9. Post-mortem prediction of primal and selected retail cut weights of New Zealand lamb from carcass and animal characteristics.

    Science.gov (United States)

    Ngo, L; Ho, H; Hunter, P; Quinn, K; Thomson, A; Pearson, G

    2016-02-01

    Post-mortem measurements (cold weight, grade and external carcass linear dimensions) as well as live animal data (age, breed, sex) were used to predict ovine primal and retail cut weights for 792 lamb carcases. Significant levels of variance could be explained using these predictors. The predictive power of those measurements on primal and retail cut weights was studied by using the results from principal component analysis and the absolute value of the t-statistics of the linear regression model. High prediction accuracy for primal cut weight was achieved (adjusted R(2) up to 0.95), as well as moderate accuracy for key retail cut weight: tenderloins (adj-R(2)=0.60), loin (adj-R(2)=0.62), French rack (adj-R(2)=0.76) and rump (adj-R(2)=0.75). The carcass cold weight had the best predictive power, with the accuracy increasing by around 10% after including the next three most significant variables. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Framework for Interactive Parallel Dataset Analysis on the Grid

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  11. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  12. A threat analysis framework as applied to critical infrastructures in the Energy Sector.

    Energy Technology Data Exchange (ETDEWEB)

    Michalski, John T.; Duggan, David Patrick

    2007-09-01

    The need to protect national critical infrastructure has led to the development of a threat analysis framework. The threat analysis framework can be used to identify the elements required to quantify threats against critical infrastructure assets and provide a means of distributing actionable threat information to critical infrastructure entities for the protection of infrastructure assets. This document identifies and describes five key elements needed to perform a comprehensive analysis of threat: the identification of an adversary, the development of generic threat profiles, the identification of generic attack paths, the discovery of adversary intent, and the identification of mitigation strategies.

  13. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  14. Development of fetal brain of 20 weeks gestational age: Assessment with post-mortem Magnetic Resonance Imaging

    International Nuclear Information System (INIS)

    Zhang Zhonghe; Liu Shuwei; Lin Xiangtao; Teng Gaojun; Yu Taifei; Fang Fang; Zang Fengchao

    2011-01-01

    Background: The 20th week gestational age (GA) is at mid-gestation and corresponds to the age at which the termination of pregnancy in several countries and the first Magnetic Resonance Imaging (MRI) can be performed, and at which the premature babies may survive. However, at present, very little is known about the exact anatomical character at this GA. Objective: To delineate the developing fetal brain of 20 weeks GA and obtain the three dimensional visualization model. Materials and methods: 20 fetal specimens were scanned by 3.0 T and 7.0 T post-mortem MRI, and the three dimensional visualization model was obtained with Amira 4.1. Results: Most of the sulci or their anlage, except the postcentral sulcus and intraparietal sulcus, were present. The laminar organization, described as layers with different signal intensities, was most clearly distinguished at the parieto-occipital lobe and peripheral regions of the hippocampus. The basal nuclei could be clearly visualized, and the brain stem and cerebellum had formed their common shape. On the visualization model, the shape and relative relationship of the structures could be appropriately delineated. The ranges of normal values of the brain structures were obtained, but no sexual dimorphisms or cerebral asymmetries were found. Conclusions: The developing fetal brain of 20 weeks GA can be clearly delineated on 3.0 T and 7.0 T post-mortem MRIs, and the three dimensional visualization model supplies great help in precise cognition of the immature brain. These results may have positive influences on the evaluation of the fetal brain in the uterus.

  15. The Measurand Framework: Scaling Exploratory Data Analysis

    Science.gov (United States)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  16. Lipoma of the midbrain: post-mortem finding in a patient with breast cancer

    Directory of Open Access Journals (Sweden)

    Verônica Maia Gouvea

    1989-09-01

    Full Text Available Intracranial lipomas are rare, usually do not have clinical expression and are located mare frequently in the corpus callosum. Other locations include the spinal cord, midbrain tectum, superior vermis, tuber cinereum, infundibulum and more rarely cerebellopontine angle, hypothalamus, superior medullary velum and insula. We report the case of a lipoma of the left inferior colliculus which was a post-mortem finding in a woman who died of breast cancer. Although there are reports of intracranial lipomas in patients with malignant tumors there is no explanation for the co-existence of the two tumors. The present tumor also includes a segment of a nerve which is not uncommon, but a less common finding was the presence of nests of Schwann cells within it, shown by immunohistochemistry.

  17. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe; Dalcin, Lisandro; Collier, Nathan; Calo, Victor M.

    2014-01-01

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation

  18. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  19. Agricultural Value Chains in Developing Countries; a Framework for Analysis

    NARCIS (Netherlands)

    Trienekens, J.H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component

  20. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  1. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    Science.gov (United States)

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  2. A Framework for the Game-theoretic Analysis of Censorship Resistance

    Directory of Open Access Journals (Sweden)

    Elahi Tariq

    2016-10-01

    Full Text Available We present a game-theoretic analysis of optimal solutions for interactions between censors and censorship resistance systems (CRSs by focusing on the data channel used by the CRS to smuggle clients’ data past the censors. This analysis leverages the inherent errors (false positives and negatives made by the censor when trying to classify traffic as either non-circumvention traffic or as CRS traffic, as well as the underlying rate of CRS traffic. We identify Nash equilibrium solutions for several simple censorship scenarios and then extend those findings to more complex scenarios where we find that the deployment of a censorship apparatus does not qualitatively change the equilibrium solutions, but rather only affects the amount of traffic a CRS can support before being blocked. By leveraging these findings, we describe a general framework for exploring and identifying optimal strategies for the censorship circumventor, in order to maximize the amount of CRS traffic not blocked by the censor. We use this framework to analyze several scenarios with multiple data-channel protocols used as cover for the CRS. We show that it is possible to gain insights through this framework even without perfect knowledge of the censor’s (secret values for the parameters in their utility function.

  3. Post-mortem diagnostics in cases of sepsis. Part 1. Aetiology, epidemiology and microbiological tests

    Directory of Open Access Journals (Sweden)

    Marta Rorat

    2015-03-01

    Full Text Available Clinical practice has an effective methodology of diagnostic procedures to be followed in cases of sepsis. However, there are as yet no corresponding standards of action in post-mortem diagnostics. The scope of examinations is limited to an autopsy and histopathological tests. This situation may lead to errors in medico-legal opinions on the cause of death and in the assessment of appropriateness of medical procedures. In cases of suspected sepsis, medico-legal investigations require obtaining detailed information about the circumstances of death (including symptoms and results of intravital examinations before autopsy is performed, as well as sterile collection of specimens for microbiological tests and interpretation of their results on the basis of knowledge of epidemiology, pathophysiology and clinical progression of sepsis.

  4. Assessing various Infrared (IR) microscopic imaging techniques for post-mortem interval evaluation of human skeletal remains

    Science.gov (United States)

    Roider, Clemens; Ritsch-Marte, Monika; Pemberger, Nadin; Cemper-Kiesslich, Jan; Hatzer-Grubwieser, Petra; Parson, Walther; Pallua, Johannes Dominikus

    2017-01-01

    time. Cluster-analyses of data from Raman microscopic imaging reconstructed histo-anatomical features in comparison to the light microscopic image and finally, by application of principal component analyses (PCA), it was possible to see a clear distinction between forensic and archaeological bone samples. Hence, the spectral characterization of inorganic and organic compounds by the afore mentioned techniques, followed by analyses such as multivariate imaging analysis (MIAs) and principal component analyses (PCA), appear to be suitable for the post mortem interval (PMI) estimation of human skeletal remains. PMID:28334006

  5. Assessing various Infrared (IR microscopic imaging techniques for post-mortem interval evaluation of human skeletal remains.

    Directory of Open Access Journals (Sweden)

    Claudia Woess

    decreases with time. Cluster-analyses of data from Raman microscopic imaging reconstructed histo-anatomical features in comparison to the light microscopic image and finally, by application of principal component analyses (PCA, it was possible to see a clear distinction between forensic and archaeological bone samples. Hence, the spectral characterization of inorganic and organic compounds by the afore mentioned techniques, followed by analyses such as multivariate imaging analysis (MIAs and principal component analyses (PCA, appear to be suitable for the post mortem interval (PMI estimation of human skeletal remains.

  6. Assessing various Infrared (IR) microscopic imaging techniques for post-mortem interval evaluation of human skeletal remains.

    Science.gov (United States)

    Woess, Claudia; Unterberger, Seraphin Hubert; Roider, Clemens; Ritsch-Marte, Monika; Pemberger, Nadin; Cemper-Kiesslich, Jan; Hatzer-Grubwieser, Petra; Parson, Walther; Pallua, Johannes Dominikus

    2017-01-01

    . Cluster-analyses of data from Raman microscopic imaging reconstructed histo-anatomical features in comparison to the light microscopic image and finally, by application of principal component analyses (PCA), it was possible to see a clear distinction between forensic and archaeological bone samples. Hence, the spectral characterization of inorganic and organic compounds by the afore mentioned techniques, followed by analyses such as multivariate imaging analysis (MIAs) and principal component analyses (PCA), appear to be suitable for the post mortem interval (PMI) estimation of human skeletal remains.

  7. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  8. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  9. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  10. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  11. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  12. Longitudinal assessment of short-term memory deterioration in a logopenic variant primary progressive aphasia with post-mortem confirmed Alzheimer's Disease pathology.

    Science.gov (United States)

    Tree, Jeremy; Kay, Janice

    2015-09-01

    In the field of dementia research, there are reports of neurodegenerative cases with a focal loss of language, termed primary progressive aphasia (PPA). Currently, this condition has been further sub-classified, with the most recent sub-type dubbed logopenic variant (PPA-LV). As yet, there remains somewhat limited evaluation of the characteristics of this condition, with no studies providing longitudinal assessment accompanied by post-mortem examination. Moreover, a key characteristic of the PPA-LV case is a deterioration of phonological short-term memory, but again little work has scrutinized the nature of this impairment over time. The current study seeks to redress these oversights and presents detailed longitudinal examination of language and memory function in a case of PPA-LV, with special focus on tests linked to components of phonological short-term memory function. Our findings are then considered with reference to a contemporary model of the neuropsychology of phonological short-term memory. Additionally, post-mortem examinations indicated Alzheimer's disease type pathology, providing further evidence that the PPA-LV presentation may reflect an atypical presentation of this condition. © 2014 The British Psychological Society.

  13. Distinction between saltwater drowning and freshwater drowning by assessment of sinus fluid on post-mortem computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke; Sato, Yuki; Sato, Yumi; Ishibashi, Tadashi [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, Sendai, Miyagi (Japan); Usui, Akihito; Daigaku, Nami; Hosokai, Yoshiyuki [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, Sendai, Miyagi (Japan); Hayashizaki, Yoshie; Funayama, Masato [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, Sendai, Miyagi (Japan)

    2016-04-15

    To evaluate the difference in sinus fluid volume and density between saltwater and freshwater drowning and diagnose saltwater drowning in distinction from freshwater drowning. Ninety-three drowning cases (22 saltwater and 71 freshwater) were retrospectively investigated; all had undergone post-mortem CT and forensic autopsy. Sinus fluid volume and density were calculated using a 3D-DICOM workstation, and differences were evaluated. Diagnostic performance of these indicators for saltwater drowning was evaluated using a cut-off value calculated by receiver operating characteristic (ROC) analysis. The median sinus fluid volume was 5.68 mL in cases of saltwater drowning (range 0.08 to 37.55) and 5.46 mL in cases of freshwater drowning (0.02 to 27.68), and the average densities were 47.28 (14.26 to 75.98) HU and 32.56 (-14.38 to 77.43) HU, respectively. While sinus volume did not differ significantly (p = 0.6000), sinus density was significantly higher in saltwater than freshwater drowning cases (p = 0.0002). ROC analysis for diagnosis of saltwater drowning determined the cut-off value as 37.77 HU, with a sensitivity of 77 %, specificity of 72 %, PPV of 46 % and NPV of 91 %. The average density of sinus fluid in cases of saltwater drowning was significantly higher than in freshwater drowning cases; there was no significant difference in the sinus fluid volume. (orig.)

  14. Distinction between saltwater drowning and freshwater drowning by assessment of sinus fluid on post-mortem computed tomography

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Sato, Yuki; Sato, Yumi; Ishibashi, Tadashi; Usui, Akihito; Daigaku, Nami; Hosokai, Yoshiyuki; Hayashizaki, Yoshie; Funayama, Masato

    2016-01-01

    To evaluate the difference in sinus fluid volume and density between saltwater and freshwater drowning and diagnose saltwater drowning in distinction from freshwater drowning. Ninety-three drowning cases (22 saltwater and 71 freshwater) were retrospectively investigated; all had undergone post-mortem CT and forensic autopsy. Sinus fluid volume and density were calculated using a 3D-DICOM workstation, and differences were evaluated. Diagnostic performance of these indicators for saltwater drowning was evaluated using a cut-off value calculated by receiver operating characteristic (ROC) analysis. The median sinus fluid volume was 5.68 mL in cases of saltwater drowning (range 0.08 to 37.55) and 5.46 mL in cases of freshwater drowning (0.02 to 27.68), and the average densities were 47.28 (14.26 to 75.98) HU and 32.56 (-14.38 to 77.43) HU, respectively. While sinus volume did not differ significantly (p = 0.6000), sinus density was significantly higher in saltwater than freshwater drowning cases (p = 0.0002). ROC analysis for diagnosis of saltwater drowning determined the cut-off value as 37.77 HU, with a sensitivity of 77 %, specificity of 72 %, PPV of 46 % and NPV of 91 %. The average density of sinus fluid in cases of saltwater drowning was significantly higher than in freshwater drowning cases; there was no significant difference in the sinus fluid volume. (orig.)

  15. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  16. SU-E-T-481: In Vivo and Post Mortem Animal Irradiation: Measured Vs. Calculated Doses

    Energy Technology Data Exchange (ETDEWEB)

    Heintz, P [Univ New Mexico Radiology Dept., Albuquerque, NM (United States); Heintz, B [Texas Oncology, PA, Southlake, TX (United States); Sandoval, D [University of New Mexico, Albuquerque, NM (United States); Weber, W; Melo, D; Guilmette, R [Lovelace Respiratory Research Institute, Albuquerque, NM (United States)

    2015-06-15

    Purpose: Computerized radiation therapy treatment planning is performed on almost all patients today. However it is seldom used for laboratory irradiations. The first objective is to assess whether modern radiation therapy treatment planning (RTP) systems accurately predict the subject dose by comparing in vivo and decedent dose measurements to calculated doses. The other objective is determine the importance of using a RTP system for laboratory irradiations. Methods: 5 MOSFET radiation dosimeters were placed enterically in each subject (2 sedated Rhesus Macaques) to measure the absorbed dose at 5 levels (carina, lung, heart, liver and rectum) during whole body irradiation. The subjects were treated with large opposed lateral fields and extended distances to cover the entire subject using a Varian 600C linac. CT simulation was performed ante-mortem (AM) and post-mortem (PM). To compare AM and PM doses, calculation points were placed at the location of each dosimeter in the treatment plan. The measured results were compared to the results using Varian Eclipse and Prowess Panther RTP systems. Results: The Varian and Prowess treatment planning system agreed to within in +1.5% for both subjects. However there were significant differences between the measured and calculated doses. For both animals the calculated central axis dose was higher than prescribed by 3–5%. This was caused in part by inaccurate measurement of animal thickness at the time of irradiation. For one subject the doses ranged from 4% to 7% high and the other subject the doses ranged 7% to 14% high when compared to the RTP doses. Conclusions: Our results suggest that using proper CT RTP system can more accurately deliver the prescribed dose to laboratory subjects. It also shows that there is significant dose variation in such subjects when inhomogeneities are not considered in the planning process.

  17. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  18. Generic Formal Framework for Compositional Analysis of Hierarchical Scheduling Systems

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; Hyun Kim, Jin; Thi Xuan Phan, Linh

    We present a compositional framework for the specification and analysis of hierarchical scheduling systems (HSS). Firstly we provide a generic formal model, which can be used to describe any type of scheduling system. The concept of Job automata is introduced in order to model job instantiation...

  19. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  20. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  1. Myocardial contrast defect associated with thrombotic coronary occlusion: Pre-autopsy diagnosis of a cardiac death with post-mortem CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Heon; Cha, Jang Gyu [Dept. of Radiology, Soonchunhyang University Hospital, Bucheon (Korea, Republic of); Park, Hye Jin; Lee, Soo Kyoung; Yang, Kyung Moo [Dept. of Forensic Medicine, National Forensic Service, Wonju (Korea, Republic of)

    2015-10-15

    We report the case of a female who died of suspected acute myocardial infarction. Post-mortem CT angiography (PMCTA) was performed with intravascular contrast infusion before the standard autopsy, and it successfully demonstrated the complete thrombotic occlusion of a coronary artery and also a corresponding perfusion defect on myocardium. We herein describe the PMCTA findings of a cardiac death with special emphasis on the potential benefits of this novel CT technique in forensic practice.

  2. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Verbruggen, Aviel; Laes, Erik

    2015-01-01

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  3. A comparative analysis of protected area planning and management frameworks

    Science.gov (United States)

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  4. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  5. Causes of Stillbirth and Time of Death in Swedish Holstein Calves Examined Post Mortem

    Directory of Open Access Journals (Sweden)

    Elvander M

    2003-09-01

    Full Text Available This study was initiated due to the observation of increasing and rather high levels of stillbirths, especially in first-calving Swedish Holstein cows (10.3%, 2002. Seventy-six Swedish Holstein calves born to heifers at 41 different farms were post mortem examined in order to investigate possible reasons for stillbirth and at what time in relation to full-term gestation they had occurred. The definition of a stillborn calf was dead at birth or within 24 h after birth after at least 260 days of gestation. Eight calves were considered as having died already in uterus. Slightly less than half of the examined calves (46.1% were classified as having died due to a difficult calving. Four calves (5.3% had different kinds of malformations (heart defects, enlarged thymus, urine bladder defect. Approximately one third of the calves (31.6% were clinically normal at full-term with no signs of malformation and born with no indication of difficulties at parturition or any other reason that could explain the stillbirth. The numbers of male and female calves were rather equally distributed within the groups. A wide variation in post mortem weights was seen in all groups, although a number of the calves in the group of clinically normal calves with unexplained reason of death were rather small and, compared with e.g. those calves categorised as having died due to a difficult calving, their average birth weight was 6 kg lower (39.9 ± 1.7 kg vs. 45.9 ± 1.5 kg, p ≤ 0.01. It was concluded that the cause of stillbirth with a non-infectious aetiology is likely to be multifactorial and difficult calving may explain only about half of the stillbirths. As much as one third of the calves seemed clinically normal with no obvious reason for death. This is a target group of calves that warrants a more thorough investigation in further studies.

  6. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  7. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  8. Framework for SEM contour analysis

    Science.gov (United States)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  9. Ablation of fast-spiking interneurons in the dorsal striatum, recapitulating abnormalities seen post-mortem in Tourette syndrome, produces anxiety and elevated grooming.

    Science.gov (United States)

    Xu, M; Li, L; Pittenger, C

    2016-06-02

    Tic disorders, including Tourette syndrome (TS), are thought to involve pathology of cortico-basal ganglia loops, but their pathology is not well understood. Post-mortem studies have shown a reduced number of several populations of striatal interneurons, including the parvalbumin-expressing fast-spiking interneurons (FSIs), in individuals with severe, refractory TS. We tested the causal role of this interneuronal deficit by recapitulating it in an otherwise normal adult mouse using a combination transgenic-viral cell ablation approach. FSIs were reduced bilaterally by ∼40%, paralleling the deficit found post-mortem. This did not produce spontaneous stereotypies or tic-like movements, but there was increased stereotypic grooming after acute stress in two validated paradigms. Stereotypy after amphetamine, in contrast, was not elevated. FSI ablation also led to increased anxiety-like behavior in the elevated plus maze, but not to alterations in motor learning on the rotorod or to alterations in prepulse inhibition, a measure of sensorimotor gating. These findings indicate that a striatal FSI deficit can produce stress-triggered repetitive movements and anxiety. These repetitive movements may recapitulate aspects of the pathophysiology of tic disorders. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  11. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    Science.gov (United States)

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  12. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  13. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  14. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  15. Benchmarking of Modern Data Analysis Tools for a 2nd generation Transient Data Analysis Framework

    CERN Document Server

    Goncalves, Nuno

    2016-01-01

    During the past year of operating the Large Hadron Collider (LHC), the amount of transient accelerator data to be persisted and analysed has been steadily growing. Since the startup of the LHC in 2006, the amount of weekly data storage requirements exceeded what the systems was initially designed to accommodate in a full year of operation. Moreover, it is predicted that the data acquisition rates will continue to increase in the future, due to foreseen improvements in the infrastructure within the scope of the High Luminosity LHC project. Despite the efforts for improving and optimizing the current data storage infrastructures (CERN Accelerator Logging Service and Post Mortem database), some limitations still persist and require a different approach to scale up efficiently to provide efficient services for future machine upgrades. This project aims to explore one of the possibilities among novel solutions proposed to solve the problem of working with large datasets. The configuration is composed of Spark for ...

  16. Digital Trade Infrastructures: A Framework for Analysis

    Directory of Open Access Journals (Sweden)

    Boriana Boriana

    2018-04-01

    Full Text Available In global supply chains, information about transactions resides in fragmented pockets within business and government systems. The lack of reliable, accurate and complete information makes it hard to detect risks (such as safety, security, compliance and commercial risks and at the same time makes international trade inefficient. The introduction of digital infrastructures that transcend organizational and system domains is driven by the prospect of reducing the fragmentation of information, thereby enabling improved security and efficiency in the trading process. This article develops a digital trade infrastructure framework through an empirically grounded analysis of four digital infrastructures in the trade domain, using the conceptual lens of digital infrastructure.

  17. RECTAL BIOPSY IN SHEEP AND GOATS FOR MONITORING AND ANTE-MORTEM DIAGNOSIS OF SCRAPIE: NUMBER OF LYMPHOID FOLLICLES IN TWO CONSECUTIVE COLLECTIONS

    Directory of Open Access Journals (Sweden)

    Helen Caroline Raksa

    2016-07-01

    O acúmulo da PrPSc em tecidos linfoides levou ao desenvolvimento de procedimentos de biopsia para o diagnóstico ante mortem da scrapie em ovinos, utilizando tecidos acessíveis como a tonsila(5 e terceira pálpebra(6, e a técnica de imuno-histoquímica (IHQ. Por outro lado, a grande área de folículos linfoides presente no reto de ovinos(7 tornou a biopsia retal uma possibilidade de diagnóstico ante mortem da scrapie. Amostras da mucosa retal têm sido colhidas e analisadas por meio de provas de IHQ para avaliar a presença de PrPSc no tecido linfoide associado à mucosa retoanal (RAMALT, do inglês Recto-Anal Mucosa Associated Lymphoid Tissue(8,9. No Brasil, o primeiro relato de scrapie foi em 1978, em um ovino Hampshire Down, importado da Inglaterra(10. Segundo a OIE, de 2008 a 2014 foram sacrificados 41 animais no país, em surtos de scrapie(11. Desde 2008, o diagnóstico de scrapie é realizado por meio da técnica de IHQ a partir de amostras do SNC e tecidos linfoides(12. Porém, no caso de tecidos linfoides associados à mucosa retal, pode haver necessidade de novas colheitas em curtos intervalos de tempo devido à escassez de tecido para o diagnóstico da doença que, segundo Leal et al.(13, deve ser de no mínimo três folículos linfoides (FL por amostra. Visando ao reconhecimento de boas técnicas para o monitoramento e o diagnóstico ante mortem da scrapie, o presente estudo teve por objetivo avaliar a quantidade de tecido linfoide associado à mucosa retal obtido pela técnica de biopsia retal e com vistas à avaliação imuno-histoquímica, bem como a possibilidade de se realizarem dois procedimentos de biopsia consecutivos, em diferentes intervalos de tempo, em ovinos e caprinos.

  18. The Coronal Analysis of SHocks and Waves (CASHeW) framework

    Science.gov (United States)

    Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste

    2017-11-01

    Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.

  19. Gamma-hydroxybutyric acid endogenous production and post-mortem behaviour - the importance of different biological matrices, cut-off reference values, sample collection and storage conditions.

    Science.gov (United States)

    Castro, André L; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2014-10-01

    Gamma-Hydroxybutyric Acid (GHB) is an endogenous compound with a story of clinical use, since the 1960's. However, due to its secondary effects, it has become a controlled substance, entering the illicit market for recreational and "dance club scene" use, muscle enhancement purposes and drug-facilitated sexual assaults. Its endogenous context can bring some difficulties when interpreting, in a forensic context, the analytical values achieved in biological samples. This manuscript reviewed several crucial aspects related to GHB forensic toxicology evaluation, such as its post-mortem behaviour in biological samples; endogenous production values, whether in in vivo and in post-mortem samples; sampling and storage conditions (including stability tests); and cut-off reference values evaluation for different biological samples, such as whole blood, plasma, serum, urine, saliva, bile, vitreous humour and hair. This revision highlights the need of specific sampling care, storage conditions, and cut-off reference values interpretation in different biological samples, essential for proper practical application in forensic toxicology. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  20. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  1. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  2. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  3. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    Science.gov (United States)

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  4. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  5. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  6. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  7. A Novel Framework for Interactive Visualization and Analysis of Hyperspectral Image Data

    Directory of Open Access Journals (Sweden)

    Johannes Jordan

    2016-01-01

    Full Text Available Multispectral and hyperspectral images are well established in various fields of application like remote sensing, astronomy, and microscopic spectroscopy. In recent years, the availability of new sensor designs, more powerful processors, and high-capacity storage further opened this imaging modality to a wider array of applications like medical diagnosis, agriculture, and cultural heritage. This necessitates new tools that allow general analysis of the image data and are intuitive to users who are new to hyperspectral imaging. We introduce a novel framework that bundles new interactive visualization techniques with powerful algorithms and is accessible through an efficient and intuitive graphical user interface. We visualize the spectral distribution of an image via parallel coordinates with a strong link to traditional visualization techniques, enabling new paradigms in hyperspectral image analysis that focus on interactive raw data exploration. We combine novel methods for supervised segmentation, global clustering, and nonlinear false-color coding to assist in the visual inspection. Our framework coined Gerbil is open source and highly modular, building on established methods and being easily extensible for application-specific needs. It satisfies the need for a general, consistent software framework that tightly integrates analysis algorithms with an intuitive, modern interface to the raw image data and algorithmic results. Gerbil finds its worldwide use in academia and industry alike with several thousand downloads originating from 45 countries.

  8. Analyzing and modeling interdisciplinary product development a framework for the analysis of knowledge characteristics and design support

    CERN Document Server

    Neumann, Frank

    2015-01-01

    Frank Neumann focuses on establishing a theoretical basis that allows a description of the interplay between individual and collective processes in product development. For this purpose, he introduces the integrated descriptive model of knowledge creation as the first constituent of his research framework. As a second part of the research framework, an analysis and modeling method is proposed that captures the various knowledge conversion activities described by the integrated descriptive model of knowledge creation. Subsequently, this research framework is applied to the analysis of knowledge characteristics of mechatronic product development (MPD). Finally, the results gained from the previous steps are used within a design support system that aims at federating the information and knowledge resources contained in the models published in the various development activities of MPD. Contents Descriptive Model of Knowledge Creation in Interdisciplinary Product Development Research Framework for the Analysis of ...

  9. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  10. Post-mortem whole body computed tomography of opioid (heroin and methadone) fatalities: frequent findings and comparison to autopsy

    Energy Technology Data Exchange (ETDEWEB)

    Winklhofer, Sebastian; Stolzmann, Paul [University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland); Surer, Eddie; Ampanozi, Garyfalia; Thali, Michael; Schweitzer, Wolf [University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); Ruder, Thomas [University of Zurich, Department of Forensic Medicine and Radiology, Institute of Forensic Medicine, Zurich (Switzerland); University Hospital Bern, Institute of Diagnostic, Interventional and Pediatric Radiology, Bern (Switzerland); Elliott, Marina [Simon Fraser University, Department of Archaeology, Burnaby, BC (Canada); Oestreich, Andrea; Kraemer, Thomas [University of Zurich, Department of Forensic Pharmacology and Toxicology, Institute of Forensic Medicine, Zurich (Switzerland); Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology, Zurich (Switzerland)

    2014-06-15

    To investigate frequent findings in cases of fatal opioid intoxication in whole-body post-mortem computed tomography (PMCT). PMCT of 55 cases in which heroin and/or methadone had been found responsible for death were retrospectively evaluated (study group), and were compared with PMCT images of an age- and sex-matched control group. Imaging results were compared with conventional autopsy. The most common findings in the study group were: pulmonary oedema (95 %), aspiration (66 %), distended urinary bladder (42 %), cerebral oedema (49 %), pulmonary emphysema (38 %) and fatty liver disease (36 %). These PMCT findings occurred significantly more often in the study group than in the control group (p < 0.05). The combination of lung oedema, brain oedema and distended urinary bladder was seen in 26 % of the cases in the study group but never in the control group (0 %). This triad, as indicator of opioid-related deaths, had a specificity of 100 %, as confirmed by autopsy and toxicological analysis. Frequent findings in cases of fatal opioid intoxication were demonstrated. The triad of brain oedema, lung oedema and a distended urinary bladder on PMCT was highly specific for drug-associated cases of death. (orig.)

  11. Post-mortem whole body computed tomography of opioid (heroin and methadone) fatalities: frequent findings and comparison to autopsy

    International Nuclear Information System (INIS)

    Winklhofer, Sebastian; Stolzmann, Paul; Surer, Eddie; Ampanozi, Garyfalia; Thali, Michael; Schweitzer, Wolf; Ruder, Thomas; Elliott, Marina; Oestreich, Andrea; Kraemer, Thomas; Alkadhi, Hatem

    2014-01-01

    To investigate frequent findings in cases of fatal opioid intoxication in whole-body post-mortem computed tomography (PMCT). PMCT of 55 cases in which heroin and/or methadone had been found responsible for death were retrospectively evaluated (study group), and were compared with PMCT images of an age- and sex-matched control group. Imaging results were compared with conventional autopsy. The most common findings in the study group were: pulmonary oedema (95 %), aspiration (66 %), distended urinary bladder (42 %), cerebral oedema (49 %), pulmonary emphysema (38 %) and fatty liver disease (36 %). These PMCT findings occurred significantly more often in the study group than in the control group (p < 0.05). The combination of lung oedema, brain oedema and distended urinary bladder was seen in 26 % of the cases in the study group but never in the control group (0 %). This triad, as indicator of opioid-related deaths, had a specificity of 100 %, as confirmed by autopsy and toxicological analysis. Frequent findings in cases of fatal opioid intoxication were demonstrated. The triad of brain oedema, lung oedema and a distended urinary bladder on PMCT was highly specific for drug-associated cases of death. (orig.)

  12. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, Lisandro; Collier, N.; Vignal, Philippe; Cortes, Adriano Mauricio; Calo, Victor M.

    2016-01-01

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  13. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, L.

    2016-05-25

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  14. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  15. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Khaw, Kim Siang [Washington U., Seattle

    2017-10-21

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  16. Pregnant woman and road safety: experimental crash test with post mortem human subject.

    Science.gov (United States)

    Delotte, Jerome; Behr, Michel; Thollon, Lionel; Arnoux, Pierre-Jean; Baque, Patrick; Bongain, Andre; Brunet, Christian

    2008-05-01

    Trauma affect between 3 and 7% of all pregnancies in industrialized countries, and the leading cause of these traumas is car crashes. The difficulty to appreciate physiologic and anatomic changes occurring during pregnancy explain that majority of studies were not based on anatomical data. We present a protocol to create a realistic anatomical model of pregnant woman using a post mortem human subject (PMHS). We inserted a physical model of the gravid uterus into the pelvis of a PMHS. 3D acceleration sensors were placed on the subject to measure the acceleration on different body segments. We simulated three frontal impact situations at 20 km/h between two average European cars. Two main kinematics events were identified as possible causes of injuries: lap belt loading and backrest impact. Cadaver experiments provide one interesting complementary approach to study injury mechanisms related to road accidents involving pregnant women. This anatomical accuracy makes it possible to progress in the field of safety devices.

  17. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis......We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  18. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A framework of analysis for field experiments with alternative materials in road construction.

    Science.gov (United States)

    François, D; Jullien, A

    2009-01-01

    In France, a wide variety of alternative materials is produced or exists in the form of stockpiles built up over time. Such materials are distributed over various regions of the territory depending on local industrial development and urbanisation trends. The use of alternative materials at a national scale implies sharing local knowledge and experience. Building a national database on alternative materials for road construction is useful in gathering and sharing information. An analysis of feedback from onsite experiences (back analysis) is essential to improve knowledge on alternative material use in road construction. Back analysis of field studies has to be conducted in accordance with a single common framework. This could enable drawing comparisons between alternative materials and between road applications. A framework for the identification and classification of data used in back analyses is proposed. Since the road structure is an open system, this framework has been based on a stress-response approach at both the material and structural levels and includes a description of external factors applying during the road service life. The proposal has been shaped from a review of the essential characteristics of road materials and structures, as well as from the state of knowledge specific to alternative material characterisation.

  20. An ovine in vivo framework for tracheobronchial stent analysis.

    Science.gov (United States)

    McGrath, Donnacha J; Thiebes, Anja Lena; Cornelissen, Christian G; O'Shea, Mary B; O'Brien, Barry; Jockenhoevel, Stefan; Bruzzi, Mark; McHugh, Peter E

    2017-10-01

    Tracheobronchial stents are most commonly used to restore patency to airways stenosed by tumour growth. Currently all tracheobronchial stents are associated with complications such as stent migration, granulation tissue formation, mucous plugging and stent strut fracture. The present work develops a computational framework to evaluate tracheobronchial stent designs in vivo. Pressurised computed tomography is used to create a biomechanical lung model which takes into account the in vivo stress state, global lung deformation and local loading from pressure variation. Stent interaction with the airway is then evaluated for a number of loading conditions including normal breathing, coughing and ventilation. Results of the analysis indicate that three of the major complications associated with tracheobronchial stents can potentially be analysed with this framework, which can be readily applied to the human case. Airway deformation caused by lung motion is shown to have a significant effect on stent mechanical performance, including implications for stent migration, granulation formation and stent fracture.

  1. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  2. The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis

    Science.gov (United States)

    Fuchs, E. F.

    1972-01-01

    A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.

  3. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  4. Three-dimensional finite element analysis of zirconia all-ceramic cantilevered fixed partial dentures with different framework designs.

    Science.gov (United States)

    Miura, Shoko; Kasahara, Shin; Yamauchi, Shinobu; Egusa, Hiroshi

    2017-06-01

    The purpose of this study were: to perform stress analyses using three-dimensional finite element analysis methods; to analyze the mechanical stress of different framework designs; and to investigate framework designs that will provide for the long-term stability of both cantilevered fixed partial dentures (FPDs) and abutment teeth. An analysis model was prepared for three units of cantilevered FPDs that assume a missing mandibular first molar. Four types of framework design (Design 1, basic type; Design 2, framework width expanded buccolingually by 2 mm; Design 3, framework height expanded by 0.5 mm to the occlusal surface side from the end abutment to the connector area; and Design 4, a combination of Designs 2 and 3) were created. Two types of framework material (yttrium-oxide partially stabilized zirconia and a high precious noble metal gold alloy) and two types of abutment material (dentin and brass) were used. In the framework designs, Design 1 exhibited the highest maximum principal stress value for both zirconia and gold alloy. In the abutment tooth, Design 3 exhibited the highest maximum principal stress value for all abutment teeth. In the present study, Design 4 (the design with expanded framework height and framework width) could contribute to preventing the concentration of stress and protecting abutment teeth. © 2017 Eur J Oral Sci.

  5. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    to occupational settings with minor environmental considerations, and most have not been thoroughly tested on a wide range of NM. Care should also be taken when selecting the most appropriate risk analysis strategy for a given risk context. Given this, we recommend a multi-faceted approach to assess...... the environmental risks of NM as well as increased applications and testing of the proposed frameworks for different NM....

  6. Integrating Poverty and Environmental Concerns into Value-Chain Analysis: A Strategic Framework and Practical Guide

    DEFF Research Database (Denmark)

    Riisgaard, Lone; Bolwig, Simon; Ponte, Stefano

    2010-01-01

    This article aims to guide the design and implementation of action-research projects in value-chain analysis by presenting a strategic framework focused on small producers and trading and processing firms in developing countries. Its stepwise approach – building on the conceptual framework set ou...... purpose of increasing the rewards and/or reducing the risks....

  7. DNA quality and quantity from up to 16 years old post-mortem blood stored on FTA cards.

    Science.gov (United States)

    Rahikainen, Anna-Liina; Palo, Jukka U; de Leeuw, Wiljo; Budowle, Bruce; Sajantila, Antti

    2016-04-01

    Blood samples preserved on FTA cards offer unique opportunities for genetic research. DNA recovered from these cards should be stable for long periods of time. However, it is not well established as how well the DNA stored on FTA card for substantial time periods meets the demands of forensic or genomic DNA analyses and especially so for from post-mortem (PM) samples in which the quality can vary upon initial collection. The aim of this study was to evaluate the time-dependent degradation on DNA quality and quantity extracted from up to 16 years old post-mortem bloodstained FTA cards. Four random FTA samples from eight time points spanning 1998 to 2013 (n=32) were collected and extracted in triplicate. The quantity and quality of the extracted DNA samples were determined with Quantifiler(®) Human Plus (HP) Quantification kit. Internal sample and sample-to-sample variation were evaluated by comparing recovered DNA yields. The DNA from the triplicate samplings were subsequently combined and normalized for further analysis. The practical effect of degradation on DNA quality was evaluated from normalized samples both with forensic and pharmacogenetic target markers. Our results suggest that (1) a PM change, e.g. blood clotting prior to sampling, affects the recovered DNA yield, creating both internal and sample-to-sample variation; (2) a negative correlation between the FTA card storage time and DNA quantity (r=-0.836 at the 0.01 level) was observed; (3) a positive correlation (r=0.738 at the level 0.01) was found between FTA card storage time and degradation levels. However, no inhibition was observed with the method used. The effect of degradation was manifested clearly with functional applications. Although complete STR-profiles were obtained for all samples, there was evidence of degradation manifested as decreased peak heights in the larger-sized amplicons. Lower amplification success was notable with the large 5.1 kb CYP2D6 gene fragment which strongly supports

  8. A unified framework for risk and vulnerability analysis covering both safety and security

    International Nuclear Information System (INIS)

    Aven, Terje

    2007-01-01

    Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke

  9. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  10. OpenElectrophy: an electrophysiological data- and analysis-sharing framework

    Directory of Open Access Journals (Sweden)

    Samuel Garcia

    2009-05-01

    Full Text Available Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data and analysis sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et. al., 2007. OpenElectrophy is open-source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy.

  11. The SAFE FOODS Risk Analysis Framework suitable for GMOs? A case study

    NARCIS (Netherlands)

    Kuiper, H.A.; Davies, H.V.

    2010-01-01

    This paper describes the current EU regulatory framework for risk analysis of genetically modified (GM) crop cultivation and market introduction of derived food/feed. Furthermore the risk assessment strategies for GM crops and derived food/feed as designed by the European Food Safety Authority

  12. Using a Strategic Planning Tool as a Framework for Case Analysis

    Science.gov (United States)

    Lai, Christine A.; Rivera, Julio C., Jr.

    2006-01-01

    In this article, the authors describe how they use a strategic planning tool known as SWOT as a framework for case analysis, using it to analyze the strengths, weaknesses, opportunities, and threats of a public works project intended to enhance regional economic development in Tempe, Arizona. Students consider the project in light of a variety of…

  13. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-01-01

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  14. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu [Louisiana State University, Baton Rouge, LA (United States); Sattler, Meredith, E-mail: msattler@lsu.edu [School of Architecture, Louisiana State University, Baton Rouge, LA (United States); Friedland, Carol J., E-mail: friedland@lsu.edu [Bert S. Turner Department of Construction Management, Louisiana State University, Baton Rouge, LA (United States)

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  15. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  16. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  17. Overview of the Systems Analysis Framework for the EU Bioeconomy. Deliverable 1.4 of the EU FP 7 SAT-BBE project Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy (SAT BBE)

    NARCIS (Netherlands)

    Leeuwen, van M.G.A.; Meijl, van H.; Smeets, E.M.W.; Tabeau-Kowalska, E.W.

    2014-01-01

    In November 2012 the Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy project (SAT-BBE) was launched with the purpose to design an analysis tool useful to monitoring the evolution and impacts of the bioeconomy. In the SAT-BBE project the development of the analysis tool for the

  18. The FairRoot framework

    International Nuclear Information System (INIS)

    Al-Turany, M; Bertini, D; Karabowicz, R; Kresan, D; Malzacher, P; Uhlig, F; Stockmanns, T

    2012-01-01

    The FairRoot framework is an object oriented simulation, reconstruction and data analysis framework based on ROOT. It includes core services for detector simulation and offline analysis. The framework delivers base classes which enable the users to easily construct their experimental setup in a fast and convenient way. By using the Virtual Monte Carlo concept it is possible to perform the simulations using either Geant3 or Geant4 without changing the user code or the geometry description. Using and extending the task mechanism of ROOT it is possible to implement complex analysis tasks in a convenient way. Moreover, using the FairCuda interface of the framework it is possible to run some of these tasks also on GPU. Data IO, as well as parameter handling and data base connections are also handled by the framework. Since some of the experiments will not have an experimental setup with a conventional trigger system, the framework can handle also free flowing input streams of detector data. For this mode of operation the framework provides classes to create the needed time sorted input streams of detector data out of the event based simulation data. There are also tools to do radiation studies and to visualize the simulated data. A CMake-CDash based building and monitoring system is also part of the FairRoot services which helps to build and test the framework on many different platforms in an automatic way, including also Continuous Integration.

  19. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  20. Metabolomic Profiling of Post-Mortem Brain Reveals Changes in Amino Acid and Glucose Metabolism in Mental Illness Compared with Controls

    Directory of Open Access Journals (Sweden)

    Rong Zhang

    2016-01-01

    Full Text Available Metabolomic profiling was carried out on 53 post-mortem brain samples from subjects diagnosed with schizophrenia, depression, bipolar disorder (SDB, diabetes, and controls. Chromatography on a ZICpHILIC column was used with detection by Orbitrap mass spectrometry. Data extraction was carried out with m/z Mine 2.14 with metabolite searching against an in-house database. There was no clear discrimination between the controls and the SDB samples on the basis of a principal components analysis (PCA model of 755 identified or putatively identified metabolites. Orthogonal partial least square discriminant analysis (OPLSDA produced clear separation between 17 of the controls and 19 of the SDB samples (R2CUM 0.976, Q2 0.671, p-value of the cross-validated ANOVA score 0.0024. The most important metabolites producing discrimination were the lipophilic amino acids leucine/isoleucine, proline, methionine, phenylalanine, and tyrosine; the neurotransmitters GABA and NAAG and sugar metabolites sorbitol, gluconic acid, xylitol, ribitol, arabinotol, and erythritol. Eight samples from diabetic brains were analysed, six of which grouped with the SDB samples without compromising the model (R2 CUM 0.850, Q2 CUM 0.534, p-value for cross-validated ANOVA score 0.00087. There appears on the basis of this small sample set to be some commonality between metabolic perturbations resulting from diabetes and from SDB.

  1. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  2. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    Energy Technology Data Exchange (ETDEWEB)

    Kawasumi, Yusuke, E-mail: ssu@rad.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Usui, Akihito, E-mail: t7402r0506@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosokai, Yoshiyuki, E-mail: hosokai@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Igari, Yui, E-mail: igari@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hosoya, Tadashi [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Hayashizaki, Yoshie, E-mail: yoshie@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Saito, Haruo, E-mail: hsaito@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Diagnostic Image Analysis, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Ishibashi, Tadashi, E-mail: tisibasi@med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Clinical Imaging, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan); Funayama, Masato, E-mail: funayama@forensic.med.tohoku.ac.jp [Tohoku University Graduate School of Medicine, Department of Forensic Medicine, 2-1 Seiryo-machi, Aoba-ku, Sendai, Miyagi 980-8575 (Japan)

    2015-04-15

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  3. Post-mortem computed tomography findings of the lungs: Retrospective review and comparison with autopsy results of 30 infant cases

    International Nuclear Information System (INIS)

    Kawasumi, Yusuke; Usui, Akihito; Hosokai, Yoshiyuki; Igari, Yui; Hosoya, Tadashi; Hayashizaki, Yoshie; Saito, Haruo; Ishibashi, Tadashi; Funayama, Masato

    2015-01-01

    Highlights: •Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). •In this study, twenty-two of the thirty sudden infant death cases showed increasing concentration in the entire lung field. •Based on the autopsy results, the lungs simply collapsed and no other abnormal lung findings were identified. •The radiologist should not consider increasing concentration in all lung fields as simply a pulmonary disorder when diagnosing the cause of infant death using PMCT. -- Abstract: Objectives: Infant cases frequently show a diffuse increase in the concentration of lung fields on post-mortem computed tomography (PMCT). However, the lungs often show simply atelectasis at autopsy in the absence of any other abnormal changes. Thus, we retrospectively reviewed the PMCT findings of lungs following sudden infant death and correlated them with the autopsy results. Materials and methods: We retrospectively reviewed infant cases (0 year) who had undergone PMCT and a forensic autopsy at our institution between May 2009 and June 2013. Lung opacities were classified according to their type; consolidation, ground-glass opacity and mixed, as well as distribution; bilateral diffuse and areas of sparing. Statistical analysis was performed to assess the relationships among lung opacities, causes of death and resuscitation attempt. Results: Thirty infant cases were selected, which included 22 sudden and unexplained deaths and 8 other causes of death. Resuscitation was attempted in 22 of 30 cases. Bilateral diffuse opacities were observed in 21 of the 30 cases. Of the 21 cases, 18 were sudden and unexplained deaths. Areas of sparing were observed in 4 sudden and unexplained deaths and 5 other causes of death. Distribution of opacities was not significantly associated with causes of death or resuscitation attempt. The 21 cases with bilateral diffuse opacities included 6 consolidations (4 sudden and unexplained

  4. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    and approaches which have been developed or proposed by large organizations or regulatory bodies for NM. These frameworks and approaches were evaluated and assessed based on a select number of criteria which have been previously proposed as important parameters for inclusion in successful risk assessment......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  5. An integrated framework for cost- benefit analysis in road safety projects using AHP method

    Directory of Open Access Journals (Sweden)

    Mahsa Mohamadian

    2011-10-01

    Full Text Available Cost benefit analysis (CBA is a useful tool for investment decision-making from economic point of view. When the decision involves conflicting goals, the multi-attribute analysis approach is more capable; because there are some social and environmental criteria that cannot be valued or monetized by cost benefit analysis. The complex nature of decision-making in road safety normally makes it difficult to reach a single alternative solution that can satisfy all decision-making problems. Generally, the application of multi-attribute analysis in road sector is promising; however, the applications are in preliminary stage. Some multi-attribute analysis techniques, such as analytic hierarchy process (AHP have been widely used in practice. This paper presents an integrated framework with CBA and AHP methods to select proper alternative in road safety projects. The proposed model of this paper is implemented for a case study of improving a road to reduce the accidents in Iran. The framework is used as an aid to cost benefit tool in road safety projects.

  6. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe

    2014-06-06

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation, and the phase-field crystal equation as test cases. These two models allow us to highlight some of the main advantages that we have access to while using PetIGA for scientific computing.

  7. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    Science.gov (United States)

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  8. A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure

    Directory of Open Access Journals (Sweden)

    Yingjie Xia

    2013-01-01

    Full Text Available Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs, which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI, by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.

  9. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  10. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  11. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  12. Effect of ante-mortem hypoxia on the physicochemical and functional properties of white shrimp (Litopenaeus vannamei) muscle stored on ice.

    Science.gov (United States)

    García-Sifuentes, Celia Olivia; Pacheco-Aguilar, Ramón; Scheuren-Acevedo, Susana María; Carvallo-Ruiz, Gisela; Garcia-Sanchez, Guillermina; Gollas-Galván, Teresa; Hernández-López, Jorge

    2013-06-01

    The effect of ante-mortem hypoxia on the physicochemical and functional properties of raw and cooked white shrimp was studied. Hue angle was greater (p ≤ 0.05) for stressed raw shrimp compared to control (greener color); whereas a lower angle was detected for cooked stressed shrimp (redder/orange coloration). In addition, hue angle increased (p ≤ 0.05) over the ice storage period for control and stressed shrimp (raw and/or cooked). Muscle hardness and shear force showed no differences when comparing control and stressed shrimp (raw and/or cooked). However, during ice storage, shear force increased (p ≤ 0.05) by 22% and 9% for control and stressed raw shrimp, respectively; in contrast, shear force and muscle hardness decreased for cooked shrimp (p ≤ 0.05). Control showed more (p ≤ 0.05) elasticity than stressed cooked shrimp. Stressed raw shrimp showed a water holding capacity 10.8% lower (p ≤ 0.05) than control. However, during the storage, water holding capacity increased (p ≤ 0.05) reaching similar values to control after day 4. Muscle protein solubility of stressed shrimp was 31% lower than control; however, no differences (p > 0.05) were observed after the second day. The thermal stability of myosin (T max) showed differences (p ≤ 0.05) among control and stressed shrimp, whereas no differences for ΔH were observed. Results showed the influence of ante-mortem hypoxia on the physicochemical and functional properties of white shrimp muscle.

  13. The potential of non-invasive pre- and post-mortem carcass measurements to predict the contribution of carcass components to slaughter yield of guinea pigs.

    Science.gov (United States)

    Barba, Lida; Sánchez-Macías, Davinia; Barba, Iván; Rodríguez, Nibaldo

    2018-06-01

    Guinea pig meat consumption is increasing exponentially worldwide. The evaluation of the contribution of carcass components to carcass quality potentially can allow for the estimation of the value added to food animal origin and make research in guinea pigs more practicable. The aim of this study was to propose a methodology for modelling the contribution of different carcass components to the overall carcass quality of guinea pigs by using non-invasive pre- and post mortem carcass measurements. The selection of predictors was developed through correlation analysis and statistical significance; whereas the prediction models were based on Multiple Linear Regression. The prediction results showed higher accuracy in the prediction of carcass component contribution expressed in grams, compared to when expressed as a percentage of carcass quality components. The proposed prediction models can be useful for the guinea pig meat industry and research institutions by using non-invasive and time- and cost-efficient carcass component measuring techniques. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. A Comparative Analysis of PISA Scientific Literacy Framework in Finnish and Thai Science Curricula

    Science.gov (United States)

    Sothayapetch, Pavinee; Lavonen, Jari; Juuti, Kalle

    2013-01-01

    A curriculum is a master plan that regulates teaching and learning. This paper compares Finnish and Thai primary school level science curricula to the PISA 2006 Scientific Literacy Framework. Curriculum comparison was made following the procedure of deductive content analysis. In the analysis, there were four main categories adopted from PISA…

  15. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  16. Concepts of person-centred care: a framework analysis of five studies in daily care practices

    Directory of Open Access Journals (Sweden)

    Margreet

    2016-11-01

    Full Text Available Background: Person-centred care is used as a term to indicate a ‘made to measure’ approach in care. But what does this look like in daily practice? The person-centred nursing framework developed by McCormack and McCance (2010 offers specific concepts but these are still described in rather general terms. Empirical studies, therefore, could help to clarify them and make person-centredness more tangible for nurses. Aims: This paper describes how a framework analysis aimed to clarify the concepts described in the model of McCormack and McCance in order to guide professionals using them in practice. Methods: Five separate empirical studies focusing on older adults in the Netherlands were used in the framework analysis. The research question was: ‘How are concepts of person-centred care made tangible where empirical data are used to describe them?’ Analysis was done in five steps, leading to a comparison between the description of the concepts and the empirical significance found in the studies. Findings: Suitable illustrations were found for the majority of concepts. The results show that an empirically derived specification emerges from the data. In the concept of ‘caring relationship’ for example, it is shown that the personal character of each relationship is expressed by what the nurse and the older person know about each other. Other findings show the importance of values being present in care practices. Conclusions: The framework analysis shows that concepts can be clarified when empirical studies are used to make person-centred care tangible so nurses can understand and apply it in practice. Implications for practice: The concepts of the person-centred nursing framework are recognised when: Nurses know unique characteristics of the person they care for and what is important to them, and act accordingly Nurses use values such as trust, involvement and humour in their care practice Acknowledgement of emotions and compassion create

  17. FEBEX II Project Post-mortem analysis EDZ assessment

    International Nuclear Information System (INIS)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-01-01

    Within the framework of the FEBEX II project a multidisciplinary team studied the mechanisms of creation of the potential damaged zone around the test drift. The research program includes laboratory and in situ investigations as well as the numerical modelling of the observed phenomena. Where laboratory investigations are concerned, the 14C-PMMA technique was applied to study the spatial distribution of porosity in the samples taken from the test drift wall. In addition complementary microscopy and scanning electron microscopy (SEM) studies were performed to make qualitative investigations on the pore apertures and minerals in porous regions. The results obtained with the PMMA method have not shown any clear increased porosity zone adjacent to the tunnel wall. The total porosity of the samples varied between 0.6-1.2%. The samples of unplugged region did not differ from the samples of plugged region. A clear increase in porosity to depths of 10-15 mm from the tunnel wall was detected in lamprophyre samples. According to the SEM/EDX analyses the excavation-disturbed zone in the granite matrix extended to depths of 1-3 mm from the wall surface. A few quartz grains were crushed and some micro fractures were found. Gas permeability tests were carried out on two hollow cylinder samples of about 1m long each taken on the granite wall perpendicular to the drift axis. The first sample was cored in the service area far from the heated zone and the second one at the level of the heater. The tests were performed at constant gas pressure by setting a steady state radial flow through a section of 1cm wide isolated by means of four mini-packers. The profile of the gas permeability according to the core length has been established. The results obtained for both considered samples have shown permeability ranging between 3.5 10-18 and 8.4 10-19m2, pointing out the absence of a marked damage. Acoustic investigations have been carried out with the objective of quantifying the

  18. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    Within the framework of the FEBEX II project a multidisciplinary team studied the mechanisms of creation of the potential damaged zone around the test drift. The research program includes laboratory and in situ investigations as well as the numerical modelling of the observed phenomena. Where laboratory investigations are concerned, the 14C-PMMA technique was applied to study the spatial distribution of porosity in the samples taken from the test drift wall. In addition complementary microscopy and scanning electron microscopy (SEM) studies were performed to make qualitative investigations on the pore apertures and minerals in porous regions. The results obtained with the PMMA method have not shown any clear increased porosity zone adjacent to the tunnel wall. The total porosity of the samples varied between 0.6-1.2%. The samples of unplugged region did not differ from the samples of plugged region. A clear increase in porosity to depths of 10-15 mm from the tunnel wall was detected in lamprophyre samples. According to the SEM/EDX analyses the excavation-disturbed zone in the granite matrix extended to depths of 1-3 mm from the wall surface. A few quartz grains were crushed and some micro fractures were found. Gas permeability tests were carried out on two hollow cylinder samples of about 1m long each taken on the granite wall perpendicular to the drift axis. The first sample was cored in the service area far from the heated zone and the second one at the level of the heater. The tests were performed at constant gas pressure by setting a steady state radial flow through a section of 1cm wide isolated by means of four mini-packers. The profile of the gas permeability according to the core length has been established. The results obtained for both considered samples have shown permeability ranging between 3.5 10-18 and 8.4 10-19m2, pointing out the absence of a marked damage. Acoustic investigations have been carried out with the objective of quantifying the

  19. Comparability of outcome frameworks in medical education: Implications for framework development.

    Science.gov (United States)

    Hautz, Stefanie C; Hautz, Wolf E; Feufel, Markus A; Spies, Claudia D

    2015-01-01

    Given the increasing mobility of medical students and practitioners, there is a growing need for harmonization of medical education and qualifications. Although several initiatives have sought to compare national outcome frameworks, this task has proven a challenge. Drawing on an analysis of existing outcome frameworks, we identify factors that hinder comparability and suggest ways of facilitating comparability during framework development and revisions. We searched MedLine, EmBase and the Internet for outcome frameworks in medical education published by national or governmental organizations. We analyzed these frameworks for differences and similarities that influence comparability. Of 1816 search results, 13 outcome frameworks met our inclusion criteria. These frameworks differ in five core features: history and origins, formal structure, medical education system, target audience and key terms. Many frameworks reference other frameworks without acknowledging these differences. Importantly, the level of detail of the outcomes specified differs both within and between frameworks. The differences identified explain some of the challenges involved in comparing outcome frameworks and medical qualifications. We propose a two-level model distinguishing between "core" competencies and culture-specific "secondary" competencies. This approach could strike a balance between local specifics and cross-national comparability of outcome frameworks and medical education.

  20. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  1. Exploring intellectual capital through social network analysis: a conceptual framework

    Directory of Open Access Journals (Sweden)

    Ivana Tichá

    2011-01-01

    Full Text Available The purpose of this paper is to develop a framework to assess intellectual capital. Intellectual capital is a key element in an organization’s future earning potential. Theoretical and empirical studies show that it is the unique combination of the different elements of intellectual capital and tangible investments that determines an enterprise´s competitive advantage. Intellectual capital has been defined as the combination of an organization´s human, organizational and relational resources and activities. It includes the knowledge, skills, experience and abilities of the employees, its R&D activities, organizational, routines, procedures, systems, databases and its Intellectual Property Rights, as well as all the resources linked to its external relationships, such as with its customers, suppliers, R&D partners, etc. This paper focuses on the relational capital and attempts to suggest a conceptual framework to assess this part of intellectual capital applying social network analysis approach. The SNA approach allows for mapping and measuring of relationships and flows between, people, groups, organizations, computers, URLs, and other connected information/knowledge entities. The conceptual framework is developed for the assessment of collaborative networks in the Czech higher education sector as the representation of its relational capital. It also builds on the previous work aiming at proposal of methodology guiding efforts to report intellectual capital at the Czech public universities.

  2. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  3. Degradation of Kidney and Psoas Muscle Proteins as Indicators of Post-Mortem Interval in a Rat Model, with Use of Lateral Flow Technology.

    Directory of Open Access Journals (Sweden)

    Dong-Gi Lee

    Full Text Available We investigated potential protein markers of post-mortem interval (PMI using rat kidney and psoas muscle. Tissue samples were taken at 12 h intervals for up to 96 h after death by suffocation. Expression levels of eight soluble proteins were analyzed by Western blotting. Degradation patterns of selected proteins were clearly divided into three groups: short-term, mid-term, and long-term PMI markers based on the half maximum intensity of intact protein expression. In kidney, glycogen synthase (GS and glycogen synthase kinase-3β were degraded completely within 48 h making them short-term PMI markers. AMP-activated protein kinase α, caspase 3 and GS were short-term PMI markers in psoas muscle. Glyceraldehyde 3-phosphate dehydrogenase (GAPDH was a mid-term PMI marker in both tissues. Expression levels of the typical long-term PMI markers, p53 and β-catenin, were constant for at least 96 h post-mortem in both tissues. The degradation patterns of GS and caspase-3 were verified by immunohistochemistry in both tissues. GAPDH was chosen as a test PMI protein to perform a lateral flow assay (LFA. The presence of recombinant GAPDH was clearly detected in LFA and quantified in a concentration-dependent manner. These results suggest that LFA might be used to estimate PMI at a crime scene.

  4. Tissue Microarray Analysis Applied to Bone Diagenesis

    OpenAIRE

    Barrios Mello, Rafael; Regis Silva, Maria Regina; Seixas Alves, Maria Teresa; Evison, Martin; Guimarães, Marco Aurélio; Francisco, Rafaella Arrabaça; Dias Astolphi, Rafael; Miazato Iwamura, Edna Sadayo

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens....

  5. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  6. Back to the Future - Part 2. Post-mortem assessment and evolutionary role of the bio-medicolegal sciences.

    Science.gov (United States)

    Ferrara, Santo Davide; Cecchetto, Giovanni; Cecchi, Rossana; Favretto, Donata; Grabherr, Silke; Ishikawa, Takaki; Kondo, Toshikazu; Montisci, Massimo; Pfeiffer, Heidi; Bonati, Maurizio Rippa; Shokry, Dina; Vennemann, Marielle; Bajanowski, Thomas

    2017-07-01

    Part 2 of the review "Back to the Future" is dedicated to the evolutionary role of the bio-medicolegal sciences, reporting the historical profiles, the state of the art, and prospects for future development of the main related techniques and methods of the ancillary disciplines that have risen to the role of "autonomous" sciences, namely, Genetics and Genomics, Toxicology, Radiology, and Imaging, involved in historic synergy in the "post-mortem assessment," together with the mother discipline Legal Medicine, by way of its primary fundament, universally denominated as Forensic Pathology. The evolution of the scientific research and the increased accuracy of the various disciplines will be oriented towards the elaboration of an "algorithm," able to weigh the value of "evidence" placed at the disposal of the "justice system" as real truth and proof.

  7. Clinical and post mortem analysis of combat neck injury used to inform a novel coverage of armour tool.

    Science.gov (United States)

    Breeze, J; Fryer, R; Hare, J; Delaney, R; Hunt, N C; Lewis, E A; Clasper, J C

    2015-04-01

    There is a requirement in the Ministry of Defence for an objective method of comparing the area of coverage of different body armour designs for future applications. Existing comparisons derived from surface wound mapping are limited in that they can only demonstrate the skin entry wound location. The Coverage of Armour Tool (COAT) is a novel three-dimensional model capable of comparing the coverage provided by body armour designs, but limited information exists as to which anatomical structures require inclusion. The aim of this study was to assess the utility of COAT, in the assessment of neck protection, using clinically relevant injury data. Hospital notes and post mortem records of all UK soldiers injured by an explosive fragment to the neck between 01 Jan 2006 and 31 December 2012 from Iraq and Afghanistan were analysed to determine which anatomical structures were responsible for death or functional disability at one year post injury. Using COAT a comparison of three ballistic neck collar designs was undertaken with reference to the percentage of these anatomical structures left exposed. 13/81 (16%) survivors demonstrated complications at one year, most commonly upper limb weakness from brachial plexus injury or a weak voice from laryngeal trauma. In 14/94 (15%) soldiers the neck wound was believed to have been the sole cause of death, primarily from carotid artery damage, spinal cord transection or rupture of the larynx. COAT objectively demonstrated that despite the larger OSPREY collar having almost double the surface area than the two-piece prototype collar, the percentage area of vulnerable cervical structures left exposed only reduced from 16.3% to 14.4%. COAT demonstrated its ability to objectively quantify the potential effectiveness of different body armour designs in providing coverage of vulnerable anatomical structures from different shot line orientations. To improve its utility, it is recommended that COAT be further developed to enable weapon

  8. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  9. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    Science.gov (United States)

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  10. GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Quiter, Brian J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bandstra, Mark S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-12-01

    The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of open and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.

  11. SU-C-12A-04: Diagnostic Imaging Research Using Decedents as a Proxy for the Living: Are Radiation Dosimetry and Tissue Property Measurements Affected by Post-Mortem Changes?

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, D; Heintz, P [Department of Radiology University of New Mexico School of Medicine, Albuquerque, NM (United States); Weber, W; Melo, D [Lovelace Respiratory Research Institute, Albuquerque, New Mexico (United States); Adolphi, N; Hatch, P [Radiology-Pathology Center for Forensic Imaging, University of New Mexico School of Medicine, Albuquerque, New Mexico (United States)

    2014-06-01

    Purpose: Radiation dose (RD) from diagnostic imaging is a growing public health concern. Implanting dosimeters is a more accurate way to assess organ dose, relative to commonly used mathematical estimations. However, performing accurate dosimetry using live subjects is hindered by patient motion and safety considerations, which limit the RD and placement of implanted dosimeters. Performing multiple scans on the same subject would be the ideal way to assess the impact of dose reduction on image quality; however, performing multiple non-standard-of-care scans on live subjects for dosimetry and image quality measurements is generally prohibited by IRB committees. Our objective is to assess whether RD and tissue property (TP) measurements in post-mortem (PM) subjects are sufficiently similar to those in live subjects to justify the use of deceased subjects in future dosimetry and image quality studies. Methods: 4 MOSFET radiation dosimeters were placed enterically in each subject (2 sedated Rhesus Macaques) to measure the RD at 4 levels (carina, lung, heart, and liver) during CT scanning. The CT protocol was performed ante-mortem (AM) and 2 and 3 hours PM. For TP analysis, additional scans were taken at 24 hours PM. To compare AM and PM TP, regions-of-interest were drawn on selected organs and the average CT density with standard deviation (in units of HU) were taken; additionally, visual comparisons of images were made at each PM interval. Results: No significant difference was observed in 8 of 9 measurements comparing AM and PM RD. Only one measurement (liver of the first subject) showed a significant difference (7% lower on PM measurement), possibly due to subject re-positioning. Initial TP visual and quantitative analyses show little to no change PM. Conclusion: Our results suggest that realistic radiation dosimetry and image quality measurements based on tissue properties can be performed reliably on recently deceased subjects.

  12. A decision analysis framework for stakeholder involvement and learning in groundwater management

    Science.gov (United States)

    Karjalainen, T. P.; Rossi, P. M.; Ala-aho, P.; Eskelinen, R.; Reinikainen, K.; Kløve, B.; Pulido-Velazquez, M.; Yang, H.

    2013-12-01

    Multi-criteria decision analysis (MCDA) methods are increasingly used to facilitate both rigorous analysis and stakeholder involvement in natural and water resource planning. Decision-making in that context is often complex and multi-faceted with numerous trade-offs between social, environmental and economic impacts. However, practical applications of decision-support methods are often too technically oriented and hard to use, understand or interpret for all participants. The learning of participants in these processes is seldom examined, even though successful deliberation depends on learning. This paper analyzes the potential of an interactive MCDA framework, the decision analysis interview (DAI) approach, for facilitating stakeholder involvement and learning in groundwater management. It evaluates the results of the MCDA process in assessing land-use management alternatives in a Finnish esker aquifer area where conflicting land uses affect the groundwater body and dependent ecosystems. In the assessment process, emphasis was placed on the interactive role of the MCDA tool in facilitating stakeholder participation and learning. The results confirmed that the structured decision analysis framework can foster learning and collaboration in a process where disputes and diverse interests are represented. Computer-aided interviews helped the participants to see how their preferences affected the desirability and ranking of alternatives. During the process, the participants' knowledge and preferences evolved as they assessed their initial knowledge with the help of fresh scientific information. The decision analysis process led to the opening of a dialogue, showing the overall picture of the problem context and the critical issues for the further process.

  13. Partial autolysis of μ/m-calpain during post mortem aging of chicken muscle.

    Science.gov (United States)

    Zhao, Liang; Jiang, Nanqi; Li, Miaozhen; Huang, Ming; Zhou, Guanghong

    2016-12-01

    The objective of this study was to investigate changes occurring in μ/m-calpain in post mortem chicken muscles and to determine the origin of the unknown bands found in calpain casein zymography. The unknown bands were reported with slightly greater mobility compared to conventional μ/m-calpain bands in casein zymography. Identification of these bands was accomplished using native polyacrylamide gel electrophoresis, liquid chromatography tandem mass spectrometry and with protein phosphatase treatment. Results showed that the unknown bands were corresponding to μ/m-calpain, and dephosphorylation by protein phosphatase did not change their appearance. The calpain samples were then incubated with various concentrations of Ca 2+ to determine the relationship between changes in μ/m-calpain and the appearance of the unknown bands. The products of μ/m-calpain partial autolysis were found to be consistent with the appearance of the unknown bands. Therefore, the appearance of these bands did not result from phosphorylation of μ/m-calpain as previously hypothesized, but from partial autolysis of μ/m-calpain. Also their presence suggests that μ/m-calpain undergoes partial autolysis during aging which may play certain roles in meat quality improvement. © 2016 Japanese Society of Animal Science.

  14. Re-establishment of rigor mortis: evidence for a considerably longer post-mortem time span.

    Science.gov (United States)

    Crostack, Chiara; Sehner, Susanne; Raupach, Tobias; Anders, Sven

    2017-07-01

    Re-establishment of rigor mortis following mechanical loosening is used as part of the complex method for the forensic estimation of the time since death in human bodies and has formerly been reported to occur up to 8-12 h post-mortem (hpm). We recently described our observation of the phenomenon in up to 19 hpm in cases with in-hospital death. Due to the case selection (preceding illness, immobilisation), transfer of these results to forensic cases might be limited. We therefore examined 67 out-of-hospital cases of sudden death with known time points of death. Re-establishment of rigor mortis was positive in 52.2% of cases and was observed up to 20 hpm. In contrast to the current doctrine that a recurrence of rigor mortis is always of a lesser degree than its first manifestation in a given patient, muscular rigidity at re-establishment equalled or even exceeded the degree observed before dissolving in 21 joints. Furthermore, this is the first study to describe that the phenomenon appears to be independent of body or ambient temperature.

  15. Building a Conceptual Framework: Philosophy, Definitions, and Procedure

    OpenAIRE

    Yosef Jabareen

    2009-01-01

    In this paper the author proposes a new qualitative method for building conceptual frameworks for phenomena that are linked to multidisciplinary bodies of knowledge. First, he redefines the key terms of concept, conceptual framework, and conceptual framework analysis. Concept has some components that define it. A conceptual framework is defined as a network or a “plane” of linked concepts. Conceptual framework analysis offers a procedure of theorization for building conceptual frameworks base...

  16. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    Science.gov (United States)

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom

  17. Academic Libraries and Quality: An Analysis and Evaluation Framework

    Science.gov (United States)

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  18. CLARA: CLAS12 Reconstruction and Analysis Framework

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matta, Sebastian Mancilla [Santa Maria U., Valparaiso, Chile; Oyarzun, Ricardo [Santa Maria U., Valparaiso, Chile

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  19. A Framework for Analysis of Music Similarity Measures

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Christensen, Mads G.; Jensen, Søren Holdt

    2007-01-01

    To analyze specific properties of music similarity measures that the commonly used genre classification evaluation procedure does not reveal, we introduce a MIDI based test framework for music similarity measures. We introduce the framework by example and thus outline an experiment to analyze the...

  20. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  1. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  2. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    Directory of Open Access Journals (Sweden)

    Kim Hyun

    2011-12-01

    Full Text Available Abstract Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  3. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  4. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  5. Applicability of long-term electroencephalography in pre-mortem diagnosis of Creutzfeldt-Jakob disease: A case report.

    Science.gov (United States)

    Attaripour Isfahani, Sanaz; Dougherty, Michelle; Gliebus, Gediminas Peter

    2017-01-01

    Creutzfeldt-Jakob disease accounts for more than 90% of all sporadic prion disease cases. The molecular MM2 genotype has been divided into cortical and thalamic subtypes based on structures involved and is characterized clinically by progressive dementia without ataxia or typical electroencephalography changes. Proposed diagnostic criteria for MM2 cortical type sporadic Creutzfeldt-Jakob disease include progressive dementia, cortical hyper-intensity on diffusion-weighted magnetic resonance imaging, increased cerebrospinal fluid 14-3-3 protein level, and the exclusion of other types of dementia. The presence of periodic discharges on electroencephalography in MM2 cortical type were reported in 42% of the cases. We are reporting a case of sporadic Creutzfeldt-Jakob disease cortical MM2-type presenting with rapid cognitive decline, who survived 8 months since symptom onset. Brain imaging, cerebrospinal fluid analysis, and long-term electroencephalography monitoring were obtained and diagnosis was confirmed by autopsy. Short-term electroencephalography recording, performed 5 months after symptom onset, demonstrated diffuse background slowing without epileptiform activity. Long-term video electroencephalography monitoring demonstrated generalized slowing, maximum in bilateral frontal areas, which intermittently would become rhythmic (1-2 Hz) without hemispheric predominance. If the findings do not clearly meet the proposed clinical criteria for sporadic Creutzfeldt-Jakob disease, the use of long-term electroencephalography could increase the sensitivity. We question whether the lack of the characteristic findings on electroencephalography in some cases could be due to insufficient time of recording. Application of long-term electroencephalography monitoring increases the sensitivity of electroencephalography and the certainty of pre-mortem diagnosis of sporadic Creutzfeldt-Jakob disease.

  6. Evaluation and Policy Analysis: A Communicative Framework

    Directory of Open Access Journals (Sweden)

    Cynthia Wallat

    1997-07-01

    Full Text Available A major challenge for the next generation of students of human development is to help shape the paradigms by which we analyze and evaluate public policies for children and families. Advocates of building research and policy connections point to health care and stress experiences across home, school, and community as critical policy issues that expand the scope of contexts and outcomes studied. At a minimum, development researchers and practitioners will need to be well versed in available methods of inquiry; they will need to be "methodologically multilingual" when conducting evaluation and policy analysis, producing reports, and reporting their interpretations to consumer and policy audiences. This article suggests how traditional approaches to policy inquiry can be reconsidered in light of these research inquiry and communicative skills needed by all policy researchers. A fifteen year review of both policy and discourse processes research is presented to suggest ways to conduct policy studies within a communicative framework.

  7. AIGO: Towards a unified framework for the Analysis and the Inter-comparison of GO functional annotations

    Directory of Open Access Journals (Sweden)

    Defoin-Platel Michael

    2011-11-01

    Full Text Available Abstract Background In response to the rapid growth of available genome sequences, efforts have been made to develop automatic inference methods to functionally characterize them. Pipelines that infer functional annotation are now routinely used to produce new annotations at a genome scale and for a broad variety of species. These pipelines differ widely in their inference algorithms, confidence thresholds and data sources for reasoning. This heterogeneity makes a comparison of the relative merits of each approach extremely complex. The evaluation of the quality of the resultant annotations is also challenging given there is often no existing gold-standard against which to evaluate precision and recall. Results In this paper, we present a pragmatic approach to the study of functional annotations. An ensemble of 12 metrics, describing various aspects of functional annotations, is defined and implemented in a unified framework, which facilitates their systematic analysis and inter-comparison. The use of this framework is demonstrated on three illustrative examples: analysing the outputs of state-of-the-art inference pipelines, comparing electronic versus manual annotation methods, and monitoring the evolution of publicly available functional annotations. The framework is part of the AIGO library (http://code.google.com/p/aigo for the Analysis and the Inter-comparison of the products of Gene Ontology (GO annotation pipelines. The AIGO library also provides functionalities to easily load, analyse, manipulate and compare functional annotations and also to plot and export the results of the analysis in various formats. Conclusions This work is a step toward developing a unified framework for the systematic study of GO functional annotations. This framework has been designed so that new metrics on GO functional annotations can be added in a very straightforward way.

  8. Tissue Microarray Analysis Applied to Bone Diagenesis.

    Science.gov (United States)

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-04

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered.

  9. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Science.gov (United States)

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  10. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Directory of Open Access Journals (Sweden)

    Matteo Convertino

    Full Text Available Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the

  11. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    Science.gov (United States)

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  12. Analysis Community’s Coping Strategies and Local Risk Governance Framework in Relation to Landslide

    Directory of Open Access Journals (Sweden)

    Heru Setiawan

    2014-12-01

    Full Text Available Analysis of people perception and analysis of the coping strategy to landslides are the two elements that are es-sential to determine the level of preparedness of communities to landslides. To know the preparedness of government and other stakeholders in facing landslide, the analysis risk governance framework was required. A survey using questionnaires with random sampling was applied to assess the level of people perception and people coping strategy related to landslide. Analysis of risk governance frame work was done at the district and sub-district level. ἀe study found that people perception related with landslide dominated by high and moderate level. Age and education are two factors that inḀuence the people’s perception to landslide. Local people applied four types coping strategy, which are: economic, structural, social and cultural coping strategy. Totally, 51.6% respondents have high level, 33.3% have moderate level and only 15.1% respondents that have low level of coping strategy. ἀe factors that inḀuence the level of coping strategy are education, income and building type.  Analysis of risk governance framework is limited to the three components including stakeholder involvement, risk management and risk communication. Based on the data analysis, the level of stakeholder involvement at the district scope was categorized on the moderate till high and the level of stakeholder involvement at sub-district level was categorized on the high level. Generally, the risk management of Karanganyar was categorized on the moderate level and high level and the risk management in Tawangmangu was categorized on the moderate level. ἀere are some elements must be improved on the risk governance framework, those are data management, the pattern of relationships among stakeholders, increased participation of NGOs, constructed and updated landslide risk map, enhancement of microᴀnance role in helping the com-munity when disaster strikes

  13. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    Science.gov (United States)

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  14. Validation of a Framework for Measuring Hospital Disaster Resilience Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-06-01

    Full Text Available Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1, disaster management mechanisms (F2, hospital infrastructural safety (F3, and disaster resources (F4. These factors displayed good internal consistency. The overall level of hospital disaster resilience (F was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

  15. [18F]FDDNP PET in Tauopathies: Correlation to post mortem Pathology in a Case of Progressive Supranuclear Palsy (PSP)

    Science.gov (United States)

    Villegas, Brendon Josef

    This investigation of [18F]FDDNP was conducted in an effort to confirm the presence of disease in a patient with Progressive Supranuclear Palsy (PSP) and to correlate the ante mortem PET scan results to the post mortem pathology. The immunohistochemical and immunofluorescent staining of Paired Helical Filamentous (PHF) tau (AT8) and Amyloid Beta (6F/3D) misfolded proteins demonstrated a widespread deposition in the cortical and subcortical nuclei, the white matter, cerebellar white matter and the medulla oblongata. The in vitro autoradiography demonstrated a neocortical signal comprised of well-delineated amyloid beta in the nucleated layers I/II and hyperphosphorylated tau in the deeper layers III through VI. The autoradiography was well correlated with the immunohistochemical staining in adjacent tissue slides. The binding of the parametric [ 18F]FDDNP distribution volume ratio (DVR) correlated well (Spearman's rho = 0.962, p = .004) with the deposition of tau but not with the presence of amyloid beta (Spearman's rho = -0.829, p = .041). The [ 18F]FDDNP DVR signal appears to be primarily due to the large amount of bound hyperphosphorylated tau (p-tau) and the amyloid beta negligibly contributes to the total signal. Unlabeled FDDNP was shown to bind to tau in the form of globose tangles in the rostral ventromedial medulla as confirmed with both Thioflavin S and PHF-tau Immunofluorescence. The binding of [18F]FDDNP to the human neuroanatomy was investigated in two cohorts of distinct tauopathies and compared to the binding in two tau-negative cohorts against control patients. A cohort of PSP patients (n = 12) with a mean age of 63.8 years and a cohort of Chronic Traumatic Encephalopathy (CTE) patients (n = 14) with a mean age of 58.1 years are both characterized by the presence of various degrees of tau pathology in their brains. The cohort of Parkinson's Disease (PD) patients (n = 16) with a mean age of 63.2 years is initially characterized by clinical symptoms

  16. Preliminary approach on early post mortem stress and quality indexes changes in large size bluefin tuna (Thunnus thynnus

    Directory of Open Access Journals (Sweden)

    R. Ugolini

    2010-01-01

    Full Text Available Bluefin tuna (Thunnus thynnus is very appreciated on Japan and USA market for the preparation of sushi and sahimi. The market price of the fresh product can vary from 8 to 33 Euro/kg (gate farm/producers prices according to size, shape, fat level, meat colour, consistency and freshness (absence of “hyake”, all parameters strictly connected to feeding quality and quantity, rearing and killing stress factors and refrigeration times and conditions after death. Excessive levels of stress during the slaughtering can affect meat quality, contributing to significantly decrease of tuna’s price. The present trial was carried out to evaluate the possible harvesting/slaughtering stress effect on reared bluefin tuna meat quality, starting from the examination of the most important stress and quality parameters changes during the early post mortem period.

  17. HIF1α protein and mRNA expression as a new marker for post mortem interval estimation in human gingival tissue.

    Science.gov (United States)

    Fais, Paolo; Mazzotti, Maria Carla; Teti, Gabriella; Boscolo-Berto, Rafael; Pelotti, Susi; Falconi, Mirella

    2018-06-01

    Estimating the post mortem interval (PMI) is still a crucial step in Forensic Pathology. Although several methods are available for assessing the PMI, a precise estimation is still quite unreliable and can be inaccurate. The present study aimed to investigate the immunohistochemical distribution and mRNA expression of hypoxia inducible factor (HIF-1α) in post mortem gingival tissues to establish a correlation between the presence of HIF-1α and the time since death, with the final goal of achieving a more accurate PMI estimation. Samples of gingival tissues were obtained from 10 cadavers at different PMIs (1-3 days, 4-5 days and 8-9 days), and were processed for immunohistochemistry and quantitative reverse transcription-polymerase chain reaction. The results showed a time-dependent correlation of HIF-1α protein and its mRNA with different times since death, which suggests that HIF-1α is a potential marker for PMI estimation. The results showed a high HIF-1α protein signal that was mainly localized in the stratum basale of the oral mucosa in samples collected at a short PMI (1-3 days). It gradually decreased in samples collected at a medium PMI (4-5 days), but it was not detected in samples collected at a long PMI (8-9 days). These results are in agreement with the mRNA data. These data indicate an interesting potential utility of Forensic Anatomy-based techniques, such as immunohistochemistry, as important complementary tools to be used in forensic investigations. © 2018 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  18. The Health System Dynamics Framework: The introduction of an analytical model for health system analysis and its application to two case-studies

    Directory of Open Access Journals (Sweden)

    J van Olmen

    2012-03-01

    Full Text Available Frameworks can clarify concepts and improve understanding of underlying mechanisms in the domain of health systems research and strengthening. Many existing frameworks have a limited capacity to analyze interactions and equilibriums within a health system overlooking values as an underlying steering mechanism. This paper introduces the health system dynamics framework and demonstrates its application as a tool for analysis and modelling. The added value of this framework is: 1 consideration of different levels of a health system and tracing how interventions or events at one level influence other elements and other levels; 2 emphasizes the importance of values; 3 a central axis linking governance, human resources, service delivery and population, and 4 taking into account the key elements of complexity in analysis and strategy development. We urge  the analysis of individual health systems and meta-analysis, for a better understanding of their functioning and strengthening. 

  19. Understanding Universities in Ontario, Canada: An Industry Analysis Using Porter's Five Forces Framework

    Science.gov (United States)

    Pringle, James; Huisman, Jeroen

    2011-01-01

    In analyses of higher education systems, many models and frameworks are based on governance, steering, or coordination models. Although much can be gained by such analyses, we argue that the language used in the present-day policy documents (knowledge economy, competitive position, etc.) calls for an analysis of higher education as an industry. In…

  20. Big Data Based Analysis Framework for Product Manufacturing and Maintenance Process

    OpenAIRE

    Zhang , Yingfeng; Ren , Shan

    2015-01-01

    Part 8: Cloud-Based Manufacturing; International audience; With the widely use of smart sensor devices in the product lifecycle management (PLM), it creates amount of real-time and muti-source lifecycle big data. These data allow decision makers to make better-informed PLM decisions. In this article, an overview framework of big data based analysis for product lifecycle (BDA-PL) was presented to provide a new paradigm by extending the techniques of Internet of Things (IoT) and big data analys...

  1. Framework for analysis of solar energy systems in the built environment from an exergy perspective

    OpenAIRE

    Torio, H.; Schmidt, D.

    2010-01-01

    Exergy analysis is a more powerful tool than mere energy analysis for showing the improvement potential of energy systems. Direct use of solar radiation instead of degrading other high quality energy resources found in nature is advantageous. Yet, due to physical inconsistencies present in the exergy analysis framework for assessing direct-solar systems commonly found in literature, high exergy losses arise in the conversion process of solar radiation in direct-solar systems. However, these l...

  2. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    Directory of Open Access Journals (Sweden)

    Mohamed Elgendi

    2016-11-01

    Full Text Available Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA” involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 have to follow the inequality ( 8 × W 1 ≥ W 2 ≥ ( 2 × W 1 . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  3. Diagnostic accuracy of post-mortem MRI for thoracic abnormalities in fetuses and children

    International Nuclear Information System (INIS)

    Arthurs, Owen J.; Thayyil, Sudhin; Addison, Shea; Olsen, Oystein E.; Wade, Angie; Jones, Rod; Norman, Wendy; Taylor, Andrew M.; Scott, Rosemary J.; Robertson, Nicola J.; Chitty, Lyn S.; Sebire, Neil J.; Owens, Catherine M.

    2014-01-01

    To compare the diagnostic accuracy of post-mortem magnetic resonance imaging (PMMR) specifically for non-cardiac thoracic pathology in fetuses and children, compared with conventional autopsy. Institutional ethics approval and parental consent was obtained. A total of 400 unselected fetuses and children underwent PMMR before conventional autopsy, reported blinded to the other dataset. Of 400 non-cardiac thoracic abnormalities, 113 (28 %) were found at autopsy. Overall sensitivity and specificity (95 % confidence interval) of PMMR for any thoracic pathology was poor at 39.6 % (31.0, 48.9) and 85.5 % (80.7, 89.2) respectively, with positive predictive value (PPV) 53.7 % (42.9, 64.0) and negative predictive value (NPV) 77.0 % (71.8, 81.4). Overall agreement was 71.8 % (67.1, 76.2). PMMR was most sensitive at detecting anatomical abnormalities, including pleural effusions and lung or thoracic hypoplasia, but particularly poor at detecting infection. PMMR currently has relatively poor diagnostic detection rates for the commonest intra-thoracic pathologies identified at autopsy in fetuses and children, including respiratory tract infection and diffuse alveolar haemorrhage. The reasonable NPV suggests that normal thoracic appearances at PMMR exclude the majority of important thoracic lesions at autopsy, and so could be useful in the context of minimally invasive autopsy for detecting non-cardiac thoracic abnormalities. (orig.)

  4. Diagnostic accuracy of post-mortem MRI for thoracic abnormalities in fetuses and children

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); UCL Institute of Child Health, London (United Kingdom); Thayyil, Sudhin; Addison, Shea [Imperial College London, Perinatal Neurology and Neonatology, London (United Kingdom); Olsen, Oystein E. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); Wade, Angie [UCL Institute of Child Health, Paediatric Epidemiology and Biostatistics Unit, London (United Kingdom); Jones, Rod; Norman, Wendy; Taylor, Andrew M. [UCL Institute of Cardiovascular Science, Centre for Cardiovascular Imaging, London (United Kingdom); Great Ormond Street Hospital for Children NHS Foundation Trust, Cardiorespiratory Division, London (United Kingdom); Scott, Rosemary J. [University College London Hospital NHS Trust, Department of Histopathology, London (United Kingdom); Robertson, Nicola J. [UCL Institute for Women' s Health, Academic Neonatology, London (United Kingdom); Chitty, Lyn S. [UCL Institute of Child Health, Genetics and Genomic Medicine, London (United Kingdom); UCLH NHS Foundation Trusts, London (United Kingdom); Sebire, Neil J. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Histopathology, London (United Kingdom); UCL Institute of Child Health, London (United Kingdom); Owens, Catherine M. [Great Ormond Street Hospital for Children NHS Foundation Trust, Department of Radiology, London (United Kingdom); UCL Institute of Cardiovascular Science, Centre for Cardiovascular Imaging, London (United Kingdom); Great Ormond Street Hospital for Children NHS Foundation Trust, Cardiorespiratory Division, London (United Kingdom); Collaboration: Magnetic Resonance Imaging Autopsy Study (MaRIAS) Collaborative Group

    2014-11-15

    To compare the diagnostic accuracy of post-mortem magnetic resonance imaging (PMMR) specifically for non-cardiac thoracic pathology in fetuses and children, compared with conventional autopsy. Institutional ethics approval and parental consent was obtained. A total of 400 unselected fetuses and children underwent PMMR before conventional autopsy, reported blinded to the other dataset. Of 400 non-cardiac thoracic abnormalities, 113 (28 %) were found at autopsy. Overall sensitivity and specificity (95 % confidence interval) of PMMR for any thoracic pathology was poor at 39.6 % (31.0, 48.9) and 85.5 % (80.7, 89.2) respectively, with positive predictive value (PPV) 53.7 % (42.9, 64.0) and negative predictive value (NPV) 77.0 % (71.8, 81.4). Overall agreement was 71.8 % (67.1, 76.2). PMMR was most sensitive at detecting anatomical abnormalities, including pleural effusions and lung or thoracic hypoplasia, but particularly poor at detecting infection. PMMR currently has relatively poor diagnostic detection rates for the commonest intra-thoracic pathologies identified at autopsy in fetuses and children, including respiratory tract infection and diffuse alveolar haemorrhage. The reasonable NPV suggests that normal thoracic appearances at PMMR exclude the majority of important thoracic lesions at autopsy, and so could be useful in the context of minimally invasive autopsy for detecting non-cardiac thoracic abnormalities. (orig.)

  5. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  6. A Comparative Analysis of Competency Frameworks for Youth Workers in the Out-of-School Time Field

    OpenAIRE

    Vance, Femi

    2010-01-01

    Research suggests that the quality of out-of-school time (OST) programs is related to positive youth outcomes and skilled staff are a critical component of high quality programming. This descriptive case study of competency frameworks for youth workers in the OST field demonstrates how experts and practitioners characterize a skilled youth worker. A comparative analysis of 11 competency frameworks is conducted to identify a set of common core competencies. A set of 12 competency areas that ar...

  7. A generic framework for the description and analysis of energy security in an energy system

    International Nuclear Information System (INIS)

    Hughes, Larry

    2012-01-01

    While many energy security indicators and models have been developed for specific jurisdictions or types of energy, few can be considered sufficiently generic to be applicable to any energy system. This paper presents a framework that attempts to meet this objective by combining the International Energy Agency's definition of energy security with structured systems analysis techniques to create three energy security indicators and a process-flow energy systems model. The framework is applicable to those energy systems which can be described in terms of processes converting or transporting flows of energy to meet the energy–demand flows from downstream processes. Each process affects the environment and is subject to jurisdictional policies. The framework can be employed to capture the evolution of energy security in an energy system by analyzing the results of indicator-specific metrics applied to the energy, demand, and environment flows associated with the system's constituent processes. Energy security policies are treated as flows to processes and classified into one of three actions affecting the process's energy demand or the process or its energy input, or both; the outcome is determined by monitoring changes to the indicators. The paper includes a detailed example of an application of the framework. - Highlights: ► The IEA's definition of energy security is parsed into three energy security indicators: availability, affordability, and acceptability. ► Data flow diagrams and other systems analysis tools can represent an energy system and its processes, flows, and chains. ► Indicator-specific metrics applied to a process's flow determine the state of energy security in an energy system, an energy chain, or process. ► Energy policy is considered as a flow and policy outcomes are obtained by measuring flows with indicator-specific metrics. ► The framework is applicable to most jurisdictions and energy types.

  8. eMZed: an open source framework in Python for rapid and interactive development of LC/MS data analysis workflows

    OpenAIRE

    Kiefer, P; Schmitt, U; Vorholt, J A

    2013-01-01

    Summary: The Python-based, open-source eMZed framework was developed for mass spectrometry (MS) users to create tailored workflows for liquid chromatography (LC)/MS data analysis. The goal was to establish a unique framework with comprehensive basic functionalities that are easy to apply and allow for the extension and modification of the framework in a straightforward manner. eMZed supports the iterative development and prototyping of individual evaluation strategies by providing a computing...

  9. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  10. Analysis of lipid experiments (ALEX: a software framework for analysis of high-resolution shotgun lipidomics data.

    Directory of Open Access Journals (Sweden)

    Peter Husen

    Full Text Available Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1. The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  11. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    Science.gov (United States)

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  12. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  13. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  14. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  15. Overview of JET post-mortem results following the 2007-9 operational period, and comparisons with previous campaigns

    International Nuclear Information System (INIS)

    Coad, J P; Gruenhagen, S; Widdowson, A; Hole, D E; Hakola, A; Koivuranta, S; Likonen, J; Rubel, M

    2011-01-01

    In 2010, all the plasma-facing components were removed from JET so that the carbon-based surfaces could be replaced with beryllium (Be) or tungsten as part of the ITER-like wall (ILW) project. This gives unprecedented opportunities for post-mortem analyses of these plasma-facing surfaces; this paper reviews the data obtained so far and relates the information to studies of tiles removed during previous JET shutdowns. The general pattern of erosion/deposition at the JET divertor has been maintained, with deposition of impurities in the scrape-off layer (SOL) at the inner divertor and preferential removal of carbon and transport into the corner. However, the remaining films in the SOL contain very high Be/C ratios at the surface. The first measurements of erosion using a tile profiler have been completed, with up to 200 microns erosion being recorded at points on the inner wall guard limiters.

  16. Framework for Infectious Disease Analysis: A comprehensive and integrative multi-modeling approach to disease prediction and management.

    Science.gov (United States)

    Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark

    2017-12-01

    The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.

  17. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    Science.gov (United States)

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  18. The role of network theory and object-oriented modeling within a framework for the vulnerability analysis of critical infrastructures

    International Nuclear Information System (INIS)

    Eusgeld, Irene; Kroeger, Wolfgang; Sansavini, Giovanni; Schlaepfer, Markus; Zio, Enrico

    2009-01-01

    A framework for the analysis of the vulnerability of critical infrastructures has been proposed by some of the authors. The framework basically consists of two successive stages: (i) a screening analysis for identifying the parts of the critical infrastructure most relevant with respect to its vulnerability and (ii) a detailed modeling of the operational dynamics of the identified parts for gaining insights on the causes and mechanisms responsible for the vulnerability. In this paper, a critical presentation is offered of the results of a set of investigations aimed at evaluating the potentials of (i) using network analysis based on measures of topological interconnection and reliability efficiency, for the screening task; (ii) using object-oriented modeling as the simulation framework to capture the detailed dynamics of the operational scenarios involving the most vulnerable parts of the critical infrastructure as identified by the preceding network analysis. A case study based on the Swiss high-voltage transmission system is considered. The results are cross-compared and evaluated; the needs of further research are defined

  19. A multilevel evolutionary framework for sustainability analysis

    Directory of Open Access Journals (Sweden)

    Timothy M. Waring

    2015-06-01

    Full Text Available Sustainability theory can help achieve desirable social-ecological states by generalizing lessons across contexts and improving the design of sustainability interventions. To accomplish these goals, we argue that theory in sustainability science must (1 explain the emergence and persistence of social-ecological states, (2 account for endogenous cultural change, (3 incorporate cooperation dynamics, and (4 address the complexities of multilevel social-ecological interactions. We suggest that cultural evolutionary theory broadly, and cultural multilevel selection in particular, can improve on these fronts. We outline a multilevel evolutionary framework for describing social-ecological change and detail how multilevel cooperative dynamics can determine outcomes in environmental dilemmas. We show how this framework complements existing sustainability frameworks with a description of the emergence and persistence of sustainable institutions and behavior, a means to generalize causal patterns across social-ecological contexts, and a heuristic for designing and evaluating effective sustainability interventions. We support these assertions with case examples from developed and developing countries in which we track cooperative change at multiple levels of social organization as they impact social-ecological outcomes. Finally, we make suggestions for further theoretical development, empirical testing, and application.

  20. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  1. Post-mortem analysis on LiFePO4|Graphite cells describing the evolution & composition of covering layer on anode and their impact on cell performance

    Science.gov (United States)

    Lewerenz, Meinert; Warnecke, Alexander; Sauer, Dirk Uwe

    2017-11-01

    During cyclic aging of lithium-ion batteries the formation of a μm-thick covering layer on top of the anode facing the separator is found on top of the anode. In this work several post-mortem analyses of cyclic aged cylindrical LFP|Graphite cells are evaluated to give a detailed characterization of the covering layer and to find possible causes for the evolution of such a layer. The analyses of the layer with different methods return that it consists to high percentage of plated active lithium, deposited Fe and products of a solid electrolyte interphase (SEI). The deposition is located mainly in the center of the cell symmetrical to the coating direction. The origin of these depositions is assumed in locally overcharged particles, Fe deposition or inhomogeneous distribution of capacity density. As a secondary effect the deposition on one side increases the thickness locally; thereafter a pressure-induced overcharging due to charge agglomeration of the back side of the anode occurs. Finally a compact and dense covering layer in a late state of aging leads to deactivation of the covered parts of the anode and cathode due to suppressed lithium-ion conductivity. This leads to increasing slope of capacity fade and increase of internal resistance.

  2. Effect of pre-rigor stretch and various constant temperatures on the rate of post-mortem pH fall, rigor mortis and some quality traits of excised porcine biceps femoris muscle strips.

    Science.gov (United States)

    Vada-Kovács, M

    1996-01-01

    Porcine biceps femoris strips of 10 cm original length were stretched by 50% and fixed within 1 hr post mortem then subjected to temperatures of 4 °, 15 ° or 36 °C until they attained their ultimate pH. Unrestrained control muscle strips, which were left to shorten freely, were similarly treated. Post-mortem metabolism (pH, R-value) and shortening were recorded; thereafter ultimate meat quality traits (pH, lightness, extraction and swelling of myofibrils) were determined. The rate of pH fall at 36 °C, as well as ATP breakdown at 36 and 4 °C, were significantly reduced by pre-rigor stretch. The relationship between R-value and pH indicated cold shortening at 4 °C. Myofibrils isolated from pre-rigor stretched muscle strips kept at 36 °C showed the most severe reduction of hydration capacity, while paleness remained below extreme values. However, pre-rigor stretched myofibrils - when stored at 4 °C - proved to be superior to shortened ones in their extractability and swelling.

  3. Quantitative evaluation of volatile hydrocarbons in post-mortem blood in forensic autopsy cases of fire-related deaths.

    Science.gov (United States)

    Yonemitsu, Kosei; Sasao, Ako; Oshima, Toru; Mimasaka, Sohtaro; Ohtsu, Yuki; Nishitani, Yoko

    2012-04-10

    Volatile hydrocarbons in post-mortem blood from victims of fires were analyzed quantitatively by headspace gas chromatography mass spectrometry. The benzene and styrene concentrations in the blood were positively correlated with the carboxyhemoglobin (CO-Hb) concentration, which is evidence that the deceased inhaled the hydrocarbons and carbon monoxide simultaneously. By contrast, the concentrations of toluene and CO-Hb in the blood were not significantly correlated. This lack of correlation could be explained by two different sources of toluene, with low blood concentrations of toluene arising when the deceased inhaled smoke and high blood concentrations of toluene arising when the deceased inhaled petroleum vapor or other unknown vapors. The quantity of soot deposited in the respiratory tract was classified into four grades (-, 1+, 2+, 3+). The mean CO-Hb concentration in the 1+ soot group was significantly lower than those in the 2+ (ptypes of smoke produced by different materials. For example, petroleum combustion with a limited supply of oxygen, like in a compartment fire, may produce a large volume of dense black smoke that contains a large quantity of soot. Soot deposits in the airways and the blood CO-Hb concentration are basic and essential autopsy findings that are used to investigate fire-related deaths. The quantitative GC-MS analysis of blood volatile hydrocarbons can provide additional useful information on the cause of the fire and the circumstances surrounding the death. In combination, these three findings are useful for the reconstruction of cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. ARTS: A System-Level Framework for Modeling MPSoC Components and Analysis of their Causality

    DEFF Research Database (Denmark)

    Mahadevan, Shankar; Storgaard, Michael; Madsen, Jan

    2005-01-01

    Designing complex heterogeneousmultiprocessor Systemon- Chip (MPSoC) requires support for modeling and analysis of the different layers i.e. application, operating system (OS) and platform architecture. This paper presents an abstract system-level modeling framework, called ARTS, to support...

  5. Quantitative susceptibility mapping (QSM) as a means to measure brain iron? A post mortem validation study

    Science.gov (United States)

    Langkammer, Christian; Schweser, Ferdinand; Krebs, Nikolaus; Deistung, Andreas; Goessler, Walter; Scheurer, Eva; Sommer, Karsten; Reishofer, Gernot; Yen, Kathrin; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen R.

    2012-01-01

    Quantitative susceptibility mapping (QSM) is a novel technique which allows determining the bulk magnetic susceptibility distribution of tissue in vivo from gradient echo magnetic resonance phase images. It is commonly assumed that paramagnetic iron is the predominant source of susceptibility variations in gray matter as many studies have reported a reasonable correlation of magnetic susceptibility with brain iron concentrations in vivo. Instead of performing direct comparisons, however, all these studies used the putative iron concentrations reported in the hallmark study by Hallgren and Sourander (1958) for their analysis. Consequently, the extent to which QSM can serve to reliably assess brain iron levels is not yet fully clear. To provide such information we investigated the relation between bulk tissue magnetic susceptibility and brain iron concentration in unfixed (in situ) post mortem brains of 13 subjects using MRI and inductively coupled plasma mass spectrometry. A strong linear correlation between chemically determined iron concentration and bulk magnetic susceptibility was found in gray matter structures (r = 0.84, p < 0.001), whereas the correlation coefficient was much lower in white matter (r = 0.27, p < 0.001). The slope of the overall linear correlation was consistent with theoretical considerations of the magnetism of ferritin supporting that most of the iron in the brain is bound to ferritin proteins. In conclusion, iron is the dominant source of magnetic susceptibility in deep gray matter and can be assessed with QSM. In white matter regions the estimation of iron concentrations by QSM is less accurate and more complex because the counteracting contribution from diamagnetic myelinated neuronal fibers confounds the interpretation. PMID:22634862

  6. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    Science.gov (United States)

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the

  7. Stomach: ultrasonography evaluation and post mortem inspection in adult horses

    Directory of Open Access Journals (Sweden)

    Cristiano Chaves Pessoa da Veiga

    2014-06-01

    Full Text Available ABSTRACT. Veiga C.C.P., Cascon C.M., Souza B.G., Braga L.S.M., Souza V.C., Ferreira A.M.R. & Leite J.S. [Stomach: ultrasonography evaluation and post mortem inspection in adult horses.] Avaliação ultrassonográfica e anatomopatológica macroscópica do estômago de equinos destinados ao abate comercial. Revista Brasileira de Medicina Veterinária, 36(2:125-130, 2014. Instituto de Veterinária, Universidade Federal Rural do Rio de Janeiro, BR 465, km 7, Seropédica, 23890-000, RJ, Brasil. E-mail: radiovet@ufrrj.br The equine gastric ulcer syndrome (EGUS includes all symptomatic or asymptomatic cases of erosions, ulcers, gastritis, gastric emptying disorders, duodenitis, duodenal ulcers and complications of these disorders. Occupies a prominent place in the equine clinic where you can go for the death of the animal. Ultrasonography of the stomach is indicated when the animals showed clinical signs of gastric disease. The aim of this study was to describe the sonographic evaluation and macroscopic pathological findings of the stomach of adult horses intended for commercial slaughter. To this 39 intended for commercial slaughter horses were evaluated. Sonographic evaluation before slaughter transabdominal via the left side of abdomen for evaluation of the stomach was performed. After the slaughter of these animals their stomachs were collected, evaluated and photographed. The study concluded that ultrasonography identified the stomach in all animals evaluated, but did not allow a careful evaluation of the entire length of the viscera, especially the aglandular region and pleated border. All animals evaluated had injury to the gastric mucosa in different degrees. In animals evaluated, the stomach region was most affected by injuries glandular region, although the most severe lesions have been found in the ruffled border adjacent to aglandular region.

  8. Performance of post-mortem CT compared to autopsy in children.

    Science.gov (United States)

    Krentz, Beatriz V; Alamo, Leonor; Grimm, Jochen; Dédouit, Fabrice; Bruguier, Christine; Chevallier, Christine; Egger, Coraline; Da Silva, Luiz F F; Grabherr, Silke

    2016-07-01

    Radiological techniques such as non-enhanced post-mortem computed tomography (PMCT) play an increasingly important role in death investigations, especially in cases of non-medicolegal context of death, where the consent of the next of kin is required to perform autopsy. Such consent is often difficult to obtain for deceased children, and radiological methods may be an acceptable alternative. The aim of our study was to evaluate the performance of PMCT explorations compared to medicolegal conventional autopsies in children and its potential usefulness in non-medicolegal situations. We retrospectively reviewed a group of 26 children aged 0-12 years who died of different causes, which were investigated by both conventional autopsy and PMCT. We compared the findings extracted from radiological and autopsy reports. All findings were grouped according to their importance with respect to cause of death and to the anatomical structure they covered: organs, vascular system, soft tissue, and skeletal system. A significantly larger number of findings were detected by autopsy compared to PMCT. Autopsy proved to be superior to PMCT, notably at detecting organ, soft tissue, and vascular findings, while PMCT was superior at detecting bone findings. However, no statistically significant differences were found between the methods concerning the essential findings used to define the cause of death. In children, PMCT was less sensitive than conventional autopsy for detecting general findings. However, most essential findings were detected by both methods. PMCT was superior to autopsy for the detection of bone lesions in children. Up to today, very rare literature exists concerning PMCT in children, especially in a forensic setting. This article investigates the advantages and limitations of PMCT compared to autopsy in a unique study group and discusses possibilities for future developments.

  9. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  10. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    Science.gov (United States)

    2011-03-01

    Maintenance ABACAS offers a level of flexibility in software development that would be very useful later in the software engineering life cycle. New... Blackjacking : security threats to blackberry devices, PDAs and cell phones in the enterprise. Indianapolis, Indiana, U.S.A.: Wiley Publishing, 2007...AUTOMATED ANALYSIS OF ARM BINARIES USING THE LOW- LEVEL VIRTUAL MACHINE COMPILER FRAMEWORK THESIS Jeffrey B. Scott

  11. A framework for noise-power spectrum analysis of multidimensional images

    International Nuclear Information System (INIS)

    Siewerdsen, J.H.; Cunningham, I.A.; Jaffray, D.A.

    2002-01-01

    A methodological framework for experimental analysis of the noise-power spectrum (NPS) of multidimensional images is presented that employs well-known properties of the n-dimensional (nD) Fourier transform. The approach is generalized to n dimensions, reducing to familiar cases for n=1 (e.g., time series) and n=2 (e.g., projection radiography) and demonstrated experimentally for two cases in which n=3 (viz., using an active matrix flat-panel imager for x-ray fluoroscopy and cone-beam CT to form three-dimensional (3D) images in spatiotemporal and volumetric domains, respectively). The relationship between fully nD NPS analysis and various techniques for analyzing a 'central slice' of the NPS is formulated in a manner that is directly applicable to measured nD data, highlights the effects of correlation, and renders issues of NPS normalization transparent. The spatiotemporal NPS of fluoroscopic images is analyzed under varying conditions of temporal correlation (image lag) to investigate the degree to which the NPS is reduced by such correlation. For first-frame image lag of ∼5-8 %, the NPS is reduced by ∼20% compared to the lag-free case. A simple model is presented that results in an approximate rule of thumb for computing the effect of image lag on NPS under conditions of spatiotemporal separability. The volumetric NPS of cone-beam CT images is analyzed under varying conditions of spatial correlation, controlled by adjustment of the reconstruction filter. The volumetric NPS is found to be highly asymmetric, exhibiting a ramp characteristic in transverse planes (typical of filtered back-projection) and a band-limited characteristic in the longitudinal direction (resulting from low-pass characteristics of the imager). Such asymmetry could have implications regarding the detectability of structures visualized in transverse versus sagittal or coronal planes. In all cases, appreciation of the full dimensionality of the image data is essential to obtaining

  12. Sustainability principles in strategic environmental assessment: A framework for analysis and examples from Italian urban planning

    Energy Technology Data Exchange (ETDEWEB)

    Lamorgese, Lydia, E-mail: lydial@tin.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it

    2013-09-15

    This paper presents a framework for analysing the degree of consideration of sustainability principles in Strategic environmental assessment (SEA), and demonstrates its application to a sample of SEA of Italian urban plans. The framework is based on Gibson's (2006) sustainability principles, which are linked to a number of guidance criteria and eventually to review questions, resulting from an extensive literature review. A total of 71 questions are included in the framework, which gives particular emphasis to key concepts, such as intragenerational and intergenerational equity. The framework was applied to review the Environmental Report of the urban plans of 15 major Italian cities. The results of this review show that, even if sustainability is commonly considered as a pivotal concept, there is still work to be done in order to effectively integrate sustainability principles into SEA. In particular, most of the attention is given to mitigation and compensation measures, rather than to actual attempts to propose more sustainable planning decisions in the first place. Concerning the proposed framework of analysis, further research is required to clarify equity concerns and particularly to identify suitable indicators for operationalizing the concepts of intra/inter-generational equity in decision-making. -- Highlights: ► A framework was developed in order to evaluate planning against sustainability criteria. ► The framework was applied to analyse how sustainable principles are addressed in 15 Italian SEA reports. ► Over 85% of the reports addressed, to some extent, at least 40% of the framework questions. ► Criteria explicitly linked to intra and inter-generational equity are rarely addressed.

  13. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    Science.gov (United States)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  14. Template security analysis of multimodal biometric frameworks based on fingerprint and hand geometry

    Directory of Open Access Journals (Sweden)

    Arvind Selwal

    2016-09-01

    Full Text Available Biometric systems are automatic tools used to provide authentication during various applications of modern computing. In this work, three different design frameworks for multimodal biometric systems based on fingerprint and hand geometry modalities are proposed. An analysis is also presented to diagnose various types of template security issues in the proposed system. Fuzzy analytic hierarchy process (FAHP is applied with five decision parameters on all the designs and framework 1 is found to be better in terms of template data security, templates fusion and computational efficiency. It is noticed that template data security before storage in database is a challenging task. An important observation is that a template may be secured at feature fusion level and an indexing technique may be used to improve the size of secured templates.

  15. Runtime Detection Framework for Android Malware

    Directory of Open Access Journals (Sweden)

    TaeGuen Kim

    2018-01-01

    Full Text Available As the number of Android malware has been increased rapidly over the years, various malware detection methods have been proposed so far. Existing methods can be classified into two categories: static analysis-based methods and dynamic analysis-based methods. Both approaches have some limitations: static analysis-based methods are relatively easy to be avoided through transformation techniques such as junk instruction insertions, code reordering, and so on. However, dynamic analysis-based methods also have some limitations that analysis overheads are relatively high and kernel modification might be required to extract dynamic features. In this paper, we propose a dynamic analysis framework for Android malware detection that overcomes the aforementioned shortcomings. The framework uses a suffix tree that contains API (Application Programming Interface subtraces and their probabilistic confidence values that are generated using HMMs (Hidden Markov Model to reduce the malware detection overhead, and we designed the framework with the client-server architecture since the suffix tree is infeasible to be deployed in mobile devices. In addition, an application rewriting technique is used to trace API invocations without any modifications in the Android kernel. In our experiments, we measured the detection accuracy and the computational overheads to evaluate its effectiveness and efficiency of the proposed framework.

  16. CM-DataONE: A Framework for collaborative analysis of climate model output

    Science.gov (United States)

    Xu, Hao; Bai, Yuqi; Li, Sha; Dong, Wenhao; Huang, Wenyu; Xu, Shiming; Lin, Yanluan; Wang, Bin

    2015-04-01

    CM-DataONE is a distributed collaborative analysis framework for climate model data which aims to break through the data access barriers of increasing file size and to accelerate research process. As data size involved in project such as the fifth Coupled Model Intercomparison Project (CMIP5) has reached petabytes, conventional methods for analysis and diagnosis of model outputs have been rather time-consuming and redundant. CM-DataONE is developed for data publishers and researchers from relevant areas. It can enable easy access to distributed data and provide extensible analysis functions based on tools such as NCAR Command Language, NetCDF Operators (NCO) and Climate Data Operators (CDO). CM-DataONE can be easily installed, configured, and maintained. The main web application has two separate parts which communicate with each other through APIs based on HTTP protocol. The analytic server is designed to be installed in each data node while a data portal can be configured anywhere and connect to a nearest node. Functions such as data query, analytic task submission, status monitoring, visualization and product downloading are provided to end users by data portal. Data conform to CMIP5 Model Output Format in each peer node can be scanned by the server and mapped to a global information database. A scheduler included in the server is responsible for task decomposition, distribution and consolidation. Analysis functions are always executed where data locate. Analysis function package included in the server has provided commonly used functions such as EOF analysis, trend analysis and time series. Functions are coupled with data by XML descriptions and can be easily extended. Various types of results can be obtained by users for further studies. This framework has significantly decreased the amount of data to be transmitted and improved efficiency in model intercomparison jobs by supporting online analysis and multi-node collaboration. To end users, data query is

  17. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  18. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    Science.gov (United States)

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  19. Avaliação da urina e de leucócitos como amostras biológicas para a detecção ante mortem do vírus da cinomose canina por RT-PCR em cães naturalmente infectados Evaluation of the urine and leucocytes as biological samples for ante mortem detection of canine distemper virus by RT-PCR assay in naturally infected dogs

    Directory of Open Access Journals (Sweden)

    F.J. Negrão

    2007-02-01

    Full Text Available Urine and leucocytes were comparatively evaluated as clinical samples for ante mortem detection of the canine distemper virus (CDV by a reverse transcription-polymerase chain reaction (RT-PCR assay. One hundred and eighty eight dogs with clinical symptoms of distemper, were distributed in three groups. The group A was constituted of 93 dogs with systemic signs of distemper; the group B by 11 dogs with neurological signs, and the group C by 84 dogs that presented simultaneously systemic and neurological signs. In 66.5% (125/188 of the dogs was amplified an amplicon with 287 base pair of the CDV nucleoprotein gene. In 60.8% (76/125 of the animals the CDV was detected simultaneously in the urine and leucocytes, and in 39.2% (49/125 of the dogs just a type of clinical sample (urine: n=37; leucocytes: n=12 was positive. These results demonstrate that the different forms of clinical distemper disease can hinder the choice of only one type of clinical sample to carry out the ante mortem etiological diagnosis of CDV infection, and false-negative results can be generated.

  20. A framework for adaptive e-learning for continuum mechanics and structural analysis

    OpenAIRE

    Mosquera Feijoo, Juan Carlos; Plaza Beltrán, Luis Francisco; González Rodrigo, Beatriz

    2015-01-01

    This paper presents a project for providing the students of Structural Engineering with the flexibility to learn outside classroom schedules. The goal is a framework for adaptive E-learning based on a repository of open educational courseware with a set of basic Structural Engineering concepts and fundamentals. These are paramount for students to expand their technical knowledge and skills in structural analysis and design of tall buildings, arch-type structures as well as bridges. Thus, conc...

  1. FireCalc: An XML-based framework for distributed data analysis

    International Nuclear Information System (INIS)

    Duarte, A.S.; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F.

    2008-01-01

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis

  2. FireCalc: An XML-based framework for distributed data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duarte, A.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)], E-mail: andre.duarte@cfn.ist.utl.pt; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis.

  3. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    Science.gov (United States)

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  4. PyPWA: A partial-wave/amplitude analysis software framework

    Science.gov (United States)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  5. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  6. High resolution post-mortem MRI of non-fixed in situ foetal brain in the second trimester of gestation. Normal foetal brain development

    Energy Technology Data Exchange (ETDEWEB)

    Scola, Elisa; Palumbo, Giovanni; Avignone, Sabrina; Cinnante, Claudia Maria [Fondazione IRCCS Ca Granda Ospedale Maggiore Policlinico, Neuroradiology Unit, Milan (Italy); Conte, Giorgio [Fondazione IRCCS Ca Granda Ospedale Maggiore Policlinico, Neuroradiology Unit, Milan (Italy); Universita degli Studi di Milano, Postgraduation School in Radiodiagnostics, Milan (Italy); Boito, Simona; Persico, Nicola [Fondazione IRCCS Ca Granda Ospedale Maggiore Policlinico, Department of Obstetrics and Gynaecology ' L. Mangiagalli' , Milan (Italy); Rizzuti, Tommaso [Fondazione IRCCS Ca Granda Ospedale Maggiore Policlinico, Pathology Unit, Milan (Italy); Triulzi, Fabio [Fondazione IRCCS Ca Granda Ospedale Maggiore Policlinico, Neuroradiology Unit, Milan (Italy); Universita degli Studi di Milano, Department of Pathophysiology and Transplantation, Milan (Italy)

    2018-01-15

    To describe normal foetal brain development with high resolution post-mortem MRI (PMMRI) of non-fixed foetal brains. We retrospectively collected PMMRIs of foetuses without intracranial abnormalities and chromosomal aberrations studied after a termination of pregnancy due to extracranial abnormalities or after a spontaneous intrauterine death. PMMRIs were performed on a 3-T scanner without any fixation and without removing the brain from the skull. All PMMRIs were evaluated in consensus by two neuroradiologists. Our analysis included ten PMMRIs (median gestational age (GA): 21 weeks; range: 17-28 weeks). At 19 and 20 weeks of GA, the corticospinal tracts are recognisable in the medulla oblongata, becoming less visible from 21 weeks. Prior to 20 weeks the posterior limb of the internal capsule (PLIC) is more hypointense than surrounding deep grey nuclei; starting from 21 weeks the PLIC becomes isointense, and is hyperintense at 28 weeks. From 19-22 weeks, the cerebral hemispheres show transient layers: marginal zone, cortical plate, subplate, and intermediate, subventricular and germinal zones. PMMRI of non-fixed in situ foetal brains preserves the natural tissue contrast and skull integrity. We assessed foetal brain development in a small cohort of foetuses, focusing on 19-22 weeks of gestation. (orig.)

  7. High resolution post-mortem MRI of non-fixed in situ foetal brain in the second trimester of gestation. Normal foetal brain development

    International Nuclear Information System (INIS)

    Scola, Elisa; Palumbo, Giovanni; Avignone, Sabrina; Cinnante, Claudia Maria; Conte, Giorgio; Boito, Simona; Persico, Nicola; Rizzuti, Tommaso; Triulzi, Fabio

    2018-01-01

    To describe normal foetal brain development with high resolution post-mortem MRI (PMMRI) of non-fixed foetal brains. We retrospectively collected PMMRIs of foetuses without intracranial abnormalities and chromosomal aberrations studied after a termination of pregnancy due to extracranial abnormalities or after a spontaneous intrauterine death. PMMRIs were performed on a 3-T scanner without any fixation and without removing the brain from the skull. All PMMRIs were evaluated in consensus by two neuroradiologists. Our analysis included ten PMMRIs (median gestational age (GA): 21 weeks; range: 17-28 weeks). At 19 and 20 weeks of GA, the corticospinal tracts are recognisable in the medulla oblongata, becoming less visible from 21 weeks. Prior to 20 weeks the posterior limb of the internal capsule (PLIC) is more hypointense than surrounding deep grey nuclei; starting from 21 weeks the PLIC becomes isointense, and is hyperintense at 28 weeks. From 19-22 weeks, the cerebral hemispheres show transient layers: marginal zone, cortical plate, subplate, and intermediate, subventricular and germinal zones. PMMRI of non-fixed in situ foetal brains preserves the natural tissue contrast and skull integrity. We assessed foetal brain development in a small cohort of foetuses, focusing on 19-22 weeks of gestation. (orig.)

  8. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    Science.gov (United States)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the

  9. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  10. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    Science.gov (United States)

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  11. The Water-Energy-Food Security Nexus through the Lenses of the Value Chain and the Institutional Analysis and Development Frameworks

    Directory of Open Access Journals (Sweden)

    Sergio Villamayor-Tomas

    2015-02-01

    Full Text Available A number of frameworks have been used to study the water-food-energy nexus; but few of these consider the role of institutions in mediating environmental outcomes. In this paper we aim to start filling that gap by combining insights from the Institutional Analysis and Development (IAD framework and value chain analysis. Specifically we study food, energy and water value chains as networks of action situations (NAS where actorsʼ decisions depend not only on the institutional structure of a particular situation but also on the decisions made in related situations. Although the IAD framework has developed a solid reputation in the policy sciences, empirical applications of the related NAS concept are rare. Value-chain analysis can help drawing the empirical boundaries of NAS as embedded in production processes. In this paper we first use value-chain analysis to identify important input-output linkages among water, food and energy production processes, and then apply the IAD-NAS approach to better understand the effect of institutions within and across those processes. The resulting combined framework is then applied to four irrigation-related case studies including: the use of energy for water allocation and food production in an irrigation project in Spain; the production and allocation of treated water for food and bioenergy production in Germany; the allocation of water for food production and urban use in Kenya; and the production and allocation of energy for food production in Hyderabad, India. The case analyses reveal the value of the framework by demonstrating the importance of establishing linkages across energy, water and food-related situations and the ways in which institutions limit or facilitate synergies along the value chains.

  12. Is survival improved by the use of NIV and PEG in amyotrophic lateral sclerosis (ALS)? A post-mortem study of 80 ALS patients.

    Science.gov (United States)

    Burkhardt, Christian; Neuwirth, Christoph; Sommacal, Andreas; Andersen, Peter M; Weber, Markus

    2017-01-01

    Non-invasive ventilation (NIV) and percutaneous gastrostomy (PEG) are guideline-recommended interventions for symptom management in amyotrophic lateral sclerosis (ALS). Their effect on survival is controversial and the impact on causes of death is unknown. To investigate the effect of NIV and PEG on survival and causes of death in ALS patients. Eighty deceased ALS patients underwent a complete post mortem analysis for causes of death between 2003 and 2015. Forty-two of these patients consented for genetic testing. Effects of NIV and PEG on survival and causes of death were analyzed in a multivariable Cox proportional hazard regression. Six patients, who requested assisted suicide causing drug-induced hypoxia, were excluded from final analysis. Respiratory failure was the main cause of death in 72 out of 74 patients. Fifteen out of 74 died of aspiration pneumonia 23/74 of bronchopneumonia and 8/74 of a combination of aspiration pneumonia and bronchopneumonia. Twenty died of hypoxia without concomitant infection, and six patients had pulmonary embolism alone or in combination with pneumonia. NIV (p = 0.01) and PEG (pNIV bronchopneumonia was significantly more frequent (p NIV patients. This effect was even more pronounced in limb onset patients (pNIV and PEG prolongs survival in ALS. This study supports current AAN and EFNS guidelines which recommend NIV and PEG as a treatment option in ALS. The risk of bronchopneumonia as cause of death may be increased by NIV.

  13. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    Science.gov (United States)

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    Directory of Open Access Journals (Sweden)

    Sinha Amit U

    2011-11-01

    Full Text Available Abstract Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1 the ability to publish structured data compliant with accepted standards 2 support for multiple data types including microarrays and next generation sequencing 3 query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own

  15. Towards a Cloud Based Smart Traffic Management Framework

    Science.gov (United States)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  16. Layering of stomach contents in drowning cases in post-mortem computed tomography compared to forensic autopsy.

    Science.gov (United States)

    Gotsmy, Walther; Lombardo, Paolo; Jackowski, Christian; Brencicova, Eva; Zech, Wolf-Dieter

    2018-04-24

    In forensic autopsy, the analysis of stomach contents is important when investigating drowning cases. Three-layering of stomach contents may be interpreted as a diagnostic hint to drowning due to swallowing of larger amounts of water or other drowning media. The authors experienced frequent discrepancies of numbers of stomach content layering in drowning cases between post-mortem computed tomography (PMCT) and autopsy in forensic casework. Therefore, the goal of this study was to compare layering of stomach contents in drowning cases between PMCT and forensic autopsy. Drowning cases (n = 55; 40 male, 15 female, mean age 45.3 years; mean amount of stomach content 223 ml) that received PMCT prior to forensic autopsy were retrospectively analyzed by a forensic pathologist and a radiologist. Number of layers of stomach content in PMCT were compared to number of layers at forensic autopsy. In 28 of the 55 evaluated drowning cases, a discrepancy between layering of stomach contents at autopsy compared to PMCT was observed: 1 layer at autopsy (n = 28): 50% discrepancy to PMCT, 2 layers (n = 20): 45% discrepancy, and 3 layers (n = 7): 71.4% discrepancy. Sensitivity of correctly determining layering (as observed at forensic autopsy) in PMCT was 52% (positive predictive value 44.8%). Specificity was 46.6% (negative predictive value 53.8%). In a control group (n = 35) of non-drowning cases, three-layering of stomach contents was not observed. Discrepancies of observed numbers of stomach content layers between PMCT and forensic autopsy are a frequent finding possibly due to stomach content sampling technique at autopsy and movement of the corpse prior to PMCT and autopsy. Three-layering in PMCT, if indeed present, may be interpreted as a hint to drowning.

  17. A semi-automated method for non-invasive internal organ weight estimation by post-mortem magnetic resonance imaging in fetuses, newborns and children

    International Nuclear Information System (INIS)

    Thayyil, Sudhin; Schievano, Silvia; Robertson, Nicola J.; Jones, Rodney; Chitty, Lyn S.; Sebire, Neil J.; Taylor, Andrew M.

    2009-01-01

    Magnetic resonance (MR) imaging allows minimally invasive autopsy, especially when consent is declined for traditional autopsy. Estimation of individual visceral organ weights is an important component of traditional autopsy. Objective: To examine whether a semi-automated can be used for non-invasive internal organ weight measurement using post-mortem MR imaging in fetuses, newborns and children. Methods: Phase 1: In vitro scanning of 36 animal organs (heart, liver, kidneys) was performed to check the accuracy of volume reconstruction methodology. Real volumes were measured by water displacement method. Phase 2: Sixty-five whole body post-mortem MR scans were performed in fetuses (n = 30), newborns (n = 5) and children (n = 30) at 1.5 T using a 3D TSE T2-weighted sequence. These data were analysed offline using the image processing software Mimics 11.0. Results: Phase 1: Mean difference (S.D.) between estimated and actual volumes were -0.3 (1.5) ml for kidney, -0.7 (1.3) ml for heart, -1.7 (3.6) ml for liver in animal experiments. Phase 2: In fetuses, newborns and children mean differences between estimated and actual weights (S.D.) were -0.6 (4.9) g for liver, -5.1 (1.2) g for spleen, -0.3 (0.6) g for adrenals, 0.4 (1.6) g for thymus, 0.9 (2.5) g for heart, -0.7 (2.4) g for kidneys and 2.7 (14) g for lungs. Excellent co-correlation was noted for estimated and actual weights (r 2 = 0.99, p < 0.001). Accuracy was lower when fetuses were less than 20 weeks or less than 300 g. Conclusion: Rapid, accurate and reproducible estimation of solid internal organ weights is feasible using the semi-automated 3D volume reconstruction method.

  18. Vital analysis: field validation of a framework for annotating biological signals of first responders in action.

    Science.gov (United States)

    Gomes, P; Lopes, B; Coimbra, M

    2012-01-01

    First responders are professionals that are exposed to extreme stress and fatigue during extended periods of time. That is why it is necessary to research and develop technological solutions based on wearable sensors that can continuously monitor the health of these professionals in action, namely their stress and fatigue levels. In this paper we present the Vital Analysis smartphone-based framework, integrated into the broader Vital Responder project, that allows the annotation and contextualization of the signals collected during real action. After a contextual study we have implemented and deployed this framework in a firefighter team with 5 elements, from where we have collected over 3300 hours of annotations during 174 days, covering 382 different events. Results are analysed and discussed, validating the framework as a useful and usable tool for annotating biological signals of first responders in action.

  19. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  20. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  1. Analysis of sustainable leadership for science learning management in the 21st Century under education THAILAND 4.0 framework

    Science.gov (United States)

    Jedaman, Pornchai; Buaraphan, Khajornsak; Pimdee, Paitoon; Yuenyong, Chokchai; Sukkamart, Aukkapong; Suksup, Charoen

    2018-01-01

    This article aims to study and analyze the 21st Century of sustainable leadership under the education THAILAND 4.0 Framework, and factor analysis of sustainable leadership for science learning. The study employed both quantitative and qualitative approaches in collecting data including a questionnaire survey, a documentary review and a Participatory Action Learning (PAL). The sample were sampling purposively. There were 225 administrators of Primary and Secondary Education Area Offices throughout Thailand. Out of 225, 183 (83.33%) and 42 (16.67%) respondents were the administrators of Primary and Secondary Education Offices, respectively. The quantitative data was analyzed by descriptive statistical analysis including mean, standard deviation. Also, the Confirmatory Factor Analysis (CFA) was conducted to analyze the factors associated with sustainable leadership under the education THAILAND 4.0 Framework. The qualitative data was analyzed by using three main stages, i.e., data reduction, data organization, data interpretation to conclusion. The study revealed that sustainable leadership under the education THAILAND 4.0 Framework needs to focus on development, awareness of duty and responsibility, equality, moral and knowledge. All aspects should be integrated together in order to achieve the organizational goals, good governance culture and identity. Importantly, there were six "key" elements of sustainable leadership under the education THAILAND 4.0 framework: i) Professional Leadership Role, ii) Leadership Under Change, iii) Leadership Skills 4.0 in the 21st Century, iv) Development in the Pace With Change, v) Creativity and Creative Tension, and vi) Hold True Assessments. The CFA showed that the six key elements of sustainable leadership under the education THAILAND 4.0 framework by weight of each elements were significant at the .01 significance level.

  2. Sensitivity and uncertainty in flood inundation modelling – concept of an analysis framework

    Directory of Open Access Journals (Sweden)

    T. Weichel

    2007-01-01

    Full Text Available After the extreme flood event of the Elbe in 2002 the definition of flood risk areas by law and their simulation became more important in Germany. This paper describes a concept of an analysis framework to improve the localisation and duration of validity of flood inundation maps. The two-dimensional finite difference model TrimR2D is used and linked to a Monte-Carlo routine for parameter sampling as well as to selected performance measures. The purpose is the investigation of the impact of different spatial resolutions and the influence of changing land uses in the simulation of flood inundation areas. The technical assembling of the framework is realised and beside the model calibration, first tests with different parameter ranges were done. Preliminary results show good correlations with observed data, but the investigation of shifting land uses reflects only poor changes in the flood extension.

  3. Integration of targeted health interventions into health systems: a conceptual framework for analysis.

    Science.gov (United States)

    Atun, Rifat; de Jongh, Thyra; Secci, Federica; Ohiri, Kelechi; Adeyi, Olusoji

    2010-03-01

    The benefits of integrating programmes that emphasize specific interventions into health systems to improve health outcomes have been widely debated. This debate has been driven by narrow binary considerations of integrated (horizontal) versus non-integrated (vertical) programmes, and characterized by polarization of views with protagonists for and against integration arguing the relative merits of each approach. The presence of both integrated and non-integrated programmes in many countries suggests benefits to each approach. While the terms 'vertical' and 'integrated' are widely used, they each describe a range of phenomena. In practice the dichotomy between vertical and horizontal is not rigid and the extent of verticality or integration varies between programmes. However, systematic analysis of the relative merits of integration in various contexts and for different interventions is complicated as there is no commonly accepted definition of 'integration'-a term loosely used to describe a variety of organizational arrangements for a range of programmes in different settings. We present an analytical framework which enables deconstruction of the term integration into multiple facets, each corresponding to a critical health system function. Our conceptual framework builds on theoretical propositions and empirical research in innovation studies, and in particular adoption and diffusion of innovations within health systems, and builds on our own earlier empirical research. It brings together the critical elements that affect adoption, diffusion and assimilation of a health intervention, and in doing so enables systematic and holistic exploration of the extent to which different interventions are integrated in varied settings and the reasons for the variation. The conceptual framework and the analytical approach we propose are intended to facilitate analysis in evaluative and formative studies of-and policies on-integration, for use in systematically comparing and

  4. Skeletal abnormalities in fetuses with Down's syndrome: a radiographic post-mortem study

    International Nuclear Information System (INIS)

    Stempfle, N.; Brisse, H.; Huten, Y.; Fredouille, C.; Nessmann, C.

    1999-01-01

    Objective. To evaluate skeletal abnormalities on post-mortem radiographs of fetuses with Down's syndrome. Materials and methods. Biometrical and morphological criteria, which are used for US prenatal detection of trisomy 21, were assessed. Limb long bones, biparietal diameter (BPD)/occipito-frontal diameter (OFD) ratio, ossification of nasal bones and appearance of the middle phalanx of the fifth digit (P2) in 60 fetuses with Down's syndrome were analysed and compared with 82 normal fetuses matched for gestational age (GA) from 15 to 40 weeks' gestation (WG). Results. We observed reduced growth velocity of limb long bones during the third trimester in both groups, but the reduction was more pronounced in the trisomic group. Brachycephaly was found as early as 15 WG in Down's syndrome and continued throughout gestation (sensitivity 0.28, specificity 1). Ossification of the nasal bones, which can be detected in normal fetuses from 14 WG, was absent in one quarter of trisomic fetuses, regardless of GA. The middle phalanx of the fifth digit was evaluated by comparison with the distal phalanx (P3) of the same digit. We found that P2 was not ossified in 11/31 trisomic fetuses before 23 WG, and was either not ossified or hypoplastic in 17/29 cases after 24 WG (sensitivity 0.56, specificity 1). Conclusions. Three key skeletal signs were present in trisomic fetuses: brachycephaly, absence of nasal bone ossification, and hypoplasia of the middle phalanx of the fifth digit. All these signs are appropriate to prenatal US screening. When present, they fully justify determination of the fetal karyotype by amniocentesis. (orig.)

  5. A Conjoint Analysis Framework for Evaluating User Preferences in Machine Translation.

    Science.gov (United States)

    Kirchhoff, Katrin; Capurro, Daniel; Turner, Anne M

    2014-03-01

    Despite much research on machine translation (MT) evaluation, there is surprisingly little work that directly measures users' intuitive or emotional preferences regarding different types of MT errors. However, the elicitation and modeling of user preferences is an important prerequisite for research on user adaptation and customization of MT engines. In this paper we explore the use of conjoint analysis as a formal quantitative framework to assess users' relative preferences for different types of translation errors. We apply our approach to the analysis of MT output from translating public health documents from English into Spanish. Our results indicate that word order errors are clearly the most dispreferred error type, followed by word sense, morphological, and function word errors. The conjoint analysis-based model is able to predict user preferences more accurately than a baseline model that chooses the translation with the fewest errors overall. Additionally we analyze the effect of using a crowd-sourced respondent population versus a sample of domain experts and observe that main preference effects are remarkably stable across the two samples.

  6. Formative assessment framework proposal for transversal competencies: Application to analysis and problem-solving competence

    Directory of Open Access Journals (Sweden)

    Pedro Gómez-Gasquet

    2018-04-01

    Full Text Available Purpose: In the last years, there is an increasing interest in the manner that transversal competences (TC are introduced in the curricula. Transversal competences are generic and relevant skills that students have to develop through the several stages of the educational degrees. This paper analyses TCs in the context of the learning process of undergraduate and postgraduate courses. The main aim of this paper is to propose a framework to improve results. The framework facilities the student's training and one of the important pieces is undoubtedly that he has constant feedback from his assessments that allowing to improve the learning. An applying in the analysis and problem solving competence in the context of Master Degree in Advanced Engineering Production, Logistics and Supply Chain at the UPV is carried out. Design/methodology/approach: The work is the result of several years of professional experience in the application of the concept of transversal competence in the UPV with undergraduate and graduate students. As a result of this work and various educational innovation projects, a team of experts has been created, which has been discussing some aspects relevant to the improvement of the teaching-learning process. One of these areas of work has been in relation to the integration of various proposals on the application and deployment of transversal competences. With respect to this work, a conceptual proposal is proposed that has subsequently been empirically validated through the analysis of the results of several groups of students in a degree. Findings: The main result that is offered in the work is a framework that allows identifying the elements that are part of the learning process in the area of transversal competences. Likewise, the different items that are part of the framework are linked to the student's life cycle, and a temporal scope is established for their deployment. Practical implications: One of the most noteworthy

  7. Post-mortem analysis of suicide victims shows ABCB1 haplotype 1236T-2677T-3435T as a candidate predisposing factor behind adverse drug reactions in females.

    Science.gov (United States)

    Rahikainen, Anna-Liina; Palo, Jukka U; Haukka, Jari; Sajantila, Antti

    2018-04-01

    Genetic variation in efflux transporter, permeability glycoprotein (P-gp), has recently been associated with completed violent suicides and also violent suicide attempts. As depression is known to be a risk factor for suicide and many antidepressants are P-gp substrates, it has been speculated that inadequate antidepressant treatment response or adverse side effects could be involved. The aim of this study was to investigate whether there is an association between the P-gp coding ABCB1 gene and completed suicides in citalopram users. Also, the effect of sex and suicide method used (violent vs. non-violent) was evaluated. All cases included in the study population, 349 completed suicide victims and 284 controls, were shown to be positive for antidepressant citalopram in a post-mortem toxicological drug screen. ABCB1 1236C>T, 2677G>T/A and 3435C>T polymorphisms were determined by TaqMan genotyping assays. Haplotypes were constructed from genotype data using the PHASE software. The association between the manner of death and the ABCB1 haplotype was tested with logistic regression analysis. No statistically significant differences were observed in the ABCB1 allele or genotype frequencies between the suicide and control groups. However, the ABCB1 1236T-2677T-3435T haplotype was associated with completed suicides of female citalopram users (odds ratio: 2.23; 95% confidence interval: 1.22-4.07; P=0.009). After stratification by the method used for suicide, the association emerged in fatal intoxications (odds ratio: 2.51; 95% confidence interval: 1.29-4.87; P=0.007). In other groups, no statistically significant associations were observed. Our results suggest that female citalopram users with ABCB1 1236T-2677T-3435T are more vulnerable to adverse effects of the drugs as this haplotype was enriched in non-violent suicides of female citalopram users. Even though the biological mechanism behind this observation is unknown, the results provide another example of the importance

  8. Post mortem scientific sampling and the search for causes of death in intensive care: what information should be given and what consent should be obtained?

    Science.gov (United States)

    Rigaud, J P; Quenot, J P; Borel, M; Plu, I; Hervé, C; Moutel, G

    2011-03-01

    The search for cause of death is important to improve knowledge and provide answers for the relatives of the deceased. Medical autopsy following unexplained death in hospital is one way to identify cause of death but is difficult to carry out routinely. Post mortem sampling (PMS) of tissues via thin biopsy needle or 'mini incisions' in the skin may be a useful alternative. A study was undertaken to assess how this approach is perceived by intensive care doctors and also to evaluate how this practice is considered in ethical terms in France. A study of PMS practices immediately after death in 10 intensive care departments was performed. The medical director of each centre was interviewed by telephone and asked to describe practices in their unit and to outline the questions raised by this practice. PMS is routinely performed in 70% of the units which responded, without systematically obtaining formal consent and without precise rules for communicating results. Approaches to PMS differed between centres, but all physicians felt that PMS is useful for the scientific information it gives and also for the information it provides for relatives. All physicians regret the lack of standards to structure PMS practices. Information from post mortem examinations is important for society to inform about causes of death, for doctors to improve practices and for decision-makers responsible for organising care. Debate persists regarding the balance between individual rights and community interests. It is suggested that an approach for identifying cause of death could easily be integrated into the relationship between carers and relatives, provided full transparency is maintained.

  9. Recent advances in metal-organic frameworks and covalent organic frameworks for sample preparation and chromatographic analysis.

    Science.gov (United States)

    Wang, Xuan; Ye, Nengsheng

    2017-12-01

    In the field of analytical chemistry, sample preparation and chromatographic separation are two core procedures. The means by which to improve the sensitivity, selectivity and detection limit of a method have become a topic of great interest. Recently, porous organic frameworks, such as metal-organic frameworks (MOFs) and covalent organic frameworks (COFs), have been widely used in this research area because of their special features, and different methods have been developed. This review summarizes the applications of MOFs and COFs in sample preparation and chromatographic stationary phases. The MOF- or COF-based solid-phase extraction (SPE), solid-phase microextraction (SPME), gas chromatography (GC), high-performance liquid chromatography (HPLC) and capillary electrochromatography (CEC) methods are described. The excellent properties of MOFs and COFs have resulted in intense interest in exploring their performance and mechanisms for sample preparation and chromatographic separation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A classical regression framework for mediation analysis: fitting one model to estimate mediation effects.

    Science.gov (United States)

    Saunders, Christina T; Blume, Jeffrey D

    2017-10-26

    Mediation analysis explores the degree to which an exposure's effect on an outcome is diverted through a mediating variable. We describe a classical regression framework for conducting mediation analyses in which estimates of causal mediation effects and their variance are obtained from the fit of a single regression model. The vector of changes in exposure pathway coefficients, which we named the essential mediation components (EMCs), is used to estimate standard causal mediation effects. Because these effects are often simple functions of the EMCs, an analytical expression for their model-based variance follows directly. Given this formula, it is instructive to revisit the performance of routinely used variance approximations (e.g., delta method and resampling methods). Requiring the fit of only one model reduces the computation time required for complex mediation analyses and permits the use of a rich suite of regression tools that are not easily implemented on a system of three equations, as would be required in the Baron-Kenny framework. Using data from the BRAIN-ICU study, we provide examples to illustrate the advantages of this framework and compare it with the existing approaches. © The Author 2017. Published by Oxford University Press.

  11. TOWARDS A CLOUD BASED SMART TRAFFIC MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    M. M. Rahimi

    2017-09-01

    Full Text Available Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  12. A framework for the system-of-systems analysis of the risk for a safety-critical plant exposed to external events

    International Nuclear Information System (INIS)

    Zio, E.; Ferrario, E.

    2013-01-01

    We consider a critical plant exposed to risk from external events. We propose an original framework of analysis, which extends the boundaries of the study to the interdependent infrastructures which support the plant. For the purpose of clearly illustrating the conceptual framework of system-of-systems analysis, we work out a case study of seismic risk for a nuclear power plant embedded in the connected power and water distribution, and transportation networks which support its operation. The technical details of the systems considered (including the nuclear power plant) are highly simplified, in order to preserve the purpose of illustrating the conceptual, methodological framework of analysis. Yet, as an example of the approaches that can be used to perform the analysis within the proposed framework, we consider the Muir Web as system analysis tool to build the system-of-systems model and Monte Carlo simulation for the quantitative evaluation of the model. The numerical exercise, albeit performed on a simplified case study, serves the purpose of showing the opportunity of accounting for the contribution of the interdependent infrastructure systems to the safety of a critical plant. This is relevant as it can lead to considerations with respect to the decision making related to safety critical-issues. -- Highlights: ► We consider a critical plant exposed to risk from external events. ► We consider also the interdependent infrastructures that support the plant. ► We use Muir Web as system analysis tool to build the system-of-systems model. ► We use Monte Carlo simulation for the quantitative evaluation of the model. ► We find that the interdependent infrastructures should be considered as they can be a support for the critical plant safety

  13. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  14. THE STAR OFFLINE FRAMEWORK

    International Nuclear Information System (INIS)

    FINE, V.; FISYAK, Y.; PEREVOZTCHIKOV, V.; WENAUS, T.

    2000-01-01

    The Solenoidal Tracker At RHIC (STAR) is a-large acceptance collider detector, commissioned at Brookhaven National Laboratory in 1999. STAR has developed a software framework supporting simulation, reconstruction and analysis in offline production, interactive physics analysis and online monitoring environments that is well matched both to STAR's present status of transition between Fortran and C++ based software and to STAR's evolution to a fully OO software base. This paper presents the results of two years effort developing a modular C++ framework based on the ROOT package that encompasses both wrapped Fortran components (legacy simulation and reconstruction code) served by IDL-defined data structures, and fully OO components (all physics analysis code) served by a recently developed object model for event data. The framework supports chained components, which can themselves be composite subchains, with components (''makers'') managing ''data sets'' they have created and are responsible for. An St-DataSet class from which data sets and makers inherit allows the construction of hierarchical organizations of components and data, and centralizes almost all system tasks such as data set navigation, I/O, database access, and inter-component communication. This paper will present an overview of this system, now deployed and well exercised in production environments with real and simulated data, and in an active physics analysis development program

  15. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  16. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  17. Effective integrated frameworks for assessing mining sustainability.

    Science.gov (United States)

    Virgone, K M; Ramirez-Andreotta, M; Mainhagu, J; Brusseau, M L

    2018-05-28

    The objectives of this research are to review existing methods used for assessing mining sustainability, analyze the limited prior research that has evaluated the methods, and identify key characteristics that would constitute an enhanced sustainability framework that would serve to improve sustainability reporting in the mining industry. Five of the most relevant frameworks were selected for comparison in this analysis, and the results show that there are many commonalities among the five, as well as some disparities. In addition, relevant components are missing from all five. An enhanced evaluation system and framework were created to provide a more holistic, comprehensive method for sustainability assessment and reporting. The proposed framework has five components that build from and encompass the twelve evaluation characteristics used in the analysis. The components include Foundation, Focus, Breadth, Quality Assurance, and Relevance. The enhanced framework promotes a comprehensive, location-specific reporting approach with a concise set of well-defined indicators. Built into the framework is quality assurance, as well as a defined method to use information from sustainability reports to inform decisions. The framework incorporates human health and socioeconomic aspects via initiatives such as community-engaged research, economic valuations, and community-initiated environmental monitoring.

  18. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    Science.gov (United States)

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology

  19. Alterações bioquímicas post-mortem de matrinxã Brycon cephalus (Günther, 1869 procedente da piscicultura, mantido em gelo Post-mortem biochemical alterations in aquacultured matrinxã fish Brycon cephalus (Günther, 1869 when stored on ice

    Directory of Open Access Journals (Sweden)

    Gilvan Machado Batista

    2004-12-01

    Full Text Available No presente trabalho, foram estudadas as alterações bioquímicas post-mortem que ocorreram em matrinxã Brycon cephalus (Günther, 1869 procedente da piscicultura e mantido em gelo em Manaus - AM. Foi determinado o tempo de estocagem em gelo por meio das avaliações sensoriais físicas e gustativas, das análises de pH, Nitrogênio das Bases Voláteis Totais (N-BVT e bacteriológicas durante 29 dias. Foram determinados os índices de rigor-mortis, as concentrações de ATP e seus produtos de degradações e o valor K. De acordo com a composição química, o peixe foi classificado como "semi-gordo". Os peixes entraram em rigor-mortis aos 75 minutos após a morte por hipotermia, tendo permanecido durante 10 dias. As avaliações sensoriais (físicas e gustativas mostraram que os peixes apresentaram condição de consumo até 26 dias. As análises de ATP e de seus produtos de degradação mostraram que a referida espécie foi considerada formadora de inosina (HxR, nas condições de experimento. O valor K mostrou que os exemplares de matrinxãs permaneceram "muito frescos" até 16 dias de estocagem em gelo, concordante com a avaliação sensorial gustativa.Post-mortem biochemical alterations in aquacultured matrinxã fish Brycon cephalus (Günther, 1869 when stored on ice in Manaus-AM, were studied in this paper. The storage time on ice was determined through tasting and physical sensory evaluations, pH, total volatile bases nitrogen (TVB-N and bacteriological analyses during 29 days. Rigor-mortis index, ATP-related compounds and K value were also determined. Chemical composition demonstrated that fish was classified as "semi-fat". The specimens presented rigor-mortis 75 minutes after death caused by hypothermia and remained that way for 10 days. Shelf life time on ice was 26 days, according to sensory evaluations, pH, TVBN determinations and bacteriological analyses. ATP-related compounds pointed out that the referred species was considered to

  20. Solving nonlinear, High-order partial differential equations using a high-performance isogeometric analysis framework

    KAUST Repository

    Cortes, Adriano Mauricio; Vignal, Philippe; Sarmiento, Adel; Garcí a, Daniel O.; Collier, Nathan; Dalcin, Lisandro; Calo, Victor M.

    2014-01-01

    In this paper we present PetIGA, a high-performance implementation of Isogeometric Analysis built on top of PETSc. We show its use in solving nonlinear and time-dependent problems, such as phase-field models, by taking advantage of the high-continuity of the basis functions granted by the isogeometric framework. In this work, we focus on the Cahn-Hilliard equation and the phase-field crystal equation.