WorldWideScience

Sample records for release validation process

  1. Validation of software releases for CMS

    International Nuclear Information System (INIS)

    Gutsche, Oliver

    2010-01-01

    The CMS software stack currently consists of more than 2 Million lines of code developed by over 250 authors with a new version being released every week. CMS has setup a validation process for quality assurance which enables the developers to compare the performance of a release to previous releases and references. The validation process provides the developers with reconstructed datasets of real data and MC samples. The samples span the whole range of detector effects and important physics signatures to benchmark the performance of the software. They are used to investigate interdependency effects of all CMS software components and to find and fix bugs. The release validation process described here is an integral part of CMS software development and contributes significantly to ensure stable production and analysis. It represents a sizable contribution to the overall MC production of CMS. Its success emphasizes the importance of a streamlined release validation process for projects with a large code basis and significant number of developers and can function as a model for future projects.

  2. The ALICE Software Release Validation cluster

    International Nuclear Information System (INIS)

    Berzano, D; Krzewicki, M

    2015-01-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future. (paper)

  3. Gaia Data Release 1. Catalogue validation

    NARCIS (Netherlands)

    Arenou, F.; Luri, X.; Babusiaux, C.; Fabricius, C.; Helmi, A.; Robin, A. C.; Vallenari, A.; Blanco-Cuaresma, S.; Cantat-Gaudin, T.; Findeisen, K.; Reylé, C.; Ruiz-Dern, L.; Sordo, R.; Turon, C.; Walton, N. A.; Shih, I.-C.; Antiche, E.; Barache, C.; Barros, M.; Breddels, M.; Carrasco, J. M.; Costigan, G.; Diakité, S.; Eyer, L.; Figueras, F.; Galluccio, L.; Heu, J.; Jordi, C.; Krone-Martins, A.; Lallement, R.; Lambert, S.; Leclerc, N.; Marrese, P. M.; Moitinho, A.; Mor, R.; Romero-Gómez, M.; Sartoretti, P.; Soria, S.; Soubiran, C.; Souchay, J.; Veljanoski, J.; Ziaeepour, H.; Giuffrida, G.; Pancino, E.; Bragaglia, A.

    Context. Before the publication of the Gaia Catalogue, the contents of the first data release have undergone multiple dedicated validation tests. Aims: These tests aim to provide in-depth analysis of the Catalogue content in order to detect anomalies and individual problems in specific objects or in

  4. Process validation for radiation processing

    International Nuclear Information System (INIS)

    Miller, A.

    1999-01-01

    Process validation concerns the establishment of the irradiation conditions that will lead to the desired changes of the irradiated product. Process validation therefore establishes the link between absorbed dose and the characteristics of the product, such as degree of crosslinking in a polyethylene tube, prolongation of shelf life of a food product, or degree of sterility of the medical device. Detailed international standards are written for the documentation of radiation sterilization, such as EN 552 and ISO 11137, and the steps of process validation that are described in these standards are discussed in this paper. They include material testing for the documentation of the correct functioning of the product, microbiological testing for selection of the minimum required dose and dose mapping for documentation of attainment of the required dose in all parts of the product. The process validation must be maintained by reviews and repeated measurements as necessary. This paper presents recommendations and guidance for the execution of these components of process validation. (author)

  5. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  6. Validation of radiation sterilization process

    International Nuclear Information System (INIS)

    Kaluska, I.

    2007-01-01

    The standards for quality management systems recognize that, for certain processes used in manufacturing, the effectiveness of the process cannot be fully verified by subsequent inspection and testing of the product. Sterilization is an example of such a process. For this reason, sterilization processes are validated for use, the performance of sterilization process is monitored routinely and the equipment is maintained according to ISO 13 485. Different aspects of this norm are presented

  7. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  8. Biological processes influencing contaminant release from sediments

    International Nuclear Information System (INIS)

    Reible, D.D.

    1996-01-01

    The influence of biological processes, including bioturbation, on the mobility of contaminants in freshwater sediments is described. Effective mass coefficients are estimated for tubificid oligochaetes as a function of worm behavior and biomass density. The mass transfer coefficients were observed to be inversely proportional to water oxygen content and proportional to the square root of biomass density. The sediment reworking and contaminant release are contrasted with those of freshwater amphipods. The implications of these and other biological processes for contaminant release and i n-situ remediation of soils and sediments are summarized. 4 figs., 1 tab

  9. Development, description and validation of a Tritium Environmental Release Model (TERM).

    Science.gov (United States)

    Jeffers, Rebecca S; Parker, Geoffrey T

    2014-01-01

    Tritium is a radioisotope of hydrogen that exists naturally in the environment and may also be released through anthropogenic activities. It bonds readily with hydrogen and oxygen atoms to form tritiated water, which then cycles through the hydrosphere. This paper seeks to model the migration of tritiated species throughout the environment - including atmospheric, river and coastal systems - more comprehensively and more consistently across release scenarios than is currently in the literature. A review of the features and underlying conceptual models of some existing tritium release models was conducted, and an underlying aggregated conceptual process model defined, which is presented. The new model, dubbed 'Tritium Environmental Release Model' (TERM), was then tested against multiple validation sets from literature, including experimental data and reference tests for tritium models. TERM has been shown to be capable of providing reasonable results which are broadly comparable with atmospheric HTO release models from the literature, spanning both continuous and discrete release conditions. TERM also performed well when compared with atmospheric data. TERM is believed to be a useful tool for examining discrete and continuous atmospheric releases or combinations thereof. TERM also includes further capabilities (e.g. river and coastal release scenarios) that may be applicable to certain scenarios that atmospheric models alone may not handle well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  11. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  12. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  13. Nonlinearities in Drug Release Process from Polymeric Microparticles: Long-Time-Scale Behaviour

    Directory of Open Access Journals (Sweden)

    Elena Simona Bacaita

    2012-01-01

    Full Text Available A theoretical model of the drug release process from polymeric microparticles (a particular type of polymer matrix, through dispersive fractal approximation of motion, is built. As a result, the drug release process takes place through cnoidal oscillations modes of a normalized concentration field. This indicates that, in the case of long-time-scale evolutions, the drug particles assemble in a lattice of nonlinear oscillators occur macroscopically, through variations of drug concentration. The model is validated by experimental results.

  14. Processing ruminal ingesta to release bacteria attached to feed ...

    African Journals Online (AJOL)

    A comparison was made of different methods of processing ingesta to release bacteria attached to solid particles, prior to making viable counts. Initially processing was performed under a stream of anaerobic gas and counts were made using the roll tube technique. Later, processing was done in an anaerobic cabinet and ...

  15. Soil Moisture Active Passive Mission L4_C Data Product Assessment (Version 2 Validated Release)

    Science.gov (United States)

    Kimball, John S.; Jones, Lucas A.; Glassy, Joseph; Stavros, E. Natasha; Madani, Nima; Reichle, Rolf H.; Jackson, Thomas; Colliander, Andreas

    2016-01-01

    The SMAP satellite was successfully launched January 31st 2015, and began acquiring Earth observation data following in-orbit sensor calibration. Global data products derived from the SMAP L-band microwave measurements include Level 1 calibrated and geolocated radiometric brightness temperatures, Level 23 surface soil moisture and freezethaw geophysical retrievals mapped to a fixed Earth grid, and model enhanced Level 4 data products for surface to root zone soil moisture and terrestrial carbon (CO2) fluxes. The post-launch SMAP mission CalVal Phase had two primary objectives for each science product team: 1) calibrate, verify, and improve the performance of the science algorithms, and 2) validate accuracies of the science data products as specified in the L1 science requirements. This report provides analysis and assessment of the SMAP Level 4 Carbon (L4_C) product pertaining to the validated release. The L4_C validated product release effectively replaces an earlier L4_C beta-product release (Kimball et al. 2015). The validated release described in this report incorporates a longer data record and benefits from algorithm and CalVal refinements acquired during the SMAP post-launch CalVal intensive period. The SMAP L4_C algorithms utilize a terrestrial carbon flux model informed by SMAP soil moisture inputs along with optical remote sensing (e.g. MODIS) vegetation indices and other ancillary biophysical data to estimate global daily net ecosystem CO2 exchange (NEE) and component carbon fluxes for vegetation gross primary production (GPP) and ecosystem respiration (Reco). Other L4_C product elements include surface (10 cm depth) soil organic carbon (SOC) stocks and associated environmental constraints to these processes, including soil moisture and landscape freeze/thaw (FT) controls on GPP and respiration (Kimball et al. 2012). The L4_C product encapsulates SMAP carbon cycle science objectives by: 1) providing a direct link between terrestrial carbon fluxes and

  16. Streamlining Compliance Validation Through Automation Processes

    Science.gov (United States)

    2014-03-01

    INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS ACAS Assured Compliance Assessment Suite AMP Apache- MySQL -PHP ANSI American...enemy. Of course , a common standard for DoD security personnel to write and share compliance validation content would prevent duplicate work and aid in...process and consume much of the SCAP content available. Finally, it is free and easy to install as part of the Apache/ MySQL /PHP (AMP) [37

  17. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  18. Validation of measured friction by process tests

    DEFF Research Database (Denmark)

    Eriksen, Morten; Henningsen, Poul; Tan, Xincai

    The objective of sub-task 3.3 is to evaluate under actual process conditions the friction formulations determined by simulative testing. As regards task 3.3 the following tests have been used according to the original project plan: 1. standard ring test and 2. double cup extrusion test. The task...... has, however, been extended to include a number of new developed process tests: 3. forward rod extrusion test, 4. special ring test at low normal pressure, 5. spike test (especially developed for warm and hot forging). Validation of the measured friction values in cold forming from sub-task 3.1 has...... been made with forward rod extrusion, and very good agreement was obtained between the measured friction values in simulative testing and process testing....

  19. Digital processing method for monitoring the radioactivity of stack releases

    International Nuclear Information System (INIS)

    Vialettes, H.; Leblanc, P.; Perotin, J.P.; Lazou, J.P.

    1978-01-01

    The digital processing method proposed is adapted for data supplied by a fixed-filter detector normally used for analogue processing (integrator system). On the basis of the raw data (pulses) from the detector, the technique makes it possible to determine the rate of activity released whereas analogue processing gives only the released activity. Furthermore, the method can be used to develop alarm systems on the basis of a possible exposure rate at the point of fall-out, and by including in the program a coefficient which allows for atmospheric diffusion conditions at any given time one can improve the accuracy of the results. In order to test the digital processing method and demonstrate its advantages over analogue processing, various atmospheric contamination situations were simulated in a glove-box and analysed simultaneously, using both systems, from the pulses transmitted by the same sampling and fixed-filter detection unit. The experimental results confirm the advantages foreseen in the theoretical research. (author)

  20. Understanding and Predicting the Process of Software Maintenance Releases

    Science.gov (United States)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  1. Bioactive peptides released during of digestion of processed milk

    Science.gov (United States)

    Most of the proteins contained in milk consist of alpha-s1-, alpha-s2-, beta- and kappa-casein, and some of the peptides contained in these caseins may impart health benefits. To determine if processing affected release of peptides, samples of raw (R), homogenized (H), homogenized and pasteurized (...

  2. Estimation of in-vivo neurotransmitter release by brain microdialysis: the issue of validity.

    Science.gov (United States)

    Di Chiara, G.; Tanda, G.; Carboni, E.

    1996-11-01

    Although microdialysis is commonly understood as a method of sampling low molecular weight compounds in the extracellular compartment of tissues, this definition appears insufficient to specifically describe brain microdialysis of neurotransmitters. In fact, transmitter overflow from the brain into dialysates is critically dependent upon the composition of the perfusing Ringer. Therefore, the dialysing Ringer not only recovers the transmitter from the extracellular brain fluid but is a main determinant of its in-vivo release. Two types of brain microdialysis are distinguished: quantitative micro-dialysis and conventional microdialysis. Quantitative microdialysis provides an estimate of neurotransmitter concentrations in the extracellular fluid in contact with the probe. However, this information might poorly reflect the kinetics of neurotransmitter release in vivo. Conventional microdialysis involves perfusion at a constant rate with a transmitter-free Ringer, resulting in the formation of a steep neurotransmitter concentration gradient extending from the Ringer into the extracellular fluid. This artificial gradient might be critical for the ability of conventional microdialysis to detect and resolve phasic changes in neurotransmitter release taking place in the implanted area. On the basis of these characteristics, conventional microdialysis of neurotransmitters can be conceptualized as a model of the in-vivo release of neurotransmitters in the brain. As such, the criteria of face-validity, construct-validity and predictive-validity should be applied to select the most appropriate experimental conditions for estimating neurotransmitter release in specific brain areas in relation to behaviour.

  3. SIMULATION OF VINPOCETINE RELEASE PROCESS FROM MICROCAPSULES WITH HYDROPHOBIC SHELL

    Directory of Open Access Journals (Sweden)

    Yu. A. Polkovnikova

    2017-01-01

    Full Text Available Nowadays microcapsules are widely spread in different industries. Microcapsules with vitamins, etheric and fatty oils are included into different cosmetics (creams, gels, serums. Microencapsulated probiotics are used in foods and fodder additives in veterinary. An important field of application of microencapsulation in pharmacy is the combination in the total dosage of drugs that are incompatible when mixed in free form.The aim of work is a comparative analysis of thermodynamic characteristics of vinpocetine release from the melt of beeswax and cacao butter 3:2 into water, solution of hydrochloric acid 0.01 M and ethanol.Materials and methods. For simulation of the process of vinpocetine release from the melt into different environments models component models of the studied systems were built and their atom charges were calculated by quantum-chemical method. Spatial models of the components were built in Hyper Chem 8.01. As an initial state for the thermodynamic characteristics of the calculation of vinpocetine release from the melt, a conformation of «melt-vinpocetine» system was used after thermodynamic equilibration by molecular dynamics simulation in Bioeurica program for 5 ns. For isolated systems a vibrational analysis was performed with the use of unrestricted Hartree-Fock method in STO-3G basis set in Orca 4.0 program.Results and discussion. Vinpocetine release from the melt of beeswax and cacao butter 3:2 into water with different pH values and to ethanol depends on its solubility in these environments, and also on solubility of the melt.Conclusion. The performed study of vinpocetine release from the melt of beeswax and cacao butter 3:2 by molecular dynamics simulation demonstrates the opportunity of vinpocetine release into water with pH=2 and into ethanol. The obtained results make it possible to assume a lower degree of vinpocetine release from the melt into ethanol compared with the solution of hydrochloric acid 0,01 M.

  4. IV&V Project Assessment Process Validation

    Science.gov (United States)

    Driskell, Stephen

    2012-01-01

    The Space Launch System (SLS) will launch NASA's Multi-Purpose Crew Vehicle (MPCV). This launch vehicle will provide American launch capability for human exploration and travelling beyond Earth orbit. SLS is designed to be flexible for crew or cargo missions. The first test flight is scheduled for December 2017. The SLS SRR/SDR provided insight into the project development life cycle. NASA IV&V ran the standard Risk Based Assessment and Portfolio Based Risk Assessment to identify analysis tasking for the SLS program. This presentation examines the SLS System Requirements Review/System Definition Review (SRR/SDR), IV&V findings for IV&V process validation correlation to/from the selected IV&V tasking and capabilities. It also provides a reusable IEEE 1012 scorecard for programmatic completeness across the software development life cycle.

  5. Modeling and validating tritium transfer in a grassland ecosystem in response to {sup 3}H releases

    Energy Technology Data Exchange (ETDEWEB)

    Le Dizes, S.; Maro, D.; Rozet, M.; Hebert, D.; Solier, L.; Nicoulaud, V. [Institut de radioportection et de surete nucleaire - IRSN (France); Vermorel, F.; Aulagnier, C. [Electricite de France - EDF (France)

    2014-07-01

    Tritium ({sup 3}H) is a major radionuclide released in several forms (HTO, HT) by nuclear facilities under normal operating conditions. In terrestrial ecosystems, tritium can be found under two forms: tritium in tissue free water (TFWT) following absorption of tritiated water by leaves or roots and Organically Bound Tritium (OBT) resulting from TFWT incorporation by the plant organic matter during photosynthesis. In order to study transfers of tritium from atmospheric releases to terrestrial ecosystem such as grasslands, an in-situ laboratory has been set up by IRSN on a ryegrass field plot located 2 km downwind the AREVA NC La Hague nuclear reprocessing plant (North-West of France), as was done in the past for the assessment of transfer of radiocarbon in grasslands. The objectives of this experimental field are: (i) to better understand the OBT formation in plant by photosynthesis, (ii) to evaluate transfer processes of tritium in several forms (HT, HTO) from the atmosphere (air and rainwater) to grass and soil, (iii) to develop a modeling allowing to reproduce the dynamic response of the ecosystem to tritium atmospheric releases depending of variable environmental conditions. For this purpose, tritium activity measurements will be carried out in grass (monthly measurements of HTO, OBT), in air, rainwater, soil (daily measurements of HT, HTO) and CO{sub 2}, H{sub 2}O fluxes between soil and air compartments will be carried out. Then, the TOCATTA-c model previously developed to simulate {sup 14}C transfers to pasture on a hourly time-step basis will be adapted to take account for processes specific to tritium. The model will be tested by a comparison between simulated results and measurements. The objectives of this presentation are (1) to present the organization of the experimental design of the VATO study (Validation of TOCATTA) dedicated to transfers of tritium in a grassland ecosystem, (2) to document the major assumptions, conceptual modelling and

  6. Spent Nuclear Fuel (SNF) Process Validation Technical Support Plan

    Energy Technology Data Exchange (ETDEWEB)

    SEXTON, R.A.

    2000-03-13

    The purpose of Process Validation is to confirm that nominal process operations are consistent with the expected process envelope. The Process Validation activities described in this document are not part of the safety basis, but are expected to demonstrate that the process operates well within the safety basis. Some adjustments to the process may be made as a result of information gathered in Process Validation.

  7. Spent Nuclear Fuel (SNF) Process Validation Technical Support Plan

    International Nuclear Information System (INIS)

    SEXTON, R.A.

    2000-01-01

    The purpose of Process Validation is to confirm that nominal process operations are consistent with the expected process envelope. The Process Validation activities described in this document are not part of the safety basis, but are expected to demonstrate that the process operates well within the safety basis. Some adjustments to the process may be made as a result of information gathered in Process Validation

  8. Prediction and Validation of Heat Release Direct Injection Diesel Engine Using Multi-Zone Model

    Science.gov (United States)

    Anang Nugroho, Bagus; Sugiarto, Bambang; Prawoto; Shalahuddin, Lukman

    2014-04-01

    The objective of this study is to develop simulation model which capable to predict heat release of diesel combustion accurately in efficient computation time. A multi-zone packet model has been applied to solve the combustion phenomena inside diesel cylinder. The model formulations are presented first and then the numerical results are validated on a single cylinder direct injection diesel engine at various engine speed and timing injections. The model were found to be promising to fulfill the objective above.

  9. Zero-Release Mixed Waste Process Facility Design and Testing

    International Nuclear Information System (INIS)

    Richard D. Boardman; John A. Deldebbio; Robert J. Kirkham; Martin K. Clemens; Robert Geosits; Ping Wan

    2004-01-01

    A zero-release off-gas cleaning system for mixed-waste thermal treatment processes has been evaluated through experimental scoping tests and process modeling. The principles can possibly be adapted to a fluidized-bed calcination or stream reforming process, a waste melter, a rotary kiln process, and possibly other waste treatment thermal processes. The basic concept of a zero-release off-gas cleaning system is to recycle the bulk of the off-gas stream to the thermal treatment process. A slip stream is taken off the off-gas recycle to separate and purge benign constituents that may build up in the gas, such as water vapor, argon, nitrogen, and CO2. Contaminants are separated from the slip stream and returned to the thermal unit for eventual destruction or incorporation into the waste immobilization media. In the current study, a standard packed-bed scrubber, followed by gas separation membranes, is proposed for removal of contaminants from the off-gas recycle slipstream. The scrub solution is continuously regenerated by cooling and precipitating sulfate, nitrate, and other salts that reach a solubility limit in the scrub solution. Mercury is also separated by the scrubber. A miscible chemical oxidizing agent was shown to effectively oxidize mercury and also NO, thus increasing their removal efficiency. The current study indicates that the proposed process is a viable option for reducing off-gas emissions. Consideration of the proposed closed-system off-gas cleaning loop is warranted when emissions limits are stringent, or when a reduction in the total gas emissions volume is desired. Although the current closed-loop appears to be technically feasible, economical considerations must be also be evaluated on a case-by-case basis

  10. Release of ultrafine particles from three simulated building processes

    International Nuclear Information System (INIS)

    Kumar, Prashant; Mulheron, Mike; Som, Claudia

    2012-01-01

    Building activities are recognised to produce coarse particulate matter but less is known about the release of airborne ultrafine particles (UFPs; those below 100 nm in diameter). For the first time, this study has investigated the release of particles in the 5–560 nm range from three simulated building activities: the crushing of concrete cubes, the demolition of old concrete slabs, and the recycling of concrete debris. A fast response differential mobility spectrometer (Cambustion DMS50) was used to measure particle number concentrations (PNC) and size distributions (PNDs) at a sampling frequency of 10 Hz in a confined laboratory room providing controlled environment and near–steady background PNCs. The sampling point was intentionally kept close to the test samples so that the release of new UFPs during these simulated processes can be quantified. Tri–modal particle size distributions were recorded for all cases, demonstrating different peak diameters in fresh nuclei ( 4 cm −3 . These background modal peaks shifted towards the larger sizes during the work periods (i.e. actual experiments) and the total PNCs increased between 2 and 17 times over the background PNCs for different activities. After adjusting for background concentrations, the net release of PNCs during cube crushing, slab demolition, and ‘dry’ and ‘wet’ recycling events were measured as ∼0.77, 19.1, 22.7 and 1.76 (×10 4 ) cm −3 , respectively. The PNDs were converted into particle mass concentrations (PMCs). While majority of new PNC release was below 100 nm (i.e. UFPs), the bulk of new PMC emissions were constituted by the particles over 100 nm; ∼95, 79, 73 and 90% of total PNCs, and ∼71, 92, 93 and 91% of total PMCs, for cube crushing, slab demolition, dry recycling and wet recycling, respectively. The results of this study firmly elucidate the release of UFPs and raise a need for further detailed studies and designing health and safety related exposure guidelines for

  11. Compressive strength test for cemented waste forms: validation process

    International Nuclear Information System (INIS)

    Haucz, Maria Judite A.; Candido, Francisco Donizete; Seles, Sandro Rogerio

    2007-01-01

    In the Cementation Laboratory (LABCIM), of the Development Centre of the Nuclear Technology (CNEN/CDTN-MG), hazardous/radioactive wastes are incorporated in cement, to transform them into monolithic products, preventing or minimizing the contaminant release to the environment. The compressive strength test is important to evaluate the cemented product quality, in which it is determined the compression load necessary to rupture the cemented waste form. In LABCIM a specific procedure was developed to determine the compressive strength of cement waste forms based on the Brazilian Standard NBR 7215. The accreditation of this procedure is essential to assure reproductive and accurate results in the evaluation of these products. To achieve this goal the Laboratory personal implemented technical and administrative improvements in accordance with the NBR ISO/IEC 17025 standard 'General requirements for the competence of testing and calibration laboratories'. As the developed procedure was not a standard one the norm ISO/IEC 17025 requests its validation. There are some methodologies to do that. In this paper it is described the current status of the accreditation project, especially the validation process of the referred procedure and its results. (author)

  12. AN OVERVIEW OF PHARMACEUTICAL PROCESS VALIDATION AND PROCESS CONTROL VARIABLES OF TABLETS MANUFACTURING PROCESSES IN INDUSTRY

    OpenAIRE

    Mahesh B. Wazade*, Sheelpriya R. Walde and Abhay M. Ittadwar

    2012-01-01

    Validation is an integral part of quality assurance; the product quality is derived from careful attention to a number of factors including selection of quality parts and materials, adequate product and manufacturing process design, control of the process variables, in-process and end-product testing. Recently validation has become one of the pharmaceutical industry’s most recognized and discussed subjects. It is a critical success factor in product approval and ongoing commercialization, fac...

  13. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  14. Development and validation of a dissolution test for diltiazem hydrochloride in immediate release capsules

    Directory of Open Access Journals (Sweden)

    Taciane Ferreira Mendonça

    2011-01-01

    Full Text Available This work describes the development and validation of a dissolution test for 60 mg of diltiazem hydrochloride in immediate release capsules. The best dissolution in vitro profile was achieved using potassium phosphate buffer at pH 6.8 as the dissolution medium and paddle as the apparatus at 50 rpm. The drug concentrations in the dissolution media were determined by UV spectrophotometry and HPLC and a statistical analysis revealed that there were significant differences between HPLC and spectrophotometry. This study illustrates the importance of an official method for the dissolution test, since there is no official monograph for diltiazem hydrochloride in capsules.

  15. Validation of the Vanderbilt Holistic Face Processing Test

    OpenAIRE

    Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  16. Validation of the Vanderbilt Holistic Face Processing Test.

    OpenAIRE

    Chao-Chih Wang; Chao-Chih Wang; David Andrew Ross; Isabel Gauthier; Jennifer Joanna Richler

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  17. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  18. Soil Moisture Active Passive Mission L4_SM Data Product Assessment (Version 2 Validated Release)

    Science.gov (United States)

    Reichle, Rolf Helmut; De Lannoy, Gabrielle J. M.; Liu, Qing; Ardizzone, Joseph V.; Chen, Fan; Colliander, Andreas; Conaty, Austin; Crow, Wade; Jackson, Thomas; Kimball, John; hide

    2016-01-01

    During the post-launch SMAP calibration and validation (Cal/Val) phase there are two objectives for each science data product team: 1) calibrate, verify, and improve the performance of the science algorithm, and 2) validate the accuracy of the science data product as specified in the science requirements and according to the Cal/Val schedule. This report provides an assessment of the SMAP Level 4 Surface and Root Zone Soil Moisture Passive (L4_SM) product specifically for the product's public Version 2 validated release scheduled for 29 April 2016. The assessment of the Version 2 L4_SM data product includes comparisons of SMAP L4_SM soil moisture estimates with in situ soil moisture observations from core validation sites and sparse networks. The assessment further includes a global evaluation of the internal diagnostics from the ensemble-based data assimilation system that is used to generate the L4_SM product. This evaluation focuses on the statistics of the observation-minus-forecast (O-F) residuals and the analysis increments. Together, the core validation site comparisons and the statistics of the assimilation diagnostics are considered primary validation methodologies for the L4_SM product. Comparisons against in situ measurements from regional-scale sparse networks are considered a secondary validation methodology because such in situ measurements are subject to up-scaling errors from the point-scale to the grid cell scale of the data product. Based on the limited set of core validation sites, the wide geographic range of the sparse network sites, and the global assessment of the assimilation diagnostics, the assessment presented here meets the criteria established by the Committee on Earth Observing Satellites for Stage 2 validation and supports the validated release of the data. An analysis of the time average surface and root zone soil moisture shows that the global pattern of arid and humid regions are captured by the L4_SM estimates. Results from the

  19. Development and validation of dissolution study of sustained release dextromethorphan hydrobromide tablets.

    Science.gov (United States)

    Rajan, Sekar; Colaco, Socorrina; Ramesh, N; Meyyanathan, Subramania Nainar; Elango, K

    2014-02-01

    This study describes the development and validation of dissolution tests for sustained release Dextromethorphan hydrobromide tablets using an HPLC method. Chromatographic separation was achieved on a C18 column utilizing 0.5% triethylamine (pH 7.5) and acetonitrile in the ratio of 50:50. The detection wavelength was 280 nm. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The suitable conditions were clearly decided after testing sink conditions, dissolution medium and agitation intensity. The most excellent dissolution conditions tested, for the Dextromethorphan hydrobromide was applied to appraise the dissolution profiles. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The method was established to have sufficient intermediate precision as similar separation was achieved on another instrument handled by different operators. Mean Recovery was 101.82%. Intra precisions for three different concentrations were 1.23, 1.10 0.72 and 1.57, 1.69, 0.95 and inter run precisions were % RSD 0.83, 1.36 and 1.57%, respectively. The method was successfully applied for dissolution study of the developed Dextromethorphan hydrobromide tablets.

  20. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  1. Fast Running Urban Dispersion Model for Radiological Dispersal Device (RDD) Releases: Model Description and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Gowardhan, Akshay [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Donetti, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Belles, Rich [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Eme, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Homann, Steven [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Simpson, Matthew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC); Nasstrom, John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). National Atmospheric Release Advisory Center (NARAC)

    2017-05-24

    Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a more detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).

  2. Microsoft Visio 2013 business process diagramming and validation

    CERN Document Server

    Parker, David

    2013-01-01

    Microsoft Visio 2013 Business Process Diagramming and Validation provides a comprehensive and practical tutorial including example code and demonstrations for creating validation rules, writing ShapeSheet formulae, and much more.If you are a Microsoft Visio 2013 Professional Edition power user or developer who wants to get to grips with both the essential features of Visio 2013 and the validation rules in this edition, then this book is for you. A working knowledge of Microsoft Visio and optionally .NET for the add-on code is required, though previous knowledge of business process diagramming

  3. Experiments to validate the assumptions on Pu release in an aircraft crash

    International Nuclear Information System (INIS)

    Seehars, H.D.; Hochrainer, D.

    1983-01-01

    This report describes simulation experiments with the substitute powder CeO 2 to study the release and dispersion of PuO 2 -powder induced by kerosene fires after an aeroplane crash on a Plutonium processing fuel element plant. The release rates of CeO 2 -powder were found to be a nonlinear function of te kerosene combustion rate. The release rates during a ''micro-scale'' fire inside the glovebox (pool area some 20 cm 2 ) were characterized by values of less than 10 μg/s, those during a conflagration (pool area some 200 m 2 ) by values of somewhat more than 25 mg/s. Because of the lack of other weather conditions the dispersion experiments were exclusively realized during weak to moderate winds. Small scale fire induced maximum inhalation hazards from PuO 2 -powder used in production essentially exceeded those of large scale conflagrations. Obviously the activity intake by inhalation exceeded to some extent the admissable threshold of the annual activity intake. (orig.) [de

  4. Conditional release of materials from decommissioning process into the environment in the form of steel railway tracks

    International Nuclear Information System (INIS)

    Tatransky, Peter; Necas, Vladimir

    2009-01-01

    This work points to the possibilities of conditional release of materials from the process of decommissioning the nuclear unit from the operation. According to the valid legislation, materials which do not meet the condition of direct-unconditional release into the environment, should be modified and processed into the matrix designed for the final disposal in the storing place. However there exists a group of materials which activity is on the borderline of the limit of releasing into the environment and it is possible to release them conditionally. The matter of conditional release is that notable amount of materials, mainly metals, is usually contaminated only by radionuclids with relatively short time of half decay. These materials are suitable to use for a specific industrial purpose where the longtime fixation of shortly living radionuclids is expected in one place. This work deals with the conditional release of metals into the form of steel railway tracks. It describes the (working) groups of workers working with the steel railway tracks and defines in the numbers the critical group and its critical individual. For critical individual it dimensions the amounts of materials, which are possible to be released conditionally from one double-unit of the plant of the type VVER 440 V-230 which operation was ended on the regular basis. According to the calculations in the software VISIPLAN and OMEGA there is defined a number of released steel in such way that the internationally recommended rate of maximal effective dose for critical individual-10 μSv/year [IAEA, 2008. Managing low radioactivity material from the decommissioning of nuclear facility. Technical reports series no. 462] is not extended. In the final part there are compared the estimates of the costs of the decommissioning process with the application of conditional release and without it, which is directly reflected in the amount of saved costs and number of containers for surface disposal.

  5. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  6. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  7. Development, validation and routine control of a radiation process

    International Nuclear Information System (INIS)

    Kishor Mehta

    2010-01-01

    Today, radiation is used in industrial processing for variety of applications; from low doses for blood irradiation to very high doses for materials modification and even higher for gemstone colour enhancement. At present, radiation is mainly provided by either radionuclides or machine sources; cobalt-60 is the most predominant radionuclide in use. Currently, there are several hundred irradiation facilities worldwide. Similar to other industries, quality management systems can assist radiation processing facilities in enhancing customer satisfaction and maintaining and improving product quality. To help fulfill quality management requirements, several national and international organizations have developed various standards related to radiation processing. They all have requirements and guidelines for development, validation and routine control of the radiation process. For radiation processing, these three phases involve the following activities. Development phase includes selecting the type of radiation source, irradiation facility and the dose required for the process. Validation phase includes conducting activities that give assurance that the process will be successful. Routine control then involves activities that provide evidence that the process has been successfully realized. These standards require documentary evidence that process validation and process control have been followed. Dosimetry information gathered during these processes provides this evidence. (authors)

  8. SPECIFICITY OF MANIFACTURING PROCESS VALIDATION FOR DIAGNOSTIC SEROLOGICAL DEVICES

    Directory of Open Access Journals (Sweden)

    O. Yu. Galkin

    2018-02-01

    Full Text Available The aim of this research was to analyze recent scientific literature, as well as national and international legislature on manifacturing process validation of biopharmaceutical production, in particular devices for serological diagnostics. Technology validation in the field of medical devices for serological diagnostics is most influenced by the Technical Regulation for Medical Devices for in vitro Diagnostics State Standards of Ukraine – SSU EN ISO 13485:2015 “Medical devices. Quality management system. Requirements for regulation”, SSU EN ISO 14971:2015 “Medical devices. Instructions for risk management”, Instruction ST-N of the Ministry of Healthcare of Ukraine 42-4.0:2014 “Medications. Suitable industrial practice”, State Pharmacopoeia of Ukraine and Instruction ICH Q9 on risk management. Current recommendations for validations of drugs manufacturing process, including biotechnological manufacturing, can not be directly applied to medical devices for in vitro diagnostics. It was shown that the specifics of application and raw materials require individual validation parameters and process validations for serological diagnostics devices. Critical parameters to consider in validation plans were provided for every typical stage of production of in vitro diagnostics devices on the example of immunoassay kits, such as obtaining protein antigens, including recombinant ones, preparations of mono- and polyclonal antibodies, immunoenzyme conjugates and immunosorbents, chemical reagents etc. The bottlenecks of technologies for in vitro diagnostics devices were analyzed from the bioethical and biosafety points of view.

  9. Validation of the Vanderbilt Holistic Face Processing Test.

    Science.gov (United States)

    Wang, Chao-Chih; Ross, David A; Gauthier, Isabel; Richler, Jennifer J

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  10. Validation of the Vanderbilt Holistic Face Processing Test.

    Directory of Open Access Journals (Sweden)

    Chao-Chih Wang

    2016-11-01

    Full Text Available The Vanderbilt Holistic Face Processing Test (VHPT-F is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014. In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom, which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  11. Materials of the Regional Training Course on Validation and Process Control for Electron Beam Radiation Processing

    International Nuclear Information System (INIS)

    Kaluska, I.; Gluszewski, W.

    2007-01-01

    Irradiation with electron beams is used in the polymer industry, food, pharmaceutical and medical device industries for sterilization of surfaces. About 20 lectures presented during the Course were devoted to all aspects of control and validation of low energy electron beam processes. They should help the product manufacturers better understand the application of the ANSI/AAMI/ISO 11137 norm, which defines the requirements and standard practices for validation of the irradiation process and the process controls required during routine processing

  12. Validation of a numerical release model (REPCOM) for the Finnish reactor waste disposal systems: Pt.1

    International Nuclear Information System (INIS)

    Nykyri, Mikko

    1987-05-01

    The aim of the work is to model experimentally the inner structures and materials of two reactor waste repositories and to use the results for the validation work of a numerical near field release model, REPCOM. The experimental modelling of the multibarrier systems is conducted on a laboratory scale by using the same principal materials as are employed in the Finnish reactor waste disposal concepts. The migration of radionuclides is studied in two or more consecutive material layers. The laboratory arrangements include the following test materials: bituminized resin, cemented resin, concrete, crushed rock, and water. The materials correspond to the local materials in the planned disposal systems. Cs-137, Co-60, Sr-85, and Sr-90 are used as tracers, with which the resin, water, and crushed rock are labeled depending on the specimen type. The basic specimen geometries are cylindrical and cubic. In the cylindrical geometry the test materials were placed into PVC-tubes. The corresponding numerical model is one-dimensional. In the cubic geometry the materials were placed inside each other. The boundaries form cubes, and the numerical model is three-dimensional. Altogether 12 test system types were produced. The gamma active nuclides in the cylindrical samples were measured nondestructively with a scanner in order to determine the activity profiles in the specimens. The gamma active nuclides in the cubic samples and the beta emeitting Sr-90 in separate samples will be measured after splitting the samples. One to five activity profiles were determined for each cylindrical gamma-active sample. There are already clear diffusion profiles to be had for strontium in crushed rock, and for cesium in crushed rock and in concrete. Cobalt indicated no diffusion. No activity profiles were measured for the cubic samples or for the beta active, Sr-90-doped samples

  13. IT Support for Release Management Processes in the Automotive Industry

    NARCIS (Netherlands)

    Muller, D.; Herbst, J.; Hammori, M.; Reichert, M.U.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    Car development is based on long running, concurrently executed and highly dependent processes. The coordination and synchronization of these processes has become a complex and error-prone task due to the increasing number of functions and embedded systems in modern cars. These systems realize

  14. A novel technique for die-level post-processing of released optical MEMS

    International Nuclear Information System (INIS)

    Elsayed, Mohannad Y; Beaulieu, Philippe-Olivier; Briere, Jonathan; Ménard, Michaël; Nabki, Frederic

    2016-01-01

    This work presents a novel die-level post-processing technique for dies including released movable structures. The procedure was applied to microelectromechanical systems (MEMS) chips that were fabricated in a commercial process, SOIMUMPs from MEMSCAP. It allows the performance of a clean DRIE etch of sidewalls on the diced chips enabling the optical testing of the pre-released MEMS mirrors through the chip edges. The etched patterns are defined by photolithography using photoresist spray coating. The photoresist thickness is tuned to create photoresist bridges over the pre-released gaps, protecting the released structures during subsequent wet processing steps. Then, the chips are subject to a sequence of wet and dry etching steps prior to dry photoresist removal in O 2 plasma. Processed micromirrors were tested and found to rotate similarly to devices without processing, demonstrating that the post-processing procedure does not affect the mechanical performance of the devices significantly. (technical note)

  15. Study of benzene release from Savannah River in-tank precipitation process slurry simulant

    International Nuclear Information System (INIS)

    Rappe, K.G.; Gauglitz, P.A.

    1998-08-01

    At the Savannah River Site, the in-tank precipitation (ITP) process uses sodium tetraphenylborate (NaTPB) to precipitate radioactive cesium from alkaline wastes. During this process, potassium is also precipitated to form 4-wt% KTPB/CsTPB slurry. Residual NaTPB decomposes to form benzene, which is retained by the waste slurry. The retained benzene is also readily released from the waste during subsequent waste processing. While the release of benzene certainly poses flammability and toxicological safety concerns, the magnitude of the hazard depends on the rate of release. Currently, the mechanisms controlling the benzene release rates are not well understood, and predictive models for estimating benzene release rates are not available. The overall purpose of this study is to obtain quantitative measurements of benzene release rates from a series of ITP slurry simulants. This information will become a basis for developing a quantitative mechanistic model of benzene release rates. The transient benzene release rate was measured from the surface of various ITP slurry (solution) samples mixed with benzene. The benzene release rate was determined by continuously purging the headspace of a sealed sample vessel with an inert gas (nitrogen) and analyzing that purged headspace vapor for benzene every minute

  16. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  17. The Islamic State Battle Plan: Press Release Natural Language Processing

    Science.gov (United States)

    2016-06-01

    Institute for the Study of Violent Groups NATO North Atlantic Treaty Organization NLP Natural Language Processing PCorpus Permanent Corpus PDF...approaches, we apply Natural Language Processing ( NLP ) tools to a unique database of text documents collected by Whiteside (2014). His collection...from Arabic to English. Compared to other terrorism databases, Whiteside’s collection methodology limits the scope of the database and avoids coding

  18. Processing and validation of intermediate energy evaluated data files

    International Nuclear Information System (INIS)

    2000-01-01

    Current accelerator-driven and other intermediate energy technologies require accurate nuclear data to model the performance of the target/blanket assembly, neutron production, activation, heating and damage. In a previous WPEC subgroup, SG13 on intermediate energy nuclear data, various aspects of intermediate energy data, such as nuclear data needs, experiments, model calculations and file formatting issues were investigated and categorized to come to a joint evaluation effort. The successor of SG13, SG14 on the processing and validation of intermediate energy evaluated data files, goes one step further. The nuclear data files that have been created with the aforementioned information need to be processed and validated in order to be applicable in realistic intermediate energy simulations. We emphasize that the work of SG14 excludes the 0-20 MeV data part of the neutron evaluations, which is supposed to be covered elsewhere. This final report contains the following sections: section 2: a survey of the data files above 20 MeV that have been considered for validation in SG14; section 3: a summary of the review of the 150 MeV intermediate energy data files for ENDF/B-VI and, more briefly, the other libraries; section 4: validation of the data library against an integral experiment with MCNPX; section 5: conclusions. (author)

  19. Validated, Ultra Violet Spectroscopy method for the Dissolution study of Mycophenolate mofetil immediate release 500mg tablets

    OpenAIRE

    Surajpal P. Verma; Ozair Alam; Pooja Mullick; Nadeem Siddiqui; Suroor A. Khan

    2008-01-01

    A simple, selective and precise dissolution method was developed and validated for the Mycophenolate mofetil immediate release tablets. The method employed dissolution medium 0.1N HCl (pH1.2) and volume 900ml with USP-II apparatus (Paddle). Detection was made by measuring the absorbance on UV at the [lambda]~max~ 250nm. The method show the linearity in the range of conc. 5[micro]g/ml to 40[micro]g/ml with r^2^=0.999. The method is also validated as per International Conference of Harmonizatio...

  20. Development and preliminary validation of flux map processing code MAPLE

    International Nuclear Information System (INIS)

    Li Wenhuai; Zhang Xiangju; Dang Zhen; Chen Ming'an; Lu Haoliang; Li Jinggang; Wu Yuanbao

    2013-01-01

    The self-reliant flux map processing code MAPLE was developed by China General Nuclear Power Corporation (CGN). Weight coefficient method (WCM), polynomial expand method (PEM) and thin plane spline (TPS) method were applied to fit the deviation between measured and predicted detector signal results for two-dimensional radial plane, to interpolate or extrapolate the non-instrumented location deviation. Comparison of results in the test cases shows that the TPS method can better capture the information of curved fitting lines than the other methods. The measured flux map data of the Lingao Nuclear Power Plant were processed using MAPLE as validation test cases, combined with SMART code. Validation results show that the calculation results of MAPLE are reasonable and satisfied. (authors)

  1. Accidental Continuous Releases from Coal Processing in Semi-Confined Environment

    Directory of Open Access Journals (Sweden)

    Bruno Fabiano

    2013-09-01

    Full Text Available Notwithstanding the enforcement of ATEX EU Directives (94/9/EC of 23 March 1994 and safety management system application, explosions in the coal sector still claim lives and cause huge economic losses. Even a consolidated activity like coke dry distillation allows the opportunity of preventing explosion risk connected to fugitive emissions of coke oven gas. Considering accidental releases under semi-confined conditions, a simplified mathematical approach to the maximum allowed gaseous build-up is developed on the basis of the intrinsic hazards of the released compound. The results will help identifying and assessing low rate release consequences therefore to set-up appropriate prevention and control measures. The developed methodology was tested at the real-scale and validated by numerical computational fluid dynamics (CFD simulations showing the effectiveness of the methodology to evaluate and mitigate the risk connected to confined hazardous releases.

  2. Formulation and evaluation of sustained release matrix tablets of pioglitazone hydrochloride using processed Aloe vera mucilage as release modifier

    Directory of Open Access Journals (Sweden)

    Manoj Choudhary

    2015-01-01

    Full Text Available Background: Natural gums and mucilage which hydrates and swells on contact with aqueous media are used as additives in the formulation of hydrophilic drug delivery system. Aim: The purpose of this study was to develop a new monolithic matrix system for complete delivery of Pioglitazone hydrochloride (HCl, in a zero-order manner over an extended time period using processed Aloe vera gel mucilage (PAG as a release modifier. Materials and Methods: The matrices were prepared by dry blending of selected ratios of polymer and ingredients using direct compression technique. Physicochemical properties of dried powdered mucilage of A. vera were studied. Various formulations of pioglitazone HCl and A. vera mucilage were prepared using different drug: Polymer ratios viz., 1:1, 1:2, 1:3, 1:4, 1:5 for PAG by direct compression technique. Results: The formulated matrix tablets were found to have better uniformity of weight and drug content with low statistical deviation. The swelling behavior and in vitro release rate characteristics were also studied. Conclusion: The study proved that the dried A. vera mucilage can be used as a matrix forming material for controlled release of Pioglitazone HCl matrix tablets.

  3. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  4. Hot Experiment on Fission Gas Release Behavior from Voloxidation Process using Spent Fuel

    International Nuclear Information System (INIS)

    Park, Geun Il; Park, J. J.; Jung, I. H.; Shin, J. M.; Cho, K. H.; Yang, M. S.; Song, K. C.

    2007-08-01

    Quantitative analysis of the fission gas release characteristics during the voloxidation and OREOX processes of spent PWR fuel was carried out by spent PWR fuel in a hot-cell of the DFDF. The release characteristics of 85 Kr and 14 C fission gases during voloxidation process at 500 .deg. C is closely linked to the degree of conversion efficiency of UO 2 to U 3 O 8 powder, and it can be interpreted that the release from grain-boundary would be dominated during this step. Volatile fission gases of 14 C and 85 Kr were released to near completion during the OREOX process. Both the 14 C and 85 Kr have similar release characteristics under the voloxidation and OREOX process conditions. A higher burn-up spent fuel showed a higher release fraction than that of a low burn-up fuel during the voloxidation step at 500 .deg. C. It was also observed that the release fraction of semi-volatile Cs was about 16% during a reduction at 1,000 .deg. C of the oxidized powder, but over 90% during the voloxidation at 1,250 .deg. C

  5. Validation of new CFD release by Ground-Coupled Heat Transfer Test Cases

    Directory of Open Access Journals (Sweden)

    Sehnalek Stanislav

    2017-01-01

    Full Text Available In this article is presented validation of ANSYS Fluent with IEA BESTEST Task 34. Article stars with outlook to the topic, afterward are described steady-state cases used for validation. Thereafter is mentioned implementation of these cases on CFD. Article is concluded with presentation of the simulated results with a comparison of those from already validated simulation software by IEA. These validation shows high correlation with an older version of tested ANSYS as well as with other main software. The paper ends by discussion with an outline of future research.

  6. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  7. Interim report: Study of benzene release from Savannah River in-tank precipitation process slurry simulant

    International Nuclear Information System (INIS)

    Rappe, K.G.; Gauglitz, P.A.

    1997-09-01

    At the Savannah River Site, the in-tank precipitation (ITP) process uses sodium tetraphenylborate (NaTPB) to precipitate radioactive cesium from alkaline wastes. During this process, potassium is also precipitated to form a 4-wt% KTPB/CsTPB slurry. Residual NaTPB decomposes to form benzene, which is retained by the waste slurry. The retained benzene is also readily released from the waste during subsequent waste processing. While the release of benzene certainly poses both flammability and toxicological safety concerns, the magnitude of the hazard depends on the rate of release. Currently, the mechanisms controlling the benzene release rates are not well understood, and predictive models for estimating benzene release rates are not available. The overall purpose of this study is to obtain quantitative measurements of benzene release rates from a series of ITP slurry stimulants. This information will become a basis for developing a quantitative mechanistic model of benzene release rates. The transient benzene release rate was measured from the surface of various ITP slurry (solution) samples mixed with benzene. The benzene release rate was determined by continuously purging the headspace of a sealed sample vessel with an inert gas (nitrogen) and analyzing that purged headspace vapor for benzene every 3 minutes. The following 75-mL samples were measured for release rates: KTPB slurry with 15,000 ppm freshly added benzene that was gently mixed with the slurry, KTPB slurry homogenized (energetically mixed) with 15,000 ppm and 5,000 ppm benzene, clear and filtered KTPB salt solution saturated with benzene (with and without a pure benzene layer on top of the solution), and a slurry sample from a large demonstration experiment (DEMO slurry) containing-benzene generated in situ

  8. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  9. Best practice strategies for validation of micro moulding process simulation

    DEFF Research Database (Denmark)

    Costa, Franco; Tosello, Guido; Whiteside, Ben

    2009-01-01

    The use of simulation for injection moulding design is a powerful tool which can be used up-front to avoid costly tooling modifications and reduce the number of mould trials. However, the accuracy of the simulation results depends on many component technologies and information, some of which can...... be easily controlled or known by the simulation analyst and others which are not easily known. For this reason, experimental validation studies are an important tool for establishing best practice methodologies for use during analysis set up on all future design projects. During the validation studies......, detailed information about the moulding process is gathered and used to establish these methodologies. Whereas in routine design projects, these methodologies are then relied on to provide efficient but reliable working practices. Data analysis and simulations on preliminary micro-moulding experiments have...

  10. Nanosized sustained-release pyridostigmine bromide microcapsules: process optimization and evaluation of characteristics

    Science.gov (United States)

    Tan, Qunyou; Jiang, Rong; Xu, Meiling; Liu, Guodong; Li, Songlin; Zhang, Jingqing

    2013-01-01

    Background Pyridostigmine bromide (3-[[(dimethylamino)-carbonyl]oxy]-1-methylpyridinium bromide), a reversible inhibitor of cholinesterase, is given orally in tablet form, and a treatment schedule of multiple daily doses is recommended for adult patients. Nanotechnology was used in this study to develop an alternative sustained-release delivery system for pyridostigmine, a synthetic drug with high solubility and poor oral bioavailability, hence a Class III drug according to the Biopharmaceutics Classification System. Novel nanosized pyridostigmine-poly(lactic acid) microcapsules (PPNMCs) were expected to have a longer duration of action than free pyridostigmine and previously reported sustained-release formulations of pyridostigmine. Methods The PPNMCs were prepared using a double emulsion-solvent evaporation method to achieve sustained-release characteristics for pyridostigmine. The preparation process for the PPNMCs was optimized by single-factor experiments. The size distribution, zeta potential, and sustained-release behavior were evaluated in different types of release medium. Results The optimal volume ratio of inner phase to external phase, poly(lactic acid) concentration, polyvinyl alcohol concentration, and amount of pyridostigmine were 1:10, 6%, 3% and 40 mg, respectively. The negatively charged PPNMCs had an average particle size of 937.9 nm. Compared with free pyridostigmine, PPNMCs showed an initial burst release and a subsequent very slow release in vitro. The release profiles for the PPNMCs in four different types of dissolution medium were fitted to the Ritger-Peppas and Weibull models. The similarity between pairs of dissolution profiles for the PPNMCs in different types of medium was statistically significant, and the difference between the release curves for PPNMCs and free pyridostigmine was also statistically significant. Conclusion PPNMCs prepared by the optimized protocol described here were in the nanometer range and had good uniformity

  11. Use of fuzzy logic in signal processing and validation

    International Nuclear Information System (INIS)

    Heger, A.S.; Alang-Rashid, N.K.; Holbert, K.E.

    1993-01-01

    The advent of fuzzy logic technology has afforded another opportunity to reexamine the signal processing and validation process (SPV). The features offered by fuzzy logic can lend themselves to a more reliable and perhaps fault-tolerant approach to SPV. This is particularly attractive to complex system operations, where optimal control for safe operation depends on reliable input data. The reason for the use of fuzzy logic as the tool for SPV is its ability to transform information from the linguistic domain to a mathematical domain for processing and then transformation of its result back into the linguistic domain for presentation. To ensure the safe and optimal operation of a nuclear plant, for example, reliable and valid data must be available to the human and computer operators. Based on these input data, the operators determine the current state of the power plant and project corrective actions for future states. This determination is based on available data and the conceptual and mathematical models for the plant. A fault-tolerant SPV based on fuzzy logic can help the operators meet the objective of effective, efficient, and safe operation of the nuclear power plant. The ultimate product of this project will be a code that will assist plant operators in making informed decisions under uncertain conditions when conflicting signals may be present

  12. Validation of the production process of core-equipment HYNIC-Bombesin-Sn

    International Nuclear Information System (INIS)

    Rubio C, N. I.

    2008-01-01

    The validation process is establishing documented evidence that provides a high degree of assurance that a specific process consistently will produce a product that will meet specifications and quality attributes preset and, therefore, ensures the efficiency and effectiveness of a product. The radiopharmaceutical 99m Tc-HYNlC-Bombesin is part of the gastrin-releasing peptide (GRP) analogues of bombesin that are radiolabelled with technetium 99 metastable for molecular images obtention. Is obtained from freeze-dry formulations kits (core- equipment)) and has reported a very high stability in human serum, specific binding to receptors and rapid internalization. Biodistribution data in mice showed rapid blood clearance with predominant renal excretion and specific binding to tissues with positive response to GRP receptors. According to biokinetics studies performed on patients with breast cancer, breast show a marked asymmetry with increased uptake in neoplastic breast in healthy women and the uptake of radiopharmaceuticals is symmetrical in both breasts. No reported adverse reactions. In this paper, the prospective validation core-equipment HYNlC-Bombesin-Sn, which was shown consistently that the product meets the specifications and quality, attributes to preset from the obtained from the diagnostic radiopharmaceutical third generation: 99m Tc-HYNlC-Bombesin. The process was successfully validated and thereby ensuring the efficiency and effectiveness of this agent as a preliminary diagnostic for approval to be marketed. (Author)

  13. Leach resistance properties and release processes for salt-occluded zeolite A

    International Nuclear Information System (INIS)

    Lewis, M.A.; Fischer, D.F.; Laidler, J.J.

    1992-01-01

    The pyrometallurgical processing of spent fuel from the Integral Fast Reactor (IFR) results in a waste of LiCl-KCl-NaCl salt containing approximately 10 wt% fission products, primarily CsCl and SrCl 2 . For disposal, this waste must be immobilized in a form that it is leach resistant. A salt-occluded zeolite has been identified as a potential waste form for the salt. Its leach resistance properties were investigated using powdered samples. The results were that strontium was not released and cesium had a low release, 0.056 g/m 2 for the 56 day leach test. The initial release (within 7 days) of alkali metal cations was rapid and subsequent releases were much smaller. The releases of aluminum and silicon were 0.036 and 0.028 g/m 2 , respectively, and were constant. Neither alkali metal cation hydrolysis nor exchange between cations in the leachate and those in the zeolite was significant. Only sodium release followed t 0.5 kinetics. Selected dissolution of the occluded salt was the primary release process. These results confirm that salt-occluded zeolite has promise as the waste form for IFR pyroprocess salt

  14. Gas release during salt-well pumping: Model predictions and laboratory validation studies for soluble and insoluble gases

    International Nuclear Information System (INIS)

    Peurrung, L.M.; Caley, S.M.; Gauglitz, P.A.

    1997-08-01

    The Hanford Site has 149 single-shell tanks (SSTs) containing radioactive wastes that are complex mixes of radioactive and chemical products. Of these, 67 are known or suspected to have leaked liquid from the tanks into the surrounding soil. Salt-well pumping, or interim stabilization, is a well-established operation for removing drainable interstitial liquid from SSTs. The overall objective of this ongoing study is to develop a quantitative understanding of the release rates and cumulative releases of flammable gases from SSTs as a result of salt-well pumping. The current study is an extension of the previous work reported by Peurrung et al. (1996). The first objective of this current study was to conduct laboratory experiments to quantify the release of soluble and insoluble gases. The second was to determine experimentally the role of characteristic waste heterogeneities on the gas release rates. The third objective was to evaluate and validate the computer model STOMP (Subsurface Transport over Multiple Phases) used by Peurrung et al. (1996) to predict the release of both soluble (typically ammonia) and insoluble gases (typically hydrogen) during and after salt-well pumping. The fourth and final objective of the current study was to predict the gas release behavior for a range of typical tank conditions and actual tank geometry. In these models, the authors seek to include all the pertinent salt-well pumping operational parameters and a realistic range of physical properties of the SST wastes. For predicting actual tank behavior, two-dimensional (2-D) simulations were performed with a representative 2-D tank geometry

  15. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  16. Electric ignition energy evaluation and the energy distribution structure of energy released in electrostatic discharge process

    International Nuclear Information System (INIS)

    Liu Qingming; Huang Jinxiang; Shao Huige; Zhang Yunming

    2017-01-01

    Ignition energy is one of the important parameters of flammable materials, and evaluating ignition energy precisely is essential to the safety of process industry and combustion science and technology. By using electric spark discharge test system, a series of electric spark discharge experiments were conducted with the capacitor-stored energy in the range of 10 J, 100 J, and 1000 J, respectively. The evaluation method for energy consumed by electric spark, wire, and switch during capacitor discharge process has been studied respectively. The resistance of wire, switch, and plasma between electrodes has been evaluated by different methods and an optimized evaluation method has been obtained. The electric energy consumed by wire, electric switch, and electric spark-induced plasma between electrodes were obtained and the energy structure of capacitor-released energy was analyzed. The dynamic process and the characteristic parameters (the maximum power, duration of discharge process) of electric spark discharge process have been analyzed. Experimental results showed that, electric spark-consumed energy only accounts for 8%–14% of the capacitor-released energy. With the increase of capacitor-released energy, the duration of discharge process becomes longer, and the energy of plasma accounts for more in the capacitor-released energy. The power of electric spark varies with time as a damped sinusoids function and the period and the maximum value increase with the capacitor-released energy. (paper)

  17. Process behavior and environmental assessment of 14C releases from an HTGR fuel reprocessing facility

    International Nuclear Information System (INIS)

    Snider, J.W.; Kaye, S.V.

    1976-01-01

    Large quantities of 14 CO 2 will be evolved when graphite fuel blocks are burned during reprocessing of spent fuel from HTGR reactors. The possible release of some or all of this 14 C to the environment is a matter of concern which is investigated in this paper. Various alternatives are considered in this study for decontaminating and releasing the process off-gas to the environment. Concomitant radiological analyses have been done for the waste process scenarios to supply the necessary feedbacks for process design

  18. The Chernobyl I-131 Release: Model Validation and Assessment of the Countermeasure Effectiveness. Report of the Chernobyl 131I Release Working Group of EMRAS Theme 1

    International Nuclear Information System (INIS)

    2012-01-01

    The Chernobyl 131 I Release Working Group (IWG) which was established within the framework of the EMRAS Programme continues some of the more traditional work of previous international programmes that were aimed at increasing confidence in methods and models for the assessment of radiation exposure related to the environmental releases. There is still very little information regarding the quantitative relationship between radiation dose to the thyroid from Chernobyl and the risk of thyroid cancer. The uncertainty combined with individual estimates of radiation dose constitutes a crucial point in establishing this relationship, since, any release of radioiodine into environment creates wide range of uncertainty for internal dose assessments. The 131 I scenarios provide an excellent opportunity to compare a number of modelling approaches to a single assessment problem, in a dose reconstruction context. Nine experts in environmental modelling participated in the Plavsk Scenario dealing with areas of assessment modelling for which the capabilities are not yet well established. One could observes the remarkably improvement in models performance comparing with previous radioiodine scenarios. Predictions of the various models were within a factor of three of the observations, discrepancies between the estimates of average doses to thyroid produced by most participant not exceeded a factor of ten. The process of testing independent model calculations against independent data set also provided useful information to the originators of the test data.

  19. How Monte Carlo heuristics aid to identify the physical processes of drug release kinetics.

    Science.gov (United States)

    Lecca, Paola

    2018-01-01

    We implement a Monte Carlo heuristic algorithm to model drug release from a solid dosage form. We show that with Monte Carlo simulations it is possible to identify and explain the causes of the unsatisfactory predictive power of current drug release models. It is well known that the power-law, the exponential models, as well as those derived from or inspired by them accurately reproduce only the first 60% of the release curve of a drug from a dosage form. In this study, by using Monte Carlo simulation approaches, we show that these models fit quite accurately almost the entire release profile when the release kinetics is not governed by the coexistence of different physico-chemical mechanisms. We show that the accuracy of the traditional models are comparable with those of Monte Carlo heuristics when these heuristics approximate and oversimply the phenomenology of drug release. This observation suggests to develop and use novel Monte Carlo simulation heuristics able to describe the complexity of the release kinetics, and consequently to generate data more similar to those observed in real experiments. Implementing Monte Carlo simulation heuristics of the drug release phenomenology may be much straightforward and efficient than hypothesizing and implementing from scratch complex mathematical models of the physical processes involved in drug release. Identifying and understanding through simulation heuristics what processes of this phenomenology reproduce the observed data and then formalize them in mathematics may allow avoiding time-consuming, trial-error based regression procedures. Three bullet points, highlighting the customization of the procedure. •An efficient heuristics based on Monte Carlo methods for simulating drug release from solid dosage form encodes is presented. It specifies the model of the physical process in a simple but accurate way in the formula of the Monte Carlo Micro Step (MCS) time interval.•Given the experimentally observed curve of

  20. Understanding aroma release from model cheeses by a statistical multiblock approach on oral processing.

    Directory of Open Access Journals (Sweden)

    Gilles Feron

    Full Text Available For human beings, the mouth is the first organ to perceive food and the different signalling events associated to food breakdown. These events are very complex and as such, their description necessitates combining different data sets. This study proposed an integrated approach to understand the relative contribution of main food oral processing events involved in aroma release during cheese consumption. In vivo aroma release was monitored on forty eight subjects who were asked to eat four different model cheeses varying in fat content and firmness and flavoured with ethyl propanoate and nonan-2-one. A multiblock partial least square regression was performed to explain aroma release from the different physiological data sets (masticatory behaviour, bolus rheology, saliva composition and flux, mouth coating and bolus moistening. This statistical approach was relevant to point out that aroma release was mostly explained by masticatory behaviour whatever the cheese and the aroma, with a specific influence of mean amplitude on aroma release after swallowing. Aroma release from the firmer cheeses was explained mainly by bolus rheology. The persistence of hydrophobic compounds in the breath was mainly explained by bolus spreadability, in close relation with bolus moistening. Resting saliva poorly contributed to the analysis whereas the composition of stimulated saliva was negatively correlated with aroma release and mostly for soft cheeses, when significant.

  1. Effects of process variables on micromeritic properties and drug release of non-degradable microparticles

    Directory of Open Access Journals (Sweden)

    Mitra Jelvehgari

    2011-06-01

    Full Text Available Introduction: The purpose of this investigation was to evaluate microencapsulated controlled release preparation of theophylline using Eudragit RS 100 as the retardant material with high entrapment efficiency. Methods: Microspheres were prepared by the emulsion-solvent evaporation method. A mixed solvent system consisting of methanol and acetone and light liquid paraffin as oily phase were chosen. Sucrose stearate was used as the surfactant to stabilize the emulsification process. The prepared microspheres were characterized by drug loading, Fourier-transform infrared spectroscopy (FTIR, differential scanning colorimetry (DSC and scanning electron microscopy (SEM. The in vitro release studies were performed at pH 1.2 and 7.4 aqueous medium. Results: Increasing the concentration of emulsifier, sucrose fatty acid ester F-70, decreased the particle size which contributed to increased drug release rate. The drug loading microparticle Eudragit RS100 (1:6 showed 60-75% of entrapment and mean particle size 205.93-352.76 µm. The results showed that, an increase in the ratio of polymer: drug (F5, 6: 1 resulted in a reduction in the release rate of the drug which may be attributed to the hydrophobic nature of the polymer. Conclusion: The release of theophylline is influenced by the drug to polymer ratio and particle size. Drug release is controlled by diffusion and the best-fit release kinetic is Higuchi model.

  2. Effects of process variables on micromeritic properties and drug release of non-degradable microparticles.

    Science.gov (United States)

    Jelvehgari, Mitra; Barar, Jaleh; Nokhodchi, Ali; Shadrou, Sanam; Valizadeh, Hadi

    2011-01-01

    The purpose of this investigation was to evaluate microencapsulated controlled release preparation of theophylline using Eudragit RS 100 as the retardant material with high entrapment efficiency. Microspheres were prepared by the emulsion-solvent evaporation method. A mixed solvent system consisting of methanol and acetone and light liquid paraffin as oily phase were chosen. Sucrose stearate was used as the surfactant to stabilize the emulsification process. The prepared microspheres were characterized by drug loading, Fourier-transform infrared spectroscopy (FTIR), differential scanning colorimetry (DSC) and scanning electron microscopy (SEM). The in vitro release studies were performed at pH 1.2 and 7.4 aqueous medium. Increasing the concentration of emulsifier, sucrose fatty acid ester F-70, decreased the particle size which contributed to increased drug release rate. The drug loading microparticle Eudragit RS100(1:6) showed 60-75% of entrapment and mean particle size 205.93-352.76 μm.The results showed that, an increase in the ratio of polymer: drug (F5, 6: 1) resulted in a reduction in the release rate of the drug which may be attributed to the hydrophobic nature of the polymer. The release of theophylline is influenced by the drug to polymer ratio and particle size. Drug release is controlled by diffusion and the best-fit release kinetic is Higuchi model.

  3. Sustained-release microsphere formulation containing an agrochemical by polyurethane polymerization during an agitation granulation process.

    Science.gov (United States)

    Terada, Takatoshi; Tagami, Manabu; Ohtsubo, Toshiro; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-07-25

    In this report, a new solventless microencapsulation method by synthesizing polyurethane (PU) from polyol and isocyanate during an agglomeration process in a high-speed mixing apparatus was developed. Clothianidin (CTD), which is a neonicotinoid insecticide and highly effective against a wide variety of insect pests, was used as the model compound. The microencapsulated samples covered with PU (CTD microspheres) had a median diameter of <75μm and sustained-release properties. The CTD microspheres were analyzed by synchrotron X-ray computed tomography measurements. Multiple cores of CTD and other solid excipient were dispersed in PU. Although voids appeared in the CTD microspheres after CTD release, the spherical shape of the microspheres remained stable and no change in its framework was observed. The experimental release data were highly consistent with the Baker-Lonsdale model derived from drug release of spherical monolithic dispersions and consistent with the computed tomography measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Analyses of postulated accidental releases of UF6 inside process buildings

    International Nuclear Information System (INIS)

    Oliveira Neto, Jose Messias de; Nunes, Beatriz Guimaraes; Dias, Cristiane

    2009-01-01

    Uranium Hexafluoride is a material used in the various processes which comprise the front end of the nuclear fuel cycle (conversion, enrichment and fuel fabrication). Confinement of UF 6 is a very important safety requirement since this material is highly reactive and presents safety hazards to humans. The present paper discusses the safety relevant aspects of accidental releases of UF 6 inside process confinement buildings. Postulated accidental scenarios are analyzed and their consequences evaluated. Implant releases rates are estimated using computer code predictions. A time dependent homogeneous compartment model is used to predict concentrations of UF 6 , hydrogen fluoride and uranyl fluoride inside a confinement building, as well as to evaluate source terms released to the atmosphere. These source terms can be used as input to atmospheric dispersion models to evaluate consequences to the environment. The results can also be used to define adequate protective measures for emergency situations. (author)

  5. Modeling and validating tritium transfer in a grassland ecosystem in response to {sup 3}H releases

    Energy Technology Data Exchange (ETDEWEB)

    Le Dizes, S. [Institute for Radioprotection and Nuclear Safety, IRSN/PRP-ENV/SERIS/LM2E, Centre de Cadarache, Saint-Paul-lez-Durance (France); Maro, D.; Rozet, M.; Hebert, D. [IRSN/PRP-ENV/SERIS/LRC, Cherbourg-Octeville (France)

    2015-03-15

    In this paper a radioecological model (TOCATTA) for tritium transfer in a grassland ecosystem developed on an hourly time-step basis is proposed and compared with the first data set obtained in the vicinity of the AREVA-NC reprocessing plant of La Hague (France). The TOCATTA model aims at simulating dynamics of tritium transfer in agricultural soil and plant ecosystems exposed to time-varying HTO concentrations in air water vapour and possibly in irrigation and rain water. In the present study, gaseous releases of tritium from the AREVA NC nuclear reprocessing plant in normal operation can be intense and intermittent over a period of less than 24 hours. A first comparison of the model predictions with the field data has shown that TOCATTA should be improved in terms of kinetics of tritium transfer.

  6. Laser Processing of Carbon Fiber Reinforced Plastics - Release of Carbon Fiber Segments During Short-pulsed Laser Processing of CFRP

    Science.gov (United States)

    Walter, Juergen; Brodesser, Alexander; Hustedt, Michael; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    Cutting and ablation using short-pulsed laser radiation are promising technologies to produce or repair CFRP components with outstanding mechanical properties e.g. for automotive and aircraft industry. Using sophisticated laser processing strategies and avoiding excessive heating of the workpiece, a high processing quality can be achieved. However, the interaction of laser radiation and composite material causes a notable release of hazardous substances from the process zone, amongst others carbon fiber segments or fibrous particles. In this work, amounts and geometries of the released fiber segments are analyzed and discussed in terms of their hazardous potential. Moreover, it is investigated to what extent gaseous organic process emissions are adsorbed at the fiber segments, similar to an adsorption of volatile organic compounds at activated carbon, which is typically used as filter material.

  7. Towards elucidation of the drug release mechanism from compressed hydrophilic matrices made of cellulose ethers. III. Critical use of thermodynamic parameters of activation for modeling the water penetration and drug release processes.

    Science.gov (United States)

    Ferrero, Carmen; Massuelle, Danielle; Jeannerat, Damien; Doelker, Eric

    2013-09-10

    The two main purposes of this work were: (i) to critically consider the use of thermodynamic parameters of activation for elucidating the drug release mechanism from hydroxypropyl methylcellulose (HPMC) matrices, and (ii) to examine the effect of neutral (pH 6) and acidic (pH 2) media on the release mechanism. For this, caffeine was chosen as model drug and various processes were investigated for the effect of temperature and pH: caffeine diffusion in solution and HPMC gels, and drug release from and water penetration into the HPMC tablets. Generally, the kinetics of the processes was not significantly affected by pH. As for the temperature dependence, the activation energy (E(a)) values calculated from caffeine diffusivities were in the range of Fickian transport (20-40 kJ mol⁻¹). Regarding caffeine release from HPMC matrices, fitting the profiles using the Korsmeyer-Peppas model would indicate anomalous transport. However, the low apparent E(a) values obtained were not compatible with a swelling-controlled mechanism and can be assigned to the dimensional change of the system during drug release. Unexpectedly, negative apparent E(a) values were calculated for the water uptake process, which can be ascribed to the exothermic dissolution of water into the initially dry HPMC, the expansion of the matrix and the polymer dissolution. Taking these contributions into account, the true E(a) would fall into the range valid for Fickian diffusion. Consequently, a relaxation-controlled release mechanism can be dismissed. The apparent anomalous drug release from HPMC matrices results from a coupled Fickian diffusion-erosion mechanism, both at pH 6 and 2. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Additive advantage in characteristics of MIMCAPs on flexible silicon (100) fabric with release-first process

    KAUST Repository

    Ghoneim, Mohamed T.; Rojas, Jhonathan Prieto; Hussain, Aftab M.; Hussain, Muhammad Mustafa

    2013-01-01

    We report the inherent increase in capacitance per unit planar area of state-of-the art high-κ integrated metal/insulator/metal capacitors (MIMCAPs) fabricated on flexible silicon fabric with release-first process. We methodically study and show

  9. Requirements for the workflow-based support of release management processes in the automotive sector

    NARCIS (Netherlands)

    Bestfleisch, U.; Herbst, J.; Reichert, M.U.; Abdelmalek, B.

    One of the challenges the automotive industry currently has to master is the complexity of the electrical/elctronic system of a car. One key factor for reaching short product development cycles and high quality in this area are well-defined, properly executed test and release processes. In this

  10. Drug-releasing shape-memory polymers - the role of morphology, processing effects, and matrix degradation.

    Science.gov (United States)

    Wischke, Christian; Behl, Marc; Lendlein, Andreas

    2013-09-01

    Shape-memory polymers (SMPs) have gained interest for temporary drug-release systems that should be anchored in the body by self-sufficient active movements of the polymeric matrix. Based on the so far published scientific literature, this review highlights three aspects that require particular attention when combining SMPs with drug molecules: i) the defined polymer morphology as required for the shape-memory function, ii) the strong effects that processing conditions such as drug-loading methodologies can have on the drug-release pattern from SMPs, and iii) the independent control of drug release and degradation by their timely separation. The combination of SMPs with a drug-release functionality leads to multifunctional carriers that are an interesting technology for pharmaceutical sciences and can be further expanded by new materials such as thermoplastic SMPs or temperature-memory polymers. Experimental studies should include relevant molecules as (model) drugs and provide a thermomechanical characterization also in an aqueous environment, report on the potential effect of drug type and loading levels on the shape-memory functionality, and explore the potential correlation of polymer degradation and drug release.

  11. Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.

    Science.gov (United States)

    Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S

    2014-07-01

    This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  13. Release process for non-real property containing residual radioactive material

    International Nuclear Information System (INIS)

    Ranek, N.L.; Chen, S.Y.; Kamboj, S.; Hensley, J.; Burns, D.; Fleming, R.; Warren, S.; Wallo, A.

    1997-01-01

    It is DOE's objective to operate its facilities and to conduct its activities so that radiation exposures to members of the public are maintained within acceptable limits and exposures to residual radioactive materials are controlled. To accomplish this, DOE has adopted Order DOE 5400.51 'Radiation Protection of the Public and the Environment', and will be promulgating IO CR Part 834 to codify and clarify the requirements of DOE 5400.5. Under both DOE 5400.5 and 10 CR Part 834, radioactively contaminated DOE property is prohibited from release unless specific actions have been completed prior to the release. This paper outlines a ten-step process that, if followed, will assist DOE Operations and contractor personnel in ensuring that the required actions established by Order DOE 5400.5 and 10 CR Part 834 have been appropriately completed prior to the release for reuse or recycle of non-real property (e.g., office furniture, computers, hand tools, machinery, vehicles and scrap metal). Following the process will assist in ensuring that radiological doses to the public from the released materials will meet applicable regulatory standards and be as low as reasonably achievable (ALARA)

  14. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yaobo [Institute for Work and Health (IST), Universities of Lausanne and Geneva, Route de la Corniche 2, 1066, Epalinges (Switzerland); Kuhlbusch, Thomas A.J. [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Centre for Nanointegration (CENIDE), University Duisburg-Essen, Duisburg (Germany); Van Tongeren, Martie; Jiménez, Araceli Sánchez [Centre for Human Exposure Science, Institute of Occupational Medicine (IOM), Research Avenue North, Edinburgh EH14 4AP (United Kingdom); Tuinman, Ilse [TNO, Lange Kleiweg 137, Rijswijk (Netherlands); Chen, Rui [CAS Key Laboratory for Biomedical Effects of Nanomaterials and Nanosafety & CAS Center for Excellence in Nanoscience, National Center for Nanoscience and Technology of China, Beijing 100190 (China); Alvarez, Iñigo Larraza [ACCIONA Infrastructure, Materials Area, Innovation Division, C/Valportillo II 8, 28108, Alcobendas (Spain); Mikolajczyk, Urszula [Nofer Institute of Occupational Medicine, Lodz (Poland); Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Wohlleben, Wendel [Dept. Material Physics, BASF SE, Advanced Materials Research, Ludwigshafen (Germany); Stahlmecke, Burkhard [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Clavaguera, Simon [NanoSafety Platform, Commissariat à l’Energie Atomique et aux Energies Alternatives (CEA), Univ. Grenoble Alpes, Grenoble, 38054 (France); and others

    2017-01-15

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  15. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    International Nuclear Information System (INIS)

    Ding, Yaobo; Kuhlbusch, Thomas A.J.; Van Tongeren, Martie; Jiménez, Araceli Sánchez; Tuinman, Ilse; Chen, Rui; Alvarez, Iñigo Larraza; Mikolajczyk, Urszula; Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz; Wohlleben, Wendel; Stahlmecke, Burkhard; Clavaguera, Simon

    2017-01-01

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  16. Flight Dynamic Simulation of Fighter In the Asymmetric External Store Release Process

    Science.gov (United States)

    Safi’i, Imam; Arifianto, Ony; Nurohman, Chandra

    2018-04-01

    In the fighter design, it is important to evaluate and analyze the flight dynamic of the aircraft earlier in the development process. One of the case is the dynamics of external store release process. A simulation tool can be used to analyze the fighter/external store system’s dynamics in the preliminary design stage. This paper reports the flight dynamics of Jet Fighter Experiment (JF-1 E) in asymmetric Advance Medium Range Air to Air Missile (AMRAAM) release process through simulations. The JF-1 E and AIM 120 AMRAAAM models are built by using Advanced Aircraft Analysis (AAA) and Missile Datcom software. By using these softwares, the aerodynamic stability and control derivatives can be obtained and used to model the dynamic characteristic of the fighter and the external store. The dynamic system is modeled by using MATLAB/Simulink software. By using this software, both the fighter/external store integration and the external store release process is simulated, and the dynamic of the system can be analyzed.

  17. Disruption of Brewers' yeast by hydrodynamic cavitation: Process variables and their influence on selective release.

    Science.gov (United States)

    Balasundaram, B; Harrison, S T L

    2006-06-05

    Intracellular products, not secreted from the microbial cell, are released by breaking the cell envelope consisting of cytoplasmic membrane and an outer cell wall. Hydrodynamic cavitation has been reported to cause microbial cell disruption. By manipulating the operating variables involved, a wide range of intensity of cavitation can be achieved resulting in a varying extent of disruption. The effect of the process variables including cavitation number, initial cell concentration of the suspension and the number of passes across the cavitation zone on the release of enzymes from various locations of the Brewers' yeast was studied. The release profile of the enzymes studied include alpha-glucosidase (periplasmic), invertase (cell wall bound), alcohol dehydrogenase (ADH; cytoplasmic) and glucose-6-phosphate dehydrogenase (G6PDH; cytoplasmic). An optimum cavitation number Cv of 0.13 for maximum disruption was observed across the range Cv 0.09-0.99. The optimum cell concentration was found to be 0.5% (w/v, wet wt) when varying over the range 0.1%-5%. The sustained effect of cavitation on the yeast cell wall when re-circulating the suspension across the cavitation zone was found to release the cell wall bound enzyme invertase (86%) to a greater extent than the enzymes from other locations of the cell (e.g. periplasmic alpha-glucosidase at 17%). Localised damage to the cell wall could be observed using transmission electron microscopy (TEM) of cells subjected to less intense cavitation conditions. Absence of the release of cytoplasmic enzymes to a significant extent, absence of micronisation as observed by TEM and presence of a lower number of proteins bands in the culture supernatant on SDS-PAGE analysis following hydrodynamic cavitation compared to disruption by high-pressure homogenisation confirmed the selective release offered by hydrodynamic cavitation. Copyright 2006 Wiley Periodicals, Inc.

  18. Evaluation of microplastic release caused by textile washing processes of synthetic fabrics.

    Science.gov (United States)

    De Falco, Francesca; Gullo, Maria Pia; Gentile, Gennaro; Di Pace, Emilia; Cocca, Mariacristina; Gelabert, Laura; Brouta-Agnésa, Marolda; Rovira, Angels; Escudero, Rosa; Villalba, Raquel; Mossotti, Raffaella; Montarsolo, Alessio; Gavignano, Sara; Tonin, Claudio; Avella, Maurizio

    2018-05-01

    A new and more alarming source of marine contamination has been recently identified in micro and nanosized plastic fragments. Microplastics are difficult to see with the naked eye and to biodegrade in marine environment, representing a problem since they can be ingested by plankton or other marine organisms, potentially entering the food web. An important source of microplastics appears to be through sewage contaminated by synthetic fibres from washing clothes. Since this phenomenon still lacks of a comprehensive analysis, the objective of this contribution was to investigate the role of washing processes of synthetic textiles on microplastic release. In particular, an analytical protocol was set up, based on the filtration of the washing water of synthetic fabrics and on the analysis of the filters by scanning electron microscopy. The quantification of the microfibre shedding from three different synthetic fabric types, woven polyester, knitted polyester, and woven polypropylene, during washing trials simulating domestic conditions, was achieved and statistically analysed. The highest release of microplastics was recorded for the wash of woven polyester and this phenomenon was correlated to the fabric characteristics. Moreover, the extent of microfibre release from woven polyester fabrics due to different detergents, washing parameters and industrial washes was evaluated. The number of microfibres released from a typical 5 kg wash load of polyester fabrics was estimated to be over 6,000,000 depending on the type of detergent used. The usage of a softener during washes reduces the number of microfibres released of more than 35%. The amount and size of the released microfibres confirm that they could not be totally retained by wastewater treatments plants, and potentially affect the aquatic environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Experimental Study on Ice Forming Process of Cryogenic Liquid Releasing underwater

    Science.gov (United States)

    Zhang, Bin; Wu, Wanqing; Zhang, Xingdong; Zhang, Yi; Zhang, Chuanlin; Zhang, Haoran; Wang, Peng

    2017-11-01

    Cryogenic liquid releasing into water would be a process combines hyperactive boiling with ice forming. There are still few researches on the experimental study on the environmental conditions for deciding ice forming speed and liquid surviving state. In this paper, to advance our understanding of ice forming deciding factors in the process of LN2 releasing underwater, a visualization experimental system is built. The results show that the pressure difference significantly influences the ice forming speed and liquid surviving distance, which is observed by the experiment and theoretically analysed by Kelvin-Helmholtz instability. Adding nucleating agent is helpful to provide ice nucleus which can accelerate the ice forming speed. Water flowing has some effect on changing pressure difference, which can affect the ice forming speed and liquid surviving distance.

  20. Important processes affecting the release and migration of radionuclides from a deep geological repository

    International Nuclear Information System (INIS)

    Barátová, Dana; Nečas, Vladimír

    2017-01-01

    The processes that affect significantly the transport of contaminants through the near field and far field of a deep geological repository of spent nuclear fuel were studied. The processes can be generally divided into (i) processes related to the release of radionuclides from the spent nuclear fuel; (ii) processes related to the radionuclide transport mechanisms (such as advection and diffusion); and (iii) processes affecting the rate of radionuclide migration through the multi-barrier repository system. A near-field and geosphere model of an unspecified geological repository sited in a crystalline rock is also described. Focus of the treatment is on the effects of the different processes on the activity flow of the major safety-relevant radionuclides. The activity flow was simulated for one spent fuel cask by using the GoldSim simulation tool. (orig.)

  1. EQUATIONS FOR GAS RELEASING PROCESS FROM PRESSURIZED VESSELS IN ODH EVALUATION

    International Nuclear Information System (INIS)

    JIA, L.X.; WANG, L.

    2001-01-01

    IN THE EVALUATION OF ODH, THE CALCULATION OF THE SPILL RATE FROM THE PRESSURIZED VESSEL IS THE CENTRAL TASK. THE ACCURACY OF THE ENGINEERING ESTIMATION BECOMES ONE OF THE SAFETY DESIGN ISSUES. THIS PAPER SUMMARIZES THE EQUATIONS FOR THE OXYGEN CONCENTRATION CALCULATION IN DIFFERENT CASES, AND DISCUSSES THE EQUATIONS FOR THE GAS RELEASE PROCESS CALCULATION BOTH FOR THE HIGH-PRESSURE GAS TANK AND THE LOW-TEMPERATURE LIQUID CONTAINER

  2. [Spectral characteristics of dissolved organic matter released during the metabolic process of small medusa].

    Science.gov (United States)

    Guo, Dong-Hui; Yi, Yue-Yuan; Zhao, Lei; Guo, Wei-Dong

    2012-06-01

    The metabolic processes of jellyfish can produce dissolved organic matter (DOM) which will influence the functioning of the aquatic ecosystems, yet the optical properties of DOM released by jellyfish are unknown. Here we report the absorption and fluorescence properties of DOM released by a medusa species Black fordia virginica during a 24 h incubation experiment. Compared with the control group, an obvious increase in the concentrations of dissolved organic carbon (DOC), absorption coefficient (a280) and total dissolved nitrogen (TDN) was observed in incubation group. This clearly demonstrated the release of DOM, chromophoric DOM (CDOM) and dissolved nutrients by B. virginica which feed on enough of Artemia sp. before the experiment. The increase in spectral slope ratio (SR) and decrease in humification index (HIX) indicated that the released DOM was less-humified and had relatively lower molecular weight. Parallel factor analysis (PARAFAC) decomposed the fluorescence matrices of DOM into three humic-like components (C1-C3) and one protein-like component (C4). The Fmax of two components (C2: 400 nm showed little changes. Thus, we suggested a zooplankton index (ZIX) to trace and characterize the DOM excreted by metabolic activity of zooplankton, which is calculated as the ratio of the sum of Fmax of all fluorescence components with the emission wavelength 400 nm.

  3. Precooking as a Control for Histamine Formation during the Processing of Tuna: An Industrial Process Validation.

    Science.gov (United States)

    Adams, Farzana; Nolte, Fred; Colton, James; De Beer, John; Weddig, Lisa

    2018-02-23

    An experiment to validate the precooking of tuna as a control for histamine formation was carried out at a commercial tuna factory in Fiji. Albacore tuna ( Thunnus alalunga) were brought on board long-line catcher vessels alive, immediately chilled but never frozen, and delivered to an on-shore facility within 3 to 13 days. These fish were then allowed to spoil at 25 to 30°C for 21 to 25 h to induce high levels of histamine (>50 ppm), as a simulation of "worst-case" postharvest conditions, and subsequently frozen. These spoiled fish later were thawed normally and then precooked at a commercial tuna processing facility to a target maximum core temperature of 60°C. These tuna were then held at ambient temperatures of 19 to 37°C for up to 30 h, and samples were collected every 6 h for histamine analysis. After precooking, no further histamine formation was observed for 12 to 18 h, indicating that a conservative minimum core temperature of 60°C pauses subsequent histamine formation for 12 to 18 h. Using the maximum core temperature of 60°C provided a challenge study to validate a recommended minimum core temperature of 60°C, and 12 to 18 h was sufficient to convert precooked tuna into frozen loins or canned tuna. This industrial-scale process validation study provides support at a high confidence level for the preventive histamine control associated with precooking. This study was conducted with tuna deliberately allowed to spoil to induce high concentrations of histamine and histamine-forming capacity and to fail standard organoleptic evaluations, and the critical limits for precooking were validated. Thus, these limits can be used in a hazard analysis critical control point plan in which precooking is identified as a critical control point.

  4. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    Science.gov (United States)

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  5. The method validation step of biological dosimetry accreditation process

    International Nuclear Information System (INIS)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph.

    2006-01-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was considered as

  6. The method validation step of biological dosimetry accreditation process

    Energy Technology Data Exchange (ETDEWEB)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph. [Institut de Radioprotection et de Surete Nucleaire, LDB, 92 - Fontenay aux Roses (France)

    2006-07-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was

  7. Adequate Measuring Technology and System of Fission Gas release Behavior from Voloxidation Process

    International Nuclear Information System (INIS)

    Park, Geun Il; Park, J. J.; Jung, I. H.; Shin, J. M.; Yang, M. S.; Song, K. C.

    2006-09-01

    Based on the published literature and an understanding of available hot cell technologies, more accurate measuring methods for each volatile fission product released from voloxidation process were reviewed and selected. The conceptual design of an apparatus for measuring volatile and/or semi-volatile fission products released from spent fuel was prepared. It was identified that on-line measurement techniques can be applied for gamma-emitting fission products, and off-line measurement such as chemical/or neutron activation analysis can applied for analyzing beta-emitting fission gases. Collection methods using appropriate material or solutions were selected to measure the release fraction of beta-emitting gaseous fission products at IMEF M6 hot cell. Especially, the on-line gamma-ray counting system for monitoring of 85Kr and the off-line measuring system of 14C was established. On-line measuring system for obtaining removal ratios of the semi-volatile fission products, mainly gamma-emitting fission products such as Cs, Ru etc., was also developed at IMEF M6 hot cell which was based on by measuring fuel inventory before and after the voloxidation test through gamma measuring technique. The development of this measurement system may enable basic information to be obtained to support design of the off-gas treatment system for the voloxidation process at INL, USA

  8. An investigation of effects of modification processes on physical properties and mechanism of drug release for sustaining drug release from modified rice

    Energy Technology Data Exchange (ETDEWEB)

    Ngo, Vuong Duy; Luu, Thinh Duc; Van Vo, Toi [Pharmaceutical Engineering Laboratory, Biomedical Engineering Department, International University, Vietnam National University, Ho Chi Minh City (Viet Nam); Tran, Van-Thanh [Faculty of Pharmacy, University of Medicine and Pharmacy, Ho Chi Minh City (Viet Nam); Duan, Wei [School of Medicine, Deakin University, Pigdons Road, Waurn Ponds, Victoria (Australia); Tran, Phuong Ha-Lien, E-mail: phuong.tran1@deakin.edu.au [School of Medicine, Deakin University, Pigdons Road, Waurn Ponds, Victoria (Australia); Tran, Thao Truong-Dinh, E-mail: ttdthao@hcmiu.edu.vn [Pharmaceutical Engineering Laboratory, Biomedical Engineering Department, International University, Vietnam National University, Ho Chi Minh City (Viet Nam)

    2016-10-01

    The aim of this study was to investigate the effect of modification processes on physical properties and explain the mechanism of sustained drug release from modified rice (MR). Various types of Vietnamese rice were introduced in the study as the matrices of sustained release dosage form. Rice was thermally modified in water for a determined temperature at different times with a simple process. Then tablets containing MR and isradipine, the model drug, were prepared to investigate the capability of sustained drug release. Scanning electron microscopy (SEM) was used to determine different morphologies between MR formulations. Flow property of MR was analyzed by Hausner ratio and Carr's indices. The dissolution rate and swelling/erosion behaviors of tablets were evaluated at pH 1.2 and pH 6.8 at 37 ± 0.5 °C. The matrix tablet containing MR showed a sustained release as compared to the control. The SEM analyses and swelling/erosion studies indicated that the morphology as well as swelling/erosion rate of MR were modulated by modification time, drying method and incubation. It was found that the modification process was crucial because it could highly affect the granule morphologies and hence, leading to the change of flowability and swelling/erosion capacity for sustained release of drug. - Highlights: • Modification process affected granule morphologies and flowability of modified rice. • Modification process affected swelling/erosion capacity for drug sustained release. • Freeze-drying could decrease the erosion as well as increase the swelling rate.

  9. An investigation of effects of modification processes on physical properties and mechanism of drug release for sustaining drug release from modified rice

    International Nuclear Information System (INIS)

    Ngo, Vuong Duy; Luu, Thinh Duc; Van Vo, Toi; Tran, Van-Thanh; Duan, Wei; Tran, Phuong Ha-Lien; Tran, Thao Truong-Dinh

    2016-01-01

    The aim of this study was to investigate the effect of modification processes on physical properties and explain the mechanism of sustained drug release from modified rice (MR). Various types of Vietnamese rice were introduced in the study as the matrices of sustained release dosage form. Rice was thermally modified in water for a determined temperature at different times with a simple process. Then tablets containing MR and isradipine, the model drug, were prepared to investigate the capability of sustained drug release. Scanning electron microscopy (SEM) was used to determine different morphologies between MR formulations. Flow property of MR was analyzed by Hausner ratio and Carr's indices. The dissolution rate and swelling/erosion behaviors of tablets were evaluated at pH 1.2 and pH 6.8 at 37 ± 0.5 °C. The matrix tablet containing MR showed a sustained release as compared to the control. The SEM analyses and swelling/erosion studies indicated that the morphology as well as swelling/erosion rate of MR were modulated by modification time, drying method and incubation. It was found that the modification process was crucial because it could highly affect the granule morphologies and hence, leading to the change of flowability and swelling/erosion capacity for sustained release of drug. - Highlights: • Modification process affected granule morphologies and flowability of modified rice. • Modification process affected swelling/erosion capacity for drug sustained release. • Freeze-drying could decrease the erosion as well as increase the swelling rate.

  10. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    Science.gov (United States)

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  11. Production process validation of 2-[18F]-fluoro-2-deoxy-D-glucose

    International Nuclear Information System (INIS)

    Cantero, Miguel; Iglesias, Rocio; Aguilar, Juan; Sau, Pablo; Tardio, Evaristo; Narrillos, Marcos

    2003-01-01

    The main of validation of production process of 2-[18F]-fluoro-2-deoxi-D-glucose (FDG) was to check: A) equipment's and services implicated in the production process were correctly installed, well documented, and worked properly, and B) production of FDG was done in a repetitive way according to predefined parameters. The main document was the Validation Master Plan, and steps were: installation qualification, operation qualification, process qualification and validation report. After finalization of all tests established in qualification steps without deviations, we concluded that the production process was validated because is done in a repetitive way according predefined parameters (Au)

  12. Determination of the radionuclide release factor for an evaporator process using nondestructive assay

    International Nuclear Information System (INIS)

    Johnson, R.E.

    1998-01-01

    The 242-A Evaporator is the primary waste evaporator for the Hanford Site radioactive liquid waste stored in underground double-shell tanks. Low pressure evaporation is used to remove water from the waste, thus reducing the amount of tank space required for storage. The process produces a concentrated slurry, a process condensate, and an offgas. The offgas exhausts through two stages of high-efficiency particulate air (HEPA) filters before being discharged to the atmosphere 40 CFR 61 Subpart H requires assessment of the unfiltered exhaust to determine if continuous compliant sampling is required. Because potential (unfiltered) emissions are not measured, methods have been developed to estimate these emissions. One of the methods accepted by the Environmental Protection Agency is the measurement of the accumulation of radionuclides on the HEPA filters. Nondestructive assay (NDA) was selected for determining the accumulation on the HEPA filters. NDA was performed on the HEPA filters before and after a campaign in 1997. NDA results indicate that 2.1 E+4 becquerels of cesium-137 were accumulated on the primary HEPA 1700 filter during the campaign. The feed material processed in the campaign contained a total of 1.4 E+l6 Bq of cesium-137. The release factor for the evaporator process is 1.5 E-12. Based on this release factor, continuous compliant sampling is not required

  13. An Overview of Pharmaceutical Validation and Process Controls in ...

    African Journals Online (AJOL)

    It has always been known that the processes involved in pharmaceutical production impact significantly on the quality of the products The processes include raw material and equipment inspections as well as in-process controls. Process controls are mandatory in good manufacturing practice (GMP). The purpose is to ...

  14. Validity and Reliability of Revised Inventory of Learning Processes.

    Science.gov (United States)

    Gadzella, B. M.; And Others

    The Inventory of Learning Processes (ILP) was developed by Schmeck, Ribich, and Ramanaiah in 1977 as a self-report inventory to assess learning style through a behavioral-oriented approach. The ILP was revised by Schmeck in 1983. The Revised ILP contains six scales: (1) Deep Processing; (2) Elaborative Processing; (3) Shallow Processing; (4)…

  15. Uranium geochemistry in estuarine sediments: Controls on removal and release processes

    International Nuclear Information System (INIS)

    Barnes, C.E.; Cochran, J.K.

    1993-01-01

    Porewater uranium profiles from Long Island Sound (LIS) and Amazon shelf sediments and LIS sediment incubation experiments indicate that both removal and release processes control U geochemistry in estuarine sediments. Release of U from sediments occurs in association with Fe reduction. A correlation between U and Fe (and Mn) observed in sediment incubation experiments suggests that there is release of U from Fe-Mn-oxides as they are reduced, consistent with data from the Amazon shelf. In both sediment porewater profiles (LIS and Amazon) and sediment incubation experiments (LIS), there is removal of U from porewater under conditions of sulfate reduction. Sediment incubation experiments indicate that the removal rate is first-order with respect to U concentration, and the rate constant is linearly correlated to sulfate reduction rates. The link between U removal and sulfate reduction (a measure of diagenetic microbial activity) is consistent with a microbial mediation of U reduction. The diffusion flux of U into LIS sediments is estimated from porewater profiles. The inclusion of this estuarine removal term in the oceanic U balance increases the importance of the sediment sink. 62 refs., 12 figs., 2 tabs

  16. Capture of Tritium Released from Cladding in the Zirconium Recycle Process

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, Barry B [ORNL; Bruffey, Stephanie H [ORNL; DelCul, Guillermo Daniel [ORNL; Walker, Trenton Baird [ORNL

    2016-08-31

    Zirconium may be recovered from the Zircaloy® cladding of used nuclear fuel (UNF) for recycle or to reduce the quantities of high-level waste destined for a geologic repository. Recovery of zirconium using a chlorination process is currently under development at the Oak Ridge National Laboratory. The approach is to treat the cladding with chlorine gas to convert the zirconium in the alloy (~98 wt % of the alloy mass) to zirconium tetrachloride. A significant fraction of the tritium (0–96%) produced in nuclear fuel during irradiation may be found in zirconium-based cladding and could be released from the cladding when the solid matrix is destroyed by the chlorination reaction. To prevent uncontrolled release of radioactive tritium to other parts of the plant or to the environment, a method to recover the tritium may be required. The focus of this effort was to (1) identify potential methods for the recovery of tritium from the off-gas of the zirconium recycle process, (2) perform scoping tests on selected recovery methods using nonradioactive gas simulants, and (3) select a process design appropriate for testing on radioactive gas streams generated by the engineering-scale zirconium recycle demonstrations on radioactive used cladding.

  17. Capture of Tritium Released from Cladding in the Zirconium Recycle Process

    Energy Technology Data Exchange (ETDEWEB)

    Bruffey, Stephanie H [ORNL; Spencer, Barry B [ORNL; DelCul, Guillermo Daniel [ORNL

    2016-08-31

    This report is issued as the first revision to FCRD-MRWFD-2016-000297. Zirconium may be recovered from the Zircaloy® cladding of used nuclear fuel (UNF) for recycle or to reduce the quantities of high-level waste destined for a geologic repository. Recovery of zirconium using a chlorination process is currently under development at the Oak Ridge National Laboratory. The approach is to treat the cladding with chlorine gas to convert the zirconium in the alloy (~98 wt % of the alloy mass) to zirconium tetrachloride. A significant fraction of the tritium (0–96%) produced in nuclear fuel during irradiation may be found in zirconium-based cladding and could be released from the cladding when the solid matrix is destroyed by the chlorination reaction. To prevent uncontrolled release of radioactive tritium to other parts of the plant or to the environment, a method to recover the tritium may be required. The focus of this effort was to (1) identify potential methods for the recovery of tritium from the off-gas of the zirconium recycle process, (2) perform scoping tests on selected recovery methods using non-radioactive gas simulants, and (3) select a process design appropriate for testing on radioactive gas streams generated by the engineering-scale zirconium recycle demonstrations on radioactive used cladding.

  18. Signal validation with control-room information-processing computers

    International Nuclear Information System (INIS)

    Belblidia, L.A.; Carlson, R.W.; Russell, J.L. Jr.

    1985-01-01

    One of the 'lessons learned' from the Three Mile Island accident focuses upon the need for a validated source of plant-status information in the control room. The utilization of computer-generated graphics to display the readings of the major plant instrumentation has introduced the capability of validating signals prior to their presentation to the reactor operations staff. The current operations philosophies allow the operator a quick look at the gauges to form an impression of the fraction of full scale as the basis for knowledge of the current plant conditions. After the introduction of a computer-based information-display system such as the Safety Parameter Display System (SPDS), operational decisions can be based upon precise knowledge of the parameters that define the operation of the reactor and auxiliary systems. The principal impact of this system on the operator will be to remove the continuing concern for the validity of the instruments which provide the information that governs the operator's decisions. (author)

  19. Mechanisms of within- and across- channel processing in comodulation masking release

    DEFF Research Database (Denmark)

    Piechowiak, Tobias

    2007-01-01

    The audibility of a target sound embedded in another masking sound can be improved by adding sound energy that is remote in frequency from both the masker and the target. This effect is known as comodulation masking release (CMR) and is observed when the remote sound and the masker share coherent...... role in our ability to deal with natural complex acoustic environments. While a large body of data has been presented, the mechanisms underlying CMR are not clear. This study proposes an auditory processing model that accounts for various aspects of CMR. The model includes an equalization...

  20. Production process validation of 2-[18F]-fluoro-2-deoxy-D-glucose

    International Nuclear Information System (INIS)

    Cantero, Miguel; Iglesias, Rocio; Aguilar, Juan; Sau, Pablo; Tardio, Evaristo; Narrillos, Marcos

    2003-01-01

    The aim of production process validation of 2-[18F]-fluoro-2-deoxi-D-glucose (FDG) was to check: A) equipments and services implicated in the production process were correctly installed, well documented, and worked properly, and B) production of FDG was done in a repetitive way according to predefined parameters. The main document was the Validation Master Plan, and steps were: installation qualification, operational qualification, performance qualification and validation final report. After finalization of all tests established in qualification steps without deviations, we concluded that the production process was validated because consistently produced FDG meeting its pre-determined specifications and quality characteristics (Au)

  1. Comparison of the accident process, radioactivity release and ground contamination between Chernobyl and Fukushima-1

    International Nuclear Information System (INIS)

    Imanaka, Tetsuji; Hayashi, Gohei; Endo, Satoru

    2015-01-01

    In this report, we have reviewed the basic features of the accident processes and radioactivity releases that occurred in the Chernobyl accident (1986) and in the Fukushima-1 accident (2011). The Chernobyl accident was a power-surge accident that was caused by a failure of control of a fission chain reaction, which instantaneously destroyed the reactor and building, whereas the Fukushima-1 accident was a loss-of-coolant accident in which the reactor cores of three units were melted by decay heat after losing the electricity supply. Although the quantity of radioactive noble gases released from Fukushima-1 exceeded the amount released from Chernobyl, the size of land area severely contaminated by 137 Cesium ( 137 Cs) was 10 times smaller around Fukushima-1 compared with around Chernobyl. The differences in the accident process are reflected in the composition of the discharged radioactivity as well as in the composition of the ground contamination. Volatile radionuclides (such as 132 Te- 132 I, 131 I, 134 Cs and 137 Cs) contributed to the gamma-ray exposure from the ground contamination around Fukishima-1, whereas a greater variety of radionuclides contributed significantly around Chernobyl. When radioactivity deposition occurred, the radiation exposure rate near Chernobyl is estimated to have been 770 μGy h −1 per initial 137 Cs deposition of 1000 kBq m −2 , whereas it was 100 μGy h −1 around Fukushima-1. Estimates of the cumulative exposure for 30 years are 970 and 570 mGy per initial deposition of 1000 kBq m −2 for Chernobyl and Fukusima-1, respectively. Of these exposures, 49 and 98% were contributed by radiocesiums ( 134 Cs + 137 Cs) around Chernobyl and Fukushima-1, respectively

  2. Validation of a pulsed electric field process to pasteurize strawberry puree

    Science.gov (United States)

    An inexpensive data acquisition method was developed to validate the exact number and shape of the pulses applied during pulsed electric fields (PEF) processing. The novel validation method was evaluated in conjunction with developing a pasteurization PEF process for strawberry puree. Both buffered...

  3. Process-generated nanoparticles from ceramic tile sintering: Emissions, exposure and environmental release.

    Science.gov (United States)

    Fonseca, A S; Maragkidou, A; Viana, M; Querol, X; Hämeri, K; de Francisco, I; Estepa, C; Borrell, C; Lennikov, V; de la Fuente, G F

    2016-09-15

    The ceramic industry is an industrial sector in need of significant process changes, which may benefit from innovative technologies such as laser sintering of ceramic tiles. Such innovations result in a considerable research gap within exposure assessment studies for process-generated ultrafine and nanoparticles. This study addresses this issue aiming to characterise particle formation, release mechanisms and their impact on personal exposure during a tile sintering activity in an industrial-scale pilot plant, as a follow-up of a previous study in a laboratory-scale plant. In addition, possible particle transformations in the exhaust system, the potential for particle release to the outdoor environment, and the effectiveness of the filtration system were also assessed. For this purpose, a tiered measurement strategy was conducted. The main findings evidence that nanoparticle emission patterns were strongly linked to temperature and tile chemical composition, and mainly independent of the laser treatment. Also, new particle formation (from gaseous precursors) events were detected, with nanoparticles efficiency of the filtration system was successfully tested and evidenced a >87% efficiency in particle number concentrations removal. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Additive advantage in characteristics of MIMCAPs on flexible silicon (100) fabric with release-first process

    KAUST Repository

    Ghoneim, Mohamed T.

    2013-11-20

    We report the inherent increase in capacitance per unit planar area of state-of-the art high-κ integrated metal/insulator/metal capacitors (MIMCAPs) fabricated on flexible silicon fabric with release-first process. We methodically study and show that our approach to transform bulk silicon (100) into a flexible fabric adds an inherent advantage of enabling higher integration density dynamic random access memory (DRAM) on the same chip area. Our approach is to release an ultra-thin silicon (100) fabric (25 μm thick) from the bulk silicon wafer, then build MIMCAPs using sputtered aluminium electrodes and successive atomic layer depositions (ALD) without break-ing the vacuum of a high-κ aluminium oxide sandwiched between two tantalum nitride layers. This result shows that we can obtain flexible electronics on silicon without sacrificing the high density integration aspects and also utilize the non-planar geometry associated with fabrication process to obtain a higher integration density compared to bulk silicon integration due to an increased normalized capacitance per unit planar area. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. NASA Construction of Facilities Validation Processes - Total Building Commissioning (TBCx)

    Science.gov (United States)

    Hoover, Jay C.

    2004-01-01

    Key Atributes include: Total Quality Management (TQM) System that looks at all phases of a project. A team process that spans boundaries. A Commissioning Authority to lead the process. Commissioning requirements in contracts. Independent design review to verify compliance with Facility Project Requirements (FPR). Formal written Commissioning Plan with Documented Results. Functional performance testing (FPT) against the requirements document.

  6. Definition and validation of process mining use cases

    NARCIS (Netherlands)

    Ailenei, I.; Rozinat, A.; Eckert, A.; Aalst, van der W.M.P.; Daniel, F.; Barkaoui, K.; Dustdar, S.

    2012-01-01

    Process mining is an emerging topic in the BPM marketplace. Recently, several (commercial) software solutions have become available. Due to the lack of an evaluation framework, it is very difficult for potential users to assess the strengths and weaknesses of these process mining tools. As the first

  7. Simple process capability analysis and quality validation of ...

    African Journals Online (AJOL)

    Many ways can be applied to improve the process and one of them is by choosing the correct six sigma's design of experiment (DOE). In this study, Taguchi's experimental design was applied to achieve high percentage of cell viability in the fermentation experiment. The process capability of this study was later analyzed by ...

  8. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    Science.gov (United States)

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  9. A study on the validity of strategic classification processes

    International Nuclear Information System (INIS)

    Tae, Jae Woong; Shin, Dong Hun

    2013-01-01

    The commodity classification is to identify strategic commodity. The export license is to verify that exports have met the conditions required by the international export control system. NSSC (Nuclear Safety and Security Commission) operates the NEPS (Nuclear Export Promotion Service) for export control of nuclear items. NEPS contributed to reduce process time related to submission of documents, issuing certificates and licenses, etc. Nonetheless, it became necessary to enhance capacity to implement export control precisely and efficiently as development of Korean nuclear industry led to sharp increase of export. To provide more efficient ways, development of the advanced export control system, IXCS (Intelligent eXport Control System) was suggested. To build IXCS successfully, export control experts have analyzed Korean export control system. Two classification processes of items and technology were derived as a result of the research. However, it may reflect real cases insufficiently because it is derived by experts' discussion. This study evaluated how well the process explains real cases. Although the derived processes explained real cases well, some recommendations for improvement were found through this study. These evaluation results will help to make classification flow charts more compatible to the current export system. Most classification reports on equipment and material deliberated specification and functions while related systems were not considered. If a 'specification review' stage is added to the current process and delete unnecessary stages, this will improve accuracy of the flow chart. In the classification of nuclear technology, detailed process to identify specific information and data need to be specified to decrease subjectivity. Whether they are imitations or not is an unnecessary factor in both processes. The successful development of IXCS needs accurate export control processes as well as IT technology. If these classification processes are

  10. Analysis of Wigner energy release process in graphite stack of shut-down uranium-graphite reactor

    OpenAIRE

    Bespala, E. V.; Pavliuk, A. O.; Kotlyarevskiy, S. G.

    2015-01-01

    Data, which finding during thermal differential analysis of sampled irradiated graphite are presented. Results of computational modeling of Winger energy release process from irradiated graphite staking are demonstrated. It's shown, that spontaneous combustion of graphite possible only in adiabatic case.

  11. Notification: Preliminary Research on EPA's Decision Making Process to Release Information Under the Freedom of Information Act

    Science.gov (United States)

    July 19, 2013. The Office of Inspector General plans to begin preliminary research on the U.S. Environmental Protection Agency’s process for deciding to release information requested under the Freedom of Information Act.

  12. Validation process of ISIS CFD software for fire simulation

    International Nuclear Information System (INIS)

    Lapuerta, C.; Suard, S.; Babik, F.; Rigollet, L.

    2012-01-01

    Fire propagation constitutes a major safety concern in nuclear facilities. In this context, IRSN is developing a CFD code, named ISIS, dedicated to fire simulations. This software is based on a coherent set of models that can be used to describe a fire in large, mechanically ventilated compartments. The system of balance equations obtained by combining these models is discretized in time using fractional step methods, including a pressure correction technique for solving hydrodynamic equations. Discretization in space combines two techniques, each proven in the relevant context: mixed finite elements for hydrodynamic equations and finite volumes for transport equations. ISIS is currently in an advanced stage of verification and validation. The results obtained for a full-scale fire test performed at IRSN are presented.

  13. Process chain validation in micro and nano replication

    DEFF Research Database (Denmark)

    Calaon, Matteo

    to quantification of replication quality over large areas of surface topography based on areal detection technique and angular diffraction measurements were developed. A series of injection molding and compression molding experiments aimed at process analysis and optimization showed the possibility to control...... features dimensional accuracy variation through the identification of relevant process parameters. Statistical design of experiment results, showed the influence of both process parameters (mold temperature, packing time, packing pressure) and design parameters (channel width and direction with respect......Innovations in nanotechnology propose applications integrating micro and nanometer structures fabricated as master geometries for final replication on polymer substrates. The possibility for polymer materials of being processed with technologies enabling large volume production introduces solutions...

  14. Simple process capability analysis and quality validation of ...

    African Journals Online (AJOL)

    GREGORY

    2011-12-16

    Dec 16, 2011 ... University Malaysia, Gombak, P.O. Box 10, 50728 Kuala Lumpur, Malaysia. Accepted 7 .... used in the manufacturing industry as a process perform- ance indicator. ... Six Sigma for Electronics design and manufacturing.

  15. Development and validation of an in vitro–in vivo correlation (IVIVC model for propranolol hydrochloride extended-release matrix formulations

    Directory of Open Access Journals (Sweden)

    Chinhwa Cheng

    2014-06-01

    Full Text Available The objective of this study was to develop an in vitro–in vivo correlation (IVIVC model for hydrophilic matrix extended-release (ER propranolol dosage formulations. The in vitro release characteristics of the drug were determined using USP apparatus I at 100 rpm, in a medium of varying pH (from pH 1.2 to pH 6.8. In vivo plasma concentrations and pharmacokinetic parameters in male beagle dogs were obtained after administering oral, ER formulations and immediate-release (IR commercial products. The similarity factor f2 was used to compare the dissolution data. The IVIVC model was developed using pooled fraction dissolved and fraction absorbed of propranolol ER formulations, ER-F and ER-S, with different release rates. An additional formulation ER-V, with a different release rate of propranolol, was prepared for evaluating the external predictability. The results showed that the percentage prediction error (%PE values of Cmax and AUC0–∞ were 0.86% and 5.95%, respectively, for the external validation study. The observed low prediction errors for Cmax and AUC0–∞ demonstrated that the propranolol IVIVC model was valid.

  16. PolyNano M.6.1.1 Process validation state-of-the-art

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Calaon, Matteo

    2012-01-01

    Nano project. Methods for replication process validation are presented and will be further investigated in WP6 “Process Chain Validation” and applied to PolyNano study cases. Based on the available information, effective best practice standard process validation will be defined and implemented...... assessment methods, and presents measuring procedures/techniques suitable for replication fidelity studies. The report reviews state‐of‐the‐art research results regarding replication obtained at different scales, tooling technologies based on surface replication, process validation trough design...

  17. Guideline validation in multiple trauma care through business process modeling.

    Science.gov (United States)

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  18. Halogenation processes of secondary organic aerosol and implications on halogen release mechanisms

    Directory of Open Access Journals (Sweden)

    J. Ofner

    2012-07-01

    Full Text Available Reactive halogen species (RHS, such as X·, X2 and HOX containing X = chlorine and/or bromine, are released by various sources like photo-activated sea-salt aerosol or from salt pans, and salt lakes. Despite many studies of RHS reactions, the potential of RHS reacting with secondary organic aerosol (SOA and organic aerosol derived from biomass-burning (BBOA has been neglected. Such reactions can constitute sources of gaseous organohalogen compounds or halogenated organic matter in the tropospheric boundary layer and can influence physicochemical properties of atmospheric aerosols.

    Model SOA from α-pinene, catechol, and guaiacol was used to study heterogeneous interactions with RHS. Particles were exposed to molecular chlorine and bromine in an aerosol smog-chamber in the presence of UV/VIS irradiation and to RHS, released from simulated natural halogen sources like salt pans. Subsequently, the aerosol was characterized in detail using a variety of physicochemical and spectroscopic methods. Fundamental features were correlated with heterogeneous halogenation, which results in new functional groups (FTIR spectroscopy, changes UV/VIS absorption, chemical composition (ultrahigh resolution mass spectroscopy (ICR-FT/MS, or aerosol size distribution. However, the halogen release mechanisms were also found to be affected by the presence of organic aerosol. Those interaction processes, changing chemical and physical properties of the aerosol are likely to influence e.g. the ability of the aerosol to act as cloud condensation nuclei, its potential to adsorb other gases with low-volatility, or its contribution to radiative forcing and ultimately the Earth's radiation balance.

  19. SmartWeld/SmartProcess - intelligent model based system for the design and validation of welding processes

    Energy Technology Data Exchange (ETDEWEB)

    Mitchner, J.

    1996-04-01

    Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

  20. Best practice strategies for validation of micro moulding process simulation

    DEFF Research Database (Denmark)

    Costa, Franco; Tosello, Guido; Whiteside, Ben

    2009-01-01

    are the optimization of the moulding process and of the tool using simulation techniques. Therefore, in polymer micro manufacturing technology, software simulation tools adapted from conventional injection moulding can provide useful assistance for the optimization of moulding tools, mould inserts, micro component...... are discussed. Recommendations regarding sampling rate, meshing quality, filling analysis methods (micro short shots, flow visualization) and machine geometry modelling are given on the basis of the comparison between simulated and experimental results within the two considered study cases.......Simulation programs in polymer micro replication technology are used for the same reasons as in conventional injection moulding. To avoid the risks of costly re-engineering, the moulding process is simulated before starting the actual manufacturing process. Important economic factors...

  1. Thermal effects from the release of selenium from a coal combustion during high-temperature processing: a review.

    Science.gov (United States)

    Hu, Jianjun; Sun, Qiang; He, Huan

    2018-04-11

    The release of selenium (Se) during coal combustion can have serious impacts on the ecological environment and human health. Therefore, it is very important to study the factors that concern the release of Se from coal combustion. In this paper, the characteristics of the release of Se from coal combustion, pyrolysis, and gasification of different coal species under different conditions are studied. The results show that the amount of released Se increases at higher combustion temperatures. There are obvious increases in the amount of released Se especially in the temperature range of 300 to 800 °C. In addition, more Se is released from the coal gasification than coal combustion process, but more Se is released from coal combustion than pyrolysis. The type of coal, rate of heating, type of mineral ions, and combustion atmosphere have different effects on the released percentage of Se. Therefore, having a good understanding of the factors that surround the release of Se during coal combustion, and then establishing the combustion conditions can reduce the impacts of this toxic element to humans and the environment.

  2. The Release of Trace Elements in the Process of Coal Coking

    Directory of Open Access Journals (Sweden)

    Jan Konieczyński

    2012-01-01

    Full Text Available In order to assess the penetration of individual trace elements into the air through their release in the coal coking process, it is necessary to determine the loss of these elements by comparing their contents in the charge coal and in coke obtained. The present research covered four coke oven batteries differing in age, technology, and technical equipment. By using mercury analyzer MA-2 and the method of ICP MS As, Be, Cd, Co, Hg, Mn, Ni, Se, Sr, Tl, V, and Zn were determined in samples of charge coal and yielded coke. Basing on the analyses results, the release coefficients of selected elements were determined. Their values ranged from 0.5 to 94%. High volatility of cadmium, mercury, and thallium was confirmed. The tests have shown that although the results refer to the selected case studies, it may be concluded that the air purity is affected by controlled emission occurring when coke oven batteries are fired by crude coke oven gas. Fugitive emission of the trace elements investigated, occurring due to coke oven leaks and openings, is small and, is not a real threat to the environment except mercury.

  3. Dynamics of nanomaterials released from polymer composites in the pelletizing process

    International Nuclear Information System (INIS)

    Kato, Nobuyuki; Yoneda, Minoru; Matsui, Yasuto

    2017-01-01

    Measures against exposure to carbon nanotubes (CNT) are necessary, especially in workplaces that handle nanomaterials, because adverse health effects are a concern. This study focuses on the dynamics of CNT released from CNT/polymer composites during the pelletizing process at a pilot factory. It is difficult to identify CNT and the base resin. By characterizing the possibility of separating CNT from the composite with a kinetic weighting coefficient, estimation can be carried out using a Computational Fluid Dynamics (CFD) simulation. The mass concentration of black carbon and the particle number concentration by diameter were measured using two different measurement apparatuses. The simulation results were then compared to the measured data. The model was verified by the correlation between the simulation and measured results. The model provided a strong correlation, indicating that the dynamics of CNT and the base resin released from the polymer composite can be simulated. It is expected that the model using the CFD simulation can be applied to the occupational health field. (paper)

  4. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Science.gov (United States)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri

    2017-09-01

    The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  5. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    Directory of Open Access Journals (Sweden)

    Fleming Michael

    2017-01-01

    Full Text Available The TALYS-generated Evaluated Nuclear Data Libraries (TENDL provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states, up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  6. Development and Validation of a National System for Routine Monitoring of Mortality in People Recently Released from Prison.

    Directory of Open Access Journals (Sweden)

    Stuart A Kinner

    Full Text Available People released from prison are at increased risk of death. However, no country has established a system for routine monitoring of mortality in this population. The aims of this study were to (a evaluate a system for routine monitoring of deaths after release from prison in Australia and (b estimate the number of deaths annually within 28 and 365 days of prison release from 2000 to 2013.Persons released from prison and deaths were identified in records held by Centrelink, Australia's national provider of unemployment benefits. Estimates generated in this manner were compared with those from a study that probabilistically linked correctional records with the National Death Index (NDI, for each calendar year 2000 to 2007. Using Centrelink data, national estimates of mortality within 28 and 365 days of release were produced for each calendar year 2000 to 2013.Compared with estimates based on linkage with the NDI, the estimated crude mortality rate based on Centrelink records was on average 52% lower for deaths within 28 days of release and 24% lower for deaths within 365 days of release. Nationally, over the period 2000 to 2013, we identified an average of 32 deaths per year within 28 days of release and 188 deaths per year within 365 days of release. The crude mortality rate for deaths within both 28 and 365 days of release increased over this time.Using routinely collected unemployment benefits data we detected the majority of deaths in people recently released from prison in Australia. These data may be sufficient for routine monitoring purposes and it may be possible to adopt a similar approach in other countries. Routine surveillance of mortality in ex-prisoners serves to highlight their extreme vulnerability and provides a basis for evaluating policy reforms designed to reduce preventable deaths.

  7. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    Directory of Open Access Journals (Sweden)

    Sriram Valavala

    2018-01-01

    Full Text Available A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH. All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  8. Material model validation for laser shock peening process simulation

    International Nuclear Information System (INIS)

    Amarchinta, H K; Grandhi, R V; Langer, K; Stargel, D S

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 10 6  s −1 , which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic–plastic behavior of materials. Elastic perfectly plastic, Johnson–Cook and Zerilli–Armstrong models are used, and the performance of each model is compared with available experimental results

  9. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  10. Field studies of the atmospheric processing of radionuclides released at Chernobyl

    International Nuclear Information System (INIS)

    Slinn, W.G.N.; Brodzinski, R.L.; Busness, K.M.

    1987-01-01

    This report gives a summary description of the goals and execution of field studies to characterize the radionuclides released from Chernobyl and to examine their long-range atmospheric transport and processing. Starting on April 28, 1986, an intensive planning and preparation effort was mounted over a very short time intercept and characterize the Chernobyl plume on its arrival over the western US. During the execution of these studies, they had very little firm information (beyond their own measurements and trajectory estimates) to confirm the location of the Chernobyl plume. On May 20, 1986, the studies were terminated, after extensive airborne sampling of air, cloud water, precipitation, soil, and vegetation as the plume traversed the western and central US and moved into Canada

  11. Dynamic modeling and validation of a lignocellulosic enzymatic hydrolysis process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2013-01-01

    The enzymatic hydrolysis process is one of the key steps in second generation biofuel production. After being thermally pretreated, the lignocellulosic material is liquefied by enzymes prior to fermentation. The scope of this paper is to evaluate a dynamic model of the hydrolysis process...... on a demonstration scale reactor. The following novel features are included: the application of the Convection–Diffusion–Reaction equation to a hydrolysis reactor to assess transport and mixing effects; the extension of a competitive kinetic model with enzymatic pH dependency and hemicellulose hydrolysis......; a comprehensive pH model; and viscosity estimations during the course of reaction. The model is evaluated against real data extracted from a demonstration scale biorefinery throughout several days of operation. All measurements are within predictions uncertainty and, therefore, the model constitutes a valuable...

  12. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  13. Empiric validation of a process for behavior change.

    Science.gov (United States)

    Elliot, Diane L; Goldberg, Linn; MacKinnon, David P; Ranby, Krista W; Kuehl, Kerry S; Moe, Esther L

    2016-09-01

    Most behavior change trials focus on outcomes rather than deconstructing how those outcomes related to programmatic theoretical underpinnings and intervention components. In this report, the process of change is compared for three evidence-based programs' that shared theories, intervention elements and potential mediating variables. Each investigation was a randomized trial that assessed pre- and post- intervention variables using survey constructs with established reliability. Each also used mediation analyses to define relationships. The findings were combined using a pattern matching approach. Surprisingly, knowledge was a significant mediator in each program (a and b path effects [pbehavior change.

  14. Sludge thermal oxidation processes: mineral recycling, energy impact, and greenhouse effect gases release

    Energy Technology Data Exchange (ETDEWEB)

    Guibelin, Eric

    2003-07-01

    Different treatment routes have been studied for a mixed sludge: the conventional agricultural use is compared with the thermal oxidation processes, including incineration (in gaseous phase) and wet air oxidation (in liquid phase). The interest of a sludge digestion prior to the final treatment has been also considered according to the two major criteria, which are the fossil energy utilisation and the greenhouse effect gases (CO{sub 2}, CH{sub 4}, N{sub 2}O) release. Thermal energy has to be recovered on thermal processes to make these processes environmentally friendly, otherwise their main interest is to extract or destroy micropollutants and pathogens from the carbon cycle. In case of continuous energy recovery, incineration can produce more energy than it consumes. Digestion is especially interesting for agriculture: according to these two schemes, the energy final balance can also be in excess. As to wet air oxidation, it is probably one of the best way to minimize greenhouse effect gases emission. (author)

  15. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  16. Process Modeling and Validation for Metal Big Area Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, Srdjan [ORNL; Nycz, Andrzej [ORNL; Noakes, Mark W. [ORNL; Chin, Charlie [Dassault Systemes; Oancea, Victor [Dassault Systemes

    2017-05-01

    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology based on the metal arc welding. A continuously fed metal wire is melted by an electric arc that forms between the wire and the substrate, and deposited in the form of a bead of molten metal along the predetermined path. Objects are manufactured one layer at a time starting from the base plate. The final properties of the manufactured object are dependent on its geometry and the metal deposition path, in addition to depending on the basic welding process parameters. Computational modeling can be used to accelerate the development of the mBAAM technology as well as a design and optimization tool for the actual manufacturing process. We have developed a finite element method simulation framework for mBAAM using the new features of software ABAQUS. The computational simulation of material deposition with heat transfer is performed first, followed by the structural analysis based on the temperature history for predicting the final deformation and stress state. In this formulation, we assume that two physics phenomena are coupled in only one direction, i.e. the temperatures are driving the deformation and internal stresses, but their feedback on the temperatures is negligible. The experiment instrumentation (measurement types, sensor types, sensor locations, sensor placements, measurement intervals) and the measurements are presented. The temperatures and distortions from the simulations show good correlation with experimental measurements. Ongoing modeling work is also briefly discussed.

  17. Modulation of drug release kinetics of shellac-based matrix tablets by in-situ polymerization through annealing process.

    Science.gov (United States)

    Limmatvapirat, Sontaya; Limmatvapirat, Chutima; Puttipipatkhachorn, Satit; Nunthanid, Jurairat; Luangtana-anan, Manee; Sriamornsak, Pornsak

    2008-08-01

    A new oral-controlled release matrix tablet based on shellac polymer was designed and developed, using metronidazole (MZ) as a model drug. The shellac-based matrix tablets were prepared by wet granulation using different amounts of shellac and lactose. The effect of annealing temperature and pH of medium on drug release from matrix tablets was investigated. The increased amount of shellac and increased annealing temperature significantly affected the physical properties (i.e., tablet hardness and tablet disintegration) and MZ release from the matrix tablets. The in-situ polymerization played a major role on the changes in shellac properties during annealing process. Though the shellac did not dissolve in acid medium, the MZ release in 0.1N HCl was faster than in pH 7.3 buffer, resulting from a higher solubility of MZ in acid medium. The modulation of MZ release kinetics from shellac-based matrix tablets could be accomplished by varying the amount of shellac or annealing temperature. The release kinetics was shifted from relaxation-controlled release to diffusion-controlled release when the amount of shellac or the annealing temperature was increased.

  18. EXPERIMENTAL VALIDATION OF CUMULATIVE SURFACE LOCATION ERROR FOR TURNING PROCESSES

    Directory of Open Access Journals (Sweden)

    Adam K. Kiss

    2016-02-01

    Full Text Available The aim of this study is to create a mechanical model which is suitable to investigate the surface quality in turning processes, based on the Cumulative Surface Location Error (CSLE, which describes the series of the consecutive Surface Location Errors (SLE in roughing operations. In the established model, the investigated CSLE depends on the currently and the previously resulted SLE by means of the variation of the width of cut. The phenomenon of the system can be described as an implicit discrete map. The stationary Surface Location Error and its bifurcations were analysed and flip-type bifurcation was observed for CSLE. Experimental verification of the theoretical results was carried out.

  19. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    Science.gov (United States)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  20. Defense Waste Processing Facility Canister Closure Weld Current Validation Testing

    Energy Technology Data Exchange (ETDEWEB)

    Korinko, P. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Maxwell, D. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2018-01-29

    Two closure welds on filled Defense Waste Processing Facility (DWPF) canisters failed to be within the acceptance criteria in the DWPF operating procedure SW4-15.80-2.3 (1). In one case, the weld heat setting was inadvertently provided to the canister at the value used for test welds (i.e., 72%) and this oversight produced a weld at a current of nominally 210 kA compared to the operating procedure range (i.e., 82%) of 240 kA to 263 kA. The second weld appeared to experience an instrumentation and data acquisition upset. The current for this weld was reported as 191 kA. Review of the data from the Data Acquisition System (DAS) indicated that three of the four current legs were reading the expected values, approximately 62 kA each, and the fourth leg read zero current. Since there is no feasible way by further examination of the process data to ascertain if this weld was actually welded at either the target current or the lower current, a test plan was executed to provide assurance that these Nonconforming Welds (NCWs) meet the requirements for strength and leak tightness. Acceptance of the welds is based on evaluation of Test Nozzle Welds (TNW) made specifically for comparison. The TNW were nondestructively and destructively evaluated for plug height, heat tint, ultrasonic testing (UT) for bond length and ultrasonic volumetric examination for weld defects, burst pressure, fractography, and metallography. The testing was conducted in agreement with a Task Technical and Quality Assurance Plan (TTQAP) (2) and applicable procedures.

  1. Perceiving pain in others: validation of a dual processing model.

    Science.gov (United States)

    McCrystal, Kalie N; Craig, Kenneth D; Versloot, Judith; Fashler, Samantha R; Jones, Daniel N

    2011-05-01

    Accurate perception of another person's painful distress would appear to be accomplished through sensitivity to both automatic (unintentional, reflexive) and controlled (intentional, purposive) behavioural expression. We examined whether observers would construe diverse behavioural cues as falling within these domains, consistent with cognitive neuroscience findings describing activation of both automatic and controlled neuroregulatory processes. Using online survey methodology, 308 research participants rated behavioural cues as "goal directed vs. non-goal directed," "conscious vs. unconscious," "uncontrolled vs. controlled," "fast vs. slow," "intentional (deliberate) vs. unintentional," "stimulus driven (obligatory) vs. self driven," and "requiring contemplation vs. not requiring contemplation." The behavioural cues were the 39 items provided by the PROMIS pain behaviour bank, constructed to be representative of the diverse possibilities for pain expression. Inter-item correlations among rating scales provided evidence of sufficient internal consistency justifying a single score on an automatic/controlled dimension (excluding the inconsistent fast vs. slow scale). An initial exploratory factor analysis on 151 participant data sets yielded factors consistent with "controlled" and "automatic" actions, as well as behaviours characterized as "ambiguous." A confirmatory factor analysis using the remaining 151 data sets replicated EFA findings, supporting theoretical predictions that observers would distinguish immediate, reflexive, and spontaneous reactions (primarily facial expression and paralinguistic features of speech) from purposeful and controlled expression (verbal behaviour, instrumental behaviour requiring ongoing, integrated responses). There are implicit dispositions to organize cues signaling pain in others into the well-defined categories predicted by dual process theory. Copyright © 2011 International Association for the Study of Pain. Published by

  2. QA/QC Reflected in ISO 11137; The Role of Dosimetry in the Validation Process

    International Nuclear Information System (INIS)

    Kovacs, A.

    2007-01-01

    Standardized dosimetry (ISO/ASTM standards) - as a tool of QC - has got key role for the validation of the sterilization and ford irradiation processes, as well as to control the radiation processing of polymer products. In radiation processing, validation and process control (e.g. sterilization, food irradiation) depend on the measurement of absorbed dose. These measurements shall be performed using a dosimetric system or systems having a known level of accuracy and precision (European standard EN552:1994). In presented lecture different aspects of the operational qualification during the radiation processing of polymer products are described

  3. Quantitative Analysis of Kr-85 Fission Gas Release from Dry Process for the Treatment of Spent PWR Fuel

    International Nuclear Information System (INIS)

    Park, Geun Il; Cho, Kwang Hun; Lee, Dou Youn; Lee, Jung Won; Park, Jang Jin; Song, Kee Chan

    2007-01-01

    As spent UO 2 fuel oxidizes to U 3 O 8 by air oxidation, a corresponding volume expansion separate grains, releasing the grain-boundary inventory of fission gases. Fission products in spent UO 2 fuel can be distributed in three major regions : the inventory in fuel-sheath gap, the inventory on grain boundaries and the inventory in UO 2 matrix. Release characteristic of fission gases depends on its distribution amount in three regions as well as spent fuel burn-up. Oxidation experiments of spent fuel at 500 .deg. C gives the information of fission gases inventory in spent fuel, and further annealing experiments at higher temperature produces matrix inventory of fission gases on segregated grain. In previous study, fractional release characteristics of Kr- 85 during OREOX (Oxidation and REduction of Oxide fuel) treatment as principal key process for recycling spent PWR fuel via DUPIC cycle have already evaluated as a function of fuel burn-up with 27.3, 35 and 65 MWd/tU. In this paper, new release experiment results of Kr-85 using spent fuel with burn- up of 58 GWd/tU are included to evaluate the fission gas release behavior. As a point of summary in fission gases release behavior, the quantitative analysis of Kr- 85 release characteristics from various spent fuels with different burn-up during voloxidation and OREOX process were reviewed

  4. Process data validation according VDI 2048 in conventional and nuclear power plants

    International Nuclear Information System (INIS)

    Langenstein, M.; Laipple, B.; Schmid, F.

    2004-01-01

    Process data validation according VDI 2048 in conventional and nuclear power plants is required for acceptance testing, process and component monitoring, and status-oriented maintenance. Once a validation system like VALI III has been certified according to VDI 2048, power plant owners can use the data obtained for efficiency increase. Further, all control variables can be adjusted so as to ensure maximum plant efficiency. (orig.)

  5. Hanford Environmental Restoration data validation process for chemical and radiochemical analyses

    International Nuclear Information System (INIS)

    Adams, M.R.; Bechtold, R.A.; Clark, D.E.; Angelos, K.M.; Winter, S.M.

    1993-10-01

    Detailed procedures for validation of chemical and radiochemical data are used to assure consistent application of validation principles and support a uniform database of quality environmental data. During application of these procedures, it was determined that laboratory data packages were frequently missing certain types of documentation causing subsequent delays in meeting critical milestones in the completion of validation activities. A quality improvement team was assembled to address the problems caused by missing documentation and streamline the entire process. The result was the development of a separate data package verification procedure and revisions to the data validation procedures. This has resulted in a system whereby deficient data packages are immediately identified and corrected prior to validation and revised validation procedures which more closely match the common analytical reporting practices of laboratory service vendors

  6. A model system for targeted drug release triggered by biomolecular signals logically processed through enzyme logic networks.

    Science.gov (United States)

    Mailloux, Shay; Halámek, Jan; Katz, Evgeny

    2014-03-07

    A new Sense-and-Act system was realized by the integration of a biocomputing system, performing analytical processes, with a signal-responsive electrode. A drug-mimicking release process was triggered by biomolecular signals processed by different logic networks, including three concatenated AND logic gates or a 3-input OR logic gate. Biocatalytically produced NADH, controlled by various combinations of input signals, was used to activate the electrochemical system. A biocatalytic electrode associated with signal-processing "biocomputing" systems was electrically connected to another electrode coated with a polymer film, which was dissolved upon the formation of negative potential releasing entrapped drug-mimicking species, an enzyme-antibody conjugate, operating as a model for targeted immune-delivery and consequent "prodrug" activation. The system offers great versatility for future applications in controlled drug release and personalized medicine.

  7. Model determination and validation for reactive wetting processes

    Energy Technology Data Exchange (ETDEWEB)

    Yost, F.G.; O`Toole, E.J.; Sackinger, P.A. [Sandia National Labs., Albuquerque, NM (United States); Swiler, T.P. [Univ. of New Mexico, Albuquerque, NM (United States). Dept. of Chemical and Nuclear Engineering

    1998-01-01

    It is shown that dissolutive wetting initially yields a metastable equilibrium. A compact model for the kinetics of approach to this metastable state is described. The technique for constructing these kinetics stems from the early work of Onsager and begins with a relationship for the entropy production. From this, a coupled set of nonlinear, ordinary differential equations can be written directly. The equations are solved numerically for the wetted area and compared with experimental data. The model captures many of the subtle complexities of dissolutive wetting such as multiple metastable states. Sessile drop experiments involving a variety of Bi-Sn alloys on solid Bi substrates were performed. Substrates prepared from small and large-grained polycrystals and single crystals were used to measure equilibrium and metastable contact angles and estimate the surface tension and equilibrium contact angle of the solid-liquid interface. The substrates were also used to investigate the coupling of the dissolution and wetting processes and to investigate the effect of substrate grain size on wetting. It was determined that the equilibrium wetting geometry is independent of linear scale and that grain size has little influence on wetting or dissolution in the Bi-Sn system. To investigate the atomic behavior of liquids at interfaces during wetting, the authors simulated wetting in the Ag-Cu system using molecular dynamics with atomic potentials and observed both atomic dynamics and structural correlations of the liquid-solid interface. The authors found that spreading is prompted by interactions between the liquid and the substrate surface that cause the liquid layer in contact with the substrate to take on some of the symmetry of the substrate surface and result in the formation of a liquid monolayer that extends beyond the major part of the liquid droplet.

  8. Insights into the swelling process and drug release mechanisms from cross-linked pectin/high amylose starch matrices

    Directory of Open Access Journals (Sweden)

    Fernanda M. Carbinatto

    2014-02-01

    Full Text Available Cross-linked pectin/high amylose mixtures were evaluated as a new excipient for matrix tablets formulations, since the mixing of polymers and cross-linking reaction represent rational tools to reach materials with modulated and specific properties that meet specific therapeutic needs. Objective: In this work the influence of polymer ratio and cross-linking process on the swelling and the mechanism driving the drug release from swellable matrix tablets prepared with this excipient was investigated. Methods: Cross-linked samples were characterized by their micromeritic properties (size and shape, density, angle of repose and flow rate and liquid uptake ability. Matrix tablets were evaluated according their physical properties and the drug release rates and mechanisms were also investigated. Results: Cross-linked samples demonstrated size homogeneity and irregular shape, with liquid uptake ability insensible to pH. Cross-linking process of samples allowed the control of drug release rates and the drug release mechanism was influenced by both polymer ratio and cross-linking process. The drug release of samples with minor proportion of pectin was driven by an anomalous transport and the increase of the pectin proportion contributed to the erosion of the matrix. Conclusion: The cross-linked mixtures of high amylose and pectin showed a suitable excipient for slowing the drug release rates.

  9. A Supervised Learning Process to Validate Online Disease Reports for Use in Predictive Models.

    Science.gov (United States)

    Patching, Helena M M; Hudson, Laurence M; Cooke, Warrick; Garcia, Andres J; Hay, Simon I; Roberts, Mark; Moyes, Catherine L

    2015-12-01

    Pathogen distribution models that predict spatial variation in disease occurrence require data from a large number of geographic locations to generate disease risk maps. Traditionally, this process has used data from public health reporting systems; however, using online reports of new infections could speed up the process dramatically. Data from both public health systems and online sources must be validated before they can be used, but no mechanisms exist to validate data from online media reports. We have developed a supervised learning process to validate geolocated disease outbreak data in a timely manner. The process uses three input features, the data source and two metrics derived from the location of each disease occurrence. The location of disease occurrence provides information on the probability of disease occurrence at that location based on environmental and socioeconomic factors and the distance within or outside the current known disease extent. The process also uses validation scores, generated by disease experts who review a subset of the data, to build a training data set. The aim of the supervised learning process is to generate validation scores that can be used as weights going into the pathogen distribution model. After analyzing the three input features and testing the performance of alternative processes, we selected a cascade of ensembles comprising logistic regressors. Parameter values for the training data subset size, number of predictors, and number of layers in the cascade were tested before the process was deployed. The final configuration was tested using data for two contrasting diseases (dengue and cholera), and 66%-79% of data points were assigned a validation score. The remaining data points are scored by the experts, and the results inform the training data set for the next set of predictors, as well as going to the pathogen distribution model. The new supervised learning process has been implemented within our live site and is

  10. Neural processes mediating the preparation and release of focal motor output are suppressed or absent during imagined movement

    Science.gov (United States)

    Eagles, Jeremy S.; Carlsen, Anthony N.

    2016-01-01

    Movements that are executed or imagined activate a similar subset of cortical regions, but the extent to which this activity represents functionally equivalent neural processes is unclear. During preparation for an executed movement, presentation of a startling acoustic stimulus (SAS) evokes a premature release of the planned movement with the spatial and temporal features of the tasks essentially intact. If imagined movement incorporates the same preparatory processes as executed movement, then a SAS should release the planned movement during preparation. This hypothesis was tested using an instructed-delay cueing paradigm during which subjects were required to rapidly release a handheld weight while maintaining the posture of the arm or to perform first-person imagery of the same task while holding the weight. In a subset of trials, a SAS was presented at 1500, 500, or 200 ms prior to the release cue. Task-appropriate preparation during executed and imagined movements was confirmed by electroencephalographic recording of a contingent negative variation waveform. During preparation for executed movement, a SAS often resulted in premature release of the weight with the probability of release progressively increasing from 24 % at −1500 ms to 80 % at −200 ms. In contrast, the SAS rarely (movement. However, the SAS frequently evoked the planned postural response (suppression of bicep brachii muscle activity) irrespective of the task or timing of stimulation (even during periods of postural hold without preparation). These findings provide evidence that neural processes mediating the preparation and release of the focal motor task (release of the weight) are markedly attenuated or absent during imagined movement and that postural and focal components of the task are prepared independently. PMID:25744055

  11. Improvement of the model for surface process of tritium release from lithium oxide

    International Nuclear Information System (INIS)

    Yamaki, Daiju; Iwamoto, Akira; Jitsukawa, Shiro

    2000-01-01

    Among the various tritium transport processes in lithium ceramics, the importance and the detailed mechanism of surface reactions remain to be elucidated. The dynamic adsorption and desorption model for tritium desorption from lithium ceramics, especially Li 2 O was constructed. From the experimental results, it was considered that both H 2 and H 2 O are dissociatively adsorbed on Li 2 O and generate OH - on the surface. In the first model developed in 1994, it was assumed that either the dissociative adsorption of H 2 or H 2 O on Li 2 O generates two OH - on the surface. However, recent calculation results show that the generation of one OH - and one H - is more stable than that of two OH - s by the dissociative adsorption of H 2 . Therefore, assumption of H 2 adsorption and desorption in the first model is improved and the tritium release behavior from Li 2 O surface is evaluated again by using the improved model. The tritium residence time on the Li 2 O surface is calculated using the improved model, and the results are compared with the experimental results. The calculation results using the improved model agree well with the experimental results than those using the first model

  12. Release of carbon nanotubes from an epoxy-based nanocomposite during an abrasion process.

    Science.gov (United States)

    Schlagenhauf, Lukas; Chu, Bryan T T; Buha, Jelena; Nüesch, Frank; Wang, Jing

    2012-07-03

    The abrasion behavior of an epoxy/carbon nanotube (CNT) nanocomposite was investigated. An experimental setup has been established to perform abrasion, particle measurement, and collection all in one. The abraded particles were characterized by particle size distribution and by electron microscopy. The abrasion process was carried out with a Taber Abraser, and the released particles were collected by a tube for further investigation. The particle size distributions were measured with a scanning mobility particle sizer (SMPS) and an aerodynamic particle sizer (APS) and revealed four size modes for all measured samples. The mode corresponding to the smallest particle sizes of 300-400 nm was measured with the SMPS and showed a trend of increasing size with increasing nanofiller content. The three measured modes with particle sizes from 0.6 to 2.5 μm, measured with the APS, were similar for all samples. The measured particle concentrations were between 8000 and 20,000 particles/cm(3) for measurements with the SMPS and between 1000 and 3000 particles/cm(3) for measurements with the APS. Imaging by transmission electron microscopy (TEM) revealed that free-standing individual CNTs and agglomerates were emitted during abrasion.

  13. Melanosomes are transferred from melanocytes to keratinocytes through the processes of packaging, release, uptake, and dispersion.

    Science.gov (United States)

    Ando, Hideya; Niki, Yoko; Ito, Masaaki; Akiyama, Kaoru; Matsui, Mary S; Yarosh, Daniel B; Ichihashi, Masamitsu

    2012-04-01

    Recent studies have described the role of shedding vesicles as physiological conveyers of intracellular components between neighboring cells. Here we report that melanosomes are one example of shedding vesicle cargo, but are processed by a previously unreported mechanism. Pigment globules were observed to be connected to the filopodia of melanocyte dendrites, which have previously been shown to be conduits for melanosomes. Pigment globules containing multiple melanosomes were released from various areas of the dendrites of normal human melanocytes derived from darkly pigmented skin. The globules were then captured by the microvilli of normal human keratinocytes, also derived from darkly pigmented skin, which incorporated them in a protease-activated receptor-2 (PAR-2)-dependent manner. After the pigment globules were ingested by the keratinocytes, the membrane that surrounded each melanosome cluster was gradually degraded, and the individual melanosomes then spread into the cytosol and were distributed primarily in the perinuclear area of each keratinocyte. These results suggest a melanosome transfer pathway wherein melanosomes are transferred from melanocytes to keratinocytes via the shedding vesicle system. This packaging system generates pigment globules containing multiple melanosomes in a unique manner.

  14. Kinetics and Mechanism of Metal Retention/Release in Geochemical Processes in Soil - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Robert W.

    2000-12-29

    Effective, remediation of soils contaminated with heavy metals requires a better understanding of the mechanisms by which the metals are retained/released in soils over a long period of time. Studies on reaction of Cr(VI) with iron-rich clays indicated that structural iron (II) in these surfaces is capable of reducing chromate to chromium (III). We found that iron (II) either found naturally or produced by treatment of clay with sodium dithionite, effectively reduced Cr (VI) to Cr (III). Thus, in situ remediation of chromium combines reduction of Cr (VI) to Cr (III) and immobilization of chromium on mineral surfaces. During this study, lead sorption on a kaolin surface was found to be a rapid and a pH dependant process in which lead sorption significantly increased with the amount of phosphate on the clay surface. This study verifies that methylmercury cation remains intact when it binds to humic acids, forming a monodentate complex with some sub-population of humic thiol ligands .

  15. Adopted levels and derived limits for Ra-226 and the decision making processes concerning TENORM releases

    International Nuclear Information System (INIS)

    Paschoa, A.S.

    2002-01-01

    A fraction of a primary dose limit can be, in general, agreed upon as a dose related level to be adopted in decision-making processes. In the case of TENORM releases, fractions of primary dose levels for 226 Ra, 228 Ra, and 210 Po may be of particular importance to establish adopted levels for 226 Ra could be adopted at the highest portion of the natural background variation. Above such level, intervention and remedial action levels could also be adopted. All those levels would be fractions of the primary level, but translated in terms of derived limits expressed in practical units. Derived limits would then be calculated by using environmental models. In such approach 'critical groups' would have to be carefully defined and identified. In addition, the size of a critical group would be chosen to be used in environmental modeling. Site specific environmental models and parameters are desirable, though unavailable, or very difficult to obtain, in most cases. Thus, mathematical models and parameters of more generic nature are often used. A sensitive parametric analysis can make a ranking of the parameters used in a model, allowing one to choose how important each parameter will be for the model output. The paper will point out that when using the adopted levels and derived limits, as suggested above, the uncertainties and importance of the parameters entering an environmental model can make the difference for decision makers to take the right or wrong decision, as far as radiological protection is concerned. (author)

  16. Can gamma irradiation during radiotherapy influence the metal release process for biomedical CoCrMo and 316L alloys?

    Science.gov (United States)

    Wei, Zheng; Edin, Jonathan; Karlsson, Anna Emelie; Petrovic, Katarina; Soroka, Inna L; Odnevall Wallinder, Inger; Hedberg, Yolanda

    2018-02-09

    The extent of metal release from implant materials that are irradiated during radiotherapy may be influenced by irradiation-formed radicals. The influence of gamma irradiation, with a total dose of relevance for radiotherapy (e.g., for cancer treatments) on the extent of metal release from biomedical stainless steel AISI 316L and a cobalt-chromium alloy (CoCrMo) was investigated in physiological relevant solutions (phosphate buffered saline with and without 10 g/L bovine serum albumin) at pH 7.3. Directly after irradiation, the released amounts of metals were significantly higher for irradiated CoCrMo as compared to nonirradiated CoCrMo, resulting in an increased surface passivation (enhanced passive conditions) that hindered further release. A similar effect was observed for 316L showing lower nickel release after 1 h of initially irradiated samples as compared to nonirradiated samples. However, the effect of irradiation (total dose of 16.5 Gy) on metal release and surface oxide composition and thickness was generally small. Most metals were released initially (within seconds) upon immersion from CoCrMo but not from 316L. Albumin induced an increased amount of released metals from AISI 316L but not from CoCrMo. Albumin was not found to aggregate to any greater extent either upon gamma irradiation or in the presence of trace metal ions, as determined using different light scattering techniques. Further studies should elucidate the effect of repeated friction and fractionated low irradiation doses on the short- and long term metal release process of biomedical materials. © 2018 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2018. © 2018 The Authors Journal of Biomedical Materials Research Part B: Applied Biomaterials Published by Wiley Periodicals, Inc.

  17. Development and validation of stability indicating method for the quantitative determination of venlafaxine hydrochloride in extended release formulation using high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Jaspreet Kaur

    2010-01-01

    Full Text Available Objective : Venlafaxine,hydrochloride is a structurally novel phenethyl bicyclic antidepressant, and is usually categorized as a serotonin-norepinephrine reuptake inhibitor (SNRI but it has been referred to as a serotonin-norepinephrine-dopamine reuptake inhibitor. It inhibits the reuptake of dopamine. Venlafaxine HCL is widely prescribed in the form of sustained release formulations. In the current article we are reporting the development and validation of a fast and simple stability indicating, isocratic high performance liquid chromatographic (HPLC method for the determination of venlafaxine hydrochloride in sustained release formulations. Materials and Methods : The quantitative determination of venlafaxine hydrochloride was performed on a Kromasil C18 analytical column (250 x 4.6 mm i.d., 5 μm particle size with 0.01 M phosphate buffer (pH 4.5: methanol (40: 60 as a mobile phase, at a flow rate of 1.0 ml/min. For HPLC methods, UV detection was made at 225 nm. Results : During method validation, parameters such as precision, linearity, accuracy, stability, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions : The method has been successfully applied for the quantification and dissolution profiling of Venlafaxine HCL in sustained release formulation. The method presents a simple and reliable solution for the routine quantitative analysis of Venlafaxine HCL.

  18. Nifedipine-loaded polymeric nanocapsules: validation of a stability-indicating HPLC method to evaluate the drug entrapment efficiency and in vitro release profiles.

    Science.gov (United States)

    Granada, Andréa; Tagliari, Monika Piazzon; Soldi, Valdir; Silva, Marcos António Segatto; Zanetti-Ramos, Betina Ghiel; Fernandes, Daniel; Stulzer, Hellen Karine

    2013-01-01

    A simple stability-indicating analytical method was developed and validated to quantify nifedipine in polymeric nanocapsule suspensions; an in vitro drug release study was then carried out. The analysis was performed using an RP C18 column, UV-Vis detection at 262 nm, and methanol-water (70 + 30, v/v) mobile phase at a flow rate of 1.2 mL/min. The method was validated in terms of specificity, linearity and range, LOQ, accuracy, precision, and robustness. The results obtained were within the acceptable ranges. The nanocapsules, made of poly(epsilon-caprolactone), were prepared by the solvent displacement technique and showed high entrapment efficiency. The entrapment efficiency was 97.6 and 98.2% for the nifedipine-loaded polymeric nanocapsules prepared from polyvinyl alcohol (PVA) and Pluronic F68 (PF68), respectively. The particle size and zeta potential of nanocapsules were found to be influenced by the nature of the stabilizer used. The mean diameter and zeta potential for nanocapsules with PVA and PF68 were 290.9 and 179.9 nm, and -17.7 mV and -32.7 mV, respectively. The two formulations prepared showed a drug release of up to 70% over 4 days. This behavior indicates the viability of this drug delivery system for use as a controlled-release system.

  19. Adopted levels and derived limits for Ra-226 and the decision making processes concerning TENORM releases

    International Nuclear Information System (INIS)

    Paschoa, A.S.

    2002-01-01

    A fraction of a primary dose limit can be, in general, agreed upon as a dose related level to be adopted in decision-making processes. In the case of TENORM releases, fractions of primary dose levels for 226 Ra, 228 Ra, and 210 Po may be of particular importance to establish adopted levels and derived limits to guide decision making processes. Thus, for example, a registration level for 226 Ra could be adopted at the highest portion of the natural background variation. Above such level, intervention and remedial action levels could also be adopted. All those levels would be fractions of the primary level, but translated in terms of derived limits expressed in practical units. Derived limits would then be calculated by using environmental models. In such approach 'critical groups' would have to be carefully defined and identified. In addition, the size of a critical group would be chosen to be used in environmental modeling. Site specific environmental models and parameters are desirable, though unavailable, or very difficult to obtain, in most cases. Thus, mathematical models and parameters of more generic nature are often used. A sensitive parametric analysis can make a ranking of the parameters used in a model, allowing one to choose how important each parameter will be for the model output. The paper will point out that when using the adopted levels and derived limits, as suggested above, the uncertainties and importance of the parameters entering an environmental model can make the difference for decision makers to take the right or wrong decision, as far as radiological protection is concerned. (author)

  20. The role of surface charge in the desolvation process of gelatin: implications in nanoparticle synthesis and modulation of drug release

    Directory of Open Access Journals (Sweden)

    Ahsan SM

    2017-01-01

    Full Text Available Saad M Ahsan, Chintalagiri Mohan Rao Centre for Cellular and Molecular Biology, Council of Scientific and Industrial Research, Hyderabad, Telangana, India Abstract: The process of moving hydrophobic amino acids into the core of a protein by desolvation is important in protein folding. However, a rapid and forced desolvation can lead to precipitation of proteins. Desolvation of proteins under controlled conditions generates nanoparticles – homogeneous aggregates with a narrow size distribution. The protein nanoparticles, under physiological conditions, undergo surface erosion due to the action of proteases, releasing the entrapped drug/gene. The packing density of protein nanoparticles significantly influences the release kinetics. We have investigated the desolvation process of gelatin, exploring the role of pH and desolvating agent in nanoparticle synthesis. Our results show that the desolvation process, initiated by the addition of acetone, follows distinct pathways for gelatin incubated at different pH values and results in the generation of nanoparticles with varying matrix densities. The nanoparticles synthesized with varying matrix densities show variations in drug loading and protease-dependent extra- and intracellular drug release. These results will be useful in fine-tuning the synthesis of nanoparticles with desirable drug release profiles. Keywords: protein desolvation, nanoparticle assembly, gelatin nanoparticle synthesis, protease susceptibility, intracellular drug release

  1. Natural and industrial analogues for release of CO2 from storagereservoirs: Identification of features, events, and processes and lessonslearned

    Energy Technology Data Exchange (ETDEWEB)

    Lewicki, Jennifer L.; Birkholzer, Jens; Tsang, Chin-Fu

    2006-03-03

    The injection and storage of anthropogenic CO{sub 2} in deep geologic formations is a potentially feasible strategy to reduce CO{sub 2} emissions and atmospheric concentrations. While the purpose of geologic carbon storage is to trap CO{sub 2} underground, CO{sub 2} could migrate away from the storage site into the shallow subsurface and atmosphere if permeable pathways such as well bores or faults are present. Large-magnitude releases of CO{sub 2} have occurred naturally from geologic reservoirs in numerous volcanic, geothermal, and sedimentary basin settings. Carbon dioxide and natural gas have also been released from geologic CO{sub 2} reservoirs and natural gas storage facilities, respectively, due to influences such as well defects and injection/withdrawal processes. These systems serve as natural and industrial analogues for the potential release of CO{sub 2} from geologic storage reservoirs and provide important information about the key features, events, and processes (FEPs) that are associated with releases, as well as the health, safety, and environmental consequences of releases and mitigation efforts that can be applied. We describe a range of natural releases of CO{sub 2} and industrial releases of CO{sub 2} and natural gas in the context of these characteristics. Based on this analysis, several key conclusions can be drawn, and lessons can be learned for geologic carbon storage. First, CO{sub 2} can both accumulate beneath, and be released from, primary and secondary reservoirs with capping units located at a wide range of depths. Both primary and secondary reservoir entrapments for CO{sub 2} should therefore be well characterized at storage sites. Second, many natural releases of CO{sub 2} have been correlated with a specific event that triggered the release, such as magmatic fluid intrusion or seismic activity. The potential for processes that could cause geomechanical damage to sealing cap rocks and trigger the release of CO{sub 2} from a storage

  2. Respiratory symptoms and ex vivo cytokine release are associated in workers processing herring

    DEFF Research Database (Denmark)

    Bønløkke, Jakob Hjort; Thomassen, Mads; Viskum, Sven

    2004-01-01

    in the plasma by chemiluminescence ELISA. RESULTS: Among smoking fish-factory workers the forced vital capacity (FVC) was higher (per cent predicted 92.0 vs 85.0; P=0.028) than among municipal workers. Fish rinsing water induced WBA IL-8 release to higher levels than LPS and glucan. Among non-smokers...... the induced IL-1beta release for rinsing water ( P=0.007) and the IL-8 release for skin ( P=0.001) and meat ( P=0.003) were higher in fish-factory workers than in municipal workers. The IL-1beta release for rinsing water ( P=0.028) and skin ( P=0.041) was higher among non-smokers than among smokers, and so...... was the IL-8 release for rinsing water ( P=0.008). CONCLUSIONS: Assessing the cytokine release by use of the WBA we identified substances in the occupational environment with a pro-inflammatory potential comparable to that of LPS. The cytokine release for fish constituents was highest among non-smoking fish...

  3. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  4. Enhancing phosphorus release from waste activated sludge containing ferric or aluminum phosphates by EDTA addition during anaerobic fermentation process.

    Science.gov (United States)

    Zou, Jinte; Zhang, Lili; Wang, Lin; Li, Yongmei

    2017-03-01

    The effect of ethylene diamine tetraacetic acid (EDTA) addition on phosphorus release from biosolids and phosphate precipitates during anaerobic fermentation was investigated. Meanwhile, the impact of EDTA addition on the anaerobic fermentation process was revealed. The results indicate that EDTA addition significantly enhanced the release of phosphorus from biosolids, ferric phosphate precipitate and aluminum phosphate precipitate during anaerobic fermentation, which is attributed to the complexation of metal ions and damage of cell membrane caused by EDTA. With the optimal EDTA addition of 19.5 mM (0.41 gEDTA/gSS), phosphorus release efficiency from biosolids was 82%, which was much higher than that (40%) without EDTA addition. Meanwhile, with 19.5 mM EDTA addition, almost all the phosphorus in ferric phosphate precipitate was released, while only 57% of phosphorus in aluminum phosphate precipitate was released. This indicates that phosphorus in ferric phosphate precipitate was much easier to be released than that in aluminum phosphate precipitate during anaerobic fermentation of sludge. In addition, proper EDTA addition facilitated the production of soluble total organic carbon and volatile fatty acids, as well as solid reduction during sludge fermentation, although methane production could be inhibited. Therefore, EDTA addition can be used as an alternative method for recovering phosphorus from waste activated sludge containing ferric or aluminum precipitates, as well as recovery of soluble carbon source. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Effect of WOW process parameters on morphology and burst release of FITC-dextran loaded PLGA microspheres.

    Science.gov (United States)

    Mao, Shirui; Xu, Jing; Cai, Cuifang; Germershaus, Oliver; Schaper, Andreas; Kissel, Thomas

    2007-04-04

    Using fluorescein isothiocyanate labeled dextran (FITC-dextran 40, FD40) as a hydrophilic model compound, microspheres were prepared by a WOW double emulsion technique. Influence of process parameters on microsphere morphology and burst release of FD40 from PLGA microspheres was studied. Internal morphology of microspheres was investigated by stereological method via cryo-cutting technique and scanning electron microscopy (SEM). Drug distribution in microspheres was observed with confocal laser scanning microscopy (CLSM). Polymer nature (RG503 and RG503H) had significant influence on the micro-morphology of microspheres. Increase in continuous water phase volume (W2) led to increased surface porosity but decreased internal porosity. By increasing PVA concentration in the continuous phase from 0.1 to 1%, particle size changed marginally but burst release decreased from 12.2 to 5.9%. Internal porosity of microspheres decreased considerably with increasing polymer concentration. Increase in homogenization speed during the primary emulsion preparation led to decreased internal porosity. Burst release decreased with increasing drug loading but increased with drug molecular weight. Drug distribution in microspheres depended on preparation method. The porosity of microspheres decreased with time in the diffusion stage, but internal morphology had no influence on the release behavior in the bioerosion stage. In summary, surface porosity and internal morphology play a significant role in the release of hydrophilic macromolecules from biodegradable microspheres in the initial release phase characterized by pore diffusion.

  6. Designing Process Improvement of Finished Good On Time Release and Performance Indicator Tool in Milk Industry Using Business Process Reengineering Method

    Science.gov (United States)

    Dachyar, M.; Christy, E.

    2014-04-01

    To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.

  7. Development and evaluation of diltiazem hydrochloride controlled-release pellets by fluid bed coating process

    Directory of Open Access Journals (Sweden)

    Mikkilineni Bhanu Prasad

    2013-01-01

    Full Text Available The aim of the present study was to develop controlled-release pellets of diltiazem HCl with ethyl cellulose and hydroxylpropyl methylcellulose phthalate as the release rate retarding polymers by fluid bed coating technique. The prepared pellets were evaluated for drug content, particle size, subjected to Scanning Electron Microscopy (SEM and Differential Scanning Calori metry (DSC, and evaluated for in vitro release. Stability studies were carried out on the optimized formulations for a period of 3 months. The drug content was in the range of 97%-101%. The mean particle size of the drug-loaded pellets was in the range 700-785 μm. The drug release rate decreased as the concentration of ethyl cellulose increased in the pellet formulations. Among the prepared formulations, FDL10 and FDL11 showed 80% drug release in 16 h, matching with USP dissolution test 6 for diltiazem HCl extended-release capsules. SEM photographs confirmed that the prepared formulations were spherical in nature with a smooth surface. The compatibility between drug and polymers in the drug-loaded pellets was confirmed by DSC studies. Stability studies indicated that the pellets were stable.

  8. Evaluating the Release, Delivery, and Deployment Processes of Eight Large Product Software Vendors applying the Customer Configuration Update Model

    NARCIS (Netherlands)

    Jansen, S.R.L.; Brinkkemper, S.

    2006-01-01

    For software vendors the processes of release, delivery, and deployment to customers are inherently complex. However, software vendors can greatly improve their product quality and quality of service by applying a model that focuses on customer interaction if such a model were available. This

  9. Using the Bongwana natural CO2 release to understand leakage processes and develop monitoring

    Science.gov (United States)

    Jones, David; Johnson, Gareth; Hicks, Nigel; Bond, Clare; Gilfillan, Stuart; Kremer, Yannick; Lister, Bob; Nkwane, Mzikayise; Maupa, Thulani; Munyangane, Portia; Robey, Kate; Saunders, Ian; Shipton, Zoe; Pearce, Jonathan; Haszeldine, Stuart

    2016-04-01

    Natural CO2 leakage along the Bongwana Fault in South Africa is being studied to help understand processes of CO2 leakage and develop monitoring protocols. The Bongwana Fault crops out over approximately 80 km in KwaZulu-Natal province, South Africa. In outcrop the fault is expressed as a broad fracture corridor in Dwyka Tillite, with fractures oriented approximately N-S. Natural emissions of CO2 occur at various points along the fault, manifest as travertine cones and terraces, bubbling in the rivers and as gas fluxes through soil. Exposed rock outcrop shows evidence for Fe-staining around fractures and is locally extensively kaolinitised. The gas has also been released through a shallow water well, and was exploited commercially in the past. Preliminary studies have been carried out to better document the surface emissions using near surface gas monitoring, understand the origin of the gas through major gas composition and stable and noble gas isotopes and improve understanding of the structural controls on gas leakage through mapping. In addition the impact of the leaking CO2 on local water sources (surface and ground) is being investigated, along with the seismic activity of the fault. The investigation will help to build technical capacity in South Africa and to develop monitoring techniques and plans for a future CO2 storage pilot there. Early results suggest that CO2 leakage is confined to a relatively small number of spatially-restricted locations along the weakly seismically active fault. Fracture permeability appears to be the main method by which the CO2 migrates to the surface. The bulk of the CO2 is of deep origin with a minor contribution from near surface biogenic processes as determined by major gas composition. Water chemistry, including pH, DO and TDS is notably different between CO2-rich and CO2-poor sites. Soil gas content and flux effectively delineates the fault trace in active leakage sites. The fault provides an effective testing ground for

  10. On the improvement of IT process maturity: assessment, recommendation and validation

    Directory of Open Access Journals (Sweden)

    Dirgahayu Teduh

    2018-01-01

    Full Text Available The use of information technology (IT in enterprises must be governed and managed appropriately using IT processes. The notion of IT process maturity is useful to measure the actual performance and to define the desired performance of IT processes. Improvements are necessary when there are gaps between the actual and desired performance. Most literatures focus on IT process maturity assessment. They do not address how to improve IT process maturity. This paper proposes an approach to enterprise IT process maturity improvement for COBIT processes. The approach consists of three activities, i.e. IT process maturity assessment, recommendation, and validation. Assessment is to recognise the process’ control objectives maturity. From the assessment results, recommendation identifies control objectives that must be improved and then suggests improvement actions. The prescriptive nature of the control objectives facilitates in suggesting those actions. Recommendations for managements are defined by abstracting similar actions. Validation checks whether the recommendations match with the enterprise needs and capability. It includes a scale for validation, in which enterprise’s capability is categorized into (i not capable, (ii capable with great efforts, and (iii fully capable. The paper illustrates the approach with a case study.

  11. A Self-Peeling Vat for Improved Release Capabilities During DLP Materials Processing

    DEFF Research Database (Denmark)

    Pedersen, David Bue; Zhang, Yang; Nielsen, Jakob Skov

    2016-01-01

    for an eased release of the manufactured part from the vat by means of a flexible membrane system. A membrane of fluorinated ethylene polymer will through elastic deformation automatically peel off the part as the part is lifted during layer changes. Peeling has been qualified by means of a truncated inverted...... cone as test geometry. As the cross-sectional diameter of the cone increase throughout the build-job, the geometry will release from the glass based build platform at the point where the peeling force exceed the adhesion force between platform and part. At failure point the lateral surface area...... of the top and bottom of the truncated cone is used as a measure of the performance of the vat with respect to release-capability. This has been tested at increasing manufacturing rates. The new self-peeling vat outperformed industrial state-of-the-art vats by 814% percent....

  12. Selective Activation of Cholinergic Interneurons Enhances Accumbal Phasic Dopamine Release: Setting the Tone for Reward Processing

    Directory of Open Access Journals (Sweden)

    Roger Cachope

    2012-07-01

    Full Text Available Dopamine plays a critical role in motor control, addiction, and reward-seeking behaviors, and its release dynamics have traditionally been linked to changes in midbrain dopamine neuron activity. Here, we report that selective endogenous cholinergic activation achieved via in vitro optogenetic stimulation of nucleus accumbens, a terminal field of dopaminergic neurons, elicits real-time dopamine release. This mechanism occurs via direct actions on dopamine terminals, does not require changes in neuron firing within the midbrain, and is dependent on glutamatergic receptor activity. More importantly, we demonstrate that in vivo selective activation of cholinergic interneurons is sufficient to elicit dopamine release in the nucleus accumbens. Therefore, the control of accumbal extracellular dopamine levels by endogenous cholinergic activity results from a complex convergence of neurotransmitter/neuromodulator systems that may ultimately synergize to drive motivated behavior.

  13. On the release of cppxfel for processing X-ray free-electron laser images.

    Science.gov (United States)

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian

    2016-06-01

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.

  14. Release modes and processes relevant to source-term calculations at Yucca Mountain

    International Nuclear Information System (INIS)

    Apted, M.J.

    1994-01-01

    The feasibility of permanent disposal of radioactive high-level waste (HLW) in repositories located in deep geologic formations is being studied world-wide. The most credible release pathway is interaction between groundwater and nuclear waste forms, followed by migration of radionuclide-bearing groundwater to the accessible environment. Under hydrologically unsaturated conditions, vapor transport of volatile radionuclides is also possible. The near-field encompasses the waste packages composed of engineered barriers (e.g. man-made materials, such as vitrified waste forms, corrosion-resistant containers), while the far-field includes the natural barriers (e.g. host rock, hydrologic setting). Taken together, these two subsystems define a series of multiple, redundant barriers that act to assure the safe isolation of nuclear waste. In the U.S., the Department of energy (DOE) is investigating the feasibility of safe, long-term disposal of high-level nuclear waste at the Yucca Mountain site in Nevada. The proposed repository horizon is located in non-welded tuffs within the unsaturated zone (i.e. above the water table) at Yucca Mountain. The purpose of this paper is to describe the source-term models for radionuclide release from waste packages at Yucca Mountain site. The first section describes the conceptual release modes that are relevant for this site and waste package design, based on a consideration of the performance of currently proposed engineered barriers under expected and unexpected conditions. No attempt is made to asses the reasonableness nor probability of occurrence for any specific release mode. The following section reviews the waste-form characteristics that are required to model and constrain the release of radionuclides from the waste package. The next section present mathematical models for the conceptual release modes, selected from those that have been implemented into a probabilistic total system assessment code developed for the Electric Power

  15. Process for the transport of heat energy released by a nuclear reactor

    International Nuclear Information System (INIS)

    Nuernberg, H.W.; Wolff, G.

    1978-01-01

    The heat produced in a nuclear reactor is converted into latent chemical binding energy. The heat can be released again below 400 0 C by recombination after transport by decomposition of ethane or propane into ethylene or propylene and hydrogen. (TK) [de

  16. Fibrinopeptides A and B release in the process of surface fibrin formation

    Czech Academy of Sciences Publication Activity Database

    Riedel, T.; Suttnar, J.; Brynda, Eduard; Houska, Milan; Medved, L.; Dyr, J. E.

    2011-01-01

    Roč. 117, č. 5 (2011), s. 1700-1706 ISSN 0006-4971 R&D Projects: GA AV ČR KAN200670701 Institutional research plan: CEZ:AV0Z40500505 Keywords : fibrinopeptide release * adsorbed fibrinogen * thrombin Subject RIV: CD - Macromolecular Chemistry Impact factor: 9.898, year: 2011

  17. Preliminary Process Theory does not validate the Comparison Question Test: A comment on Palmatier and Rovner

    NARCIS (Netherlands)

    Ben-Shakar, G.; Gamer, M.; Iacono, W.; Meijer, E.; Verschuere, B.

    2015-01-01

    Palmatier and Rovner (2015) attempt to establish the construct validity of the Comparison Question Test (CQT) by citing extensive research ranging from modern neuroscience to memory and psychophysiology. In this comment we argue that merely citing studies on the preliminary process theory (PPT) of

  18. Validation of a functional model for integration of safety into process system design

    DEFF Research Database (Denmark)

    Wu, J.; Lind, M.; Zhang, X.

    2015-01-01

    with the process system functionalities as required for the intended safety applications. To provide the scientific rigor and facilitate the acceptance of qualitative modelling, this contribution focuses on developing a scientifically based validation method for functional models. The Multilevel Flow Modeling (MFM...

  19. 76 FR 4360 - Guidance for Industry on Process Validation: General Principles and Practices; Availability

    Science.gov (United States)

    2011-01-25

    ... and Development (HFM-40), Center for Biologics Evaluation and Research (CBER), Food and Drug...] Guidance for Industry on Process Validation: General Principles and Practices; Availability AGENCY: Food... of Drug Information, Center for Drug Evaluation and Research, Food and Drug Administration, 10903 New...

  20. Identification, detection, and validation of vibrating structures: a signal processing approach

    International Nuclear Information System (INIS)

    Candy, J.V.; Lager, D.L.

    1979-01-01

    This report discusses the application of modern signal processing techniques to characterize parameters governing the vibrational response of a structure. Simulated response data is used to explore the feasibility of applying these techniques to various structural problems. On-line estimator/indentifiers are used to estimate structural parameters, validate designed structures, and detect structural failure when used with a detector

  1. Examining the Validity of Self-Reports on Scales Measuring Students' Strategic Processing

    Science.gov (United States)

    Samuelstuen, Marit S.; Braten, Ivar

    2007-01-01

    Background: Self-report inventories trying to measure strategic processing at a global level have been much used in both basic and applied research. However, the validity of global strategy scores is open to question because such inventories assess strategy perceptions outside the context of specific task performance. Aims: The primary aim was to…

  2. Construct Validity in TOEFL iBT Speaking Tasks: Insights from Natural Language Processing

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.; McNamara, Danielle S.

    2016-01-01

    This study explores the construct validity of speaking tasks included in the TOEFL iBT (e.g., integrated and independent speaking tasks). Specifically, advanced natural language processing (NLP) tools, MANOVA difference statistics, and discriminant function analyses (DFA) are used to assess the degree to which and in what ways responses to these…

  3. Rates and mechanisms of radioactive release and retention inside a waste disposal canister - in Can Processes

    Energy Technology Data Exchange (ETDEWEB)

    Oversby, V.M. (ed.) [and others

    2003-10-01

    the system that will not be present under long term disposal conditions. A simulation of long-term conditions can be done using uranium dioxide that contains a short-lived isotope of uranium, but this will not include the effects of fission product and higher actinide elements on the behaviour of the spent fuel. We designed a project that had as its objective to improve the scientific understanding of the processes that control release of radioactive species from spent fuel inside a disposal canister and the chemical changes in those species that might limit release of radioactivity from the canister. If the mechanisms that control dissolution of the fuel matrix, including self-irradiation effects, can be clarified, a more realistic assessment of the long-term behaviour of spent fuel under disposal conditions can be made. By removing uncertainties concerning waste form performance, a better assessment of the individual and collective role of the engineered barriers can be made. To achieve the overall objective of the project, the following scientific and technical objectives were set. 1. Measure the actual rate of matrix dissolution of uranium dioxide under oxidising and reducing conditions. 2. Measure the effect of alpha radiolysis on the dissolution rate of uranium dioxide under oxidising and reducing conditions. 3. Measure the dissolution rate of the matrix material of spent fuel and thereby determine the additional effects of beta and gamma radiation on uranium dioxide dissolution rate under oxidising and reducing conditions. 4. Measure the ability of actively corroding iron to reduce oxidised U(VI) to U(IV) when U is present as the complex ion uranyl carbonate. 5. Measure the rate of reduction of Np(V) species in the presence of actively corroding iron. 6. Calculate the expected equilibrium and steady state concentrations of U under the conditions of the experiments used for meeting objectives 1 through 3 and compare the calculated results with those measured in

  4. NHPoisson: An R Package for Fitting and Validating Nonhomogeneous Poisson Processes

    Directory of Open Access Journals (Sweden)

    Ana C. Cebrián

    2015-03-01

    Full Text Available NHPoisson is an R package for the modeling of nonhomogeneous Poisson processes in one dimension. It includes functions for data preparation, maximum likelihood estimation, covariate selection and inference based on asymptotic distributions and simulation methods. It also provides specific methods for the estimation of Poisson processes resulting from a peak over threshold approach. In addition, the package supports a wide range of model validation tools and functions for generating nonhomogenous Poisson process trajectories. This paper is a description of the package and aims to help those interested in modeling data using nonhomogeneous Poisson processes.

  5. On-line validation of linear process models using generalized likelihood ratios

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1981-12-01

    A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator

  6. Sterilization of health care products - Ethylene oxide - Part 1: Requirements for development, validation and routine control of a sterilization process for medical devices

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 11135 describes requirements that, if met, will provide an ethylene oxide sterilization process intended to sterilize medical devices, which has appropriate microbicidal activity. Furthermore, compliance with the requirements ensures that this activity is both reliable and reproducible so that it can be predicted, with reasonable confidence, that there is a low level of probability of there being a viable microorganism present on product after sterilization. Specification of this probability is a matter for regulatory authorities and may vary from country to country. The paper provides information on scope, normative references, terms and definitions, quality management systems, sterilizing agent characterization, process and equipment characterization, product definition, process definition, validation, routine monitoring and control, product release from sterilization and maintaining process effectiveness followed by Annex A (Determination of lethal rate of the sterilization process - Biological indicator/bioburden approach), Annex B (Conservative determination of lethal rate of the sterilization process - Overkill approach, annex C (General guidance) and a bibliography.

  7. Recalculation of an artificially released avalanche with SAMOS and validation with measurements from a pulsed Doppler radar

    Directory of Open Access Journals (Sweden)

    R. Sailer

    2002-01-01

    Full Text Available A joint experiment was carried out on 10 February 1999 by the Swiss Federal Institute for Snow and Avalanche Research (SFISAR and the Austrian Institute for Avalanche and Torrent Research (AIATR, of the Federal Office and Re-search Centre for Forests, BFW to measure forces and velocities at the full scale experimental site CRÊTA BESSE in VALLÉE DE LA SIONNE, Canton du Valais, Switzerland. A huge avalanche could be released artificially, which permitted extensive investigations (dynamic measurements, im-provement of measurement systems, simulation model verification, design of protective measures, etc.. The results of the velocity measurements from the dual frequency pulsed Doppler avalanche radar of the AIATR and the recalculation with the numerical simulation model SAMOS are explained in this paper.

  8. Validation of the TTM processes of change measure for physical activity in an adult French sample.

    Science.gov (United States)

    Bernard, Paquito; Romain, Ahmed-Jérôme; Trouillet, Raphael; Gernigon, Christophe; Nigg, Claudio; Ninot, Gregory

    2014-04-01

    Processes of change (POC) are constructs from the transtheoretical model that propose to examine how people engage in a behavior. However, there is no consensus about a leading model explaining POC and there is no validated French POC scale in physical activity This study aimed to compare the different existing models to validate a French POC scale. Three studies, with 748 subjects included, were carried out to translate the items and evaluate their clarity (study 1, n = 77), to assess the factorial validity (n = 200) and invariance/equivalence (study 2, n = 471), and to analyze the concurrent validity by stage × process analyses (study 3, n = 671). Two models displayed adequate fit to the data; however, based on the Akaike information criterion, the fully correlated five-factor model appeared as the most appropriate to measure POC in physical activity. The invariance/equivalence was also confirmed across genders and student status. Four of the five existing factors discriminated pre-action and post-action stages. These data support the validation of the POC questionnaire in physical activity among a French sample. More research is needed to explore the longitudinal properties of this scale.

  9. Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greenwald, Martin

    2017-06-02

    The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucial for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management

  10. Natural hazards that may trigger a radiological release from a plutonium processing facility

    Energy Technology Data Exchange (ETDEWEB)

    Selvidge, J. E.

    1977-04-28

    Calculations show the probability of a tornado striking a plutonium area at Rocky Flats is 2.2 x 10/sup -4/ per year. The source term (expected value of plutonium release) should such an event occur is calculated at 3.3 x 10/sup -7/ grams. The source term for high-velocity, downslope winds is higher--2.2 x 10/sup -3/ grams. The probability of a meteorite that weighs one or more pounds (453 grams) striking a plutonium area is estimated at 8.88 x 10/sup -7/ per year. Because of this small probability and the remote chance that a plutonium release would occur even if a meteorite hit occurred, the hazard from meteorite impact is considered negligible. Conservative assumptions result in all calculated frequencies being almost certainly too high. Empirical observations have indicated lower frequencies than those calculated.

  11. Natural hazards that may trigger a radiological release from a plutonium processing facility

    International Nuclear Information System (INIS)

    Selvidge, J.E.

    1977-01-01

    Calculations show the probability of a tornado striking a plutonium area at Rocky Flats is 2.2 x 10 -4 per year. The source term (expected value of plutonium release) should such an event occur is calculated at 3.3 x 10 -7 grams. The source term for high-velocity, downslope winds is higher--2.2 x 10 -3 grams. The probability of a meteorite that weighs one or more pounds (453 grams) striking a plutonium area is estimated at 8.88 x 10 -7 per year. Because of this small probability and the remote chance that a plutonium release would occur even if a meteorite hit occurred, the hazard from meteorite impact is considered negligible. Conservative assumptions result in all calculated frequencies being almost certainly too high. Empirical observations have indicated lower frequencies than those calculated

  12. Medial release and lateral imbrication for intractable anterior knee pain: diagnostic process, technique, and results

    Directory of Open Access Journals (Sweden)

    Meldrum AR

    2015-01-01

    Full Text Available Alexander R Meldrum,1 Jeremy R Reed,2 Megan D Dash3 1Department of Surgery, Section of Orthopedic Surgery, University of Calgary, Calgary, AB, Canada; 2Department of Surgery, University of Saskatchewan College of Medicine, Regina, SK, Canada; 3Department of Family Medicine, College of Medicine, University of Saskatchewan, Regina, SK, Canada Purpose: To present two cases of intractable patellofemoral pain syndrome treated with a novel procedure, arthroscopic medial release, and lateral imbrication of the patellar retinaculum. Patients and methods: This case series presents the treatment of three knees in two patients (one bilateral in whom an all-inside arthroscopic medial release and lateral imbrication of the patellar retinaculum was performed. Subjective measurement of pain was the primary outcome measurement, and subjective patellofemoral instability was the secondary outcome measurement. Results: Subjectively the two patients had full resolution of their pain, without any patellofemoral instability. Conclusion: Medial release and lateral imbrication of the patellar retinaculum is a new surgical procedure that has been used in the treatment of intractable patellofemoral pain syndrome. This is the first report of its kind in the literature. While outcome measurements were less than ideal, the patients had positive outcomes, both functionally and in terms of pain. Keywords: anterior knee pain syndrome, chondromalacia patellae, runners knee, patellar chondropathy, patellofemoral dysfunction, patellofemoral tracking disorder

  13. Validation of designing tools as part of nuclear pump development process

    International Nuclear Information System (INIS)

    Klemm, T.; Sehr, F.; Spenner, P.; Fritz, J.

    2010-01-01

    Nuclear pumps are characterized by high safety standards, operational reliability as well as long life cycles. For the design process it is of common use to have a down scaled model pump to qualify operating data and simulate exceptional operating conditions. In case of modifications of the pump design compared to existing reactor coolant pumps a model pump is required to develop methods and tools to design the full scale pump. In the presented case it has a geometry scale of 1:2 regarding the full scale pump size. The experimental data of the model pump is basis for validation of methods and tools which are applied in the designing process of the full scale pump. In this paper the selection of qualified tools and the validation process is demonstrated exemplarily on a cooling circuit. The aim is to predict the resulting flow rate. Tools are chosen for different components depending on the benefit to effort ratio. For elementary flow phenomena such as fluid flow in straight pipes or gaps analytic or empirical laws can be used. For more complex flow situations numerical methods are utilized. Main focus is set on the validation process of the applied numerical flow simulation. In this case not only integral data should be compared, it is also necessary to validate local flow structure of numerical flow simulation to avoid systematic errors in CFD Model generation. Due to complex design internal flow measurements are not possible. On that reason simple comparisons of similar flow test cases are used. Results of this study show, that the flow simulation data closely match measured integral pump and test case data. With this validation it is now possible to qualify CFD simulations as a design tool for the full scale pump in similar cooling circuit. (authors)

  14. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  15. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  16. Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes

    Science.gov (United States)

    Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.

    2018-04-01

    To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.

  17. Guidelines for the Development, Validation and Routine Control of Industrial Radiation Processes

    DEFF Research Database (Denmark)

    Safrany, A.; Miller, Arne; Kovacs, A.

    Radiation processing has become a well accepted technology on the global market, with uses ranging from the sterilization of medical devices to polymer cross-linking and curing to the irradiation of selected food items. Besides these well established uses, new radiation technology applications...... are emerging for environmental remediation and the synthesis of advanced materials and products. Quality assurance is vital for the success of these technologies and requires the development of standardized procedures as well as the harmonization of process validation and process control. It is recognized...

  18. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  19. Review of processes for the release of DOE real and non-real property for reuse and recycle

    International Nuclear Information System (INIS)

    Ranek, N.L.; Kamboj, S.; Hensley, J.; Chen, S.Y.; Blunt, D.

    1997-11-01

    This report summarizes the underlying historical and regulatory framework supporting the concept of authorizing release for restricted or unrestricted reuse or recycle of real and non-real U.S. Department of Energy (DOE) properties containing residual radioactive material. Basic radiation protection principles as recommended by the International Commission on Radiological Protection are reviewed, and international initiatives to investigate radiological clearance criteria are reported. Applicable requirements of the U.S. Nuclear Regulatory Commission, the Environmental Protection Agency, DOE, and the State of Washington are discussed. Several processes that have been developed for establishing cleanup and release criteria for real and non-real DOE property containing residual radioactive material are presented. Examples of DOE real property for which radiological cleanup criteria were established to support unrestricted release are provided. Properties discussed include Formerly Utilized Sites Remedial Action Project sites, Uranium Mill Tailings Remedial Action Project sites, the Shippingport decommissioning project, the south-middle and south-east vaults in the 317 area at Argonne National Laboratory, the Heavy Water Components Test Reactor at DOE's Savannah River Site, the Experimental Boiling Water Reactor at Argonne National Laboratory, and the Weldon Spring site. Some examples of non-real property for which DOE sites have established criteria to support unrestricted release are also furnished. 10 figs., 4 tabs

  20. Validation Process for LEWICE by Use of a Navier-Stokes Solver

    Science.gov (United States)

    Wright, William B.; Porter, Christopher E.

    2017-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. A quantitative comparison of the results against a database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will extend the comparison of ice shapes between LEWICE 3.5 and experimental data from a previous paper. Comparisons of lift and drag are made between experimentally collected data from experimentally obtained ice shapes and simulated (CFD) data on simulated (LEWICE) ice shapes. Comparisons are also made between experimentally collected and simulated performance data on select experimental ice shapes to ensure the CFD solver, FUN3D, is valid within the flight regime. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  1. Technical Report Series on Global Modeling and Data Assimilation. Volume 42; Soil Moisture Active Passive (SMAP) Project Calibration and Validation for the L4_C Beta-Release Data Product

    Science.gov (United States)

    Koster, Randal D. (Editor); Kimball, John S.; Jones, Lucas A.; Glassy, Joseph; Stavros, E. Natasha; Madani, Nima (Editor); Reichle, Rolf H.; Jackson, Thomas; Colliander, Andreas

    2015-01-01

    During the post-launch Cal/Val Phase of SMAP there are two objectives for each science product team: 1) calibrate, verify, and improve the performance of the science algorithms, and 2) validate accuracies of the science data products as specified in the L1 science requirements according to the Cal/Val timeline. This report provides analysis and assessment of the SMAP Level 4 Carbon (L4_C) product specifically for the beta release. The beta-release version of the SMAP L4_C algorithms utilizes a terrestrial carbon flux model informed by SMAP soil moisture inputs along with optical remote sensing (e.g. MODIS) vegetation indices and other ancillary biophysical data to estimate global daily NEE and component carbon fluxes, particularly vegetation gross primary production (GPP) and ecosystem respiration (Reco). Other L4_C product elements include surface (<10 cm depth) soil organic carbon (SOC) stocks and associated environmental constraints to these processes, including soil moisture and landscape FT controls on GPP and Reco (Kimball et al. 2012). The L4_C product encapsulates SMAP carbon cycle science objectives by: 1) providing a direct link between terrestrial carbon fluxes and underlying freeze/thaw and soil moisture constraints to these processes, 2) documenting primary connections between terrestrial water, energy and carbon cycles, and 3) improving understanding of terrestrial carbon sink activity in northern ecosystems.

  2. Validity of Scientific Based Chemistry Android Module to Empower Science Process Skills (SPS) in Solubility Equilibrium

    Science.gov (United States)

    Antrakusuma, B.; Masykuri, M.; Ulfa, M.

    2018-04-01

    Evolution of Android technology can be applied to chemistry learning, one of the complex chemistry concept was solubility equilibrium. this concept required the science process skills (SPS). This study aims to: 1) Characteristic scientific based chemistry Android module to empowering SPS, and 2) Validity of the module based on content validity and feasibility test. This research uses a Research and Development approach (RnD). Research subjects were 135 s1tudents and three teachers at three high schools in Boyolali, Central of Java. Content validity of the module was tested by seven experts using Aiken’s V technique, and the module feasibility was tested to students and teachers in each school. Characteristics of chemistry module can be accessed using the Android device. The result of validation of the module contents got V = 0.89 (Valid), and the results of the feasibility test Obtained 81.63% (by the student) and 73.98% (by the teacher) indicates this module got good criteria.

  3. Donabedian's structure-process-outcome quality of care model: Validation in an integrated trauma system.

    Science.gov (United States)

    Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean

    2015-06-01

    According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes

  4. Validating the extract, transform, load process used to populate a large clinical research database.

    Science.gov (United States)

    Denney, Michael J; Long, Dustin M; Armistead, Matthew G; Anderson, Jamie L; Conway, Baqiyyah N

    2016-10-01

    Informaticians at any institution that are developing clinical research support infrastructure are tasked with populating research databases with data extracted and transformed from their institution's operational databases, such as electronic health records (EHRs). These data must be properly extracted from these source systems, transformed into a standard data structure, and then loaded into the data warehouse while maintaining the integrity of these data. We validated the correctness of the extract, load, and transform (ETL) process of the extracted data of West Virginia Clinical and Translational Science Institute's Integrated Data Repository, a clinical data warehouse that includes data extracted from two EHR systems. Four hundred ninety-eight observations were randomly selected from the integrated data repository and compared with the two source EHR systems. Of the 498 observations, there were 479 concordant and 19 discordant observations. The discordant observations fell into three general categories: a) design decision differences between the IDR and source EHRs, b) timing differences, and c) user interface settings. After resolving apparent discordances, our integrated data repository was found to be 100% accurate relative to its source EHR systems. Any institution that uses a clinical data warehouse that is developed based on extraction processes from operational databases, such as EHRs, employs some form of an ETL process. As secondary use of EHR data begins to transform the research landscape, the importance of the basic validation of the extracted EHR data cannot be underestimated and should start with the validation of the extraction process itself. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  6. A Probabilistic Consideration on Nuclide Releases from a Pyro-processed Waste Repository

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Jeong, Jong Tae [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Very recently, a GoldSim template program, GSTSPA, for a safety assessment of a conceptual hybrid-typed repository system, called 'A-KRS,' in which two kinds of pyroprocessed radioactive wastes, low-level metal wastes and ceramic high-level wastes that arise from the pyroprocessing of PWR nuclear spent fuels, has been developed and is to be disposed of by 'separate disposal' strategies. The A-KRS is considered to be constructed at two different depths in geological media: at a 200m depth, at which a possible human intrusion is considered to be limited after closure, for the pyroprocessed metal wastes with lower or no decay heat producing nuclides, and at a 500m depth, believed to be the reducing condition for nuclides with a rather higher radioactivity and heat generation rate. This program is ready for a probabilistic total system performance assessment (TSPA) which is able to evaluate nuclide release from the repository and farther transport into the geosphere and biosphere under various normal, disruptive natural and manmade events, and scenarios that can occur after a failure of a waste package and canister with associated uncertainty. To quantify the nuclide release and transport through the various possible pathways in the near- and far-fields of the A-KRS repository system under a normal groundwater flow scenario, some illustrative evaluations have been made through this study. Even though all parameter values associated with the A-KRS were assumed for the time being, the illustrative results should be informative since the evaluation of such releases is very important not only in view of the safety assessment of the repository, but also for design feedback of its performance

  7. Influence of Groundwater Flow Rate on Nuclide Releases from Pyro-processed Waste Repository

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Jeong, Jong Tae

    2011-01-01

    Since the early 2000s several template programs for the safety assessment of a high-level radioactive waste repository as well as a low- and intermediate level radioactive waste repository systems have been developed by utilizing GoldSim and AMBER at KAERI. Very recently, another template program for a conceptual hybrid-typed repository system, called 'A-KRS' in which two kinds of pyroprocessed radioactive wastes, low-level metal wastes and ceramic high-level wastes that arise from pyroprocessing of PWR nuclear spent fuels has been developed and are to be disposed of by separate disposal strategies. The A-KRS is considered to be constructed at two different depths in geological media: 200m depth, at which a possible human intrusion is considered to be limited after closure, for the pyroprocessed metal wastes with lower or no decay heat producing nuclides, and 500m depth, believed to be in the reducing condition for nuclides with a rather higher radioactivity and heat generation rate. This program is ready for total system performance assessment which is able to evaluate nuclide release from the repository and farther transport into the geosphere and biosphere under various normal, disruptive natural and manmade events, and scenarios that can occur after a failure of waste package and canister. To quantify a nuclide release and transport through the possible various pathways especially in the near-fields of the A-KRS repository system, some illustrative evaluations have been made through the study. Even though all parameter values associated with the A-KRS were assumed for the time being, the illustrative results should be informative since the evaluation of such releases is very important not only in view of the safety assessment of the repository, but also for design feedback of its performance

  8. A Probabilistic Consideration on Nuclide Releases from a Pyro-processed Waste Repository

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Jeong, Jong Tae

    2012-01-01

    Very recently, a GoldSim template program, GSTSPA, for a safety assessment of a conceptual hybrid-typed repository system, called 'A-KRS,' in which two kinds of pyroprocessed radioactive wastes, low-level metal wastes and ceramic high-level wastes that arise from the pyroprocessing of PWR nuclear spent fuels, has been developed and is to be disposed of by 'separate disposal' strategies. The A-KRS is considered to be constructed at two different depths in geological media: at a 200m depth, at which a possible human intrusion is considered to be limited after closure, for the pyroprocessed metal wastes with lower or no decay heat producing nuclides, and at a 500m depth, believed to be the reducing condition for nuclides with a rather higher radioactivity and heat generation rate. This program is ready for a probabilistic total system performance assessment (TSPA) which is able to evaluate nuclide release from the repository and farther transport into the geosphere and biosphere under various normal, disruptive natural and manmade events, and scenarios that can occur after a failure of a waste package and canister with associated uncertainty. To quantify the nuclide release and transport through the various possible pathways in the near- and far-fields of the A-KRS repository system under a normal groundwater flow scenario, some illustrative evaluations have been made through this study. Even though all parameter values associated with the A-KRS were assumed for the time being, the illustrative results should be informative since the evaluation of such releases is very important not only in view of the safety assessment of the repository, but also for design feedback of its performance

  9. Using 'big data' to validate claims made in the pharmaceutical approval process.

    Science.gov (United States)

    Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark

    2015-01-01

    Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research

  10. Validation of the sterilization process for radiopharmaceuticals and materials with humid heat

    International Nuclear Information System (INIS)

    Robles, Anita; Moore, Mariel; Morote, Mario; Guevara, Buenaventura; Castro, Delcy; Paragulla, Wilson; Martinez, Ramos; Ocana, Elias; Novoa, Carlos

    2014-01-01

    A validation protocol has been designed and applied for the sterilization process of radiopharmaceuticals and materials, with humid heat for sodium pertechnetate Tc-99m injection solution (placebo) and materials, in compliance with good manufacturing practices for pharmaceutical products. The sterilization cycle set for each load is developed, according to the following parameters: 121 o C ± 1 o C (temperature), 15 ± 0.5 psi (pressure) and an exposure time of 20 and 15 minutes, respectively. The results in the penetration test with load, F0 values were higher than 20 minutes at 121 o C and for the biological challenge by biological indicators (Bacillus stearothermophilus) was negative in colder spots, in three consecutive runs. The sterilization process for each load and equipment has been validated to meet the established acceptance criteria. (authors).

  11. Chitosan microparticles: influence of the gelation process on the release profile and oral bioavailability of albendazole, a class II compound.

    Science.gov (United States)

    Piccirilli, Gisela N; García, Agustina; Leonardi, Darío; Mamprin, María E; Bolmaro, Raúl E; Salomón, Claudio J; Lamas, María C

    2014-11-01

    Encapsulation of albendazole, a class II compound, into polymeric microparticles based on chitosan-sodium lauryl sulfate was investigated as a strategy to improve drug dissolution and oral bioavailability. The microparticles were prepared by spray drying technique and further characterized by means of X-ray powder diffractometry, infrared spectroscopy and scanning electron microscopy. The formation of a novel polymeric structure between chitosan and sodium lauryl sulfate, after the internal or external gelation process, was observed by infrared spectroscopy. The efficiency of encapsulation was found to be between 60 and 85% depending on the internal or external gelation process. Almost spherically spray dried microparticles were observed using scanning electron microscopy. In vitro dissolution results indicated that the microparticles prepared by internal gelation released 8% of the drug within 30 min, while the microparticles prepared by external gelation released 67% within 30 min. It was observed that the AUC and Cmax values of ABZ from microparticles were greatly improved, in comparison with the non-encapsulated drug. In conclusion, the release properties and oral bioavailability of albendazole were greatly improved by using spraydried chitosan-sodium lauryl sulphate microparticles.

  12. Mitigation of release of volatile iodine species during severe reactor accidents - a novel reliable process of safety technology

    International Nuclear Information System (INIS)

    Guentay, S.; Bruchertseifer, H.

    2010-01-01

    In severe accidents, a significant risk for public health may be generated as a result of release of the gaseous iodine species into the environment through the containment leaks or containment venting filter systems with low retention efficiency. The elemental iodine and volatile organic iodides are the main gaseous iodine species in the containment. Potential release of large quantities of gaseous elemental iodine from the reactor coolant system or its radiolytic generation in the containment sump constitute the key source of gaseous elemental iodine in containment atmosphere. Iodine paint reactions as well as the reaction of iodine with organic residuals in sump water are the main mechanisms for the generation of high volatile organic iodides in the containment. Although very much desired, significant research activities conducted in 70's unfortunately did not create any technically feasible solution to mitigate iodine release into the environment under prevailing conditions. Development of a process leading to a fast, comprehensive and reliable retention of volatile iodine species in aqueous solution with an aim to implement for the severe accident management applications has been subject of a research project in the recent years at Paul Scherrer Institut. The process developed utilizes simultaneous use of two customary technical chemical additives in an aqueous solution. The results of the experimental program have demonstrated a fast and reliable destruction of high volatile organic iodine species and fast reduction of elemental iodine into iodide ions in aqueous solutions and an efficient mitigation of the re-formation of gaseous iodine from iodide ions. Investigations covered a broad range of anticipated severe accident conditions in the containment. The project additionally focused on possible application of the process to existing containment venting filter systems, specifically as a passive add-on for back-fitting. This paper describes the process

  13. Food and Drug Administration process validation activities to support 99Mo production at Sandia National Laboratories

    International Nuclear Information System (INIS)

    McDonald, M.J.; Bourcier, S.C.; Talley, D.G.

    1997-01-01

    Prior to 1989 99 Mo was produced in the US by a single supplier, Cintichem Inc., Tuxedo, NY. Because of problems associated with operating its facility, in 1989 Cintichem elected to decommission the facility rather than incur the costs for repair. The demise of the 99 Mo capability at Cintichem left the US totally reliant upon a single foreign source, Nordion International, located in Ottawa Canada. In 1992 the DOE purchased the Cintichem 99 Mo Production Process and Drug Master File (DMF). In 1994 the DOE funded Sandia National Laboratories (SNL) to produce 99 Mo. Although Cintichem produced 99 Mo and 99m Tc generators for many years, there was no requirement for process validation which is now required by the Food and Drug Administration (FDA). In addition to the validation requirement, the requirements for current Good manufacturing Practices were codified into law. The purpose of this paper is to describe the process validation being conducted at SNL for the qualification of SNL as a supplier of 99 Mo to US pharmaceutical companies

  14. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi

    2016-02-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  15. Dynamic modeling and experimental validation for direct contact membrane distillation (DCMD) process

    KAUST Repository

    Eleiwi, Fadi; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Francis, Lijo; Laleg-Kirati, Taous-Meriem

    2016-01-01

    This work proposes a mathematical dynamic model for the direct contact membrane distillation (DCMD) process. The model is based on a 2D Advection–Diffusion Equation (ADE), which describes the heat and mass transfer mechanisms that take place inside the DCMD module. The model studies the behavior of the process in the time varying and the steady state phases, contributing to understanding the process performance, especially when it is driven by intermittent energy supply, such as the solar energy. The model is experimentally validated in the steady state phase, where the permeate flux is measured for different feed inlet temperatures and the maximum absolute error recorded is 2.78 °C. Moreover, experimental validation includes the time variation phase, where the feed inlet temperature ranges from 30 °C to 75 °C with 0.1 °C increment every 2min. The validation marks relative error to be less than 5%, which leads to a strong correlation between the model predictions and the experiments.

  16. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    Science.gov (United States)

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  17. First do no harm: developing an ethical process of consent and release for digital storytelling in healthcare

    Directory of Open Access Journals (Sweden)

    Pip Hardy

    2015-11-01

    Full Text Available Opportunities to disseminate the stories of patients and those who care for them via the internet create new dilemmas with respect to ethical processes of consent and release. The possibility of utilising images as well as words further complicates this issue. Balancing the need to protect the safety and security of those who share their stories with their own desire for their stories to be widely heard presents a complex blend of ethical issues.The Patient Voices programme has been helping people create and share digital stories of healthcare since 2003. During that time, careful thought has been given to the development of a respectful process that both empowers and protects storytellers, affording time at every stage of the process to reflect and make informed decisions about consent, sharing and dissemination. This paper describes how that process has been developed and explores the issues that it was designed to address.

  18. Inhibition of basophil histamine release by gangliosides. Further studies on the significance of cell membrane sialic acid in the histamine release process

    DEFF Research Database (Denmark)

    Jensen, C; Norn, S; Thastrup, Ole

    1987-01-01

    with the glucolipid mixture increased the sialic acid content of the cells, and this increase was attributed to an insertion of gangliosides into the cell membrane. The inhibition of histamine release was abolished by increasing the calcium concentration, which substantiates our previous findings that cell membrane......Histamine release from human basophils was inhibited by preincubation of the cells with a glucolipid mixture containing sialic acid-containing gangliosides. This was true for histamine release induced by anti-IgE, Concanavalin A and the calcium ionophore A23187, whereas the release induced by S....... aureus Wood 46 was not affected. It was demonstrated that the inhibitory capacity of the glucolipid mixture could be attributed to the content of gangliosides, since no inhibition was obtained with cerebrosides or with gangliosides from which sialic acid was removed. Preincubation of the cells...

  19. Challenges in validating the sterilisation dose for processed human amniotic membranes

    International Nuclear Information System (INIS)

    Yusof, Norimah; Hassan, Asnah; Firdaus Abd Rahman, M.N.; Hamid, Suzina A.

    2007-01-01

    Most of the tissue banks in the Asia Pacific region have been using ionising radiation at 25 kGy to sterilise human tissues for save clinical usage. Under tissue banking quality system, any dose employed for sterilisation has to be validated and the validation exercise has to be a part of quality document. Tissue grafts, unlike medical items, are not produced in large number per each processing batch and tissues relatively have a different microbial population. A Code of Practice established by the International Atomic Energy Agency (IAEA) in 2004 offers several validation methods using smaller number of samples compared to ISO 11137 (1995), which is meant for medical products. The methods emphasise on bioburden determination, followed by sterility test on samples after they were exposed to verification dose for attaining of sterility assurance level (SAL) of 10 -1 . This paper describes our experience in using the IAEA Code of Practice in conducting the validation exercise for substantiating 25 kGy as sterilisation dose for both air-dried amnion and those preserved in 99% glycerol

  20. Challenges in validating the sterilisation dose for processed human amniotic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Yusof, Norimah [Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia)], E-mail: norimah@mint.gov.my; Hassan, Asnah [Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia); Firdaus Abd Rahman, M.N.; Hamid, Suzina A. [National Tissue Bank, Hospital Universiti Sains Malaysia, Kubang Kerian, 16130 Kelantan (Malaysia)

    2007-11-15

    Most of the tissue banks in the Asia Pacific region have been using ionising radiation at 25 kGy to sterilise human tissues for save clinical usage. Under tissue banking quality system, any dose employed for sterilisation has to be validated and the validation exercise has to be a part of quality document. Tissue grafts, unlike medical items, are not produced in large number per each processing batch and tissues relatively have a different microbial population. A Code of Practice established by the International Atomic Energy Agency (IAEA) in 2004 offers several validation methods using smaller number of samples compared to ISO 11137 (1995), which is meant for medical products. The methods emphasise on bioburden determination, followed by sterility test on samples after they were exposed to verification dose for attaining of sterility assurance level (SAL) of 10{sup -1}. This paper describes our experience in using the IAEA Code of Practice in conducting the validation exercise for substantiating 25 kGy as sterilisation dose for both air-dried amnion and those preserved in 99% glycerol.

  1. Process validation for the manufacturing of Tc-99m generator at Nuclear Malaysia

    International Nuclear Information System (INIS)

    Noriah Jamal; Rehir Dahlan; Wan Anuar Wan Awang; Zakaria Ibrahim; Shaaban Kassim; Wan Firdaus Wan Ishak; Nelly Bo Nai Lee; Noraisyah Yusof; Siti Selina Abdul Hamid; Ng Yen; Rahimah Abdul Rahim; Muhammad Hanafi Mohamad Mokhtar; Azahari Kasbollah; Abd Jalil Abd Hamid; Yahya Talib; Shafii Khamis; Zulkifli Mohamed Hashim

    2007-01-01

    Process validation provides the best platform in identifying potential problems in the actual radiopharmaceuticals manufacturing work. The purpose of this paper is to present experience in performing process validation for the manufacturing of Tc-99m generator at Nuclear Malaysia. Process validation for the manufacturing of Tc-99m generator was done by performing four try runs, between October 2006 to April 2007. It was done using saline instead of the actual product. Each try run took four days to complete. On day 1, clean room was cleaned and disinfected. On day 2, activity of washing and sterilization of utensils, columns, rubber stoppers and aluminium caps was carried out. On day 3, preparation of white top, alumina packed column and mixing solutions was performed. Apparatus was also sent for sterilizing test. On day 4, the actual production day of the try run by impregnating column with sterile saline was performed. Prior to the manufacturing activities, particle counts measurement and area clearance were performed to ensure that the temperature and humidity of the clean room are suitable for the production work. Settle plates were placed at the identified positions including in the Hot Cell. Personnel's finger print was performed before and after production work by using touch plates. After completion of try run, elution from the generators that been manufactured, settle and touch plates were sent to quality control unit for the microbiological test. It took fourteen days to get the test results. The first try run was failed, which may be due to insufficient of proper arrangement/preparation of work. It may also due to problem of cleaning/disinfection of clean room, which may not be done properly. The further three consecutive try runs meet all the specifications including the sterility test, endotoxin test and finger prints. It shows that the manufacturing of Tc-99m generator at Nuclear Malaysia is validated and ready for the active run. (Author)

  2. Validation of the FEA of a deep drawing process with additional force transmission

    Science.gov (United States)

    Behrens, B.-A.; Bouguecha, A.; Bonk, C.; Grbic, N.; Vucetic, M.

    2017-10-01

    In order to meet requirements by automotive industry like decreasing the CO2 emissions, which reflects in reducing vehicles mass in the car body, the chassis and the powertrain, the continuous innovation and further development of existing production processes are required. In sheet metal forming processes the process limits and components characteristics are defined through the process specific loads. While exceeding the load limits, a failure in the material occurs, which can be avoided by additional force transmission activated in the deep drawing process before the process limit is achieved. This contribution deals with experimental investigations of a forming process with additional force transmission regarding the extension of the process limits. Based on FEA a tool system is designed and developed by IFUM. For this purpose, the steel material HCT600 is analyzed numerically. Within the experimental investigations, the deep drawing processes, with and without the additional force transmission are carried out. Here, a comparison of the produced rectangle cups is done. Subsequently, the identical deep drawing processes are investigated numerically. Thereby, the values of the punch reaction force and displacement are estimated and compared with experimental results. Thus, the validation of material model is successfully carried out on process scale. For further quantitative verification of the FEA results the experimental determined geometry of the rectangular cup is measured optically with ATOS system of the company GOM mbH and digitally compared with external software Geomagic®QualifyTM. The goal of this paper is the verification of the transferability of the FEA model for a conventional deep drawing process to a deep drawing process with additional force transmission with a counter punch.

  3. A2A-D2 receptor-receptor interaction modulates gliotransmitter release from striatal astrocyte processes.

    Science.gov (United States)

    Cervetto, Chiara; Venturini, Arianna; Passalacqua, Mario; Guidolin, Diego; Genedani, Susanna; Fuxe, Kjell; Borroto-Esquela, Dasiel O; Cortelli, Pietro; Woods, Amina; Maura, Guido; Marcoli, Manuela; Agnati, Luigi F

    2017-01-01

    Evidence for striatal A2A-D2 heterodimers has led to a new perspective on molecular mechanisms involved in schizophrenia and Parkinson's disease. Despite the increasing recognition of astrocytes' participation in neuropsychiatric disease vulnerability, involvement of striatal astrocytes in A2A and D2 receptor signal transmission has never been explored. Here, we investigated the presence of D2 and A2A receptors in isolated astrocyte processes prepared from adult rat striatum by confocal imaging; the effects of receptor activation were measured on the 4-aminopyridine-evoked release of glutamate from the processes. Confocal analysis showed that A2A and D2 receptors were co-expressed on the same astrocyte processes. Evidence for A2A-D2 receptor-receptor interactions was obtained by measuring the release of the gliotransmitter glutamate: D2 receptors inhibited the glutamate release, while activation of A2A receptors, per se ineffective, abolished the effect of D2 receptor activation. The synthetic D2 peptide VLRRRRKRVN corresponding to the receptor region involved in electrostatic interaction underlying A2A-D2 heteromerization abolished the ability of the A2A receptor to antagonize the D2 receptor-mediated effect. Together, the findings are consistent with heteromerization of native striatal astrocytic A2A-D2 receptors that via allosteric receptor-receptor interactions could play a role in the control of striatal glutamatergic transmission. These new findings suggest possible new pathogenic mechanisms and/or therapeutic approaches to neuropsychiatric disorders. © 2016 International Society for Neurochemistry.

  4. [Sediment-water flux and processes of nutrients and gaseous nitrogen release in a China River Reservoir].

    Science.gov (United States)

    Chen, Zhu-hong; Chen, Neng-wang; Wu, Yin-qi; Mo, Qiong-li; Zhou, Xing-peng; Lu, Ting; Tian, Yun

    2014-09-01

    The key processes and fluxes of nutrients (N and P) and gaseous N (N2 and N2O) across the sediment-water interface in a river reservoir (Xipi) of the Jiulong River watershed in southeast China were studied. Intact core sediment incubation of nutrients exchange, in-situ observation and lab incubation of excess dissolved N2 and N2O (products of nitrification, denitrification and Anammox), and determination of physiochemical and microbe parameters were carried out in 2013 for three representative sites along the lacustrine zone of the reservoir. Results showed that ammonium and phosphate were generally released from sediment to overlying water [with averaged fluxes of N (479.8 ± 675.4) mg. (m2. d)-1 and P (4. 56 ± 0.54) mg. (m2 d) -1] , while nitrate and nitrite diffused into the sediment. Flood events in the wet season could introduce a large amount of particulate organic matter that would be trapped by the dam reservoir, resulting in the high release fluxes of ammonium and phosphate observed in the following low-flow season. No clear spatial variation of sediment nutrient release was found in the lacustrine zone of the reservoir. Gaseous N release was dominated by excess dissolved N2 (98% of total), and the N2 flux from sediment was (15.8 ± 12. 5) mg (m2. d) -1. There was a longitudinal and vertical variation of excess dissolved N2, reflecting the combined results of denitrification and Anammox occurring in anoxic sediment and fluvial transport. Nitrification mainly occurred in the lower lacustrine zone, and the enrichment of N2O was likely regulated by the ratio of ammonium to DIN in water.

  5. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    Science.gov (United States)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  6. Conceptual dissonance: evaluating the efficacy of natural language processing techniques for validating translational knowledge constructs.

    Science.gov (United States)

    Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B

    2009-03-01

    The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.

  7. Assessing middle school students` understanding of science relationships and processes: Year 2 - instrument validation. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Schau, C.; Mattern, N.; Weber, R.; Minnick, K.

    1997-01-01

    Our overall purpose for this multi-year project was to develop an alternative assessment format measuring rural middle school students understanding of science concepts and processes and the interrelationships among them. This kind of understanding is called structural knowledge. We had 3 major interrelated goals: (1) Synthesize the existing literature and critically evaluate the actual and potential use of measures of structural knowledge in science education. (2) Develop a structural knowledge alternative assessment format. (3) Examine the validity of our structural knowledge format. We accomplished the first two goals during year 1. The structural knowledge assessment we identified and developed further was a select-and-fill-in concept map format. The goal for our year 2 work was to begin to validate this assessment approach. This final report summarizes our year 2 work.

  8. Modeling mesoscale diffusion and transport processes for releases within coastal zones during land/sea breezes

    International Nuclear Information System (INIS)

    Lyons, W.A.; Keen, C.S.; Schuh, J.A.

    1983-12-01

    This document discusses the impacts of coastal mesoscale regimes (CMRs) upon the transport and diffusion of potential accidental radionuclide releases from a shoreline nuclear power plant. CMRs exhibit significant spatial (horizontal and vertical) and temporal variability. Case studies illustrate land breezes, sea/lake breeze inflows and return flows, thermal internal boundary layers, fumigation, plume trapping, coastal convergence zones, thunderstorms and snow squalls. The direct application of a conventional Gaussian straight-line dose assessment model, initialized only by on-site tower data, can potentially produce highly misleading guidance as to plume impact locations. Since much is known concerning CMRs, there are many potential improvements to modularized dose assessment codes, such as by proper parameterization of TIBLs, forecasting the inland penetration of convergence zones, etc. A three-dimensional primitive equation prognostic model showed excellent agreement with detailed lake breeze field measurements, giving indications that such codes can be used in both diagnostic and prognostic studies. The use of relatively inexpensive supplemental meteorological data especially from remote sensing systems (Doppler sodar, radar, lightning strike tracking) and computerized data bases should save significantly on software development costs. Better quality assurance of emergency response codes could include systems of flags providing personnel with confidence levels as to the applicability of a code being used during any given CMR

  9. Obtaining Valid Safety Data for Software Safety Measurement and Process Improvement

    Science.gov (United States)

    Basili, Victor r.; Zelkowitz, Marvin V.; Layman, Lucas; Dangle, Kathleen; Diep, Madeline

    2010-01-01

    We report on a preliminary case study to examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Our goal is to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. Our purpose was two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to identify potential risks due to incorrect application of the safety process, deficiencies in the safety process, or the lack of a defined process. One early outcome of this work was to show that there are structural deficiencies in collecting valid safety data that make software safety different from hardware safety. In our conclusions we present some of these deficiencies.

  10. Formulation and validation of applied engineering equations for heat transfer processes in the food industry

    DEFF Research Database (Denmark)

    Christensen, Martin Gram

    The study is focused on convective heat transfer in the processing of solid foods, specifically with the scope to develop simple analytical calculation tools that can be incorporated into spreadsheet solutions. In areas of food engineering such as equipment manufacture the use of predictive...... calculations, modelling activities and simulations for improved design is employed to a high degree. In food manufacture the use process calculations are seldom applied. Even though, the calculation of thermal processes is not a challenging task in academia; this is not the case for food manufacture. However......; the calculations need fundamental validation and a generality that ensures a wide application, thus also the development of simplified approximations and engineering equations have to be conducted in academia. The focus group for the utilization of the presented work is; food manufacture, authorities ensuring food...

  11. Environmental processes and parameters influencing the consequences of an accidental release of radioactivity weather and season

    International Nuclear Information System (INIS)

    Boeri, G.C.

    1989-01-01

    Seasonal, climatic and meteorological conditions may have a substantial influence on the physical factors involved in transport and deposition of airborne contaminants, and on the transfer and accumulation of radionuclides in terrestrial and aquatic ecosystems. As well, these environmental conditions can also have a significant influence on living habits and practices, and thus on potential radiological and economical impacts. Moreover, these conditions may affect the features and the impact of countermeasures which are adopted for the protection of the public in case of an accidental release. During a Special Session that the Committee of Radiation Protection and Public Health (CRPPH) held on the 1st-2nd September 1986 to review the radiological aspects of the Chernobyl nuclear accident, it was agreed that a consultant should prepare a report reviewing different accident consequences in a radiation protection and public health perspective, and identify the influence of such parameters as time of the year, weather and environmental conditions on the overall impact and the determination of appropriate countermeasures. A Consultant Report on this issue was prepared, by Dr. G. Boeri, and submitted to the CRPPH for review and consideration at its meeting of 22nd-24th November 1987. The CRPPH subsequently agreed that the Consultant Report should be revised and completed, taking into account comments and suggestions sent to the Secretariat and focussing especially on the effect of seasonal and weather conditions in terms of their influence on the radiological impact of an accident and on the emergency countermeasures to be taken. It was decided that the Consultant Report should be developed into an Overview Paper for a workshop on this issue to be organised by the NEA in 1988

  12. Observational study on the fine structure and dynamics of a solar jet. II. Energy release process revealed by spectral analysis

    Science.gov (United States)

    Sakaue, Takahito; Tei, Akiko; Asai, Ayumi; Ueno, Satoru; Ichimoto, Kiyoshi; Shibata, Kazunari

    2018-01-01

    We report on a solar jet phenomenon associated with the C5.4 class flare on 2014 November 11. The data of the jet was provided by the Solar Dynamics Observatory, the X-Ray Telescope (XRT) aboard Hinode, and the Interface Region Imaging Spectrograph and Domeless Solar Telescope (DST) at Hida Observatory, Kyoto University. These plentiful data enabled us to present this series of papers to discuss all the processes of the observed phenomena, including energy storage, event trigger, and energy release. In this paper, we focus on the energy release process of the observed jet, and mainly describe our spectral analysis on the Hα data of DST to investigate the internal structure of the Hα jet and its temporal evolution. This analysis reveals that in the physical quantity distributions of the Hα jet, such as line-of-sight velocity and optical thickness, there is a significant gradient in the direction crossing the jet. We interpret this internal structure as the consequence of the migration of the energy release site, based on the idea of ubiquitous reconnection. Moreover, by measuring the horizontal flow of the fine structures in the jet, we succeeded in deriving the three-dimensional velocity field and the line-of-sight acceleration field of the Hα jet. The analysis result indicates that part of the ejecta in the Hα jet experienced additional acceleration after it had been ejected from the lower atmosphere. This secondary acceleration was found to occur in the vicinity of the intersection between the trajectories of the Hα jet and the X-ray jet observed by Hinode/XRT. We propose that a fundamental cause of this phenomenon is magnetic reconnection involving the plasmoid in the observed jet.

  13. Design of generic coal conversion facilities: Process release---Direct coal liquefaction

    Energy Technology Data Exchange (ETDEWEB)

    1991-09-01

    The direct liquefaction portion of the PETC generic direct coal liquefaction process development unit (PDU) is being designed to provide maximum operating flexibility. The PDU design will permit catalytic and non-catalytic liquefaction concepts to be investigated at their proof-of-the-concept stages before any larger scale operations are attempted. The principal variations from concept to concept are reactor configurations and types. These include thermal reactor, ebullating bed reactor, slurry phase reactor and fixed bed reactor, as well as different types of catalyst. All of these operating modes are necessary to define and identify the optimum process conditions and configurations for determining improved economical liquefaction technology.

  14. Improving the residency admissions process by integrating a professionalism assessment: a validity and feasibility study.

    Science.gov (United States)

    Bajwa, Nadia M; Yudkowsky, Rachel; Belli, Dominique; Vu, Nu Viet; Park, Yoon Soo

    2017-03-01

    The purpose of this study was to provide validity and feasibility evidence in measuring professionalism using the Professionalism Mini-Evaluation Exercise (P-MEX) scores as part of a residency admissions process. In 2012 and 2013, three standardized-patient-based P-MEX encounters were administered to applicants invited for an interview at the University of Geneva Pediatrics Residency Program. Validity evidence was gathered for P-MEX content (item analysis); response process (qualitative feedback); internal structure (inter-rater reliability with intraclass correlation and Generalizability); relations to other variables (correlations); and consequences (logistic regression to predict admission). To improve reliability, Kane's formula was used to create an applicant composite score using P-MEX, structured letter of recommendation (SLR), and structured interview (SI) scores. Applicant rank lists using composite scores versus faculty global ratings were compared using the Wilcoxon signed-rank test. Seventy applicants were assessed. Moderate associations were found between pairwise correlations of P-MEX scores and SLR (r = 0.25, P = .036), SI (r = 0.34, P = .004), and global ratings (r = 0.48, P applicants using composite score versus global rating differed significantly (z = 5.41, P improve the reliability of the residency admissions process by improving applicant composite score reliability.

  15. Real-time process signal validation based on neuro-fuzzy and possibilistic approach

    International Nuclear Information System (INIS)

    Figedy, S.; Fantoni, P.F.; Hoffmann, M.

    2001-01-01

    Real-time process signal validation is an application field where the use of fuzzy logic and Artificial Neural Networks can improve the diagnostics of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process is to be performed. The possibilistic approach allows a fast detection of unforeseen plant conditions. Specialized Artificial Neural Networks are used, one for each fuzzy cluster. This offers two main advantages: the accuracy and generalization capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This system analyzes the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. This model has been tested on a simulated data from the PWR type of a nuclear power plant, to monitor safety-related reactor variables over the entire power-flow operating map and were installed in real conditions of BWR nuclear reactor. (Authors)

  16. Validation of the production process of core-equipment HYNIC-Bombesin-Sn; Validacion del proceso de produccion del nucleo-equipo HYNIC-Bombesina-Sn

    Energy Technology Data Exchange (ETDEWEB)

    Rubio C, N I [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    2008-07-01

    The validation process is establishing documented evidence that provides a high degree of assurance that a specific process consistently will produce a product that will meet specifications and quality attributes preset and, therefore, ensures the efficiency and effectiveness of a product. The radiopharmaceutical {sup 99m}Tc-HYNlC-Bombesin is part of the gastrin-releasing peptide (GRP) analogues of bombesin that are radiolabelled with technetium 99 metastable for molecular images obtention. Is obtained from freeze-dry formulations kits (core- equipment)) and has reported a very high stability in human serum, specific binding to receptors and rapid internalization. Biodistribution data in mice showed rapid blood clearance with predominant renal excretion and specific binding to tissues with positive response to GRP receptors. According to biokinetics studies performed on patients with breast cancer, breast show a marked asymmetry with increased uptake in neoplastic breast in healthy women and the uptake of radiopharmaceuticals is symmetrical in both breasts. No reported adverse reactions. In this paper, the prospective validation core-equipment HYNlC-Bombesin-Sn, which was shown consistently that the product meets the specifications and quality, attributes to preset from the obtained from the diagnostic radiopharmaceutical third generation: {sup 99m}Tc-HYNlC-Bombesin. The process was successfully validated and thereby ensuring the efficiency and effectiveness of this agent as a preliminary diagnostic for approval to be marketed. (Author)

  17. Modeling within- and across-channel processes in comodulation masking release

    DEFF Research Database (Denmark)

    Dau, Torsten; Piechowiak, Tobias; Ewert, Stephan D

    2013-01-01

    al., J. Acoust. Soc. Am. 124, 422-438 (2008)] was used and extended by an across-channel modulation processing stage according to Piechowiak et al. [J. Acoust. Soc. Am. 121, 2111-2126 (2007)]. Five experimental paradigms were considered: CMR with a broadband noise masker as a function of the masker...

  18. HEPA filter leaching concept validation trials at the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Chakravartty, A.C.

    1995-04-01

    The enclosed report documents six New Waste Calcining Facility (NWCF) HEPA filter leaching trials conducted at the Idaho Chemical Processing Plant using a filter leaching system to validate the filter leaching treatment concept. The test results show that a modified filter leaching system will be able to successfully remove both hazardous and radiological constituents to RCRA disposal levels. Based on the success of the filter leach trials, the existing leaching system will be modified to provide a safe, simple, effective, and operationally flexible filter leaching system

  19. Modification and Validation of an Automotive Data Processing Unit, Compessed Video System, and Communications Equipment

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.

    1997-04-01

    The primary purpose of the "modification and validation of an automotive data processing unit (DPU), compressed video system, and communications equipment" cooperative research and development agreement (CRADA) was to modify and validate both hardware and software, developed by Scientific Atlanta, Incorporated (S-A) for defense applications (e.g., rotary-wing airplanes), for the commercial sector surface transportation domain (i.e., automobiles and trucks). S-A also furnished a state-of-the-art compressed video digital storage and retrieval system (CVDSRS), and off-the-shelf data storage and transmission equipment to support the data acquisition system for crash avoidance research (DASCAR) project conducted by Oak Ridge National Laboratory (ORNL). In turn, S-A received access to hardware and technology related to DASCAR. DASCAR was subsequently removed completely and installation was repeated a number of times to gain an accurate idea of complete installation, operation, and removal of DASCAR. Upon satisfactory completion of the DASCAR construction and preliminary shakedown, ORNL provided NHTSA with an operational demonstration of DASCAR at their East Liberty, OH test facility. The demonstration included an on-the-road demonstration of the entire data acquisition system using NHTSA'S test track. In addition, the demonstration also consisted of a briefing, containing the following: ORNL generated a plan for validating the prototype data acquisition system with regard to: removal of DASCAR from an existing vehicle, and installation and calibration in other vehicles; reliability of the sensors and systems; data collection and transmission process (data integrity); impact on the drivability of the vehicle and obtrusiveness of the system to the driver; data analysis procedures; conspicuousness of the vehicle to other drivers; and DASCAR installation and removal training and documentation. In order to identify any operational problems not captured by the systems

  20. Resolving Radiological Waste Classification and Release Issues Using Material Process Information and Simple Measurements and Models

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1997-11-01

    This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by United States Government or any agency thereof. The views and opinions of the author expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof

  1. The energy distribution structure and dynamic characteristics of energy release in electrostatic discharge process

    OpenAIRE

    Liu, Qingming; Shao, Huige; Zhang, Yunming

    2015-01-01

    The detail structure of energy output and the dynamic characteristics of electric spark discharge process have been studied to calculate the energy of electric spark induced plasma under different discharge condition accurately. A series of electric spark discharge experiments were conducted with the capacitor stored energy in the range of 10J 100J and 1000J respectively. And the resistance of wire, switch and plasma between electrodes were evaluated by different methods. An optimized method ...

  2. Controlled field release of a bioluminescent genetically engineered microorganism for bioremediation process monitoring and control

    Energy Technology Data Exchange (ETDEWEB)

    Ripp, S.; Nivens, D.E.; Ahn, Y.; Werner, C.; Jarrell, J. IV; Easter, J.P.; Cox, C.D.; Burlage, R.S.; Sayler, G.S.

    2000-03-01

    Pseudomonas fluorescens HK44 represents the first genetically engineered microorganism approved for field testing in the United States for bioremediation purposes. Strain HK44 harbors an introduced lux gene fused within a naphthalene degradative pathway, thereby allowing this recombinant microbe to bioluminescent as it degrades specific polyaromatic hydrocarbons such as naphthalene. The bioremediation process can therefore be monitored by the detection of light. P. fluorescens HK44 was inoculated into the vadose zone of intermediate-scale, semicontained soil lysimeters contaminated with naphthalene, anthracene, and phenanthrene, and the population dynamics were followed over an approximate 2-year period in order to assess the long-term efficacy of using strain HK44 for monitoring and controlling bioremediation processes. Results showed that P. fluorescens HK44 was capable of surviving initial inoculation into both hydrocarbon contaminated and uncontaminated soils and was recoverable from these soils 660 days post inoculation. It was also demonstrated that strain HK44 was capable of generating bioluminescence in response to soil hydrocarbon bioavailability. Bioluminescence approaching 166,000 counts/s was detected in fiber optic-based biosensor devices responding to volatile polyaromatic hydrocarbons, while a portable photomultiplier module detected bioluminescence at an average of 4300 counts/s directly from soil-borne HK44 cells within localized treatment areas. The utilization of lux-based bioreporter microorganisms therefore promises to be a viable option for in situ determination of environmental contaminant bioavailability and biodegradation process monitoring and control.

  3. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  4. Dynamic Modeling and Validation of a Biomass Hydrothermal Pretreatment Process - A Demonstration Scale Study

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jakobsen, Jon Geest

    2015-01-01

    for the enzymatic hydrolysis process. Several by-products are also formed, which disturb and act as inhibitors downstream. The objective of this study is to formulate and validate a large scale hydrothermal pretreatment dynamic model based on mass and energy balances, together with a complex conversion mechanism......Hydrothermal pretreatment of lignocellulosic biomass is a cost effective technology for second generation biorefineries. The process occurs in large horizontal and pressurized thermal reactors where the biomatrix is opened under the action of steam pressure and temperature to expose cellulose...... and kinetics. The study includes a comprehensive sensitivity and uncertainty analysis, with parameter estimation from real-data in the 178-185° range. To highlight the application utility of the model, a state estimator for biomass composition is developed. The predictions capture well the dynamic trends...

  5. Validation of precision powder injection molding process simulations using a spiral test geometry

    DEFF Research Database (Denmark)

    Marhöfer, Maximilian; Müller, Tobias; Tosello, Guido

    2015-01-01

    Like in many other areas of engineering, process simulations find application in precision injection molding to assist and optimize the quality and design of precise products and the molding process. Injection molding comprises mainly the manufacturing of plastic components. However, the variant ....... The necessary data and the implementation procedure of the new material models are outlined. In order to validate the simulation studies and evaluate their accuracy, the simulation results are compared with experiments performed using a spiral test geometry...... for powder injection molding. This characterization includes measurements of rheological, thermal, and pvT behavior of the powder-binder-mixes. The acquired material data was used to generate new material models for the database of the commercially available Autodesk Moldflow® simulation software...

  6. Brain regions for sound processing and song release in a small grasshopper.

    Science.gov (United States)

    Balvantray Bhavsar, Mit; Stumpner, Andreas; Heinrich, Ralf

    2017-05-01

    We investigated brain regions - mostly neuropils - that process auditory information relevant for the initiation of response songs of female grasshoppers Chorthippus biguttulus during bidirectional intraspecific acoustic communication. Male-female acoustic duets in the species Ch. biguttulus require the perception of sounds, their recognition as a species- and gender-specific signal and the initiation of commands that activate thoracic pattern generating circuits to drive the sound-producing stridulatory movements of the hind legs. To study sensory-to-motor processing during acoustic communication we used multielectrodes that allowed simultaneous recordings of acoustically stimulated electrical activity from several ascending auditory interneurons or local brain neurons and subsequent electrical stimulation of the recording site. Auditory activity was detected in the lateral protocerebrum (where most of the described ascending auditory interneurons terminate), in the superior medial protocerebrum and in the central complex, that has previously been implicated in the control of sound production. Neural responses to behaviorally attractive sound stimuli showed no or only poor correlation with behavioral responses. Current injections into the lateral protocerebrum, the central complex and the deuto-/tritocerebrum (close to the cerebro-cervical fascicles), but not into the superior medial protocerebrum, elicited species-typical stridulation with high success rate. Latencies and numbers of phrases produced by electrical stimulation were different between these brain regions. Our results indicate three brain regions (likely neuropils) where auditory activity can be detected with two of these regions being potentially involved in song initiation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Retraction: Calculation and modeling of the energy released in result of water freezing process (WFP

    Directory of Open Access Journals (Sweden)

    M. Ghodsi Hassanabad

    Full Text Available This article has been retracted: please see Elsevier Policy on Article Withdrawal (https://www.elsevier.com/about/our-business/policies/article-withdrawal.After a thorough investigation, the Editors have concluded that the acceptance of this article was based upon the positive advice of at least one illegitimate reviewer report. The report was submitted from an email account which was provided to the journal as a suggested reviewer during the submission of the article. Although purportedly a real reviewer account, the Editors have concluded that this was not of an appropriate, independent reviewer.This manipulation of the peer-review process represents a clear violation of the fundamentals of peer review, our publishing policies, and publishing ethics standards. Apologies are offered to the reviewers whose identities were assumed and to the readers of the journal that this deception was not detected during the submission process.Further, no reason has been provided for the addition of the author names M. Ghodsi Hassanabad and A. Dehghani Mehrbadi to the authorship of the revised article.

  8. PCB and PAH release from power stations and waste incineration processes in the UK

    Energy Technology Data Exchange (ETDEWEB)

    Dyke, Patrick H. [PD Consulting, Magdalen, Brobury, HR3 6DX (United Kingdom); Foan, Colin [The Environment Agency, National Centre for Risk Analysis and Options Appraisal, Kings Meadow House, Kings Meadow Road, Reading, (United Kingdom); Fiedler, Heidelore [United Nations Environment Programme (UNEP) Chemicals, 11-13, chemin des Anemones, CH-1219, Chatelaine (Switzerland)

    2003-01-01

    This study focused on emissions of polychlorinated biphenyls (PCB) and polycyclic aromatic hydrocarbons (PAH) from incineration and power generation processes. Increased concern over human exposure to both classes of compounds has meant that environmental regulators need to assess the contribution made by emissions from regulated processes to human exposure. In the first part of an assessment in the UK we reviewed literature data on emissions of PCB, focusing on the dioxin-like PCB assigned toxic equivalency factors by the World Health Organization, and PAH. The literature study was supplemented by a series of plant tests to gather initial real plant data. Literature data were limited and the lack of standard protocols for measurement and reporting of both PCB and PAH meant that few data sets were comparable. Levels of dioxin-like PCB reported in the literature and measured in UK plant tests showed that well-controlled modern combustion plants with comprehensive pollution controls gave low emissions, typically about 5-10% of the toxic equivalent of the emissions of polychlorinated dibenzodioxins and dibenzofurans at the same plants and below the widely used standard of 0.1 ng TEQ/N m{sup 3}. (Author)

  9. The Signal Validation method of Digital Process Instrumentation System on signal conditioner for SMART

    International Nuclear Information System (INIS)

    Moon, Hee Gun; Park, Sang Min; Kim, Jung Seon; Shon, Chang Ho; Park, Heui Youn; Koo, In Soo

    2005-01-01

    The function of PIS(Process Instrumentation System) for SMART is to acquire the process data from sensor or transmitter. The PIS consists of signal conditioner, A/D converter, DSP(Digital Signal Process) and NIC(Network Interface Card). So, It is fully digital system after A/D converter. The PI cabinet and PDAS(Plant Data Acquisition System) in commercial plant is responsible for data acquisition of the sensor or transmitter include RTD, TC, level, flow, pressure and so on. The PDAS has the software that processes each sensor data and PI cabinet has the signal conditioner, which is need for maintenance and test. The signal conditioner has the potentiometer to adjust the span and zero for test and maintenance. The PIS of SMART also has the signal conditioner which has the span and zero adjust same as the commercial plant because the signal conditioner perform the signal condition for AD converter such as 0∼10Vdc. But, To adjust span and zero is manual test and calibration. So, This paper presents the method of signal validation and calibration, which is used by digital feature in SMART. There are I/E(current to voltage), R/E(resistor to voltage), F/E(frequency to voltage), V/V(voltage to voltage). Etc. In this paper show only the signal validation and calibration about I/E converter that convert level, pressure, flow such as 4∼20mA into signal for AD conversion such as 0∼10Vdc

  10. Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1

    Directory of Open Access Journals (Sweden)

    Thomas Zahel

    2017-10-01

    Full Text Available Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0. However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.

  11. Identification and validation of multiple cell surface markers of clinical-grade adipose-derived mesenchymal stromal cells as novel release criteria for good manufacturing practice-compliant production.

    Science.gov (United States)

    Camilleri, Emily T; Gustafson, Michael P; Dudakovic, Amel; Riester, Scott M; Garces, Catalina Galeano; Paradise, Christopher R; Takai, Hideki; Karperien, Marcel; Cool, Simon; Sampen, Hee-Jeong Im; Larson, A Noelle; Qu, Wenchun; Smith, Jay; Dietz, Allan B; van Wijnen, Andre J

    2016-08-11

    Clinical translation of mesenchymal stromal cells (MSCs) necessitates basic characterization of the cell product since variability in biological source and processing of MSCs may impact therapeutic outcomes. Although expression of classical cell surface markers (e.g., CD90, CD73, CD105, and CD44) is used to define MSCs, identification of functionally relevant cell surface markers would provide more robust release criteria and options for quality control. In addition, cell surface expression may distinguish between MSCs from different sources, including bone marrow-derived MSCs and clinical-grade adipose-derived MSCs (AMSCs) grown in human platelet lysate (hPL). In this work we utilized quantitative PCR, flow cytometry, and RNA-sequencing to characterize AMSCs grown in hPL and validated non-classical markers in 15 clinical-grade donors. We characterized the surface marker transcriptome of AMSCs, validated the expression of classical markers, and identified nine non-classical markers (i.e., CD36, CD163, CD271, CD200, CD273, CD274, CD146, CD248, and CD140B) that may potentially discriminate AMSCs from other cell types. More importantly, these markers exhibit variability in cell surface expression among different cell isolates from a diverse cohort of donors, including freshly prepared, previously frozen, or proliferative state AMSCs and may be informative when manufacturing cells. Our study establishes that clinical-grade AMSCs expanded in hPL represent a homogeneous cell culture population according to classical markers,. Additionally, we validated new biomarkers for further AMSC characterization that may provide novel information guiding the development of new release criteria. Use of Autologous Bone Marrow Aspirate Concentrate in Painful Knee Osteoarthritis (BMAC): Clinicaltrials.gov NCT01931007 . Registered August 26, 2013. MSC for Occlusive Disease of the Kidney: Clinicaltrials.gov NCT01840540 . Registered April 23, 2013. Mesenchymal Stem Cell Therapy in Multiple

  12. Fate of stable strontium in the sewage treatment process as an analog for radiostrontium released by nuclear accidents

    International Nuclear Information System (INIS)

    Kamei-Ishikawa, Nao; Ito, Ayumi; Umita, Teruyuki

    2013-01-01

    Highlights: • 76% of the Sr entering the plant was discharged to receiving water. • 21% of the Sr flowing through the plant was transferred to the sewage sludge. •Almost all of the Sr in the sewage sludge was concentrated in incinerated sewage sludge ash. • Activated sludge had a lower sorption capacity for Sr than metals such as Cd. -- Abstract: Radionuclides were widely released into the environment due to the nuclear accident at the Fukushima Daiichi Nuclear Power Plant. Some of these radionuclides have flowed into municipal sewage treatment plants through sewer systems. We have observed the fate of stable Sr in the sewage treatment process as a means to predict the fate of radiostrontium. Concentrations of stable Sr were determined in sewage influent, effluent, dewatered sludge, and incinerated sewage sludge ash collected from a sewage treatment plant once a month from July to December 2011. In the mass balance of Sr in the sewage treatment plant, 76% of the Sr entering the plant was discharged to the receiving water on average. Additionally, 14% of the Sr flowing through the plant was transferred to the sewage sludge and then concentrated in the sludge ash without being released to the atmosphere. We also investigated Sr sorption by activated sludge in a batch experiment. Measurements at 3 and 6 h after the contact showed Sr was sorbed in the activated sludge; however, the measurements indicated Sr desorption from activated sludge occurred 48 h after the contact

  13. When Medical News Comes from Press Releases-A Case Study of Pancreatic Cancer and Processed Meat.

    Directory of Open Access Journals (Sweden)

    Joseph W Taylor

    Full Text Available The media have a key role in communicating advances in medicine to the general public, yet the accuracy of medical journalism is an under-researched area. This project adapted an established monitoring instrument to analyse all identified news reports (n = 312 on a single medical research paper: a meta-analysis published in the British Journal of Cancer which showed a modest link between processed meat consumption and pancreatic cancer. Our most significant finding was that three sources (the journal press release, a story on the BBC News website and a story appearing on the 'NHS Choices' website appeared to account for the content of over 85% of the news stories which covered the meta analysis, with many of them being verbatim or moderately edited copies and most not citing their source. The quality of these 3 primary sources varied from excellent (NHS Choices, 10 of 11 criteria addressed to weak (journal press release, 5 of 11 criteria addressed, and this variance was reflected in the accuracy of stories derived from them. Some of the methods used in the original meta-analysis, and a proposed mechanistic explanation for the findings, were challenged in a subsequent commentary also published in the British Journal of Cancer, but this discourse was poorly reflected in the media coverage of the story.

  14. A Mathematical Model for Scheduling a Batch Processing Machine with Multiple Incompatible Job Families, Non-identical Job dimensions, Non-identical Job sizes, Non-agreeable release times and due dates

    International Nuclear Information System (INIS)

    Ramasubramaniam, M; Mathirajan, M

    2013-01-01

    The paper addresses the problem scheduling a batch processing machine with multiple incompatible job families, non-identical job dimensions, non-identical job sizes and non-agreeable release dates to minimize makespan. The research problem is solved by proposing a mixed integer programming model that appropriately takes into account the parameters considered in the problem. The proposed is validated using a numerical example. The experiment conducted show that the model can pose significant difficulties in solving the large scale instances. The paper concludes by giving the scope for future work and some alternative approaches one can use for solving these class of problems.

  15. Basic study of influence of radiation defects on tritium release processes from lithium silicates

    Energy Technology Data Exchange (ETDEWEB)

    Abramenkovs, A.; Tiliks, J.; Kizane, G.; Supe, A. [Latvia Univ., Riga (Latvia). Dept. of Chem.; Grishmanovs, V. [Department of Quantum Engineering and System Science, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113 (Japan)

    1997-09-01

    The radiolysis of Li{sub 2}SiO{sub 3} and Li{sub 4}SiO{sub 4} was studied using the chemical scavengers method (CSM), thermoluminescence, lyoluminescence, electron spin resonance and spectrometric methods. The influence of the absorbed dose and many another parameters such as: irradiation conditions, sample preparation conditions and concentration of impurities on the accumulation rate of each type RD and RP were studied. Several possibilities for reducing the radiolysis of silicates were discussed. It has been found that tritium localization on the surface and in grains proceed by two different mechanisms. Tritium thermoextraction from the surface proceeds as chemidesorption of tritiated water, but from the bulk as diffusion. The tritium retention processes were studied. It has been found that tritium retention depends on irradiation conditions. Tritium retention is due to the formation of chemical bonds Li-T and thermal stable {identical_to}Si-T bonds. The accumulation of colloidal silicon and lithium can increase the tritium retention up to 25-35%. (orig.).

  16. Evaluation of occupational exposure to toxic metals released in the process of aluminum welding.

    Science.gov (United States)

    Matczak, Wanda; Gromiec, Jan

    2002-04-01

    The objective of this study was to evaluate occupational exposure to welding fumes and its elements on aluminum welders in Polish industry. The study included 52 MIG/Al fume samples and 18 TIG/Al samples in 3 plants. Air samples were collected in the breathing zone of welders (total and respirable dust). Dust concentration was determined gravimetrically, and the elements in the collected dust were determined by AAS. Mean time-weighted average (TWA) concentrations of the welding dusts/fumes and their components in the breathing zone obtained for different welding processes were, in mg/m3: MIG/Al fumes mean 6.0 (0.8-17.8), Al 2.1 (0.1-7.7), Mg 0.2 (TIG/Al fumes 0.7 (0.3-1.4), Al 0.17 (0.07-0.50). A correlation has been found between the concentration of the main components and the fume/dust concentrations in MIG/Al and TIG/Al fumes. Mean percentages of the individual components in MIG/Al fumes/dusts were Al: 30 (9-56) percent; Mg: 3 (1-5.6) percent; Mn: 0.2 (0.1-0.3) percent; Cu: 0.2 (welding methods, the nature of welding-related operations, and work environment conditions.

  17. Contribution of the different erosion processes to material release from the vessel walls of fusion devices during plasma operation

    International Nuclear Information System (INIS)

    Behrisch, R.

    2002-01-01

    In high temperature plasma experiments several processes contribute to erosion and loss of material from the vessel walls. This material may enter the plasma edge and the central plasma where it acts as impurities. It will finally be re-deposited at other wall areas. These erosion processes are: evaporation due to heating of wall areas. At very high power deposition evaporation may become very large, which has been named ''blooming''. Large evaporation and melting at some areas of the vessel wall surface may occur during heat pulses, as observed in plasma devices during plasma disruptions. At tips on the vessel walls and/or hot spots on the plasma exposed solid surfaces electrical arcs between the plasma and the vessel wall may ignite. They cause the release of ions, atoms and small metal droplets, or of carbon dust particles. Finally, atoms from the vessel walls are removed by physical and chemical sputtering caused by the bombardment of the vessel walls with ions as well as energetic neutral hydrogen atoms from the boundary plasma. All these processes have been, and are, observed in today's plasma experiments. Evaporation can in principle be controlled by very effective cooling of the wall tiles, arcing is reduced by very stable plasma operation, and sputtering by ions can be reduced by operating with a cold plasma in front of the vessel walls. However, sputtering by energetic neutrals, which impinge on all areas of the vessel walls, is likely to be the most critical process because ions lost from the plasma recycle as neutrals or have to be refuelled by neutrals leading to the charge exchange processes in the plasma. In order to quantify the wall erosion, ''materials factors'' (MF) have been introduced in the following for the different erosion processes. (orig.)

  18. BICYCLE HELMET DESIGN AND THE VIRTUAL VALIDATION OF THE IMPACT, AERODYNAMICS AND PRODUCTION PROCESS

    Directory of Open Access Journals (Sweden)

    Bojan Boshevski

    2017-12-01

    Full Text Available This paper presents the development process of a bicycle helmet through individual research, creation, presentation and analysis of the results of the most important product development stages. The quality of the development and manufacturing process of the protective equipment for extreme sports is an imperative for a successful product and its flawless function. The design of the bicycle helmet is made following the rules of the design in order to create a well-founded and functional product. After creating design sketches, a virtual prototype was developed in "SolidWorks" using the required ergonomic dimensions. 3D printed model of the human head with adapted ergonomic dimensions and the designed bicycle helmet was developed in order to verify the applied ergonomic measures. The virtual model will be used as an input in the finite element analysis of the helmet impact test based on the EN1078 standard and the aerodynamic simulations executed in "SolidWorks Simulation and Flow Simulation", for verification of the impact and aerodynamic properties. Virtual testing of aerodynamic features and the ability of the bicycle helmet to allow ventilation of the user's head indicate that the helmet performs its function in the desired way. Also, the virtual prototype will be used for the production process simulation in "SolidWorks Plastics" in order to analyze the production of the bicycle helmet. The polycarbonate helmet outer shell is subject to a number of simulations for the sake of analyzing the production process in order to obtain the desired characteristics of the polycarbonate outer shell and to avoid the disadvantages that occur in the manufacturing process. The main goal of this paper is to develop a safety bicycle helmet with improved ergonomic, validation of impact, aerodynamic characteristics and production process in order to produce a high quality product for mass use.

  19. Benchmarking Multilayer-HySEA model for landslide generated tsunami. HTHMP validation process.

    Science.gov (United States)

    Macias, J.; Escalante, C.; Castro, M. J.

    2017-12-01

    Landslide tsunami hazard may be dominant along significant parts of the coastline around the world, in particular in the USA, as compared to hazards from other tsunamigenic sources. This fact motivated NTHMP about the need of benchmarking models for landslide generated tsunamis, following the same methodology already used for standard tsunami models when the source is seismic. To perform the above-mentioned validation process, a set of candidate benchmarks were proposed. These benchmarks are based on a subset of available laboratory data sets for solid slide experiments and deformable slide experiments, and include both submarine and subaerial slides. A benchmark based on a historic field event (Valdez, AK, 1964) close the list of proposed benchmarks. A total of 7 benchmarks. The Multilayer-HySEA model including non-hydrostatic effects has been used to perform all the benchmarking problems dealing with laboratory experiments proposed in the workshop that was organized at Texas A&M University - Galveston, on January 9-11, 2017 by NTHMP. The aim of this presentation is to show some of the latest numerical results obtained with the Multilayer-HySEA (non-hydrostatic) model in the framework of this validation effort.Acknowledgements. This research has been partially supported by the Spanish Government Research project SIMURISK (MTM2015-70490-C02-01-R) and University of Malaga, Campus de Excelencia Internacional Andalucía Tech. The GPU computations were performed at the Unit of Numerical Methods (University of Malaga).

  20. Session 3, Measurement systems and signal validation/processing: Rapporteur's report

    International Nuclear Information System (INIS)

    Shepard, R.L.

    1991-01-01

    Eight papers scheduled for presentation dealt with in-core flux and temperature detectors and the interpretation of their signals. Our theme discussed was how core models could be used to validate in-core detector signals, and conversely, how the detector signals could validate the core models. Methods were proposed for distinguishing between detector malfunction (invalid signals) and actual changes in core conditions. It it necessary to reconcile these conflicting possibilities so that accurate and timely assessments of the present and future state of the core may be made during reactor operation, particularly during upset conditions. A second theme addressed the advantages and disadvantages of fixed vs movable in-core detectors -- their characteristics, employment, and signal interpretation. The economic and operating tradeoffs of fixed and movable detectors were addressed. A third theme was the use of signal processing to distinguish between gamma noise and neutron flux signals and how to improve the response times of in-core detectors. The discussion in this session relates to a broader discussion of the relative merits of self-powered neutron detectors and gamma thermometers for in-core flux monitoring which took place at the Cadarache meeting in 1988, and which was continued in Session 1 of this meeting

  1. Development and validation of a notational system to study the offensive process in football.

    Science.gov (United States)

    Sarmento, Hugo; Anguera, Teresa; Campaniço, Jorge; Leitão, José

    2010-01-01

    The most striking change within football development is the application of science to its problems and in particular the use of increasingly sophisticated technology that, supported by scientific data, allows us to establish a "code of reading" the reality of the game. Therefore, this study describes the process of the development and validation of an ad hoc system of categorization, which allows the different methods of offensive game in football and the interaction to be analyzed. Therefore, through an exploratory phase of the study, we identified 10 vertebrate criteria and the respective behaviors observed for each of these criteria. We heard a panel of five experts with the purpose of a content validation. The resulting instrument is characterized by a combination of field formats and systems of categories. The reliability of the instrument was calculated by the intraobserver agreement, and values above 0.95 for all criteria were achieved. Two FC Barcelona games were coded and analyzed, which allowed the detection of various T-patterns. The results show that the instrument serves the purpose for which it was developed and can provide important information for the understanding of game interaction in football.

  2. Development and validation of a CFD model predicting the backfill process of a nuclear waste gallery

    International Nuclear Information System (INIS)

    Gopala, Vinay Ramohalli; Lycklama a Nijeholt, Jan-Aiso; Bakker, Paul; Haverkate, Benno

    2011-01-01

    Research highlights: → This work presents the CFD simulation of the backfill process of Supercontainers with nuclear waste emplaced in a disposal gallery. → The cement-based material used for backfill is grout and the flow of grout is modelled as a Bingham fluid. → The model is verified against an analytical solution and validated against the flowability tests for concrete. → Comparison between backfill plexiglas experiment and simulation shows a distinct difference in the filling pattern. → The numerical model needs to be further developed to include segregation effects and thixotropic behavior of grout. - Abstract: Nuclear waste material may be stored in underground tunnels for long term storage. The example treated in this article is based on the current Belgian disposal concept for High-Level Waste (HLW), in which the nuclear waste material is packed in concrete shielded packages, called Supercontainers, which are inserted into these tunnels. After placement of the packages in the underground tunnels, the remaining voids between the packages and the tunnel lining is filled-up with a cement-based material called grout in order to encase the stored containers into the underground spacing. This encasement of the stored containers inside the tunnels is known as the backfill process. A good backfill process is necessary to stabilize the waste gallery against ground settlements. A numerical model to simulate the backfill process can help to improve and optimize the process by ensuring a homogeneous filling with no air voids and also optimization of the injection positions to achieve a homogeneous filling. The objective of the present work is to develop such a numerical code that can predict the backfill process well and validate the model against the available experiments and analytical solutions. In the present work the rheology of Grout is modelled as a Bingham fluid which is implemented in OpenFOAM - a finite volume-based open source computational fluid

  3. Effect of processing on polyamine content and bioactive peptides released after in vitro gastrointestinal digestion of infant formulas.

    Science.gov (United States)

    Gómez-Gallego, C; Recio, I; Gómez-Gómez, V; Ortuño, I; Bernal, M J; Ros, G; Periago, M J

    2016-02-01

    This study examined the influence of processing on polyamines and peptide release after the digestion of a commercial infant formula designed for children during the first months of life. Polyamine oxidase activity was not suppressed during the manufacturing process, which implicates that polyamine concentrations were reduced over time and during infant formula self-life. In gel electrophoresis, in vitro gastrointestinal digestion of samples with reduced amount of enzymes and time of digestion shows an increase in protein digestibility, reflected in the increase in nonprotein nitrogen after digestion and the disappearance of β-lactoglobulin and α-lactalbumin bands in gel electrophoresis. Depending on the sample, between 22 and 87 peptides were identified after gastrointestinal digestion. A peptide from β-casein f(98-105) with the sequence VKEAMAPK and antioxidant activity appeared in all of the samples. Other peptides with antioxidant, immunomodulatory, and antimicrobial activities were frequently found, which could have an effect on infant health. The present study confirms that the infant formula manufacturing process determines the polyamine content and peptidic profile after digestion of the infant formula. Because compositional dissimilarity between human milk and infant formula in polyamines and proteins could be responsible for some of the differences in health reported between breast-fed and formula-fed children, these changes must be taken into consideration because they may have a great effect on infant nutrition and development. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Impact of the Bergeron-Findeisen process on the release of aerosol particles during the evolution of cloud ice

    Science.gov (United States)

    Schwarzenböck, A.; Mertes, S.; Heintzenberg, J.; Wobrock, W.; Laj, P.

    The paper focuses on the redistribution of aerosol particles (APs) during the artificial nucleation and subsequent growth of ice crystals in a supercooled cloud. A significant number of the supercooled cloud droplets during icing periods (seeding agents: C 3H 8, CO 2) did not freeze as was presumed prior to the experiment but instead evaporated. The net mass flux of water vapour from the evaporating droplets to the nucleating ice crystals (Bergeron-Findeisen mechanism) led to the release of residual particles that simultaneously appeared in the interstitial phase. The strong decrease of the droplet residuals confirms the nucleation of ice particles on seeding germs without natural aerosol particles serving as ice nuclei. As the number of residual particles during the seedings did not drop to zero, other processes such as heterogeneous ice nucleation, spontaneous freezing, entrainment of supercooled droplets and diffusion to the created particle-free ice germs must have contributed to the experimental findings. During the icing periods, residual mass concentrations in the condensed phase dropped by a factor of 1.1-6.7, as compared to the unperturbed supercooled cloud. As the Bergeron-Findeisen process also occurs without artificial seeding in the atmosphere, this study demonstrated that the hydrometeors in mixed-phase clouds might be much cleaner than anticipated for the simple freezing process of supercooled droplets in tropospheric mid latitude clouds.

  5. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version.

    Science.gov (United States)

    Falcó-Pegueroles, Anna; Lluch-Canut, Teresa; Guàrdia-Olmos, Joan

    2013-06-01

    Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables 'frequency' and 'degree of conflict'. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable 'exposure to conflict', as well as considering six 'types of ethical conflict'. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach's alpha was used to evaluate the instrument's reliability. All analyses were performed using the statistical software PASW v19. Cronbach's alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model

  6. Process-oriented tests for validation of baroclinic shallow water models: The lock-exchange problem

    Science.gov (United States)

    Kolar, R. L.; Kibbey, T. C. G.; Szpilka, C. M.; Dresback, K. M.; Tromble, E. M.; Toohey, I. P.; Hoggan, J. L.; Atkinson, J. H.

    A first step often taken to validate prognostic baroclinic codes is a series of process-oriented tests, as those suggested by Haidvogel and Beckmann [Haidvogel, D., Beckmann, A., 1999. Numerical Ocean Circulation Modeling. Imperial College Press, London], among others. One of these tests is the so-called "lock-exchange" test or "dam break" problem, wherein water of different densities is separated by a vertical barrier, which is removed at time zero. Validation against these tests has primarily consisted of comparing the propagation speed of the wave front, as predicted by various theoretical and experimental results, to model output. In addition, inter-model comparisons of the lock-exchange test have been used to validate codes. Herein, we present a high resolution data set, taken from a laboratory-scale model, for direct and quantitative comparison of experimental and numerical results throughout the domain, not just the wave front. Data is captured every 0.2 s using high resolution digital photography, with salt concentration extracted by comparing pixel intensity of the dyed fluid against calibration standards. Two scenarios are discussed in this paper, symmetric and asymmetric mixing, depending on the proportion of dense/light water (17.5 ppt/0.0 ppt) in the experiment; the Boussinesq approximation applies to both. Front speeds, cast in terms of the dimensionless Froude number, show excellent agreement with literature-reported values. Data are also used to quantify the degree of mixing, as measured by the front thickness, which also provides an error band on the front speed. Finally, experimental results are used to validate baroclinic enhancements to the barotropic shallow water ADvanced CIRCulation (ADCIRC) model, including the effect of the vertical mixing scheme on simulation results. Based on salinity data, the model provides an average root-mean-square (rms) error of 3.43 ppt for the symmetric case and 3.74 ppt for the asymmetric case, most of which can

  7. Electron sterilization validation techniques using the controlled depth of sterilization process

    International Nuclear Information System (INIS)

    Cleghorn, D.A.; Nablo, S.V.

    1990-01-01

    Many pharmaceutical products, especially parenteral drugs, cannot be sterilized with gamma rays or high energy electrons due to the concomitant product degradation. In view of the well-controlled electron energy spectrum available in modern electron processors, it is practical to deliver sterilizing doses over depths considerably less than those defining the thickness of blister-pack constructions or pharmaceutical containers. Because bremsstrahlung and X-ray production are minimized at these low electron energies and in these low Z materials, very high electron: penetrating X-ray dose ratios are possible for the application of the technique. Thin film dosimetric techniques have been developed utilizing radiochromic film in the 10-60 g/m 2 range for determining the surface dose distribution in occluded surface areas where direct electron illumination is not possible. Procedures for validation of the process using dried spore inoculum on the product as well as in good geometry are employed to determine the process lethality and its dependence on product surface geometry. Applications of the process to labile pharmaceuticals in glass and polystyrene syringes are reviewed. It has been applied to the sterilization of commercial sterile products since 1987, and the advantages and the natural limitations of the technique are discussed. (author)

  8. Nuclear data for fusion: Validation of typical pre-processing methods for radiation transport calculations

    International Nuclear Information System (INIS)

    Hutton, T.; Sublet, J.C.; Morgan, L.; Leadbeater, T.W.

    2015-01-01

    Highlights: • We quantify the effect of processing nuclear data from ENDF to ACE format. • We consider the differences between fission and fusion angular distributions. • C-nat(n,el) at 2.0 MeV has a 0.6% deviation between original and processed data. • Fe-56(n,el) at 14.1 MeV has a 11.0% deviation between original and processed data. • Processed data do not accurately depict ENDF distributions for fusion energies. - Abstract: Nuclear data form the basis of the radiation transport codes used to design and simulate the behaviour of nuclear facilities, such as the ITER and DEMO fusion reactors. Typically these data and codes are biased towards fission and high-energy physics applications yet are still applied to fusion problems. With increasing interest in fusion applications, the lack of fusion specific codes and relevant data libraries is becoming increasingly apparent. Industry standard radiation transport codes require pre-processing of the evaluated data libraries prior to use in simulation. Historically these methods focus on speed of simulation at the cost of accurate data representation. For legacy applications this has not been a major concern, but current fusion needs differ significantly. Pre-processing reconstructs the differential and double differential interaction cross sections with a coarse binned structure, or more recently as a tabulated cumulative distribution function. This work looks at the validity of applying these processing methods to data used in fusion specific calculations in comparison to fission. The relative effects of applying this pre-processing mechanism, to both fission and fusion relevant reaction channels are demonstrated, and as such the poor representation of these distributions for the fusion energy regime. For the nat C(n,el) reaction at 2.0 MeV, the binned differential cross section deviates from the original data by 0.6% on average. For the 56 Fe(n,el) reaction at 14.1 MeV, the deviation increases to 11.0%. We

  9. Screening tool for oropharyngeal dysphagia in stroke - Part I: evidence of validity based on the content and response processes.

    Science.gov (United States)

    Almeida, Tatiana Magalhães de; Cola, Paula Cristina; Pernambuco, Leandro de Araújo; Magalhães, Hipólito Virgílio; Magnoni, Carlos Daniel; Silva, Roberta Gonçalves da

    2017-08-17

    The aim of the present study was to identify the evidence of validity based on the content and response process of the Rastreamento de Disfagia Orofaríngea no Acidente Vascular Encefálico (RADAVE; "Screening Tool for Oropharyngeal Dysphagia in Stroke"). The criteria used to elaborate the questions were based on a literature review. A group of judges consisting of 19 different health professionals evaluated the relevance and representativeness of the questions, and the results were analyzed using the Content Validity Index. In order to evidence validity based on the response processes, 23 health professionals administered the screening tool and analyzed the questions using a structured scale and cognitive interview. The RADAVE structured to be applied in two stages. The first version consisted of 18 questions in stage I and 11 questions in stage II. Eight questions in stage I and four in stage II did not reach the minimum Content Validity Index, requiring reformulation by the authors. The cognitive interview demonstrated some misconceptions. New adjustments were made and the final version was produced with 12 questions in stage I and six questions in stage II. It was possible to develop a screening tool for dysphagia in stroke with adequate evidence of validity based on content and response processes. Both validity evidences obtained so far allowed to adjust the screening tool in relation to its construct. The next studies will analyze the other evidences of validity and the measures of accuracy.

  10. Quality assessment of the Ozone_cci Climate Research Data Package (release 2017 – Part 1: Ground-based validation of total ozone column data products

    Directory of Open Access Journals (Sweden)

    K. Garane

    2018-03-01

    Full Text Available The GOME-type Total Ozone Essential Climate Variable (GTO-ECV is a level-3 data record, which combines individual sensor products into one single cohesive record covering the 22-year period from 1995 to 2016, generated in the frame of the European Space Agency's Climate Change Initiative Phase II. It is based on level-2 total ozone data produced by the GODFIT (GOME-type Direct FITting v4 algorithm as applied to the GOME/ERS-2, OMI/Aura, SCIAMACHY/Envisat and GOME-2/Metop-A and Metop-B observations. In this paper we examine whether GTO-ECV meets the specific requirements set by the international climate–chemistry modelling community for decadal stability long-term and short-term accuracy. In the following, we present the validation of the 2017 release of the Climate Research Data Package Total Ozone Column (CRDP TOC at both level 2 and level 3. The inter-sensor consistency of the individual level-2 data sets has mean differences generally within 0.5 % at moderate latitudes (±50°, whereas the level-3 data sets show mean differences with respect to the OMI reference data record that span between −0.2 ± 0.9 % (for GOME-2B and 1.0 ± 1.4 % (for SCIAMACHY. Very similar findings are reported for the level-2 validation against independent ground-based TOC observations reported by Brewer, Dobson and SAOZ instruments: the mean bias between GODFIT v4 satellite TOC and the ground instrument is well within 1.0 ± 1.0 % for all sensors, the drift per decade spans between −0.5 % and 1.0 ± 1.0 % depending on the sensor, and the peak-to-peak seasonality of the differences ranges from ∼ 1 % for GOME and OMI to  ∼ 2 % for SCIAMACHY. For the level-3 validation, our first goal was to show that the level-3 CRDP produces findings consistent with the level-2 individual sensor comparisons. We show a very good agreement with 0.5 to 2 % peak-to-peak amplitude for the monthly mean difference time series and a

  11. Quality assessment of the Ozone_cci Climate Research Data Package (release 2017) - Part 1: Ground-based validation of total ozone column data products

    Science.gov (United States)

    Garane, Katerina; Lerot, Christophe; Coldewey-Egbers, Melanie; Verhoelst, Tijl; Elissavet Koukouli, Maria; Zyrichidou, Irene; Balis, Dimitris S.; Danckaert, Thomas; Goutail, Florence; Granville, Jose; Hubert, Daan; Keppens, Arno; Lambert, Jean-Christopher; Loyola, Diego; Pommereau, Jean-Pierre; Van Roozendael, Michel; Zehner, Claus

    2018-03-01

    The GOME-type Total Ozone Essential Climate Variable (GTO-ECV) is a level-3 data record, which combines individual sensor products into one single cohesive record covering the 22-year period from 1995 to 2016, generated in the frame of the European Space Agency's Climate Change Initiative Phase II. It is based on level-2 total ozone data produced by the GODFIT (GOME-type Direct FITting) v4 algorithm as applied to the GOME/ERS-2, OMI/Aura, SCIAMACHY/Envisat and GOME-2/Metop-A and Metop-B observations. In this paper we examine whether GTO-ECV meets the specific requirements set by the international climate-chemistry modelling community for decadal stability long-term and short-term accuracy. In the following, we present the validation of the 2017 release of the Climate Research Data Package Total Ozone Column (CRDP TOC) at both level 2 and level 3. The inter-sensor consistency of the individual level-2 data sets has mean differences generally within 0.5 % at moderate latitudes (±50°), whereas the level-3 data sets show mean differences with respect to the OMI reference data record that span between -0.2 ± 0.9 % (for GOME-2B) and 1.0 ± 1.4 % (for SCIAMACHY). Very similar findings are reported for the level-2 validation against independent ground-based TOC observations reported by Brewer, Dobson and SAOZ instruments: the mean bias between GODFIT v4 satellite TOC and the ground instrument is well within 1.0 ± 1.0 % for all sensors, the drift per decade spans between -0.5 % and 1.0 ± 1.0 % depending on the sensor, and the peak-to-peak seasonality of the differences ranges from ˜ 1 % for GOME and OMI to ˜ 2 % for SCIAMACHY. For the level-3 validation, our first goal was to show that the level-3 CRDP produces findings consistent with the level-2 individual sensor comparisons. We show a very good agreement with 0.5 to 2 % peak-to-peak amplitude for the monthly mean difference time series and a negligible drift per decade of the differences in the Northern Hemisphere

  12. Irradiation process validation in the new conveyor system installed at PISI: special dosimetry

    International Nuclear Information System (INIS)

    Pantano, Barbara P.; Docters, Andrea S.

    2009-01-01

    Semi Industrial Irradiation Plant (PISI) is a multipurpose facility which uses 60 Co sources to treat different products with numerous purposes, such as sterilization of medical devices, pharmaceutical and veterinarian products; control of pathogenic microorganisms, shelf-life extension and insect disinfestation of food, among others. In order to achieve the desired effect, the product is carried inside the irradiation chamber by means of a conveyor system, and it is exposed to radiation following a pre-established path. The recent installation of a new conveyor system at PISI demands the execution of a thorough validation programme. The scope of this presentation is to describe the dose mapping tasks that will be performed in order to characterize the irradiator and its new conveyor system with respect to distribution and variability of dose, complying with international standards on good irradiation practices. Information about the distribution and variability of dose in a product irradiated under defined conditions will allow the obtaining of process parameters which will conform the process specifications in future routine irradiations. The initial stages of the Validation Programme are the Installation Qualification, the Operational Qualification (OQ) and the Performance Qualification (PQ). To accomplish the IQ diverse tests are being carried out at PISI in order to verify that the system has been installed and is operating according to its technical specifications. Both OQ and PQ require dose mapping on simulated and real product, respectively. Dose mapping consists on placing dosimeters on a process load of homogeneous material -under certain irradiator and process parameters- according to a three-dimensional pre-established placement pattern. Since the replacement of the conveyor system introduces a significant modification in the source-to-product geometry, therefore in dose distribution, there is no reference dosimetry data available, so a more exhaustive

  13. Validation of basophil histamine release against the autologous serum skin test and outcome of serum-induced basophil histamine release studies in a large population of chronic urticaria patients

    DEFF Research Database (Denmark)

    Platzer, M H; Grattan, C E H; Poulsen, Lars K.

    2005-01-01

    the immunoglobulin E (IgE) or the high affinity IgE receptor (FcepsilonRI) and serum-induced histamine release (HR) from basophils and mast cells. We have examined the correlation between the ASST and a new basophil histamine-releasing assay (the HR-Urtikaria test) in a group of well-characterized CU patients...... and subsequently determined the frequency of HR-Urticaria-positive sera from a larger population of CU patients....

  14. Investigation of the Effect of Internal Mold Release Agent and Filler on the Pulling Force in Pultrusion Process

    Directory of Open Access Journals (Sweden)

    M. Esfandeh

    2007-08-01

    Full Text Available Pulling force is one of the most important variables in pultrusion process which determines the capacity of the pultrusion machine. One of the characteristics of a desired pultrusion process is a low pulling force and a high line speed.Among the important factors affecting the pulling force are the internal mold release agent (IMR and the content and particle size of the filler in resin formulation. In addition to facilitating the part separation from the die, IMR also affects the curing kinetics and in turn the pulling force. In this research, a commercial IMR has been used in a range 1-5 phr. DSC and DMTAAnalyses showed that the presence of IMR in concentrations above 3 phr reduces the heat of curing reaction and also the curing rate. This results in an increase in pulling force. Study of filler effect showed that the increase in filler content from 4 to 8 phr reduces the pulling force but beyond that it is increased. Also, decreasing the filler particle size in line speed lower than 30 cm/min reduces the pulling force but increases it at higher line speed.

  15. A novel co-processed directly compressible release-retarding polymer: In vitro, solid state and in vivo evaluation

    Directory of Open Access Journals (Sweden)

    Prashant Kumar Choudhari

    2018-06-01

    Full Text Available Directly compressible (DC co-processed excipient capable of providing nearly zero order release with improved functionality was developed without any chemical modification by employed various techniques such as physical mixing, high shear mixer granulation and spray drying. Co-processed excipient was developed by using release retarding polymer Eudragit RSPO, separately, in combination with different concentration of hydroxyl propyl methyl cellulose 100 cps (Methocel K100 LV, HPMC, ethyl cellulose (Ethocel N50, EC and hydroxyl propyl cellulose (Klucel EF, HPC. All co-processed excipients were evaluated for their flow properties in terms of angle of repose, bulk density, tapped density, compressibility index and Hausner's ratio. Out of eighteen combinations, the nine co-processed excipients exhibited promising flow properties were found suitable for direct compression and formulated as tablets. Metoprolol succinate, a BCS Class I drug, was selected as a model drug and the formulation was developed employing direct compression approach. The developed tablets were evaluated for physical parameters like uniformity of weight, thickness, hardness, friability and assay. In vitro dissolution study confirms that formulation prepared using co-processed excipient showed sustained drug release. The optimized tablet formulation was characterized by DSC, FTIR and PXRD which confirms the absence of any chemical change during co-processing. The optimized formulation was kept for stability study for six months as per ICH guidelines and found to be stable. In vivo pharmacokinetic study of optimized formulation in rats showed similar pharmacokinetic behaviour as was observed with the marketed brand. Study revealed that co-processed excipient has advantage over polymers with single property and can be utilised for sustained release formulation. Keywords: Co-processed excipient, Metoprolol succinate, Extended-release, Direct compression, Zero-order release

  16. Experiments to populate and validate a processing model for polyurethane foam. BKC 44306 PMDI-10

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, Lisa Ann [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rao, Rekha Ranjana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelden, Bion [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); O' Hern, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wyatt, Nicholas B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hileman, Michael Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Urquhart, Alexander [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Thompson, Kyle Richard [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, David Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    We are developing computational models to elucidate the expansion and dynamic filling process of a polyurethane foam, PMDI. The polyurethane of interest is chemically blown, where carbon dioxide is produced via the reaction of water, the blowing agent, and isocyanate. The isocyanate also reacts with polyol in a competing reaction, which produces the polymer. Here we detail the experiments needed to populate a processing model and provide parameters for the model based on these experiments. The model entails solving the conservation equations, including the equations of motion, an energy balance, and two rate equations for the polymerization and foaming reactions, following a simplified mathematical formalism that decouples these two reactions. Parameters for the polymerization kinetics model are reported based on infrared spectrophotometry. Parameters describing the gas generating reaction are reported based on measurements of volume, temperature and pressure evolution with time. A foam rheology model is proposed and parameters determined through steady-shear and oscillatory tests. Heat of reaction and heat capacity are determined through differential scanning calorimetry. Thermal conductivity of the foam as a function of density is measured using a transient method based on the theory of the transient plane source technique. Finally, density variations of the resulting solid foam in several simple geometries are directly measured by sectioning and sampling mass, as well as through x-ray computed tomography. These density measurements will be useful for model validation once the complete model is implemented in an engineering code.

  17. Validity of High School Physic Module With Character Values Using Process Skill Approach In STKIP PGRI West Sumatera

    Science.gov (United States)

    Anaperta, M.; Helendra, H.; Zulva, R.

    2018-04-01

    This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.

  18. Development of In Vitro-In Vivo Correlation for Amorphous Solid Dispersion Immediate-Release Suvorexant Tablets and Application to Clinically Relevant Dissolution Specifications and In-Process Controls.

    Science.gov (United States)

    Kesisoglou, Filippos; Hermans, Andre; Neu, Colleen; Yee, Ka Lai; Palcza, John; Miller, Jessica

    2015-09-01

    Although in vitro-in vivo correlations (IVIVCs) are commonly pursued for modified-release products, there are limited reports of successful IVIVCs for immediate-release (IR) formulations. This manuscript details the development of a Multiple Level C IVIVC for the amorphous solid dispersion formulation of suvorexant, a BCS class II compound, and its application to establishing dissolution specifications and in-process controls. Four different 40 mg batches were manufactured at different tablet hardnesses to produce distinct dissolution profiles. These batches were evaluated in a relative bioavailability clinical study in healthy volunteers. Although no differences were observed for the total exposure (AUC) of the different batches, a clear relationship between dissolution and Cmax was observed. A validated Multiple Level C IVIVC against Cmax was developed for the 10, 15, 20, 30, and 45 min dissolution time points and the tablet disintegration time. The relationship established between tablet tensile strength and dissolution was subsequently used to inform suitable tablet hardness ranges within acceptable Cmax limits. This is the first published report for a validated Multiple Level C IVIVC for an IR solid dispersion formulation demonstrating how this approach can facilitate Quality by Design in formulation development and help toward clinically relevant specifications and in-process controls. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. Development and validation of a production process of platelet lysate for autologous use.

    Science.gov (United States)

    Plöderl, Karin; Strasser, Cornelia; Hennerbichler, Simone; Peterbauer-Scherb, Anja; Gabriel, Christian

    2011-01-01

    Growth factors (GF) contained in platelets are a potential source to improve wound healing by the stimulation and acceleration of soft tissue and bone healing. This resulted in the idea that autologous platelet-rich plasma or platelet lysate (PL) containing high levels of GF might improve healing processes. Today platelet products are already applied in bone and maxillofacial surgery. In recent years, cosmetic surgery and facial rejuvenation procedures are growing steadily. New methods including platelet products aiming to induce non-surgical reduction of wrinkles upon topical injection and to minimize surgical risks in general are developed. Several point-of-care devices are already available on the market. However, the amount of PL obtained by these kits is far too high for certain applications in cosmetic surgery and they offer no possibility of storing the remaining material in a sterile manner. Therefore we developed a procedure for the sterile production of smaller amounts of PL in a closed system that can also be split into several products for repeated administration. The closed system was determined to be a bag system designed for an autologous blood donation of 100 ml whole blood. We set a special focus on the validation of the production procedure, mainly regarding sterility and platelet recovery. For validation 22 healthy volunteers were asked for a blood donation, which was centrifuged twice to obtain concentrated platelets (CP). A freeze-thaw cycle caused lysis of the CP to get approximately 8.48 ± 1.36 ml PL. We yielded satisfying results of 100% sterility and a platelet recovery of 36.92% ± 18.71%. We therefore conclude that the PL obtained is ready for studies comparing it with traditional treatments.

  20. Hearing aids in children: the importance of the verification and validation processes.

    Science.gov (United States)

    Rissatto, Mara Renata; Novaes, Beatriz Cavalcanti de Albuquerque Caiuby

    2009-01-01

    during the fitting of hearing aids in children it is important, besides using a verification protocol, to have a validation process. to describe and discuss the use of a protocol for the fitting and the verification of hearing aids in children, as well as the impact of the adjustment of the acoustic characteristics in speech perception tasks. ten children aging from three to eleven years were enrolled in this study. All children presented bilateral sensorineural hearing impairment, were users of hearing aids and were followed at a public hearing health care service in Bahia. The children were submitted to the following procedures: pure tone air and bone conduction thresholds; real-ear coupler difference (RECD); verification with real-ear measurement equipment: coupler gain/output and insertion gain and to speech perception tasks: 'The Six-Sound Test' (Ling, 2006) and the 'Word Associations for Syllable Perception' (WASP - Koch, 1999). The programmed electro acoustic characteristics of the hearing aids were compared to the electro acoustic characteristics prescribed by the DSL [i/o] v4.1 software. The speech perception tasks were reapplied on three occasions: straight after the modification of the electro acoustic characteristics, after 30 days and 60 days. for more than 50% of the tested children, the programmed electro acoustic characteristics of the hearing aids did not correspond to that suggested by the DSL [i/o] software. Adequate prescription was verified in 70% of the investigated sample; this was also confirmed by the results in the speech perception tasks (p=0.000). This data confirmed that the mean percentage of correct answers increased after the modification of the electro acoustic characteristics. the use of a protocol that verifies and validates the fitting of hearing aids in children is necessary.

  1. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  2. Validation Testing of the Nitric Acid Dissolution Step Within the K Basin Sludge Pretreatment Process

    Energy Technology Data Exchange (ETDEWEB)

    AJ Schmidt; CH Delegard; KL Silvers; PR Bredt; CD Carlson; EW Hoppe; JC Hayes; DE Rinehart; SR Gano; BM Thornton

    1999-03-24

    The work described in this report involved comprehensive bench-scale testing of nitric acid (HNO{sub 3}) dissolution of actual sludge materials from the Hanford K East (KE) Basin to confirm the baseline chemical pretreatment process. In addition, process monitoring and material balance information was collected to support the development and refinement of process flow diagrams. The testing was performed by Pacific Northwest National Laboratory (PNNL)for the US Department of Energy's Office of Spent Fuel Stabilization (EM-67) and Numatec Hanford Corporation (NHC) to assist in the development of the K Basin Sludge Pretreatment Process. The baseline chemical pretreatment process for K Basin sludge is nitric acid dissolution of all particulate material passing a 1/4-in. screen. The acid-insoluble fraction (residual solids) will be stabilized (possibly by chemical leaching/rinsing and grouting), packaged, and transferred to the Hanford Environmental Restoration Disposal Facility (ERDF). The liquid fraction is to be diluted with depleted uranium for uranium criticality safety and iron nitrate for plutonium criticality safety, and neutralized with sodium hydroxide. The liquid fraction and associated precipitates are to be stored in the Hanford Tank Waste Remediation Systems (TWRS) pending vitrification. It is expected that most of the polychlorinated biphenyls (PCBs), associated with some K Basin sludges, will remain with the residual solids for ultimate disposal to ERDF. Filtration and precipitation during the neutralization step will further remove trace quantities of PCBs within the liquid fraction. The purpose of the work discussed in this report was to examine the dissolution behavior of actual KE Basin sludge materials at baseline flowsheet conditions and validate the.dissolution process step through bench-scale testing. The progress of the dissolution was evaluated by measuring the solution electrical conductivity and concentrations of key species in the

  3. Validation Testing of the Nitric Acid Dissolution Step Within the K Basin Sludge Pretreatment Process

    International Nuclear Information System (INIS)

    AJ Schmidt; CH Delegard; KL Silvers; PR Bredt; CD Carlson; EW Hoppe; JC Hayes; DE Rinehart; SR Gano; BM Thornton

    1999-01-01

    The work described in this report involved comprehensive bench-scale testing of nitric acid (HNO 3 ) dissolution of actual sludge materials from the Hanford K East (KE) Basin to confirm the baseline chemical pretreatment process. In addition, process monitoring and material balance information was collected to support the development and refinement of process flow diagrams. The testing was performed by Pacific Northwest National Laboratory (PNNL)for the US Department of Energy's Office of Spent Fuel Stabilization (EM-67) and Numatec Hanford Corporation (NHC) to assist in the development of the K Basin Sludge Pretreatment Process. The baseline chemical pretreatment process for K Basin sludge is nitric acid dissolution of all particulate material passing a 1/4-in. screen. The acid-insoluble fraction (residual solids) will be stabilized (possibly by chemical leaching/rinsing and grouting), packaged, and transferred to the Hanford Environmental Restoration Disposal Facility (ERDF). The liquid fraction is to be diluted with depleted uranium for uranium criticality safety and iron nitrate for plutonium criticality safety, and neutralized with sodium hydroxide. The liquid fraction and associated precipitates are to be stored in the Hanford Tank Waste Remediation Systems (TWRS) pending vitrification. It is expected that most of the polychlorinated biphenyls (PCBs), associated with some K Basin sludges, will remain with the residual solids for ultimate disposal to ERDF. Filtration and precipitation during the neutralization step will further remove trace quantities of PCBs within the liquid fraction. The purpose of the work discussed in this report was to examine the dissolution behavior of actual KE Basin sludge materials at baseline flowsheet conditions and validate the.dissolution process step through bench-scale testing. The progress of the dissolution was evaluated by measuring the solution electrical conductivity and concentrations of key species in the dissolver

  4. Validating and extending the three process model of alertness in airline operations.

    Directory of Open Access Journals (Sweden)

    Michael Ingre

    Full Text Available Sleepiness and fatigue are important risk factors in the transport sector and bio-mathematical sleepiness, sleep and fatigue modeling is increasingly becoming a valuable tool for assessing safety of work schedules and rosters in Fatigue Risk Management Systems (FRMS. The present study sought to validate the inner workings of one such model, Three Process Model (TPM, on aircrews and extend the model with functions to model jetlag and to directly assess the risk of any sleepiness level in any shift schedule or roster with and without knowledge of sleep timings. We collected sleep and sleepiness data from 136 aircrews in a real life situation by means of an application running on a handheld touch screen computer device (iPhone, iPod or iPad and used the TPM to predict sleepiness with varying level of complexity of model equations and data. The results based on multilevel linear and non-linear mixed effects models showed that the TPM predictions correlated with observed ratings of sleepiness, but explorative analyses suggest that the default model can be improved and reduced to include only two-processes (S+C, with adjusted phases of the circadian process based on a single question of circadian type. We also extended the model with a function to model jetlag acclimatization and with estimates of individual differences including reference limits accounting for 50%, 75% and 90% of the population as well as functions for predicting the probability of any level of sleepiness for ecological assessment of absolute and relative risk of sleepiness in shift systems for safety applications.

  5. A Mathematical Model for Reactions During Top-Blowing in the AOD Process: Validation and Results

    Science.gov (United States)

    Visuri, Ville-Valtteri; Järvinen, Mika; Kärnä, Aki; Sulasalmi, Petri; Heikkinen, Eetu-Pekka; Kupari, Pentti; Fabritius, Timo

    2017-06-01

    In earlier work, a fundamental mathematical model was proposed for side-blowing operation in the argon oxygen decarburization (AOD) process. In the preceding part "Derivation of the Model," a new mathematical model was proposed for reactions during top-blowing in the AOD process. In this model it was assumed that reactions occur simultaneously at the surface of the cavity caused by the gas jet and at the surface of the metal droplets ejected from the metal bath. This paper presents validation and preliminary results with twelve industrial heats. In the studied heats, the last combined-blowing stage was altered so that oxygen was introduced from the top lance only. Four heats were conducted using an oxygen-nitrogen mixture (1:1), while eight heats were conducted with pure oxygen. Simultaneously, nitrogen or argon gas was blown via tuyères in order to provide mixing that is comparable to regular practice. The measured carbon content varied from 0.4 to 0.5 wt pct before the studied stage to 0.1 to 0.2 wt pct after the studied stage. The results suggest that the model is capable of predicting changes in metal bath composition and temperature with a reasonably high degree of accuracy. The calculations indicate that the top slag may supply oxygen for decarburization during top-blowing. Furthermore, it is postulated that the metal droplets generated by the shear stress of top-blowing create a large mass exchange area, which plays an important role in enabling the high decarburization rates observed during top-blowing in the AOD process. The overall rate of decarburization attributable to top-blowing in the last combined-blowing stage was found to be limited by the mass transfer of dissolved carbon.

  6. Validating and extending the three process model of alertness in airline operations.

    Science.gov (United States)

    Ingre, Michael; Van Leeuwen, Wessel; Klemets, Tomas; Ullvetter, Christer; Hough, Stephen; Kecklund, Göran; Karlsson, David; Åkerstedt, Torbjörn

    2014-01-01

    Sleepiness and fatigue are important risk factors in the transport sector and bio-mathematical sleepiness, sleep and fatigue modeling is increasingly becoming a valuable tool for assessing safety of work schedules and rosters in Fatigue Risk Management Systems (FRMS). The present study sought to validate the inner workings of one such model, Three Process Model (TPM), on aircrews and extend the model with functions to model jetlag and to directly assess the risk of any sleepiness level in any shift schedule or roster with and without knowledge of sleep timings. We collected sleep and sleepiness data from 136 aircrews in a real life situation by means of an application running on a handheld touch screen computer device (iPhone, iPod or iPad) and used the TPM to predict sleepiness with varying level of complexity of model equations and data. The results based on multilevel linear and non-linear mixed effects models showed that the TPM predictions correlated with observed ratings of sleepiness, but explorative analyses suggest that the default model can be improved and reduced to include only two-processes (S+C), with adjusted phases of the circadian process based on a single question of circadian type. We also extended the model with a function to model jetlag acclimatization and with estimates of individual differences including reference limits accounting for 50%, 75% and 90% of the population as well as functions for predicting the probability of any level of sleepiness for ecological assessment of absolute and relative risk of sleepiness in shift systems for safety applications.

  7. THE PROCESS OF MASS TRANSFER ON THE SOLID-LIQUID BOUNDARY LAYER DURING THE RELEASE OF DICLOFENAC SODIUM AND PAPAVERINE HYDROCHLORIDE FROM TABLETS IN A PADDLE APPARATUS.

    Science.gov (United States)

    Kasperek, Regina; Zimmer, Lukasz; Poleszak, Ewa

    2016-01-01

    The release study of diclofenac sodium (DIC) and papaverine hydrochloride (PAP) from two formulations of the tablets in the paddle apparatus using different rotation speeds to characterize the process of mass transfer on the solid-liquid boundary layer was carried out. The dissolution process of active substances was described by values of mass transfer coefficients, the diffusion boundary layer thickness and dimensionless numbers (Sh and Re). The values of calculated parameters showed that the release of DIC and PAP from tablets comprising potato starch proceeded faster than from tablets containing HPMC and microcrystalline cellulose. They were obtained by direct dependencies between Sh and Re in the range from 75 rpm to 125 rpm for both substances from all tablets. The description of the dissolution process with the dimensionless numbers make it possible to plan the drug with the required release profile under given in vitro conditions.

  8. On the Load-Unload (L-U) and Force-Release (F-R) Algorithms for Simulating Brittle Fracture Processes via Lattice Models

    KAUST Repository

    Liu, Jinxing

    2011-11-11

    General summaries on the load-unload and force-release methods indicate that the two methods are efficient for different-charactered quasi-static failures; therefore, it is important to choose the right one for different applications. Then we take, as an example, the case where the release of the ruptured element\\'s internal force is infinitely slower than the relaxation of the lattice system and analyze why the force-release method works better than the load-unload method in this particular case. Different trial deformation fields are used by them to track the next equilibrium state. Force-release method ensures that the deformation throughout the whole failure process coincides exactly with the controlled-displacement boundary conditions and we utilize the \\'left modulus\\' concept to prove that this method satisfies the energetic evolution in the force-displacement diagram; both of which are not satisfied by the load-unload method. To illustrate that the force-release method is not just another form of the load-unload method, a tensile test on a specifically designed system is analyzed to further compare the above two methods, showing that their predicted sequences of elemental failures can be different. In closing, we simulate the uniaxial tensile test on a beam lattice system by the load-unload and force-release methods and exploit the details of the resulting fracture processes. © The Author(s), 2011.

  9. Enhancement of the sludge disintegration and nutrients release by a treatment with potassium ferrate combined with an ultrasonic process.

    Science.gov (United States)

    Li, Wei; Yu, Najiaowa; Liu, Qian; Li, Yiran; Ren, Nanqi; Xing, Defeng

    2018-09-01

    Sludge disintegration by ultrasound is a promising sludge treatment method. In order to enhance the efficiency of the sludge reduction and hydrolysis, potassium ferrate (K 2 FeO 4 ) (PF) was used. A novel method was developed to improve the sludge disintegration-sludge pretreatment by using PF in combination with an ultrasonic treatment (PF + ULT). After a short-term PF + ULT treatment, 17.23% of the volatile suspended solids (VSS) were reduced after a 900-min reaction time, which is 61.3% higher than the VSS reduction for the raw sludge. The supernatant soluble chemical oxygen demand (SCOD), total nitrogen (TN), volatile fatty acids (VFAs), soluble protein and polysaccharides increased by 522.5%, 1029.4%, 878.4%, 2996.6% and 801.9%, respectively. The constituent parts of the dissolved organic matter of the sludge products were released efficiently, which demonstrated the positive effect caused by the PF + ULT. The enhanced sludge disintegration process further alleviates environmental risk and offers a more efficient and convenient method for utilizing sludge. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Anaerobic digestion of nitrogen rich poultry manure: Impact of thermophilic biogas process on metal release and microbial resistances.

    Science.gov (United States)

    Anjum, Reshma; Grohmann, Elisabeth; Krakat, Niclas

    2017-02-01

    Poultry manure is a nitrogen rich fertilizer, which is usually recycled and spread on agricultural fields. Due to its high nutrient content, chicken manure is considered to be one of the most valuable animal wastes as organic fertilizer. However, when chicken litter is applied in its native form, concerns are raised as such fertilizers also include high amounts of antibiotic resistant pathogenic Bacteria and heavy metals. We studied the impact of an anaerobic thermophilic digestion process on poultry manure. Particularly, microbial antibiotic resistance profiles, mobile genetic elements promoting the resistance dissemination in the environment as well as the presence of heavy metals were focused in this study. The initiated heat treatment fostered a community shift from pathogenic to less pathogenic bacterial groups. Phenotypic and molecular studies demonstrated a clear reduction of multiple resistant pathogens and self-transmissible plasmids in the heat treated manure. That treatment also induced a higher release of metals and macroelements. Especially, Zn and Cu exceeded toxic thresholds. Although the concentrations of a few metals reached toxic levels after the anaerobic thermophilic treatment, the quality of poultry manure as organic fertilizer may raise significantly due to the elimination of antibiotic resistance genes (ARG) and self-transmissible plasmids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. An overview of mesoscale aerosol processes, comparisons, and validation studies from DRAGON networks

    Science.gov (United States)

    Holben, Brent N.; Kim, Jhoon; Sano, Itaru; Mukai, Sonoyo; Eck, Thomas F.; Giles, David M.; Schafer, Joel S.; Sinyuk, Aliaksandr; Slutsker, Ilya; Smirnov, Alexander; Sorokin, Mikhail; Anderson, Bruce E.; Che, Huizheng; Choi, Myungje; Crawford, James H.; Ferrare, Richard A.; Garay, Michael J.; Jeong, Ukkyo; Kim, Mijin; Kim, Woogyung; Knox, Nichola; Li, Zhengqiang; Lim, Hwee S.; Liu, Yang; Maring, Hal; Nakata, Makiko; Pickering, Kenneth E.; Piketh, Stuart; Redemann, Jens; Reid, Jeffrey S.; Salinas, Santo; Seo, Sora; Tan, Fuyi; Tripathi, Sachchida N.; Toon, Owen B.; Xiao, Qingyang

    2018-01-01

    Over the past 24 years, the AErosol RObotic NETwork (AERONET) program has provided highly accurate remote-sensing characterization of aerosol optical and physical properties for an increasingly extensive geographic distribution including all continents and many oceanic island and coastal sites. The measurements and retrievals from the AERONET global network have addressed satellite and model validation needs very well, but there have been challenges in making comparisons to similar parameters from in situ surface and airborne measurements. Additionally, with improved spatial and temporal satellite remote sensing of aerosols, there is a need for higher spatial-resolution ground-based remote-sensing networks. An effort to address these needs resulted in a number of field campaign networks called Distributed Regional Aerosol Gridded Observation Networks (DRAGONs) that were designed to provide a database for in situ and remote-sensing comparison and analysis of local to mesoscale variability in aerosol properties. This paper describes the DRAGON deployments that will continue to contribute to the growing body of research related to meso- and microscale aerosol features and processes. The research presented in this special issue illustrates the diversity of topics that has resulted from the application of data from these networks.

  12. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Anders, Gabriel; The ATLAS collaboration; Kazarov, Andrei; Lehmann Miotto, Giovanna; Santos, Alejandro; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from different sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP directives is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the persistent storage of all relevant data streams that are produced during data taking, the second is a...

  13. A Validation System for the Complex Event Processing Directives of the ATLAS Shifter Assistant Tool

    CERN Document Server

    Santos, Alejandro; The ATLAS collaboration; Avolio, Giuseppe; Kazarov, Andrei; Lehmann Miotto, Giovanna; Soloviev, Igor

    2015-01-01

    Complex Event Processing (CEP) is a methodology that combines data from many sources in order to identify events or patterns that need particular attention. It has gained a lot of momentum in the computing world in the past few years and is used in ATLAS to continuously monitor the behaviour of the data acquisition system, to trigger corrective actions and to guide the experiment’s operators. This technology is very powerful, if experts regularly insert and update their knowledge about the system’s behaviour into the CEP engine. Nevertheless, writing or modifying CEP rules is not trivial since the used programming paradigm is quite different with respect to what developers are normally familiar with. In order to help experts verify that the directives work as expected, we have thus developed a complete testing and validation environment. This system consists of three main parts: the first is the data reader from existing storage of all relevant data streams that are produced during data taking, the second...

  14. An overview of mesoscale aerosol processes, comparisons, and validation studies from DRAGON networks

    Directory of Open Access Journals (Sweden)

    B. N. Holben

    2018-01-01

    Full Text Available Over the past 24 years, the AErosol RObotic NETwork (AERONET program has provided highly accurate remote-sensing characterization of aerosol optical and physical properties for an increasingly extensive geographic distribution including all continents and many oceanic island and coastal sites. The measurements and retrievals from the AERONET global network have addressed satellite and model validation needs very well, but there have been challenges in making comparisons to similar parameters from in situ surface and airborne measurements. Additionally, with improved spatial and temporal satellite remote sensing of aerosols, there is a need for higher spatial-resolution ground-based remote-sensing networks. An effort to address these needs resulted in a number of field campaign networks called Distributed Regional Aerosol Gridded Observation Networks (DRAGONs that were designed to provide a database for in situ and remote-sensing comparison and analysis of local to mesoscale variability in aerosol properties. This paper describes the DRAGON deployments that will continue to contribute to the growing body of research related to meso- and microscale aerosol features and processes. The research presented in this special issue illustrates the diversity of topics that has resulted from the application of data from these networks.

  15. The influence of the structural characteristics of polyethylene on the release of gas mixtures for extrusion processing

    Directory of Open Access Journals (Sweden)

    V. I. Korchagin

    2017-01-01

    Full Text Available The study of thermal and mechano-thermal effects in the inert and air environment on the evolution of gas formations from high pressure polyethylene (LDPE was carried out on a Smart RHEO 1000 capillary rheometer with the CeastView 5.94.4D software, using capillaries 5 mm in length and 1 mm in diameter . Study of composition Gas formations during the deformation of polyethylenes of different grades through the channel of a capillary viscometer. Which were characterized by structural characteristics, were carried out at shear rates close to production in the range from 50 to 300 s-1 at temperatures of 160, 190 and 220 ° C. The objects of the study were domestic thermoplastics of the following brands: LDPE 10803-020; LDPE 15803020; LPVD F-03020-S; HDPE 293-285-D, characterized by structural characteristics. It is established that the gasification during extrusion processing is promoted by the branching of polyethylene, while the degree of destruction processes increases with increasing temperature and depends on the reaction medium of the working volume of the equipment. Emerging critical shear stresses in the absence of oxidants and impurities contribute to mechano-destruction, accompanied by the formation of free radicals, which recombine to form a more branched structure of the polymer. In turn, the high temperature promotes degradation in the side parts of the polymer to form volatile products that are released from the reaction volume. It should be noted that the gassing due to thermal exposure is promoted by the air environment, but to a lesser extent than with mechano-thermal action. A smaller measure of the impact in the inertial medium is apparently associated with a limited access of oxidants to the destruction centers.

  16. Process-oriented dose assessment model for 14C due to releases during normal operation of a nuclear power plant

    International Nuclear Information System (INIS)

    Aquilonius, Karin; Hallberg, Bengt

    2005-01-01

    Swedish nuclear utility companies are required to assess doses due to releases of radionuclides during normal operation. In 2001, calculation methods used earlier were updated due to new authority regulations. The isotope 14 C is of special interest in dose assessments due to the role of carbon in the metabolism of all life forms. Earlier, factors expressing the ratio between concentration of 14 C in air and in various plants were used. In order to extend the possibility to take local conditions into account, a process-oriented assessment model for uptake of carbon and doses from releases of 14 C to air was developed (POM 14 C). The model uses part of Daisy which has been developed to model the turnover of carbon in crops. [Hansen, S., Jensen, H.E., Nielsen, N.E., Svendsen, H., 1993. Description of the Soil Plant System Model DAISY, Basic Principles and Modelling Approach. Simulation Model for Transformation and Transport of Energy and Matter in the Soil Plant Atmosphere System. Jordbruksfoerlaget, The Royal Veterianary and Agricultural University, Copenhagen, Denmark]. The main objectives were to test model performance of the former method, and to investigate if taking site specific parameters into account to a greater degree would lead to major differences in the results. Several exposure pathways were considered: direct consumption of locally grown cereals, vegetables, and root vegetables, as well as consumption of milk and meat from cows having eaten fodder cereals and green fodder from the area around the nuclear plant. The total dose of the earlier model was compared with that of POM 14 C. The result of the former was shown to be slightly higher than the latter, but POM 14 C confirmed that the earlier results were of a reasonable magnitude. When full account of local conditions was taken, e.g. as regards solar radiation, temperature, and concentration of 14 C in air at various places in the surroundings of each nuclear plant, a difference in dose between

  17. The kinetics of sterane biological marker release and degradation processes during the hydrous pyrolysis of vitrinite kerogen

    Science.gov (United States)

    Abbott, G. D.; Wang, G. Y.; Eglinton, T. I.; Home, A. K.; Petch, G. S.

    1990-09-01

    The hydrous pyrolysis of a mineral-free vitrinite kerogen (Dinantian coal Lower Carboniferous, North East England) has been carried out at four temperatures (270, 300, 330, and 350°C) for heating times ranging from 2 to 648 h. No significant differences in the epimer-based maturation parameters 20S/(20S + 20R)-5α(H),14α(H),17α(H) C 29 non-rearranged steranes and 22S/(22S+22R)-17α(H), 21β(H) homohopanes were found for a comparison between "expelled oil" and "bitumen" fractions in the resulting pyrolysates. A deuterated model compound ((20R)-5α(H),14α(H),17α(H)-[2,2,4,4-d 4] cholestane) was added to a number of preextracted kerogens (vitrinite, Kimmeridge, Messel and Monterey) and the mixtures were heated under typical hydrous pyrolysis conditions. These experiments showed that direct chiral isomerisation at C-20 in the non-rearranged steranes appears to be relatively unimportant during hydrous pyrolysis which has also been suggested by other recent studies on geological samples.A kinetic model comprising consecutive release and degradation processes was derived to measure first-order rate coefficients from the bi-exponential concentration-time functions of both the (20R)-and (20S)-5α(H),14α(H),17α(H) C 29 "free" steranes in the vitrinite kerogen pyrolysates. This data was then used to calculate preliminary Arrhenius parameters for release ((20S): ΔEa = 125 ± 30 kJ mol -1, A ≈ 4.7 × 10 5 s -1;(20R): ΔEa = 151 ± 39 kJ mol -1, A ≈ 2.7 × 10 9 s -1) and degradation ((20S): ΔEa = 104 ± 22 kJ mol -1, A ≈ 5.8 × 10 3 s -1; (20R): Δa = 87 ± 6 kJ mol -1, A ≈ 2.2 × 10 2 s -1) of the above individual isomers and the values were found to be consistent with a free-radical chain mechanism. This work helps in the greater understanding of the important biomarker reactions that prevail in hydrous pyrolysis experiments.

  18. Formation and release of. beta. -glucosidase by Aspergillus niger ZIMET 43 746 in correlation to process operations

    Energy Technology Data Exchange (ETDEWEB)

    Kerns, G; Dalchow, E; Klappach, G; Meyer, D

    1986-01-01

    The total formation of ..beta..-glucosidase by the wild strain of Aspergillus niger ZIMET 43 746 is non-growth-associated. In discontinuous culture the total ..beta..-glucosidase activity related to the mycelium is increasing with the age of the mycelium. The complete release of the remaining mycelial-associated ..beta..-glucosidase is dependent on the structure of the mycelium. In the cases of the mycelium forms pellets throughout the growth phase than the release of ..beta..-glucosidase is accelerated compared to the release from loosly branched mycelium. Increasing shear stress caused by increasing of the impeller speed promotes the formation of pellets.

  19. The Metacognitive Anger Processing (MAP) Scale - Validation in a Mixed Clinical and a Forensic In-Patient Sample

    DEFF Research Database (Denmark)

    Moeller, Stine Bjerrum; Bech, Per

    2018-01-01

    BACKGROUND: The metacognitive approach by Wells and colleagues has gained empirical support with a broad range of symptoms. The Metacognitive Anger Processing (MAP) scale was developed to provide a metacognitive measure on anger (Moeller, 2016). In the preliminary validation, three components were...... identified (positive beliefs, negative beliefs and rumination) to be positively correlated with the anger. AIMS: To validate the MAP in a sample of mixed clinical patients (n = 88) and a sample of male forensic patients (n = 54). METHOD: The MAP was administered together with measures of metacognition, anger......, rumination, anxiety and depressive symptoms. RESULTS: The MAP showed acceptable scalability and excellent reliability. Convergent validity was evidenced using the general metacognitive measure (MCQ-30), and concurrent validity was supported using two different anger measures (STAXI-2 and NAS). CONCLUSIONS...

  20. Validation of King's transaction process for healthcare provider-patient communication in pharmaceutical context: One cross-sectional study.

    Science.gov (United States)

    Wang, Dan; Liu, Chenxi; Zhang, Zinan; Ye, Liping; Zhang, Xinping

    2018-03-27

    With the impressive advantages of patient-pharmacist communication being advocated and poor pharmacist-patient communication in different settings, it is of great significance and urgency to explore the mechanism of the pharmacist-patient communicative relationship. The King's theory of goal attainment is proposed as one of the most promising models to be applied, because it takes into consideration both improving the patient-pharmacist relationship and attaining patients' health outcomes. This study aimed to validate the King's transaction process and build the linkage between the transaction process and patient satisfaction in a pharmaceutical context. A cross-sectional study was conducted in four tertiary hospitals in two provincial cities (Wuhan and Shanghai) in central and east China in July 2017. Patients over 18 were investigated in the pharmacies of the hospitals. The instrument for the transaction process was revised and tested. Path analysis was conducted for the King's transaction process and its relationship with patient satisfaction. Five hundred eighty-nine participants were investigated for main study. Prior to the addition of covariates, the hypothesised model of the King's transaction process was validated, in which all paths of the transaction process were statistically significant (p process had direct effects on patient satisfaction (p process was established as one valid theoretical framework of healthcare provider-patient communication in a pharmaceutical context. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Kinetics and mechanisms of metal retention/release in geochemical processes in soil. 1998 annual progress report

    International Nuclear Information System (INIS)

    Taylor, R.W.

    1998-01-01

    'The long-term fate of toxic metals in soils cannot be precisely predicted, and often remediation recommendations and techniques may be ineffective or unnecessary. This work will generate basic knowledge on the kinetics and mechanism(s) of heavy metal retention/release by soil mineral colloids. The information should assist in improving remediation strategies for toxic heavy metal contaminated soils. The objectives are: (1) To determine the effects of residence time on the mechanisms of Cr(VI), Cu(II), Co(II), Cd(II), Pb(II), and Ni(II) sorption/release on Fe and Al oxide and clay mineral surfaces using kinetic studies coupled to extended x-ray absorption fine structure (EXAFS) spectroscopy and fourier transform infrared (FTIR) spectroscopy. (2) To study the effect of temperature, pH, and phosphate on metal sorption by oxides, and derive thermodynamic parameters to describe the sorption process. As of June, 16, 1997 several clay minerals were tested for their efficiency of removing Cr from aqueous systems. The materials tested--smectite, vermiculites, illites, and kaolinite--represent the natural clay minerals that are abundant in soils and sediments. The clays were used in either their original or reduced (reduced with sodium dithionite) forms. The experimental result indicate that the reduced clays acted as an efficient remover of Cr(VI) from an aqueous system. The XANES spectra of Cr-treated clays provided evidence that the clays reduced Cr(VI) to Cr(III) and immobilized Cr in the clays at the same time. Sodium dithionite applied directly into aqueous systems reduced Cr(VI) to Cr(III), but could not immobilize Cr even in the presence of the clays. The Cr(VI) removal capacity varied with the clay mineral type and the structural Fe content. For the clays used in this study, the removal capacity follows the orders of smectites > vermiculites and illites > kaolinite. Within the same type of clay minerals, reduction of Cr(VI) is highly related to the ferrous iron

  2. Effects of adenosine on renin release from isolated rat glomeruli and kidney slices

    DEFF Research Database (Denmark)

    Skøtt, O; Baumbach, L

    1985-01-01

    was used. The specificity of the renin release process was validated by measuring adenylate kinase as a marker for cytoplasmatic leak. Adenosine (10 micrograms/ml) halved basal renin release from incubated KS as compared to controls (P less than 0.001, n = 8, 8). Renin release from LAG stimulated...... by calcium depletion was also inhibited (P less than 0.05, n = 8, 9) whereas basal release was not affected (n = 6, 12). No effect was detected neither on basal nor on calcium stimulated renin release from SAG. We conclude that adenosine inhibits renin release in vitro by a mechanism independent...

  3. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  4. Disruption of Pseudomonas putida by high pressure homogenization: a comparison of the predictive capacity of three process models for the efficient release of arginine deiminase.

    Science.gov (United States)

    Patil, Mahesh D; Patel, Gopal; Surywanshi, Balaji; Shaikh, Naeem; Garg, Prabha; Chisti, Yusuf; Banerjee, Uttam Chand

    2016-12-01

    Disruption of Pseudomonas putida KT2440 by high-pressure homogenization in a French press is discussed for the release of arginine deiminase (ADI). The enzyme release response of the disruption process was modelled for the experimental factors of biomass concentration in the broth being disrupted, the homogenization pressure and the number of passes of the cell slurry through the homogenizer. For the same data, the response surface method (RSM), the artificial neural network (ANN) and the support vector machine (SVM) models were compared for their ability to predict the performance parameters of the cell disruption. The ANN model proved to be best for predicting the ADI release. The fractional disruption of the cells was best modelled by the RSM. The fraction of the cells disrupted depended mainly on the operating pressure of the homogenizer. The concentration of the biomass in the slurry was the most influential factor in determining the total protein release. Nearly 27 U/mL of ADI was released within a single pass from slurry with a biomass concentration of 260 g/L at an operating pressure of 510 bar. Using a biomass concentration of 100 g/L, the ADI release by French press was 2.7-fold greater than in a conventional high-speed bead mill. In the French press, the total protein release was 5.8-fold more than in the bead mill. The statistical analysis of the completely unseen data exhibited ANN and SVM modelling as proficient alternatives to RSM for the prediction and generalization of the cell disruption process in French press.

  5. A Proposed Mechanism for Development of CTE Following Concussive Events: Head Impact, Water Hammer Injury, Neurofilament Release, and Autoimmune Processes.

    Science.gov (United States)

    Kornguth, Steven; Rutledge, Neal; Perlaza, Gabe; Bray, James; Hardin, Allen

    2017-12-19

    During the past decade, there has been an increasing interest in early diagnosis and treatment of traumatic brain injuries (TBI) that lead to chronic traumatic encephalopathy (CTE). The subjects involved range from soldiers exposed to concussive injuries from improvised explosive devices (IEDs) to a significant number of athletes involved in repetitive high force impacts. Although the forces from IEDs are much greater by a magnitude than those from contact sports, the higher frequency associated with contact sports allows for more controlled assessment of the mechanism of action. In our study, we report findings in university-level women soccer athletes followed over a period of four and a half years from accession to graduation. Parameters investigated included T1-, T2-, and susceptibility-weighted magnetic resonance images (SWI), IMPACT (Immediate Post-Concussion Assessment and Cognitive Testing), and C3 Logix behavioral and physiological assessment measures. The MRI Studies show several significant findings: first, a marked increase in the width of sulci in the frontal to occipital cortices; second, an appearance of subtle hemorrhagic changes at the base of the sulci; third was a sustained reduction in total brain volume in several soccer players at a developmental time when brain growth is generally seen. Although all of the athletes successfully completed their college degree and none exhibited long term clinical deficits at the time of graduation, the changes documented by MRI represent a clue to the pathological mechanism following an injury paradigm. The authors propose that our findings and those of prior publications support a mechanism of injury in CTE caused by an autoimmune process associated with the release of neural proteins from nerve cells at the base of the sulcus from a water hammer injury effect. As evidence accumulates to support this hypothesis, there are pharmacological treatment strategies that may be able to mitigate the development of

  6. A Proposed Mechanism for Development of CTE Following Concussive Events: Head Impact, Water Hammer Injury, Neurofilament Release, and Autoimmune Processes

    Directory of Open Access Journals (Sweden)

    Steven Kornguth

    2017-12-01

    Full Text Available During the past decade, there has been an increasing interest in early diagnosis and treatment of traumatic brain injuries (TBI that lead to chronic traumatic encephalopathy (CTE. The subjects involved range from soldiers exposed to concussive injuries from improvised explosive devices (IEDs to a significant number of athletes involved in repetitive high force impacts. Although the forces from IEDs are much greater by a magnitude than those from contact sports, the higher frequency associated with contact sports allows for more controlled assessment of the mechanism of action. In our study, we report findings in university-level women soccer athletes followed over a period of four and a half years from accession to graduation. Parameters investigated included T1-, T2-, and susceptibility-weighted magnetic resonance images (SWI, IMPACT (Immediate Post-Concussion Assessment and Cognitive Testing, and C3 Logix behavioral and physiological assessment measures. The MRI Studies show several significant findings: first, a marked increase in the width of sulci in the frontal to occipital cortices; second, an appearance of subtle hemorrhagic changes at the base of the sulci; third was a sustained reduction in total brain volume in several soccer players at a developmental time when brain growth is generally seen. Although all of the athletes successfully completed their college degree and none exhibited long term clinical deficits at the time of graduation, the changes documented by MRI represent a clue to the pathological mechanism following an injury paradigm. The authors propose that our findings and those of prior publications support a mechanism of injury in CTE caused by an autoimmune process associated with the release of neural proteins from nerve cells at the base of the sulcus from a water hammer injury effect. As evidence accumulates to support this hypothesis, there are pharmacological treatment strategies that may be able to mitigate the

  7. Validation of the baking process as a kill-step for controlling Salmonella in muffins.

    Science.gov (United States)

    Channaiah, Lakshmikantha H; Michael, Minto; Acuff, Jennifer C; Phebus, Randall K; Thippareddi, Harshavardhan; Olewnik, Maureen; Milliken, George

    2017-06-05

    This research investigates the potential risk of Salmonella in muffins when contamination is introduced via flour, the main ingredient. Flour was inoculated with a 3-strain cocktail of Salmonella serovars (Newport, Typhimurium, and Senftenberg) and re-dried to achieve a target concentration of ~8logCFU/g. The inoculated flour was then used to prepare muffin batter following a standard commercial recipe. The survival of Salmonella during and after baking at 190.6°C for 21min was analyzed by plating samples on selective and injury-recovery media at regular intervals. The thermal inactivation parameters (D and z values) of the 3-strain Salmonella cocktail were determined. A ≥5logCFU/g reduction in Salmonella population was demonstrated by 17min of baking, and a 6.1logCFU/g reduction in Salmonella population by 21min of baking. The D-values of Salmonella serovar cocktail in muffin batter were 62.2±3.0, 40.1±0.9 and 16.5±1.7min at 55, 58 and 61°C, respectively; and the z-value was 10.4±0.6°C. The water activity (a w ) of the muffin crumb (0.928) after baking and 30min of cooling was similar to that of pre-baked muffin batter, whereas the a w of the muffin crust decreased to (0.700). This study validates a typical commercial muffin baking process utilizing an oven temperature of 190.6°C for at least 17min as an effective kill-step in reducing a Salmonella serovar population by ≥5logCFU/g. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Non-isothermal processes during the drying of bare soil: Model Development and Validation

    Science.gov (United States)

    Sleep, B.; Talebi, A.; O'Carrol, D. M.

    2017-12-01

    Several coupled liquid water, water vapor, and heat transfer models have been developed either to study non-isothermal processes in the subsurface immediately below the ground surface, or to predict the evaporative flux from the ground surface. Equilibrium phase change between water and gas phases is typically assumed in these models. Recently, a few studies have questioned this assumption and proposed a coupled model considering kinetic phase change. However, none of these models were validated against real field data. In this study, a non-isothermal coupled model incorporating kinetic phase change was developed and examined against the measured data from a green roof test module. The model also incorporated a new surface boundary condition for water vapor transport at the ground surface. The measured field data included soil moisture content and temperature at different depths up to the depth of 15 cm below the ground surface. Lysimeter data were collected to determine the evaporation rates. Short and long wave radiation, wind velocity, air ambient temperature and relative humidity were measured and used as model input. Field data were collected for a period of three months during the warm seasons in south eastern Canada. The model was calibrated using one drying period and then several other drying periods were simulated. In general, the model underestimated the evaporation rates in the early stage of the drying period, however, the cumulative evaporation was in good agreement with the field data. The model predicted the trends in temperature and moisture content at the different depths in the green roof module. The simulated temperature was lower than the measured temperature for most of the simulation time with the maximum difference of 5 ° C. The simulated moisture content changes had the same temporal trend as the lysimeter data for the events simulated.

  9. Development of Maltodextrin-Based Immediate-Release Tablets Using an Integrated Twin-Screw Hot-Melt Extrusion and Injection-Molding Continuous Manufacturing Process.

    Science.gov (United States)

    Puri, Vibha; Brancazio, Dave; Desai, Parind M; Jensen, Keith D; Chun, Jung-Hoon; Myerson, Allan S; Trout, Bernhardt L

    2017-11-01

    The combination of hot-melt extrusion and injection molding (HME-IM) is a promising process technology for continuous manufacturing of tablets. However, there has been limited research on its application to formulate crystalline drug-containing immediate-release tablets. Furthermore, studies that have applied the HME-IM process to molded tablets have used a noncontinuous 2-step approach. The present study develops maltodextrin (MDX)-based extrusion-molded immediate-release tablets for a crystalline drug (griseofulvin) using an integrated twin-screw HME-IM continuous process. At 10% w/w drug loading, MDX was selected as the tablet matrix former based on a preliminary screen. Furthermore, liquid and solid polyols were evaluated for melt processing of MDX and for impact on tablet performance. Smooth-surfaced tablets, comprising crystalline griseofulvin solid suspension in the amorphous MDX-xylitol matrix, were produced by a continuous process on a twin-screw extruder coupled to a horizontally opening IM machine. Real-time HME process profiles were used to develop automated HME-IM cycles. Formulation adjustments overcame process challenges and improved tablet strength. The developed MDX tablets exhibited adequate strength and a fast-dissolving matrix (85% drug release in 20 min), and maintained performance on accelerated stability conditions. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  10. Validation of the Spanish Version of the Emotional Skills Assessment Process (ESAP) with College Students in Mexico

    Science.gov (United States)

    Teliz Triujeque, Rosalia

    2009-01-01

    The major purpose of the study was to determine the construct validity of the Spanish version of the Emotional Skills Assessment Process (ESAP) in a targeted population of agriculture college students in Mexico. The ESAP is a self assessment approach that helps students to identify and understand emotional intelligence skills relevant for…

  11. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    Science.gov (United States)

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  12. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  13. Validation of the manufacturing process used to produce long-acting recombinant factor IX Fc fusion protein.

    Science.gov (United States)

    McCue, J; Osborne, D; Dumont, J; Peters, R; Mei, B; Pierce, G F; Kobayashi, K; Euwart, D

    2014-07-01

    Recombinant factor IX Fc (rFIXFc) fusion protein is the first of a new class of bioengineered long-acting factors approved for the treatment and prevention of bleeding episodes in haemophilia B. The aim of this work was to describe the manufacturing process for rFIXFc, to assess product quality and to evaluate the capacity of the process to remove impurities and viruses. This manufacturing process utilized a transferable and scalable platform approach established for therapeutic antibody manufacturing and adapted for production of the rFIXFc molecule. rFIXFc was produced using a process free of human- and animal-derived raw materials and a host cell line derived from human embryonic kidney (HEK) 293H cells. The process employed multi-step purification and viral clearance processing, including use of a protein A affinity capture chromatography step, which binds to the Fc portion of the rFIXFc molecule with high affinity and specificity, and a 15 nm pore size virus removal nanofilter. Process validation studies were performed to evaluate identity, purity, activity and safety. The manufacturing process produced rFIXFc with consistent product quality and high purity. Impurity clearance validation studies demonstrated robust and reproducible removal of process-related impurities and adventitious viruses. The rFIXFc manufacturing process produces a highly pure product, free of non-human glycan structures. Validation studies demonstrate that this product is produced with consistent quality and purity. In addition, the scalability and transferability of this process are key attributes to ensure consistent and continuous supply of rFIXFc. © 2014 The Authors. Haemophilia Published by John Wiley & Sons Ltd.

  14. Validation of a DNA IQ-based extraction method for TECAN robotic liquid handling workstations for processing casework.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Fourney, Ron M

    2010-10-01

    A semi-automated DNA extraction process for casework samples based on the Promega DNA IQ™ system was optimized and validated on TECAN Genesis 150/8 and Freedom EVO robotic liquid handling stations configured with fixed tips and a TECAN TE-Shake™ unit. The use of an orbital shaker during the extraction process promoted efficiency with respect to DNA capture, magnetic bead/DNA complex washes and DNA elution. Validation studies determined the reliability and limitations of this shaker-based process. Reproducibility with regards to DNA yields for the tested robotic workstations proved to be excellent and not significantly different than that offered by the manual phenol/chloroform extraction. DNA extraction of animal:human blood mixtures contaminated with soil demonstrated that a human profile was detectable even in the presence of abundant animal blood. For exhibits containing small amounts of biological material, concordance studies confirmed that DNA yields for this shaker-based extraction process are equivalent or greater to those observed with phenol/chloroform extraction as well as our original validated automated magnetic bead percolation-based extraction process. Our data further supports the increasing use of robotics for the processing of casework samples. Crown Copyright © 2009. Published by Elsevier Ireland Ltd. All rights reserved.

  15. Iodine-131 Releases from Radioactive Lanthanum Processing at the X-10 Site in Oak Ridge, Tennessee (1944-1956)- An Assessment of Quantities released, Off-Site Radiation Doses, and Potential Excess Risks of Thyroid Cancer, Volume 1

    International Nuclear Information System (INIS)

    Apostoaei, A.I.; Burns, R.E.; Hoffman, F.O.; Ijaz, T.; Lewis, C.J.; Nair, S.K.; Widner, T.E.

    1999-01-01

    In the early 1990s, concern about the Oak Ridge Reservation's past releases of contaminants to the environment prompted Tennessee's public health officials to pursue an in-depth study of potential off-site health effects at Oak Ridge. This study, the Oak Ridge dose reconstruction, was supported by an agreement between the U.S. Department of Energy (DOE) and the State of Tennessee, and was overseen by a 12-member panel appointed by Tennessee's Commissioner of Health. One of the major contaminants studied in the dose reconstruction was radioactive iodine, which was released to the air by X-10 (now called Oak Ridge National Laboratory) as it processed spent nuclear reactor fuel from 1944 through 1956. The process recovered radioactive lanthanum for use in weapons development. Iodine concentrates in the thyroid gland so health concerns include various diseases of the thyroid, such as thyroid cancer. The large report, ''Iodine-131 Releases from Radioactive Lanthanum Processing at the X-10 Site in Oak Ridge, Tennessee (1944-1956) - An Assessment of Quantities Released, Off-site Radiation Doses, and Potential Excess Risks of Thyroid Cancer,'' is in two volumes. Volume 1 is the main body of the report, and Volume 1A, which has the same title, consists of 22 supporting appendices. Together, these reports serve the following purposes: (1) describe the methodologies used to estimate the amount of iodine-131 (I-131) released; (2) evaluate I-131's pathway from air to vegetation to food to humans; (3) estimate doses received by human thyroids; (4) estimate excess risk of acquiring a thyroid cancer during ones lifetime; and (5) provide equations, examples of historical documents used, and tables of calculated values. Results indicate that females born in 1952 who consumed milk from a goat pastured a few miles east of X-10 received the highest doses from I-131 and would have had the highest risks of contracting thyroid cancer. Doses from cow's milk are considerably less . Detailed

  16. Iodine-131 Releases from Radioactive Lanthanum Processing at the X-10 Site in Oak Ridge, Tennessee (1944-1956)- An Assessment of Quantities released, Off-Site Radiation Doses, and Potential Excess Risks of Thyroid Cancer, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Apostoaei, A.I.; Burns, R.E.; Hoffman, F.O.; Ijaz, T.; Lewis, C.J.; Nair, S.K.; Widner, T.E.

    1999-07-01

    In the early 1990s, concern about the Oak Ridge Reservation's past releases of contaminants to the environment prompted Tennessee's public health officials to pursue an in-depth study of potential off-site health effects at Oak Ridge. This study, the Oak Ridge dose reconstruction, was supported by an agreement between the U.S. Department of Energy (DOE) and the State of Tennessee, and was overseen by a 12-member panel appointed by Tennessee's Commissioner of Health. One of the major contaminants studied in the dose reconstruction was radioactive iodine, which was released to the air by X-10 (now called Oak Ridge National Laboratory) as it processed spent nuclear reactor fuel from 1944 through 1956. The process recovered radioactive lanthanum for use in weapons development. Iodine concentrates in the thyroid gland so health concerns include various diseases of the thyroid, such as thyroid cancer. The large report, ''Iodine-131 Releases from Radioactive Lanthanum Processing at the X-10 Site in Oak Ridge, Tennessee (1944-1956) - An Assessment of Quantities Released, Off-site Radiation Doses, and Potential Excess Risks of Thyroid Cancer,'' is in two volumes. Volume 1 is the main body of the report, and Volume 1A, which has the same title, consists of 22 supporting appendices. Together, these reports serve the following purposes: (1) describe the methodologies used to estimate the amount of iodine-131 (I-131) released; (2) evaluate I-131's pathway from air to vegetation to food to humans; (3) estimate doses received by human thyroids; (4) estimate excess risk of acquiring a thyroid cancer during ones lifetime; and (5) provide equations, examples of historical documents used, and tables of calculated values. Results indicate that females born in 1952 who consumed milk from a goat pastured a few miles east of X-10 received the highest doses from I-131 and would have had the highest risks of contracting thyroid cancer. Doses from cow

  17. The influence of DOM and microbial processes on arsenic release from karst during ASR operations in the Floridan Aquifer

    Science.gov (United States)

    Jin, J.; Zimmerman, A. R.

    2011-12-01

    The mobilization of subsurface As poses a serious threat to human health, particularly in a region such as Florida where population is heavily dependent on highly porous karstic aquifers for drinking water. Injection water used in aquifer storage and recovery (ASR) or aquifer recharge (AR) operations is commonly high in dissolved organic matter (DOM) and OM can also be present in the subsurface carbonate rock. Using batch incubation experiments, this study examined the role of core preservation methods, as well as the influence of labile and more refractory DOM on the mobilization of As from carbonate rock. Incubation experiments used sealed reaction vessels with preserved and homogenized core materials collected via coring the Suwannee Formation in southwest Florida and treatment additions consisting of 1) source water (SW) enriched in sterilized soil DOM, 2) SW enriched in soil DOM and microbes, and 3) SW enriched in sodium acetate. During an initial equilibration phase in native groundwater (NGW) with low dissolved oxygen (DO; Phase 1), we found the greatest As release of the whole incubation. In the beginning of Phase 2 (N2 headspace) in which NGW was replaced with treatment solutions, there was little As release except in the vessel with Na-acetate added, which also had the lowest ORP. At the start of Phase 3, when incubations were exposed to air, most vessels saw more ion (including As) release into solution. Vessel with Na-acetate had less As release in Phase 3 than in Phase 2. During all experimental phases, treatments of DOM or microbe additions had no apparent effect on the amount of As release. The core materials was found contain significant amount of indigenous DOM (about 8 g OC/kg core) which was released during the incubation so DOC concentrations displayed no clear pattern among different treatments. At least three abiotic As mobilization mechanisms may play a role in As released during different stages of the experiment. Desorption of As from iron

  18. Management of the General Process of Parenteral Nutrition Using mHealth Technologies: Evaluation and Validation Study.

    Science.gov (United States)

    Cervera Peris, Mercedes; Alonso Rorís, Víctor Manuel; Santos Gago, Juan Manuel; Álvarez Sabucedo, Luis; Wanden-Berghe, Carmina; Sanz-Valero, Javier

    2018-04-03

    Any system applied to the control of parenteral nutrition (PN) ought to prove that the process meets the established requirements and include a repository of records to allow evaluation of the information about PN processes at any time. The goal of the research was to evaluate the mobile health (mHealth) app and validate its effectiveness in monitoring the management of the PN process. We studied the evaluation and validation of the general process of PN using an mHealth app. The units of analysis were the PN bags prepared and administered at the Son Espases University Hospital, Palma, Spain, from June 1 to September 6, 2016. For the evaluation of the app, we used the Poststudy System Usability Questionnaire and subsequent analysis with the Cronbach alpha coefficient. Validation was performed by checking the compliance of control for all operations on each of the stages (validation and transcription of the prescription, preparation, conservation, and administration) and by monitoring the operative control points and critical control points. The results obtained from 387 bags were analyzed, with 30 interruptions of administration. The fulfillment of stages was 100%, including noncritical nonconformities in the storage control. The average deviation in the weight of the bags was less than 5%, and the infusion time did not present deviations greater than 1 hour. The developed app successfully passed the evaluation and validation tests and was implemented to perform the monitoring procedures for the overall PN process. A new mobile solution to manage the quality and traceability of sensitive medicines such as blood-derivative drugs and hazardous drugs derived from this project is currently being deployed. ©Mercedes Cervera Peris, Víctor Manuel Alonso Rorís, Juan Manuel Santos Gago, Luis Álvarez Sabucedo, Carmina Wanden-Berghe, Javier Sanz-Valero. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 03.04.2018.

  19. Modifying an Active Compound’s Release Kinetic Using a Supercritical Impregnation Process to Incorporate an Active Agent into PLA Electrospun Mats

    Directory of Open Access Journals (Sweden)

    Carol López de Dicastillo

    2018-04-01

    Full Text Available The main objective of this work was to study the release of cinnamaldehyde (CIN from electrospun poly lactic acid (e-PLA mats obtained through two techniques: (i direct incorporation of active compound during the electrospinning process (e-PLA-CIN; and (ii supercritical carbon dioxide (scCO2 impregnation of CIN within electrospun PLA mats (e-PLA/CINimp. The development and characterization of both of these active electrospun mats were investigated with the main purpose of modifying the release kinetic of this active compound. Morphological, structural, and thermal properties of these materials were also studied, and control mats e-PLA and e- PLA CO 2 were developed in order to understand the effect of electrospinning and scCO2 impregnation, respectively, on PLA properties. Both strategies of incorporation of this active compound into PLA matrix resulted in different morphologies that influenced chemical and physical properties of these composites and in different release kinetics of CIN. The electrospinning and scCO2 impregnation processes and the presence of CIN altered PLA thermal and structural properties when compared to an extruded PLA material. The incorporation of CIN through scCO2 impregnation resulted in higher release rate and lower diffusion coefficients when compared to active electrospun mats with CIN incorporated during the electrospinning process.

  20. On the experimental approaches for the assessment of the release of engineered nanomaterials from nanocomposites by physical degradation processes

    International Nuclear Information System (INIS)

    Blázquez, M; Unzueta, I; Egizabal, A

    2014-01-01

    The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as n anomaterials release simulation technologies . In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge

  1. On the experimental approaches for the assessment of the release of engineered nanomaterials from nanocomposites by physical degradation processes

    Science.gov (United States)

    Blázquez, M.; Egizabal, A.; Unzueta, I.

    2014-08-01

    The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as "nanomaterials release simulation technologies". In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge.

  2. A validated stability-indicating RP-HPLC method for levofloxacin in the presence of degradation products, its process related impurities and identification of oxidative degradant.

    Science.gov (United States)

    Lalitha Devi, M; Chandrasekhar, K B

    2009-12-05

    The objective of current study was to develop a validated specific stability indicating reversed-phase liquid chromatographic method for the quantitative determination of levofloxacin as well as its related substances determination in bulk samples, pharmaceutical dosage forms in the presence of degradation products and its process related impurities. Forced degradation studies were performed on bulk sample of levofloxacin as per ICH prescribed stress conditions using acid, base, oxidative, water hydrolysis, thermal stress and photolytic degradation to show the stability indicating power of the method. Significant degradation was observed during oxidative stress and the degradation product formed was identified by LCMS/MS, slight degradation in acidic stress and no degradation was observed in other stress conditions. The chromatographic method was optimized using the samples generated from forced degradation studies and the impurity spiked solution. Good resolution between the peaks corresponds to process related impurities and degradation products from the analyte were achieved on ACE C18 column using the mobile phase consists a mixture of 0.5% (v/v) triethyl amine in sodium dihydrogen orthophosphate dihydrate (25 mM; pH 6.0) and methanol using a simple linear gradient. The detection was carried out at 294 nm. The limit of detection and the limit of quantitation for the levofloxacin and its process related impurities were established. The stressed test solutions were assayed against the qualified working standard of levofloxacin and the mass balance in each case was in between 99.4 and 99.8% indicating that the developed LC method was stability indicating. Validation of the developed LC method was carried out as per ICH requirements. The developed LC method was found to be suitable to check the quality of bulk samples of levofloxacin at the time of batch release and also during its stability studies (long term and accelerated stability).

  3. [Support of the nursing process through electronic nursing documentation systems (UEPD) – Initial validation of an instrument].

    Science.gov (United States)

    Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi

    2016-01-01

    Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.

  4. Process simulation and experimental validation of Hot Metal Gas Forming with new press hardening steels

    Science.gov (United States)

    Paul, A.; Reuther, F.; Neumann, S.; Albert, A.; Landgrebe, D.

    2017-09-01

    One field in the work of the Fraunhofer Institute for Machine Tools and Forming Technology IWU in Chemnitz is industry applied research in Hot Metal Gas Forming, combined with press hardening in one process step. In this paper the results of investigations on new press hardening steels from SSAB AB (Docol®1800 Bor and Docol®2000 Bor) are presented. Hot tensile tests recorded by the project partner (University of West Bohemia, Faculty of Mechanical Engineering) were used to create a material model for thermo-mechanical forming simulations. For this purpose the provided raw data were converted into flow curve approximations of the real stress-real strain-curves for both materials and afterwards integrated in a LS-DYNA simulation model of Hot Metal Gas Forming with all relevant boundary conditions and sub-stages. Preliminary experimental tests were carried out using a tool at room temperature to permit evaluation of the forming behaviour of Docol 1800 Bor and Docol 2000 Bor tubes as well as validation of the simulation model. Using this demonstrator geometry (outer diameter 57 mm, tube length 300 mm, wall thickness 1.5 mm), the intention was to perform a series of tests with different furnace temperatures (from 870 °C to 1035 °C), maximum internal pressures (up to 67 MPa) and pressure build-up rates (up to 40 MPa/s) to evaluate the formability of Docol 1800 Bor and Docol 2000 Bor. Selected demonstrator parts produced in that way were subsequently analysed by wall thickness and hardness measurements. The tests were carried out using the completely modernized Dunkes/AP&T HS3-1500 hydroforming press at the Fraunhofer IWU. In summary, creating a consistent simulation model with all relevant sub-stages was successfully established in LS-DYNA. The computation results show a high correlation with the experimental data regarding the thinning behaviour. The Hot Metal Gas Forming of the demonstrator geometry was successfully established as well. Different hardness values

  5. Applicability of near-infrared spectroscopy in the monitoring of film coating and curing process of the prolonged release coated pellets.

    Science.gov (United States)

    Korasa, Klemen; Hudovornik, Grega; Vrečer, Franc

    2016-10-10

    Although process analytical technology (PAT) guidance has been introduced to the pharmaceutical industry just a decade ago, this innovative approach has already become an important part of efficient pharmaceutical development, manufacturing, and quality assurance. PAT tools are especially important in technologically complex operations which require strict control of critical process parameters and have significant effect on final product quality. Manufacturing of prolonged release film coated pellets is definitely one of such processes. The aim of the present work was to study the applicability of the at-line near-infrared spectroscopy (NIR) approach in the monitoring of pellet film coating and curing steps. Film coated pellets were manufactured by coating the active ingredient containing pellets with film coating based on polymethacrylate polymers (Eudragit® RS/RL). The NIR proved as a useful tool for the monitoring of the curing process since it was able to determine the extent of the curing and hence predict drug release rate by using partial least square (PLS) model. However, such approach also showed a number of limitations, such as low reliability and high susceptibility to pellet moisture content, and was thus not able to predict drug release from pellets with high moisture content. On the other hand, the at-line NIR was capable to predict the thickness of Eudragit® RS/RL film coating in a wide range (up to 40μm) with good accuracy even in the pellets with high moisture content. To sum up, high applicability of the at-line NIR in the monitoring of the prolonged release pellets production was demonstrated in the present study. The present findings may contribute to more efficient and reliable PAT solutions in the manufacturing of prolonged release dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Processing of thyrotropin-releasing hormone prohormone (pro-TRH) generates a biologically active peptide, prepro-TRH-(160-169), which regulates TRH-induced thyrotropin secretion

    International Nuclear Information System (INIS)

    Bulant, M.; Vaudry, H.; Roussel, J.P.; Astier, H.; Nicolas, P.

    1990-01-01

    Rat thyrotropin-releasing hormone (TRH) prohormone contains five copies of the TRH progenitor sequence Gln-His-Pro-Gly linked together by connecting sequences whose biological activity is unknown. Both the predicted connecting peptide prepro-TRH-(160-169) (Ps4) and TRH are predominant storage forms of TRH precursor-related peptides in the hypothalamus. To determine whether Ps4 is co-released with TRH, rat median eminence slices were perfused in vitro. Infusion of depolarizing concentrations of KCl induced stimulation of release of Ps4- and TRH-like immunoreactivity. The possible effect of Ps4 on thyrotropin release was investigated in vitro using quartered anterior pituitaries. Infusion of Ps4 alone had no effect on thyrotropin secretion but potentiated TRH-induced thyrotropin release in a dose-dependent manner. In addition, the occurrence of specific binding sites for 125 I-labeled Tyr-Ps4 in the distal lobe of the pituitary was demonstrated by binding analysis and autoradiographic localization. These findings indicate that these two peptides that arise from a single multifunctional precursor, the TRH prohormone, act in a coordinate manner on the same target cells to promote hormonal secretion. These data suggest that differential processing of the TRH prohormone may have the potential to modulate the biological activity of TRH

  7. Individual differences in processing styles: validity of the Rational-Experiential Inventory.

    Science.gov (United States)

    Björklund, Fredrik; Bäckström, Martin

    2008-10-01

    In Study 1 (N= 203) the factor structure of a Swedish translation of Pacini and Epstein's Rational-Experiential Inventory (REI-40) was investigated using confirmatory factor analysis. The hypothesized model with rationality and experientiality as orthogonal factors had satisfactory fit to the data, significantly better than alternative models (with two correlated factors or a single factor). Inclusion of "ability" and "favorability" subscales for rationality and experientiality increased fit further. It was concluded that the structural validity of the REI is adequate. In Study 2 (N= 72) the REI-factors were shown to have theoretically meaningful correlations to other personality traits, indicating convergent and discriminant validity. Finally, scores on the rationality scale were negatively related to risky choice framing effects in Kahneman and Tversky's Asian disease task, indicating concurrent validity. On the basis of these findings it was concluded that the test has satisfactory psychometric properties.

  8. Valid knowledge for the professional design of large and complex design processes

    NARCIS (Netherlands)

    Aken, van J.E.

    2004-01-01

    The organization and planning of design processes, which we may regard as design process design, is an important issue. Especially for large and complex design-processes traditional approaches to process design may no longer suffice. The design literature gives quite some design process models. As

  9. The Adaptation, Validation, Reliability Process of the Turkish Version Orientations to Happiness Scale

    Directory of Open Access Journals (Sweden)

    Hakan Saricam

    2015-12-01

    Full Text Available The purpose of this research is to adapt the Scale of Happiness Orientations, which was developed by Peterson, Park, and Seligman (2005, into Turkish and examine the psychometric properties of the scale. The participants of the research consist of 489 students. The psychometric properties of the scale was examined with test methods; linguistic equivalence, descriptive factor analysis, confirmatory factor analysis, criterion-related validity, internal consistency, and test-retest. For criterion-related validity (concurrent validity, the Oxford Happiness Questionnaire-Short Form is used. Articles resulting from the descriptive factor analysis for structural validity of scale were summed into three factors (life of meaning, life of pleasure, life of engagement in accordance with the original form. Confirmatory factor analysis conducted yielded the value of three-factor fit indexes of 18 items: (χ2/df=1.94, RMSEA= .059, CFI= .96, GFI= .95, IFI= .95, NFI= .96, RFI= .95 and SRMR= .044. Factor load of the scale ranges from .36 to .59. During criterion-validity analysis, between Scale of Happiness Orientations and the Oxford Happiness Questionnaire, positive strong relations were seen at the level of p<.01 significance level. Cronbach Alpha internal consistency coefficient was .88 for the life of meaning sub-scale, .84 for the life of pleasure sub-scale, and .81 for the life of engagement sub-scale. In addition, a corrected items total correlation ranges from .39 to .61. According to these results, it can be said that the scale is a valid and reliable assessment instrument for positive psychology, educational psychology, and other fields.

  10. Soil process modelling in CZO research: gains in data harmonisation and model validation

    Science.gov (United States)

    van Gaans, Pauline; Andrianaki, Maria; Kobierska, Florian; Kram, Pavel; Lamacova, Anna; Lair, Georg; Nikolaidis, Nikos; Duffy, Chris; Regelink, Inge; van Leeuwen, Jeroen P.; de Ruiter, Peter

    2014-05-01

    Various soil process models were applied to four European Critical Zone observatories (CZOs), the core research sites of the FP7 project SoilTrEC: the Damma glacier forefield (CH), a set of three forested catchments on geochemically contrasing bedrocks in the Slavkov Forest (CZ), a chronosequence of soils in the former floodplain of the Danube of Fuchsenbigl/Marchfeld (AT), and the Koiliaris catchments in the north-western part of Crete, (GR). The aim of the modelling exercises was to apply and test soil process models with data from the CZOs for calibration/validation, identify potential limits to the application scope of the models, interpret soil state and soil functions at key stages of the soil life cycle, represented by the four SoilTrEC CZOs, contribute towards harmonisation of data and data acquisition. The models identified as specifically relevant were: The Penn State Integrated Hydrologic Model (PIHM), a fully coupled, multiprocess, multi-scale hydrologic model, to get a better understanding of water flow and pathways, The Soil and Water Assessment Tool (SWAT), a deterministic, continuous time (daily time step) basin scale model, to evaluate the impact of soil management practices, The Rothamsted Carbon model (Roth-C) to simulate organic carbon turnover and the Carbon, Aggregation, and Structure Turnover (CAST) model to include the role of soil aggregates in carbon dynamics, The Ligand Charge Distribution (LCD) model, to understand the interaction between organic matter and oxide surfaces in soil aggregate formation, and The Terrestrial Ecology Model (TEM) to obtain insight into the link between foodweb structure and carbon and nutrient turnover. With some exceptions all models were applied to all four CZOs. The need for specific model input contributed largely to data harmonisation. The comparisons between the CZOs turned out to be of great value for understanding the strength and limitations of the models, as well as the differences in soil conditions

  11. Co-extrusion as a processing technique to manufacture a dual sustained release fixed-dose combination product.

    Science.gov (United States)

    Vynckier, An-Katrien; Voorspoels, Jody; Remon, Jean Paul; Vervaet, Chris

    2016-05-01

    This study aimed to design a fixed-dose combination dosage form which provides a sustained release profile for both the freely water-soluble metformin HCl and the poorly soluble gliclazide, two antidiabetic compounds used to treat diabetes mellitus. Hot-melt co-extrusion was used as an innovative manufacturing technique for a pharmaceutical fixed-dose combination product. In this way, a matrix formulation that sustained metformin release could be developed, despite the high drug load in the formulation and the freely soluble nature of the drug. It was clear that co-extrusion was perfectly suited to produce a fixed-dose combination product with adequate properties for each of the incorporated APIs. A coat layer, containing at least 30% CAPA(®) 6506 as a hydrophobic polymer, was necessary to adequately sustain the release of the highly dosed freely soluble drug from the 70% metformin HCl-loaded CAPA(®) 6506 core of the co-extrudate. To obtain a complete gliclazide release over 24-h solubilization in Kollidon(®) VA, added as a second polymer to the CAPA(®) 6506 in the coat, was needed. Both active pharmaceutical ingredients (APIs), which have different physicochemical characteristics, were formulated in a single dosage form, using co-extrusion. © 2016 Royal Pharmaceutical Society, Journal of Pharmacy and Pharmacology.

  12. Floating matrix tablets based on low density foam powder: effects of formulation and processing parameters on drug release.

    Science.gov (United States)

    Streubel, A; Siepmann, J; Bodmeier, R

    2003-01-01

    The aim of this study was to develop and physicochemically characterize single unit, floating controlled drug delivery systems consisting of (i). polypropylene foam powder, (ii). matrix-forming polymer(s), (iii). drug, and (iv). filler (optional). The highly porous foam powder provided low density and, thus, excellent in vitro floating behavior of the tablets. All foam powder-containing tablets remained floating for at least 8 h in 0.1 N HCl at 37 degrees C. Different types of matrix-forming polymers were studied: hydroxypropyl methylcellulose (HPMC), polyacrylates, sodium alginate, corn starch, carrageenan, gum guar and gum arabic. The tablets eroded upon contact with the release medium, and the relative importance of drug diffusion, polymer swelling and tablet erosion for the resulting release patterns varied significantly with the type of matrix former. The release rate could effectively be modified by varying the "matrix-forming polymer/foam powder" ratio, the initial drug loading, the tablet geometry (radius and height), the type of matrix-forming polymer, the use of polymer blends and the addition of water-soluble or water-insoluble fillers (such as lactose or microcrystalline cellulose). The floating behavior of the low density drug delivery systems could successfully be combined with accurate control of the drug release patterns.

  13. Investigation of some factors affecting on release of radon-222 from phosphogypsum waste associated with phosphate ore processing.

    Science.gov (United States)

    Hilal, M A; El Afifi, E M; Nayl, A A

    2015-07-01

    The aim of this study is oriented to investigate the influence of some physicochemical factors such as radium distribution, grain size, moisture content and chemical constituents on releases of radon-222 from the accumulated phosphogypsum (PG) waste. The emanation fraction, activity concentration in the pore and the surface exhalation rate of radon-222 in the bulk PG waste are 34.5 ± 0.3%, 238.6 ± 7.8 kBq m(-3) and 213 ± 6.9 mBq m(-2) s(-1), respectively. These values were varied and enhanced slightly in the fine grain sizes (F1 factor of 1.05 folds compared to the bulk residue. It was also found that release of radon from residue PG waste was controlled positively by radium (Ra-226), calcium (CaSO4) and strontium (SrO). About 67% of radon release attributed to the grain size below 0.5 mm, while 33% due to the large grain size above 0.5 mm. The emanation fraction of Rn-222 is increased with moisture content and the maximum emanation is ∼43% of moisture of 3-8%. It reduced slowly with the continuous increase in moisture till 20%. Due to PG waste in situ can be enhancing the background to the surround workers and/or public. Therefore, the environmental negative impacts due to release of Rn-222 can be minimized by legislation to restrict its civil uses, or increasing its moisture to ∼10%, or by the particle size separation of the fine fraction containing the high levels of Ra-226 followed by a suitable chemical treatment or disposal; whereas the low release amount can be diluted and used in cement industry, roads or dam construction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Ultrasound as an Outcome Measure in Gout. A Validation Process by the OMERACT Ultrasound Working Group

    DEFF Research Database (Denmark)

    Terslev, Lene; Gutierrez, Marwin; Schmidt, Wolfgang A

    2015-01-01

    OBJECTIVE: To summarize the work performed by the Outcome Measures in Rheumatology (OMERACT) Ultrasound (US) Working Group on the validation of US as a potential outcome measure in gout. METHODS: Based on the lack of definitions, highlighted in a recent literature review on US as an outcome tool...

  15. Aerosol release factor for Pu as a consequence of an ion exchange resin fire in the process cell of a fuel reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Bhanti, D.P.; Malvankar, S.V.; Kotrappa, P.; Somasundaram, S.; Raghunath, B.; Curtay, A.M.

    1988-12-01

    One of the upper limit accidents usually considered in the safety analysis of a fuel reprocessing plant is an accidental explosion, followed by a fire, of an ion exchange column containing resin loaded with large quantities of plutonium. In such accidents, a certain fraction (release factor) of Pu is released in the form of an aerosol into the ventilation system, and finally to the environment through HEPA filters and the stack. The present study was undertaken to determine the aerosol release factor for Pu in the process cell of a typical fuel reprocessing plant. Geometrically similar scaled-down models of three different sizes were built, and suitably scaled-down quantities of resin loaded with thorium in nitric acid medium were burnt in these model cells. Thorium was used in place of Pu because of its physical and chemical similarities with Pu. The release factor was obtained by comparing the amount of Th in air with the total. The study also dealt with aerosol characteristics and kinematics of process of fire. The aerosol release factors for the three models were found to lie in the range 0.01-0.07%, and varied non-monotonically with model size. The analysis of scaled down results in conjunction with simplified aerosol modelling yielded the release factor for the actual cell conditions as 0.012% with an upper limit value of 0.1%. The particle size analysis based on Th-radioactivity and particle-mass indicated nonuniform tagging of Th to aerosol particles. These particles were irregularly shaped, but not as long chain-like aggregates. The study proposes, with a reasonable degree of conservatism, the release factor of 0.1% for such fires, and aerosol parameters, AMAD and sigma/sub g/, as 2 m and 2 respectively. However, for situations significantly different from the present one, the release factor of 1% recommended by the American National Standards Institute may be used with a greater degree of confidence in the light of the present work.

  16. Reuse of conditionally released steel; proposals and evaluation of processes for manufacturing of steel elements and processes for construction of selected scenarios - 59130

    International Nuclear Information System (INIS)

    Bezak, Peter; Ondra, Frantisek; Hajkova, Eva; Necas, Vladimir

    2012-01-01

    The project include systematic scenarios analysis of conditionally released materials from the decommissioning of nuclear installations and the creation of new knowledge in this field, which will be used for implementing projects for reuse of these materials. New knowledge includes data about materials from the decommissioning (types of materials and radiological data on the basis of analysis of various scenarios). Scenarios contain information about conditionally released materials, data of the external exposure of personnel who will assemble those structures and who will be use the constructions up to the target scenario. Scenarios assume guarantee that the final products will be placed on the current position for a very long period from 50 to 100 years. The paper presents the review of activities for manufacturing of various steel construction elements made of conditionally released steels and activities for realisation of selected scenarios for reuse of construction elements. The ingots after melting of decommissioned radioactive steel materials are as the starting material for manufacturing of steel components. Ingots from the controlled area will be melted into induction furnace and the mixture of liquid steel will be alloyed for achieve of required chemical parameters. Typical steel products suitable for established scenarios are steel rebar of concrete, steel profiles of various forms, railway rails and rolled steel sheets. Target scenarios include an analysis of staff exposure during installation of steel constructions as well as exposure of individual from critical groups of population during their exploitation. The various scenarios, provided within the scope of the CONRELMAT project are focused at the systematic analysis of the use of conditionally released steel from decommissioning of nuclear facilities. Scenarios are focused on research and development of model situations in constructions in the areas of transport, civil constructions, industry and

  17. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    Science.gov (United States)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  18. Exploration, Development, and Validation of Patient-reported Outcomes in Antineutrophil Cytoplasmic Antibody–associated Vasculitis Using the OMERACT Process

    Science.gov (United States)

    Robson, Joanna C.; Milman, Nataliya; Tomasson, Gunnar; Dawson, Jill; Cronholm, Peter F.; Kellom, Katherine; Shea, Judy; Ashdown, Susan; Boers, Maarten; Boonen, Annelies; Casey, George C.; Farrar, John T.; Gebhart, Don; Krischer, Jeffrey; Lanier, Georgia; McAlear, Carol A.; Peck, Jacqueline; Sreih, Antoine G.; Tugwell, Peter; Luqmani, Raashid A.; Merkel, Peter A.

    2016-01-01

    Objective Antineutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV) is a group of linked multisystem life- and organ-threatening diseases. The Outcome Measures in Rheumatology (OMERACT) vasculitis working group has been at the forefront of outcome development in the field and has achieved OMERACT endorsement of a core set of outcomes for AAV. Patients with AAV report as important some manifestations of disease not routinely collected through physician-completed outcome tools; and they rate common manifestations differently from investigators. The core set includes the domain of patient-reported outcomes (PRO). However, PRO currently used in clinical trials of AAV do not fully characterize patients’ perspectives on their burden of disease. The OMERACT vasculitis working group is addressing the unmet needs for PRO in AAV. Methods Current activities of the working group include (1) evaluating the feasibility and construct validity of instruments within the PROMIS (Patient-Reported Outcome Measurement Information System) to record components of the disease experience among patients with AAV; (2) creating a disease-specific PRO measure for AAV; and (3) applying The International Classification of Functioning, Disability and Health to examine the scope of outcome measures used in AAV. Results The working group has developed a comprehensive research strategy, organized an investigative team, included patient research partners, obtained peer-reviewed funding, and is using a considerable research infrastructure to complete these interrelated projects to develop evidence-based validated outcome instruments that meet the OMERACT filter of truth, discrimination, and feasibility. Conclusion The OMERACT vasculitis working group is on schedule to achieve its goals of developing validated PRO for use in clinical trials of AAV. (First Release September 1 2015; J Rheumatol 2015;42:2204–9; doi:10.3899/jrheum.141143) PMID:26329344

  19. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M J

    1998-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  20. Validation process of simulation model; Proceso de validacion de modelos de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    San Isidro Pindado, M.J.

    1997-12-31

    It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posteriori experiments. Three steps can be well differentiated: - Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. - Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. - Residual analysis. This analysis has been made on the time domain on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, ESP studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author)

  1. Validation of contractor HMA testing data in the materials acceptance process.

    Science.gov (United States)

    2010-08-01

    "This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee comprised of SCDOT, FHWA, and Industry representatives provided oversight of the process. The research process included a literature review, a brief surve...

  2. Design of sustained release fine particles using two-step mechanical powder processing: particle shape modification of drug crystals and dry particle coating with polymer nanoparticle agglomerate.

    Science.gov (United States)

    Kondo, Keita; Ito, Natsuki; Niwa, Toshiyuki; Danjo, Kazumi

    2013-09-10

    We attempted to prepare sustained release fine particles using a two-step mechanical powder processing method; particle-shape modification and dry particle coating. First, particle shape of bulk drug was modified by mechanical treatment to yield drug crystals suitable for the coating process. Drug crystals became more rounded with increasing rotation speed, which demonstrates that powerful mechanical stress yields spherical drug crystals with narrow size distribution. This process is the result of destruction, granulation and refinement of drug crystals. Second, the modified drug particles and polymer coating powder were mechanically treated to prepare composite particles. Polymer nanoparticle agglomerate obtained by drying poly(meth)acrylate aqueous dispersion was used as a coating powder. The porous nanoparticle agglomerate has superior coating performance, because it is completely deagglomerated under mechanical stress to form fine fragments that act as guest particles. As a result, spherical drug crystals treated with porous agglomerate were effectively coated by poly(meth)acrylate powder, showing sustained release after curing. From these findings, particle-shape modification of drug crystals and dry particle coating with nanoparticle agglomerate using a mechanical powder processor is expected as an innovative technique for preparing controlled-release coated particles having high drug content and size smaller than 100 μm. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. In situ detection of the Zn(2+) release process of ZnO NPs in tumour cells by confocal laser scanning fluorescence microscopy.

    Science.gov (United States)

    Song, Wenshuang; Tang, Xiaoling; Li, Yong; Sun, Yang; Kong, Jilie; Qingguang, Ren

    2016-08-01

    The use of zinc oxide (ZnO) nanoparticles (NPs) for cancer is not yet clear for human clinical applications, which is primarily due to the lack of a better understanding of the action mechanisms and cellular consequences of the direct exposure of cells to these NPs. In this work, the authors have selected zinquin ethyl ester, a Zn(2+)-specific fluorescent molecular probe, to efficiently differentiate ZnO NPs and Zn(2+), and combined with confocal laser scanning microscopy (CLSM) to in situ study the Zn(2+) release process of ZnO NPs in cancer cell system through detecting the change of Zn(2+) level over time. During the experiments, the authors have designed the test group ZnO-2 in addition to assess the influence of a long-term storage on the characteristics of ZnO NPs in aqueous solution, and the Zn(2+) release process of ZnO NPs in cancer cell system. After three-month storage at room temperature, the release process became earlier and faster, which was consistent with previous results of transmission electron microscope, UV-Vis and PL spectra. It is a good detection method that combination of Zn(2+)-specific fluorescent molecular probe and CLSM, which will be helpful for ZnO NPs using in clinical research.

  4. Development and Validation of a Q-Sort Measure of Identity Processing Style: The Identity Processing Style Q-Sort

    Science.gov (United States)

    Pittman, Joe F.; Kerpelman, Jennifer L.; Lamke, Leanne K.; Sollie, Donna L.

    2009-01-01

    Identity styles represent strategies individuals use to explore identity-related issues. Berzonsky (Berzonsky, M. D. (1992). Identity style and coping strategies. "Journal of Personality, 60", 771-788) identified three styles: informational, normative, and diffuse. In three studies, this paper presents (a) the identity processing style Q-sort…

  5. Validation experiment of a numerically processed millimeter-wave interferometer in a laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Kogi, Y., E-mail: kogi@fit.ac.jp; Higashi, T.; Matsukawa, S. [Department of Information Electronics, Fukuoka Institute of Technology, Fukuoka 811-0295 (Japan); Mase, A. [Art, Science and Technology Center for Cooperative Research, Kyushu University, Kasuga, Fukuoka 816-0811 (Japan); Kohagura, J.; Yoshikawa, M. [Plasma Research Center, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Nagayama, Y.; Kawahata, K. [National Institute for Fusion Science, Toki, Gifu 509-5202 (Japan); Kuwahara, D. [Tokyo University of Agriculture and Technology, Koganei, Tokyo 184-8588 (Japan)

    2014-11-15

    We propose a new interferometer system for density profile measurements. This system produces multiple measurement chords by a leaky-wave antenna driven by multiple frequency inputs. The proposed system was validated in laboratory evaluation experiments. We confirmed that the interferometer generates a clear image of a Teflon plate as well as the phase shift corresponding to the plate thickness. In another experiment, we confirmed that quasi-optical mirrors can produce multiple measurement chords; however, the finite spot size of the probe beam degrades the sharpness of the resulting image.

  6. Validation of CFD predictions using process data obtained from flow through an industrial control valve

    International Nuclear Information System (INIS)

    Green, J; Mishra, R; Charlton, M; Owen, R

    2012-01-01

    This study uses the experimental flow test data to validate CFD simulations for a complex control valve trim. In both the simulation and the experimental flow test the capacity of the trim (Cv) is calculated in order to test the ability of CFD software to provide a design tool for these trims. While CFD tests produced results for the capacity which were consistent across a series of five different simulations, it differed from the experimental flow data by nearly 25%. This indicates that CFD simulations need to be properly calibrated before being used in designing complex valve trims.

  7. The Effect of Varying Speed Release of Nutrients from Fertilizers on Growth-production Process of Turf

    Directory of Open Access Journals (Sweden)

    Peter Hric

    2016-01-01

    Full Text Available The aim of this experiment was to compare the influence of fertilizers with different speed of nutrients release on growth–production indicators of turf under non–irrigated conditions. The experiment was carried in warm and dry conditions in area Nitra (Slovak Republic. In the experiment were followed 5 treatments (1. without fertilization, 2. Nitre with dolomite, Superphosphate, Potassium salt, 3. Turf fertilizer Travcerit®, 4. Slow release fertilizer SRF NPK 14–5–14 (+ 4CaO + 4MgO + 7S, 5. Controlled release fertilizer Duslocote® NPK (S 13–9–18 (+6S. The highest gain of height reached variant fertilized by fertilizer SRF NPK 14–5–14 (+ 4CaO + 4MgO + 7S. Comparison of the individual treatments for the whole period showed significantly lower average daily gains of height on control treatment compared to fertilizing treatments Nitre with dolomite, Superphosphate, Potassium salt, SRF NPK 14–5–14 (+ 4CaO + 4MgO + 7S and Duslocote® NPK (S 13–9–18 (+6S. During the reported period the highest gain of weight reached treatment by application fertilizer Duslocote® NPK (S 13– 9–18 (+ 6S. Comparison of the individual treatments for the whole period, were found significantly lower average daily production of phytomass on control treatment in comparison with fertilization turfs by Travcerit® and Duslocote® NPK (S 13–9–18 (+6S.

  8. Kinetics and mechanisms of metal retention/release in geochemical processes in soil. 1997 annual progress report

    International Nuclear Information System (INIS)

    Taylor, R.W.

    1997-01-01

    'Remediation of soils polluted with heavy metals is a major challenge facing the nation. This is especially so at many DOE facilities and other superfund sites. In many cases, speciation of the metals is inaccurate and difficult and the mechanisms by which the metals are retained/released in soils over long times are poorly understood. Consequently, the long-term fate of metals in soils cannot be precisely predicted and often, the remediation recommendations and techniques that are employed to clean up soils may be ineffective or unnecessary. Accordingly, the authors are proposing work to generate basic knowledge on the kinetics and mechanism(s) of heavy metal retention/release by soil mineral colloids as affected by inorganic anion. The nature of the interaction of Cd(II), Co(II), Cr(VI), Cu(II), Ni(II) and Pb(II) with pure soil minerals and extracted soil clays will be investigated. The colloids will be characterized in terms of surface area, surface charge and surface site density. They will be used to study the effect(s) of pH, phosphate rate, and temperature on metals retention/release. The experiments will involve using various kinetic and isothermic sorption equations as models to describe the data thus acquired. The spectroscopic methods will involve using extended x-ray absorption fine structure spectroscopy (EXAFS) and Fourier Transform Infrared Spectroscopy (FTIR). The data generated from the proposed study will assist in designing better remediation strategies to effectively clean up toxic heavy metal contaminated soils at DOE facilities and other superfund sites.'

  9. Data Validation Package - April and July 2015 Groundwater and Surface Water Sampling at the Gunnison, Colorado, Processing Site

    Energy Technology Data Exchange (ETDEWEB)

    Linard, Joshua [Dept. of Energy (DOE), Washington, DC (United States). Office of Legacy Management; Campbell, Sam [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States)

    2016-02-01

    This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites. Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Domestic wells 0476 and 0477 were sampled in July because the homes were unoccupied in April, and the wells were not in use. Duplicate samples were collected from locations 0113, 0248, and 0477. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. No issues were identified during the data validation process that requires additional action or follow-up.

  10. AXAIR: A Computer Code for SAR Assessment of Plume-Exposure Doses from Potential Process-Accident Releases to Atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Pillinger, W.L.

    2001-05-17

    This report describes the AXAIR computer code which is available to terminal users for evaluating the doses to man from exposure to the atmospheric plume from postulated stack or building-vent releases at the Savannah River Plant. The emphasis herein is on documentation of the methodology only. The total-body doses evaluated are those that would be exceeded only 0.5 percent of the time based on worst-sector, worst-case meteorological probability analysis. The associated doses to other body organs are given in the dose breakdowns by radionuclide, body organ and pathway.

  11. Quantitative estimation of hydrogen concentration on the Ni3Al specimens surface in the process of hydrogen release

    International Nuclear Information System (INIS)

    Katano, Gen; Sano, Shogo; Saito, Hideo; Mori, Minoru

    2000-01-01

    The method to calculate the hydrogen concentration in metal specimens is given by tritium counts with the liquid scintillation counter. As segments to measure, Ni 3 Al intermetallic compound crystals were used. Tritium was charged to crystals with the method of cathode charging. The charged tritium was transported by diffusion and released from specimen surface. The tritium releasing rate was calculated from the increasing rate of tritium activity. Then the concentration of hydrogen at the surface was calculated from tritium counts. The outcome showed that the hydrogen concentration decreases at specimens surface by elapsed time. Then, the behavior of tritium diffusion was affected by doped boron (up to 0.235 atom% B and 0.470 atom% B) in Ni 3 Al crystals. As the amount of boron increased, the tritium diffusion coefficient decreased. And the hydrogen concentration varied with the amount of boron. After passing enough time, the hydrogen concentration in crystals with boron was much larger than the one without boron. Since it is very likely that the hydrogen concentration is affected by the number of hydrogen sites in the crystal, it is obvious judging by these phenomena, that by doping boron, numbers of hydrogen trapping sites were created. As the hydrogen distribution becomes homogenous after passing enough time, it is possible to measure the hydrogen concentration in all the crystals from β-ray counts at specimens surface. (author)

  12. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  13. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  14. The process of processing: exploring the validity of Neisser's perceptual cycle model with accounts from critical decision-making in the cockpit.

    Science.gov (United States)

    Plant, Katherine L; Stanton, Neville A

    2015-01-01

    The perceptual cycle model (PCM) has been widely applied in ergonomics research in domains including road, rail and aviation. The PCM assumes that information processing occurs in a cyclical manner drawing on top-down and bottom-up influences to produce perceptual exploration and actions. However, the validity of the model has not been addressed. This paper explores the construct validity of the PCM in the context of aeronautical decision-making. The critical decision method was used to interview 20 helicopter pilots about critical decision-making. The data were qualitatively analysed using an established coding scheme, and composite PCMs for incident phases were constructed. It was found that the PCM provided a mutually exclusive and exhaustive classification of the information-processing cycles for dealing with critical incidents. However, a counter-cycle was also discovered which has been attributed to skill-based behaviour, characteristic of experts. The practical applications and future research questions are discussed. Practitioner Summary: This paper explores whether information processing, when dealing with critical incidents, occurs in the manner anticipated by the perceptual cycle model. In addition to the traditional processing cycle, a reciprocal counter-cycle was found. This research can be utilised by those who use the model as an accident analysis framework.

  15. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    Science.gov (United States)

    2012-04-01

    AGARD doivent comporter la dénomination « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues, telles que le titre ...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des

  16. Validation of Case Finding Algorithms for Hepatocellular Cancer From Administrative Data and Electronic Health Records Using Natural Language Processing.

    Science.gov (United States)

    Sada, Yvonne; Hou, Jason; Richardson, Peter; El-Serag, Hashem; Davila, Jessica

    2016-02-01

    Accurate identification of hepatocellular cancer (HCC) cases from automated data is needed for efficient and valid quality improvement initiatives and research. We validated HCC International Classification of Diseases, 9th Revision (ICD-9) codes, and evaluated whether natural language processing by the Automated Retrieval Console (ARC) for document classification improves HCC identification. We identified a cohort of patients with ICD-9 codes for HCC during 2005-2010 from Veterans Affairs administrative data. Pathology and radiology reports were reviewed to confirm HCC. The positive predictive value (PPV), sensitivity, and specificity of ICD-9 codes were calculated. A split validation study of pathology and radiology reports was performed to develop and validate ARC algorithms. Reports were manually classified as diagnostic of HCC or not. ARC generated document classification algorithms using the Clinical Text Analysis and Knowledge Extraction System. ARC performance was compared with manual classification. PPV, sensitivity, and specificity of ARC were calculated. A total of 1138 patients with HCC were identified by ICD-9 codes. On the basis of manual review, 773 had HCC. The HCC ICD-9 code algorithm had a PPV of 0.67, sensitivity of 0.95, and specificity of 0.93. For a random subset of 619 patients, we identified 471 pathology reports for 323 patients and 943 radiology reports for 557 patients. The pathology ARC algorithm had PPV of 0.96, sensitivity of 0.96, and specificity of 0.97. The radiology ARC algorithm had PPV of 0.75, sensitivity of 0.94, and specificity of 0.68. A combined approach of ICD-9 codes and natural language processing of pathology and radiology reports improves HCC case identification in automated data.

  17. Calibration and validation processes for relative humidity measurement by a Hygrochron iButton.

    Science.gov (United States)

    Shin, Mirim; Patton, Raymond; Mahar, Trevor; Ireland, Angus; Swan, Paul; Chow, Chin Moi

    2017-10-01

    Accurate relative humidity (RH) measurement is demanded in studies of thermal comfort. Thermal discomfort occurs when the near-to-skin temperature or RH is outside of the thermal comfort zone. The Hygrochron, a small wireless device which measures both temperature and RH, would be suitable and convenient in exercise or sleep studies. However, the RH measurement has not been validated. This paper has three parts. Part 1: In evaluating the sensor surface for RH detection, four Hygrochrons were placed on a wet paper towel. Two were placed on the towel with the protruding surface facing up and the other two facing down. The results showed that the Hygrochron with the protruding side was the sensor surface for detecting RH. Part 2: Twenty-seven Hygrochrons were calibrated in a humidity calibration chamber at a RH range from 40 to 90% at a constant temperature from 32 to 37°C. The mean bias was -1.08% between the Hygrochrons and the calibration chamber. The Hygrochron overestimated RH at the lower range (40-60%) and underestimated RH at the higher range (80-90%). The application of individual regression equations to each Hygrochron improved accuracy and reduced the mean bias to -0.002%. However, one Hygrochron showed outlier values that may be due to a manufacturing defect. Part 3: The reproducibility of Hygrochron for RH measurements was tested twice at the same condition of 35°C over a three-month interval. The intra-class coefficient was 0.996 to 1.000 with non-significant differences in the mean RH between test and re-test results (p=0.159). Hygrochrons are valid for RH measurements which show high reproducibility. It is recommended that Hygrochrons be calibrated over a range of desired RH and temperature prior to use to improve accuracy and detect any manufacturing defects. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Development and Validation of a Constitutive Model for Dental Composites during the Curing Process

    Science.gov (United States)

    Wickham Kolstad, Lauren

    Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.

  19. Full scale validation of helminth ova (Ascaris suum) inactivation by different sludge treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Paulsrud, B.; Gjerde, B.; Lundar, A.

    2003-07-01

    The Norwegian sewage sludge regulation requires disinfection (hygienisation) of all sludges for land application, and one of the criteria is that disinfected sludge should not contain viable helminth ova. All disinfection processes have to be designed and operated in order to comply with this criteria, and four processes employed in Norway (thermophilic aerobic pre-treatment, pre-pasteurisation, thermal vacuum drying in membrane filter presses and lime treatment) have been tested in full scale by inserting semipermeable bags of Ascaris suum eggs into the processes for certain limes. For lime treatment supplementary laboratory tests have been conducted. The paper presents the results of the experiments, and it could be concluded that all processes, except lime treatment, could be operated at less stringent time-temperature regimes than commonly experienced at Norwegian plants today. (author)

  20. Experimental Validation of Hybrid Distillation-Vapor Permeation Process for Energy Efficient Ethanol-Water Separation

    Science.gov (United States)

    The energy demand of distillation-based systems for ethanol recovery and dehydration can be significant, particularly for dilute solutions. An alternative separation process integrating vapor stripping with a vapor compression step and a vapor permeation membrane separation step,...

  1. Validation and Application of a New Reversed Phase HPLC Method for In Vitro Dissolution Studies of Rabeprazole Sodium in Delayed-Release Tablets

    Directory of Open Access Journals (Sweden)

    Md. Saddam Nawaz

    2013-01-01

    Full Text Available The purpose of this study was to develop and validate a new reversed phase high performance liquid chromatographic (RP-HPLC method to quantify in vitro dissolution assay of rabeprazole sodium in pharmaceutical tablet dosage form. Method development was performed on C 18, 100×4.6 mm ID, and 10 μm particle size column, and injection volume was 20 μL using a diode array detector (DAD to monitor the detection at 280 nm. The mobile phase consisted of buffer: acetonitrile at a ratio of 60 : 40 (v/v, and the flow rate was maintained at 1.0 mL/min. The method was validated in terms of suitability, linearity, specificity, accuracy, precision, stability, and sensitivity. Linearity was observed over the range of concentration 0.05–12.0 μg/mL, and the correlation coefficient was found excellent >0.999. The method was specific with respect to rabeprazole sodium, and the peak purity was found 99.99%. The method was precise and had relative standard deviations (RSD less than 2%. Accuracy was found in the range of 99.9 to 101.9%. The method was robust in different variable conditions and reproducible. This proposed fast, reliable, cost-effective method can be used as quality control tool for the estimation of rabeprazole sodium in routine dissolution test analysis.

  2. Process simulation and statistical approaches for validating waste form qualification models

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A.

    1989-05-01

    This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition

  3. [Validity assessment of a low level phonological processing test material in preschool children].

    Science.gov (United States)

    Ptok, M; Altwein, F

    2012-07-01

    The BISC (Bielefelder Screening) is a German test to evaluate phonological skills believed to be a prerequisite for future reading and writing skills. BISC results may indicate an elevated risk for dyslexia. Our research group has put forward test material in order to specifically examine low-level phonological processing LLPP. In this study we analysed whether BISC and low-level phonological processing correlate. A retrospective correlation analysis was carried out on primary school children's test results of the BISC and the newly developed low-level phonological processing test material. Spearman's rho was 0.52 between total LLPP and total BISC. The subscales correlated with a rho below 0.5. Results indicate that a low level phonological processing and higher level phonological processing can be differentiated. Future studies will have to clarify whether these results can be used to construct specifically targeting therapy strategies and whether the LLPP test material can be used to assess the risk of subsequent dyslexia also. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Rejoinder to commentary on Palmatier and Rovner (2015): credibility assessment: Preliminary Process Theory, the polygraph process, and construct validity.

    Science.gov (United States)

    Palmatier, John J; Rovner, Louis

    2015-01-01

    We briefly review comments submitted in response to the target article in this series (Palmatier & Rovner, 2015) arguing that a scientifically defensible construct for the instrumental assessment of credibility (i.e. polygraph) may be found in Barry's Preliminary Process Theory (PPT). Our review of the relevant scientific literature discovered a growing body of converging evidence, particularly from the neurosciences that focus not only on deception, but more broadly on memory, emotion, and the orienting response (OR), leading to this conclusion. After reviewing the submitted comments, we are further convinced, especially as applied scientists, that at this time the most viable direction forward is in the context of the PPT. Concurrently, we candidly acknowledge that research must be conducted to address not only commentator concerns but, if warranted, modification of existing theory. Although disagreement continues to exist regarding the order in which questions are asked, the most significant finding, is perhaps that not a single commentator argues against this growing, and vital applied science (i.e., the instrumental assessment of credibility - polygraph). Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Validating and Determining the Weight of Items Used for Evaluating Clinical Governance Implementation Based on Analytic Hierarchy Process Model

    Directory of Open Access Journals (Sweden)

    Elaheh Hooshmand

    2015-10-01

    Full Text Available Background The purpose of implementing a system such as Clinical Governance (CG is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. Methods The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP model. Results The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients’ non-medical needs, complaints and patients’ participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients’ non-medical needs, patients’ participation in the treatment process and research and development. Conclusion The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety.

  6. Validating and determining the weight of items used for evaluating clinical governance implementation based on analytic hierarchy process model.

    Science.gov (United States)

    Hooshmand, Elaheh; Tourani, Sogand; Ravaghi, Hamid; Vafaee Najar, Ali; Meraji, Marziye; Ebrahimipour, Hossein

    2015-04-08

    The purpose of implementing a system such as Clinical Governance (CG) is to integrate, establish and globalize distinct policies in order to improve quality through increasing professional knowledge and the accountability of healthcare professional toward providing clinical excellence. Since CG is related to change, and change requires money and time, CG implementation has to be focused on priority areas that are in more dire need of change. The purpose of the present study was to validate and determine the significance of items used for evaluating CG implementation. The present study was descriptive-quantitative in method and design. Items used for evaluating CG implementation were first validated by the Delphi method and then compared with one another and ranked based on the Analytical Hierarchy Process (AHP) model. The items that were validated for evaluating CG implementation in Iran include performance evaluation, training and development, personnel motivation, clinical audit, clinical effectiveness, risk management, resource allocation, policies and strategies, external audit, information system management, research and development, CG structure, implementation prerequisites, the management of patients' non-medical needs, complaints and patients' participation in the treatment process. The most important items based on their degree of significance were training and development, performance evaluation, and risk management. The least important items included the management of patients' non-medical needs, patients' participation in the treatment process and research and development. The fundamental requirements of CG implementation included having an effective policy at national level, avoiding perfectionism, using the expertise and potentials of the entire country and the coordination of this model with other models of quality improvement such as accreditation and patient safety. © 2015 by Kerman University of Medical Sciences.

  7. Validation of a Process-Based Agro-Ecosystem Model (Agro-IBIS for Maize in Xinjiang, Northwest China

    Directory of Open Access Journals (Sweden)

    Tureniguli Amuti

    2018-03-01

    Full Text Available Agricultural oasis expansion and intensive management practices have occurred in arid and semiarid regions of China during the last few decades. Accordingly, regional carbon and water budgets have been profoundly impacted by agroecosystems in these regions. Therefore, study on the methods used to accurately estimate energy, water, and carbon exchanges is becoming increasingly important. Process-based models can represent the complex processes between land and atmosphere among agricultural ecosystems. However, before the models can be applied they must be validated under different environmental and climatic conditions. In this study, a process-based agricultural ecosystem model (Agro-IBIS was validated for maize crops using 3 years of soil and biometric measurements at Wulanwusu agrometeorological site (WAS located in the Shihezi oasis in Xinjiang, northwest China. The model satisfactorily represented leaf area index (LAI during the growing season, simulating its peak values within the magnitude of 0–10%. The total biomass carbon was overestimated by 15%, 8%, and 16% in 2004, 2005, and 2006, respectively. The model satisfactorily simulated the soil temperature (0–10 cm and volumetric water content (VWC (0–25 cm of farmland during the growing season. However, it overestimated soil temperature approximately by 4 °C and VWC by 15–30% during the winter, coinciding with the period of no vegetation cover in Xinjiang. Overall, the results indicate that the model could represent crop growth, and seems to be applicable in multiple sites in arid oases agroecosystems of Xinjiang. Future application of the model will impose more comprehensive validation using eddy covariance flux data, and consider including dynamics of crop residue and improving characterization of the final stage of leaf development.

  8. Development and Validation of an Acid Mine Drainage Treatment Process for Source Water

    Energy Technology Data Exchange (ETDEWEB)

    Lane, Ann [Battelle Memorial Institute, Columbus, OH (United States)

    2016-03-01

    Throughout Northern Appalachia and surrounding regions, hundreds of abandoned mine sites exist which frequently are the source of Acid Mine Drainage (AMD). AMD typically contains metal ions in solution with sulfate ions which have been leached from the mine. These large volumes of water, if treated to a minimum standard, may be of use in Hydraulic Fracturing (HF) or other industrial processes. This project’s focus is to evaluate an AMD water treatment technology for the purpose of providing treated AMD as an alternative source of water for HF operations. The HydroFlex™ technology allows the conversion of a previous environmental liability into an asset while reducing stress on potable water sources. The technology achieves greater than 95% water recovery, while removing sulfate to concentrations below 100 mg/L and common metals (e.g., iron and aluminum) below 1 mg/L. The project is intended to demonstrate the capability of the process to provide AMD as alternative source water for HF operations. The second budget period of the project has been completed during which Battelle conducted two individual test campaigns in the field. The first test campaign demonstrated the ability of the HydroFlex system to remove sulfate to levels below 100 mg/L, meeting the requirements indicated by industry stakeholders for use of the treated AMD as source water. The second test campaign consisted of a series of focused confirmatory tests aimed at gathering additional data to refine the economic projections for the process. Throughout the project, regular communications were held with a group of project stakeholders to ensure alignment of the project objectives with industry requirements. Finally, the process byproduct generated by the HydroFlex process was evaluated for the treatment of produced water against commercial treatment chemicals. It was found that the process byproduct achieved similar results for produced water treatment as the chemicals currently in use. Further

  9. ACE-FTS version 3.0 data set: validation and data processing update

    Directory of Open Access Journals (Sweden)

    Claire Waymark

    2014-01-01

    Full Text Available On 12 August 2003, the Canadian-led Atmospheric Chemistry Experiment (ACE was launched into a 74° inclination orbit at 650 km with the mission objective to measure atmospheric composition using infrared and UV-visible spectroscopy (Bernath et al. 2005. The ACE mission consists of two main instruments, ACE-FTS and MAESTRO (McElroy et al. 2007, which are being used to investigate the chemistry and dynamics of the Earth’s atmosphere.  Here, we focus on the high resolution (0.02 cm-1 infrared Fourier Transform Spectrometer, ACE-FTS, that measures in the 750-4400 cm-1 (2.2 to 13.3 µm spectral region.  This instrument has been making regular solar occultation observations for more than nine years.  The current ACE-FTS data version (version 3.0 provides profiles of temperature and volume mixing ratios (VMRs of more than 30 atmospheric trace gas species, as well as 20 subsidiary isotopologues of the most abundant trace atmospheric constituents over a latitude range of ~85°N to ~85°S.  This letter describes the current data version and recent validation comparisons and provides a description of our planned updates for the ACE-FTS data set. [...

  10. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    Science.gov (United States)

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  11. Language Analysis in the Context of the Asylum Process: Procedures, Validity, and Consequences

    Science.gov (United States)

    Reath, Anne

    2004-01-01

    In 1993, the language section of the Swedish Migration Board initiated the production of documents they called "language analyses" to aid in the processing of asylum seekers. Today, 11 years later, 2 privately owned companies in Stockholm produce these documents. These companies have produced language analyses not only for the Swedish…

  12. Assessment of Sensory Processing and Executive Functions in Childhood: Development, Reliability, and Validity of the EPYFEI

    Directory of Open Access Journals (Sweden)

    Dulce Romero-Ayuso

    2018-03-01

    Full Text Available The aim of this study was to determine the psychometric properties of the “Assessment of Sensory Processing and Executive Functions in Childhood” (EPYFEI, a questionnaire designed to assess the sensory processing and executive functions of children aged between 3 and 11 years. The EPYFEI was completed by a sample of 1,732 parents of children aged between 3 and 11 years who lived in Spain. An exploratory factor analysis was conducted and showed five main factors: (1 executive attention, working memory, and initiation of actions; (2 general sensory processing; (3 emotional and behavioral self-regulation; (4 supervision, correction of actions, and problem solving; and (5 inhibitory. The reliability of the analysis was high both for the whole questionnaire and for the factors it is composed of. Results provide evidence of the potential usefulness of the EPYFEI in clinical contexts for the early detection of neurodevelopmental disorders, in which there may be a deficit of executive functions and sensory processing.

  13. Short-Term Memory and Auditory Processing Disorders: Concurrent Validity and Clinical Diagnostic Markers

    Science.gov (United States)

    Maerlender, Arthur

    2010-01-01

    Auditory processing disorders (APDs) are of interest to educators and clinicians, as they impact school functioning. Little work has been completed to demonstrate how children with APDs perform on clinical tests. In a series of studies, standard clinical (psychometric) tests from the Wechsler Intelligence Scale for Children, Fourth Edition…

  14. The Development and Validation of Scores on the Mathematics Information Processing Scale (MIPS).

    Science.gov (United States)

    Bessant, Kenneth C.

    1997-01-01

    This study reports on the development and psychometric properties of a new 87-item Mathematics Information Processing Scale that explores learning strategies, metacognitive problem-solving skills, and attentional deployment. Results with 340 college students support the use of the instrument, for which factor analysis identified five theoretically…

  15. Information-Processing Architectures in Multidimensional Classification: A Validation Test of the Systems Factorial Technology

    Science.gov (United States)

    Fific, Mario; Nosofsky, Robert M.; Townsend, James T.

    2008-01-01

    A growing methodology, known as the systems factorial technology (SFT), is being developed to diagnose the types of information-processing architectures (serial, parallel, or coactive) and stopping rules (exhaustive or self-terminating) that operate in tasks of multidimensional perception. Whereas most previous applications of SFT have been in…

  16. Experimental validation of viscous and viscoelastic simulations of micro injection molding process

    DEFF Research Database (Denmark)

    Gava, Alberto; Tosello, Guido; Lucchetta, Giovanni

    2009-01-01

    The effects of two different rheological models used in the simulation of the micro injection molding (µIM) process are investigated. The Cross-WLF viscous model and the Giesekus viscoelastic model are selected and their performance evaluated using 3D models implemented on two different...

  17. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    Science.gov (United States)

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  18. The Association of Social Work Boards' Licensure Examinations: A Review of Reliability and Validity Processes

    Science.gov (United States)

    Marson, Stephen M.; DeAngelis, Donna; Mittal, Nisha

    2010-01-01

    Objectives: The purpose of this article is to create transparency for the psychometric methods employed for the development of the Association of Social Work Boards' (ASWB) exams. Results: The article includes an assessment of the macro (political) and micro (statistical) environments of testing social work competence. The seven-step process used…

  19. [Twenty-year History and Future Challenges in Transparency Enhancement of Review Process for Approval: Focus on Public Release of Review Reports regarding New Drugs and Medical Devices].

    Science.gov (United States)

    Morimoto, Kazushige; Kawasaki, Satoko; Yoshida, Yasunori

    2015-01-01

    For 20 years, the Ministry of Health, Labour and Welfare (MHLW, formerly Ministry of Health and Welfare (MHW)) has been trying to increase transparency of the review process for approving reports in order to promote the rational use of newly approved drugs and medical devices. The first Summary Basis of Approval (SBA) was published by MHW in 1994. In 1999, evaluation reports were prepared by MHW and the Pharmaceuticals and Medical Devices Evaluation Center to make them available to the public. In 2005, a notice from the Chief Executive of the Pharmaceuticals and Medical Devices Agency (PMDA) made procedures for public release of information on reviewing applications for new drugs. In 2006, 90 review reports of newly approved drugs and eight medical devices were revealed on PMDA websites. The dissemination of information by the United States Food and Drug Administration (FDA) and that of the European Medicines Agency (EMA) were studied and compared with that of the MHLW and PMDA. While common technical documents (CTD) for new drugs and summary technical documents (STED) for new medical devices have been released by PMDA, such documents are not released by the FDA and EMA. The European Public Assessment Report (EAPR) summary for the public is an interesting questionnaire approach that uses the "What," "How" and "Why" format. Finally, future proposals for the next decade are also outlined.

  20. Microparticles obtained by complex coacervation: influence of the type of reticulation and the drying process on the release of the core material

    Directory of Open Access Journals (Sweden)

    Izabela Dutra Alvim

    2010-12-01

    Full Text Available Microparticles obtained by complex coacervation were crosslinked with glutaraldehyde or with transglutaminase and dried using freeze drying or spray drying. Moist samples presented Encapsulation Efficiency (%EE higher than 96%. The mean diameters ranged from 43.7 ± 3.4 to 96.4 ± 10.3 µm for moist samples, from 38.1 ± 5.36 to 65.2 ± 16.1 µm for dried samples, and from 62.5 ± 7.5 to 106.9 ± 26.1 µm for rehydrated microparticles. The integrity of the particles without crosslinking was maintained when freeze drying was used. After spray drying, only crosslinked samples were able to maintain the wall integrity. Microparticles had a round shape and in the case of dried samples rugged walls apparently without cracks were observed. Core distribution inside the particles was multinuclear and homogeneous and core release was evaluated using anhydrous ethanol. Moist particles crosslinked with glutaraldehyde at the concentration of 1.0 mM.g-1 protein (ptn, were more efficient with respect to the core retention compared to 0.1 mM.g-1 ptn or those crosslinked with transglutaminase (10 U.g-1 ptn. The drying processes had a strong influence on the core release profile reducing the amount released to all dry samples

  1. An integrated system for dissolution studies and magnetic resonance imaging of controlled release, polymer-based dosage forms-a tool for quantitative assessment of hydrogel formation processes.

    Science.gov (United States)

    Kulinowski, Piotr; Dorozyński, Przemysław; Jachowicz, Renata; Weglarz, Władysław P

    2008-11-04

    Controlled release (CR) dosage forms are often based on polymeric matrices, e.g., sustained-release tablets and capsules. It is crucial to visualise and quantify processes of the hydrogel formation during the standard dissolution study. A method for imaging of CR, polymer-based dosage forms during dissolution study in vitro is presented. Imaging was performed in a non-invasive way by means of the magnetic resonance imaging (MRI). This study was designed to simulate in vivo conditions regarding temperature, volume, state and composition of dissolution media. Two formulations of hydrodynamically balanced systems (HBS) were chosen as model CR dosage forms. HBS release active substance in stomach while floating on the surface of the gastric content. Time evolutions of the diffusion region, hydrogel formation region and "dry core" region were obtained during a dissolution study of L-dopa as a model drug in two simulated gastric fluids (i.e. in fed and fasted state). This method seems to be a very promising tool for examining properties of new formulations of CR, polymer-based dosage forms or for comparison of generic and originator dosage forms before carrying out bioequivalence studies.

  2. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    Science.gov (United States)

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P information. Copyright © 2016. Published by Elsevier Inc.

  3. Sorption-desorption processes of radioisotopes with solid materials from liquid releases and atmosphere deposits. The distribution coefficient (Ksub(d)), its uses, limitations, and practical applications

    International Nuclear Information System (INIS)

    Saas, Arsene

    1979-03-01

    The various sorption-desorption processes of radionuclides with environmental materials are presented. The parameters governing the distribution coefficient are reviewed in the light of various examples. The factors affecting equilibria between the different phases are: reaction time, concentration of the solid phase, water quality, salinity, competition between ions, concentration of radioisotopes or stable isotopes, pH of the mobile phase, particle diameter, chemical form of the radioisotopes, nature of the solid phase, temperature. The effects of the biological parameters on the distribution coefficient are discussed. Biological processes affect the main chemical transformations: mineralization, insolubilization, oxidation-reduction, complexation, ... The importance of these processes is demonstrated by a number of examples in various media. Finally, the practical use of Ksub(d) in the assessment of the environmental impact of radioactive releases is developed, with special emphasis on the limits of its use in siting studies and its essential interest in specifying pathways and capacity of a river system [fr

  4. Time Sharing Between Robotics and Process Control: Validating a Model of Attention Switching.

    Science.gov (United States)

    Wickens, Christopher Dow; Gutzwiller, Robert S; Vieane, Alex; Clegg, Benjamin A; Sebok, Angelia; Janes, Jess

    2016-03-01

    The aim of this study was to validate the strategic task overload management (STOM) model that predicts task switching when concurrence is impossible. The STOM model predicts that in overload, tasks will be switched to, to the extent that they are attractive on task attributes of high priority, interest, and salience and low difficulty. But more-difficult tasks are less likely to be switched away from once they are being performed. In Experiment 1, participants performed four tasks of the Multi-Attribute Task Battery and provided task-switching data to inform the role of difficulty and priority. In Experiment 2, participants concurrently performed an environmental control task and a robotic arm simulation. Workload was varied by automation of arm movement and both the phases of environmental control and existence of decision support for fault management. Attention to the two tasks was measured using a head tracker. Experiment 1 revealed the lack of influence of task priority and confirmed the differing roles of task difficulty. In Experiment 2, the percentage attention allocation across the eight conditions was predicted by the STOM model when participants rated the four attributes. Model predictions were compared against empirical data and accounted for over 95% of variance in task allocation. More-difficult tasks were performed longer than easier tasks. Task priority does not influence allocation. The multiattribute decision model provided a good fit to the data. The STOM model is useful for predicting cognitive tunneling given that human-in-the-loop simulation is time-consuming and expensive. © 2016, Human Factors and Ergonomics Society.

  5. New pediatric vision screener, part II: electronics, software, signal processing and validation.

    Science.gov (United States)

    Gramatikov, Boris I; Irsch, Kristina; Wu, Yi-Kai; Guyton, David L

    2016-02-04

    We have developed an improved pediatric vision screener (PVS) that can reliably detect central fixation, eye alignment and focus. The instrument identifies risk factors for amblyopia, namely eye misalignment and defocus. The device uses the birefringence of the human fovea (the most sensitive part of the retina). The optics have been reported in more detail previously. The present article focuses on the electronics and the analysis algorithms used. The objective of this study was to optimize the analog design, data acquisition, noise suppression techniques, the classification algorithms and the decision making thresholds, as well as to validate the performance of the research instrument on an initial group of young test subjects-18 patients with known vision abnormalities (eight male and 10 female), ages 4-25 (only one above 18) and 19 controls with proven lack of vision issues. Four statistical methods were used to derive decision making thresholds that would best separate patients with abnormalities from controls. Sensitivity and specificity were calculated for each method, and the most suitable one was selected. Both the central fixation and the focus detection criteria worked robustly and allowed reliable separation between normal test subjects and symptomatic subjects. The sensitivity of the instrument was 100 % for both central fixation and focus detection. The specificity was 100 % for central fixation and 89.5 % for focus detection. The overall sensitivity was 100 % and the overall specificity was 94.7 %. Despite the relatively small initial sample size, we believe that the PVS instrument design, the analysis methods employed, and the device as a whole, will prove valuable for mass screening of children.

  6. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    Science.gov (United States)

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper

  7. Validation of the solidifying soil process using laser-induced breakdown spectroscopy

    Science.gov (United States)

    Lin, Zhao-Xiang; Liu, Lin-Mei; Liu, Lu-Wen

    2016-09-01

    Although an Ionic Soil Stabilizer (ISS) has been widely used in landslide control, it is desirable to effectively monitor the stabilization process. With the application of laser-induced breakdown spectroscopy (LIBS), the ion contents of K, Ca, Na, Mg, Al, and Si in the permeable fluid are detected after the solidified soil samples have been permeated. The processes of the Ca ion exchange are analyzed at pressures of 2 and 3 atm, and it was determined that the cation exchanged faster as the pressure increased. The Ca ion exchanges were monitored for different stabilizer mixtures, and it was found that a ratio of 1:200 of ISS to soil is most effective. The investigated plasticity and liquidity indexes also showed that the 1:200 ratio delivers the best performance. The research work indicates that it is possible to evaluate the engineering performances of soil solidified by ISS in real time and online by LIBS.

  8. Manipulation of natural subsurface processes: Field research and validation. Interim report

    International Nuclear Information System (INIS)

    Fruchter, J.S.; Spane, F.A.; Amonette, J.E.

    1994-11-01

    Often the only alternative for treating deep subsurface contamination is in situ manipulation of natural processes to change the mobility or form of contaminants. However, the complex interactions of natural subsurface physical, chemical, and microbial processes limit the predictability of the system-wide impact of manipulation based on current knowledge. This report is a summary of research conducted to examine the feasibility of controlling the oxidation-reduction (redox) potential of the unconfined aquifer at the Hanford Site in southeastern Washington State by introducing chemical reagents and microbial nutrients. The experiment would allow the testing of concepts and hypotheses developed from fundamental research in the US Department of Energy's (DOE's) Subsurface Science Program. Furthermore, the achievement of such control is expected to have implications for in situ remediation of dispersed aqueous contaminants in the subsurface environment at DOE sites nationwide, and particularly at the Hanford Site. This interim report summarizes initial research that was conducted between July 1990 and October 1991

  9. Tritium processing tests for the validation of upgraded PERMCAT mechanical design

    Energy Technology Data Exchange (ETDEWEB)

    Demange, D.; Glugla, M.; Guenther, K.; Le, T. L.; Simon, K. H.; Wagner, R.; Welte, S. [Forschungszentrum Karlsruhe GmbH, Institue for Technical Physics, Tritium Laboratory Karlsruhe, P.O Box 36 40, D-76021 Karlsruhe (Germany)

    2008-07-15

    The PERMCAT process, chosen for the final clean-up stage of the Tritium Exhaust Processing system in ITER, directly combines a Pd/Ag membrane and a catalyst bed for the detritiation of gaseous mixtures containing molecular and chemically bound tritium. Upgraded PERMCAT mechanical designs have been proposed to both increase the robustness and simplify the design of the reactor. One uses a special corrugated Pd/Ag membrane able to withstand change in length of the membrane during both normal operation and in the case of off-normal events. Based on this design, an upgraded PERMCAT reactor has been produced at FZK and successfully tested at TLK with ITER relevant tritiated gaseous mixtures using the CAPER facility. (authors)

  10. Tritium processing tests for the validation of upgraded PERMCAT mechanical design

    International Nuclear Information System (INIS)

    Demange, D.; Glugla, M.; Guenther, K.; Le, T. L.; Simon, K. H.; Wagner, R.; Welte, S.

    2008-01-01

    The PERMCAT process, chosen for the final clean-up stage of the Tritium Exhaust Processing system in ITER, directly combines a Pd/Ag membrane and a catalyst bed for the detritiation of gaseous mixtures containing molecular and chemically bound tritium. Upgraded PERMCAT mechanical designs have been proposed to both increase the robustness and simplify the design of the reactor. One uses a special corrugated Pd/Ag membrane able to withstand change in length of the membrane during both normal operation and in the case of off-normal events. Based on this design, an upgraded PERMCAT reactor has been produced at FZK and successfully tested at TLK with ITER relevant tritiated gaseous mixtures using the CAPER facility. (authors)

  11. Micro injection moulding process validation for high precision manufacture of thermoplastic elastomer micro suspension rings

    DEFF Research Database (Denmark)

    Calaon, M.; Tosello, G.; Elsborg Hansen, R.

    Micro injection moulding (μIM) is one of the most suitable micro manufacturing processes for flexible mass-production of multi-material functional micro components. The technology was employed in this research used to produce thermoplastic elastomer (TPE) micro suspension rings identified...... main μIM process parameters (melt temperature, injection speed, packing pressure) using the Design of Experiment statistical technique. Measurements results demonstrated the importance of calibrating mould´s master geometries to ensure correct part production and effective quality conformance...... on the frequency in order to improve the signal quality and assure acoustic reproduction fidelity. Production quality of the TPE rings drastically influence the product functionality. In the present study, a procedure for μIM TPE micro rings production optimization has been established. The procedure entail using...

  12. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.8

    Science.gov (United States)

    2013-06-28

    OC operating in the turbid coastal waters of the Gulf of Mexico . Strict criteria exist for data collection, protocols and processing to calculate the...cruise The Gulf of Mexico chlorophyll cruise has been added to this report to show the correlation of the chlorophyll products and demonstrate how...the N2Gen 9 upgrades in AOPS v4.8 improve the matchups. Figure 18 shows a VIIRS image of the Gulf of Mexico with oceanographic cruise stations 1

  13. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.12

    Science.gov (United States)

    2015-09-03

    Sensor (SeaWiFS), Moderate Resolution Imaging Spectrometers ( MODIS on Aqua), and Medium Resolution Imaging Spectrometer (MERIS). This VTR documents...the MOBY site. 3.2.1 Image to Image Comparison The AOPS v4.12 upgrades are detailed in Section 2 System Description. Figure 5 shows the MODIS ...bottom images were processed with AOPS v4.12; MODIS imagery on the left and VIIRS on the right. The most significant change in the VIIRS imagery is

  14. Pose-varied multi-axis optical finishing systems theory and process validation

    CERN Document Server

    Cheng, Haobo

    2015-01-01

    This book focuses on advanced optical finishing techniques and design for high-performance manufacturing systems. It provides numerous detailed examples of how advanced automation techniques have been applied to optical fabrication processes. The simulations, removal rate and accurate experimental results offer useful resources for engineering practice. Researchers, engineers and graduate students working in optical engineering and precision manufacture engineering will benefit from this book.

  15. Modeling and Experimental Validation of the Electron Beam Selective Melting Process

    Directory of Open Access Journals (Sweden)

    Wentao Yan

    2017-10-01

    Full Text Available Electron beam selective melting (EBSM is a promising additive manufacturing (AM technology. The EBSM process consists of three major procedures: ① spreading a powder layer, ② preheating to slightly sinter the powder, and ③ selectively melting the powder bed. The highly transient multi-physics phenomena involved in these procedures pose a significant challenge for in situ experimental observation and measurement. To advance the understanding of the physical mechanisms in each procedure, we leverage high-fidelity modeling and post-process experiments. The models resemble the actual fabrication procedures, including ① a powder-spreading model using the discrete element method (DEM, ② a phase field (PF model of powder sintering (solid-state sintering, and ③ a powder-melting (liquid-state sintering model using the finite volume method (FVM. Comprehensive insights into all the major procedures are provided, which have rarely been reported. Preliminary simulation results (including powder particle packing within the powder bed, sintering neck formation between particles, and single-track defects agree qualitatively with experiments, demonstrating the ability to understand the mechanisms and to guide the design and optimization of the experimental setup and manufacturing process.

  16. Validation of DWI pre-processing procedures for reliable differentiation between human brain gliomas.

    Science.gov (United States)

    Vellmer, Sebastian; Tonoyan, Aram S; Suter, Dieter; Pronin, Igor N; Maximov, Ivan I

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) is a powerful tool in clinical applications, in particular, in oncology screening. dMRI demonstrated its benefit and efficiency in the localisation and detection of different types of human brain tumours. Clinical dMRI data suffer from multiple artefacts such as motion and eddy-current distortions, contamination by noise, outliers etc. In order to increase the image quality of the derived diffusion scalar metrics and the accuracy of the subsequent data analysis, various pre-processing approaches are actively developed and used. In the present work we assess the effect of different pre-processing procedures such as a noise correction, different smoothing algorithms and spatial interpolation of raw diffusion data, with respect to the accuracy of brain glioma differentiation. As a set of sensitive biomarkers of the glioma malignancy grades we chose the derived scalar metrics from diffusion and kurtosis tensor imaging as well as the neurite orientation dispersion and density imaging (NODDI) biophysical model. Our results show that the application of noise correction, anisotropic diffusion filtering, and cubic-order spline interpolation resulted in the highest sensitivity and specificity for glioma malignancy grading. Thus, these pre-processing steps are recommended for the statistical analysis in brain tumour studies. Copyright © 2017. Published by Elsevier GmbH.

  17. Kozeny-Carman permeability relationship with disintegration process predicted from early dissolution profiles of immediate release tablets.

    Science.gov (United States)

    Kumari, Parveen; Rathi, Pooja; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir

    2017-07-01

    This study was oriented toward the disintegration profiling of the diclofenac sodium (DS) immediate-release (IR) tablets and development of its relationship with medium permeability k perm based on Kozeny-Carman equation. Batches (L1-L9) of DS IR tablets with different porosities and specific surface area were prepared at different compression forces and evaluated for porosity, in vitro dissolution and particle-size analysis of the disintegrated mass. The k perm was calculated from porosities and specific surface area, and disintegration profiles were predicted from the dissolution profiles of IR tablets by stripping/residual method. The disintegration profiles were subjected to exponential regression to find out the respective disintegration equations and rate constants k d . Batches L1 and L2 showed the fastest disintegration rates as evident from their bi-exponential equations while the rest of the batches L3-L9 exhibited the first order or mono-exponential disintegration kinetics. The 95% confidence interval (CI 95% ) revealed significant differences between k d values of different batches except L4 and L6. Similar results were also spotted for dissolution profiles of IR tablets by similarity (f 2 ) test. The final relationship between k d and k perm was found to be hyperbolic, signifying the initial effect of k perm on the disintegration rate. The results showed that disintegration profiling is possible because a relationship exists between k d and k perm . The later being relatable with porosity and specific surface area can be determined by nondestructive tests.

  18. Growth Hormone-Releasing Peptide 6 Enhances the Healing Process and Improves the Esthetic Outcome of the Wounds

    Directory of Open Access Journals (Sweden)

    Yssel Mendoza Marí

    2016-01-01

    Full Text Available In addition to its cytoprotective effects, growth hormone-releasing peptide 6 (GHRP-6 proved to reduce liver fibrotic induration. CD36 as one of the GHRP-6 receptors appears abundantly represented in cutaneous wounds granulation tissue. The healing response in a scenario of CD36 agonistic stimulation had not been previously investigated. Excisional full-thickness wounds (6 mmØ were created in the dorsum of Wistar rats and topically treated twice a day for 5 days. The universal model of rabbit’s ears hypertrophic scars was implemented and the animals were treated daily for 30 days. Treatments for both species were based on a CMC jelly composition containing GHRP-6 400 μg/mL. Wounds response characterization included closure dynamic, RT-PCR transcriptional profile, histology, and histomorphometric procedures. The rats experiment indicated that GHRP-6 pharmacodynamics involves attenuation of immunoinflammatory mediators, their effector cells, and the reduction of the expression of fibrotic cytokines. Importantly, in the hypertrophic scars rabbit’s model, GHRP-6 intervention dramatically reduced the onset of exuberant scars by activating PPARγ and reducing the expression of fibrogenic cytokines. GHRP-6 showed no effect on the reversion of consolidated lesions. This evidence supports the notion that CD36 is an active and pharmacologically approachable receptor to attenuate wound inflammation and accelerate its closure so as to improve wound esthetic.

  19. Titanium carbide-carbon porous nanocomposite materials for radioactive ion beam production: processing, sintering and isotope release properties

    CERN Document Server

    AUTHOR|(CDS)2081922; Stora, Thierry

    2017-01-26

    The Isotope Separator OnLine (ISOL) technique is used at the ISOLDE - Isotope Separator OnLine DEvice facility at CERN, to produce radioactive ion beams for physics research. At CERN protons are accelerated to 1.4 GeV and made to collide with one of two targets located at ISOLDE facility. When the protons collide with the target material, nuclear reactions produce isotopes which are thermalized in the bulk of the target material grains. During irradiation the target is kept at high temperatures (up to 2300 °C) to promote diffusion and effusion of the produced isotopes into an ion source, to produce a radioactive ion beam. Ti-foils targets are currently used at ISOLDE to deliver beams of K, Ca and Sc, however they are operated at temperatures close to their melting point which brings target degradation, through sintering and/or melting which reduces the beam intensities over time. For the past 10 years, nanostructured target materials have been developed and have shown improved release rates of the produced i...

  20. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  1. Impact of treated effluents released from processing of radioactive mineral on the aquatic environment of Periyar river

    International Nuclear Information System (INIS)

    Radhakrishnan, Sujata; Haridasan, P.P.; Radhakrishna Pillai, K.; Pillai, P.M.B.; Khan, A.H.

    2005-01-01

    The chemical processing of monazite/ thorium concentrate for the separation of thorium, uranium and rare earths results in the generation of effluents, both acidic and alkaline. Indian Rare Earths Ltd (IREL), Udyogamandal was carrying out processing of monazite for nearly 50 years. Presently (since 2004) Indian Rare Earths Ltd, Udyogamandal is processing earlier stocked thorium hydroxide concentrate retrieved from Silos to produce Thorium Oxalate (along with a small percentage of Rare Earth elements), Nuclear Grade Ammonium Di-Uranate (NGADU), and small quantities of Nuclear Grade Thorium Oxide ('THRUST' Project). The treated effluents after monitoring are discharged to river Periyar. River Periyar is the recipient water body for treated effluents from IREL as well as a host of other chemical industries. Indian Rare Earths Ltd, Udyogamandal had been carrying out chemical processing of monazite for the past 50 years. Recently, from 2004, the plant has shifted from monazite processing to processing of thorium concentrate (THRUST Project). The present paper discusses the characteristics of the effluents generated as per this project, their treatment, monitoring methodology, discharge and impact on the aquatic environment of river Periyar. It has been noted that the impact on the aquatic environment by way of enhancing the natural background radioactivity in the river had been insignificant. (author)

  2. The Music Therapy Session Assessment Scale (MT-SAS): Validation of a new tool for music therapy process evaluation.

    Science.gov (United States)

    Raglio, Alfredo; Gnesi, Marco; Monti, Maria Cristina; Oasi, Osmano; Gianotti, Marta; Attardo, Lapo; Gontero, Giulia; Morotti, Lara; Boffelli, Sara; Imbriani, Chiara; Montomoli, Cristina; Imbriani, Marcello

    2017-11-01

    Music therapy (MT) interventions are aimed at creating and developing a relationship between patient and therapist. However, there is a lack of validated observational instruments to consistently evaluate the MT process. The purpose of this study was the validation of Music Therapy Session Assessment Scale (MT-SAS), designed to assess the relationship between therapist and patient during active MT sessions. Videotapes of a single 30-min session per patient were considered. A pilot study on the videotapes of 10 patients was carried out to help refine the items, define the scoring system and improve inter-rater reliability among the five raters. Then, a validation study on 100 patients with different clinical conditions was carried out. The Italian MT-SAS was used throughout the process, although we also provide an English translation. The final scale consisted of 7 binary items accounting for eye contact, countenance, and nonverbal and sound-music communication. In the pilot study, raters were found to share an acceptable level of agreement in their assessments. Explorative factorial analysis disclosed a single homogeneous factor including 6 items (thus supporting an ordinal total score), with only the item about eye contact being unrelated to the others. Moreover, the existence of 2 different archetypal profiles of attuned and disattuned behaviours was highlighted through multiple correspondence analysis. As suggested by the consistent results of 2 different analyses, MT-SAS is a reliable tool that globally evaluates sonorous-musical and nonverbal behaviours related to emotional attunement and empathetic relationship between patient and therapist during active MT sessions. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Sensitivity, applicability and validation of bi-gaussian off- and on-line models for the evaluation of the consequences of accidental releases in nuclear facilities

    International Nuclear Information System (INIS)

    Kretzschmar, J.G.; Mertens, I.; Vanderborght, B.

    1984-01-01

    A computer code CAERS (Computer Aided Emergency Response System) has been developed for the simulation of the short-term concentrations caused by an atmospheric emission. The concentration calculations are based on the bi-gaussian theorem with the possibility of using twelve different sets of turbulence typing schemes and dispersion parameters or the plume can be simulated with a bi-dimensional puff trajectory model with tri-gaussian diffusion of the puffs. With the puff trajectory model the emission and the wind conditions can be variable in time. Sixteen SF 6 tracer dispersion experiments, with mobile as well as stationary time averaging sampling, have been carried out for the validation of the on-line and off-line models of CAERS. The tracer experiments of this study have shown that the CAERS system, using the bi-gaussian model and the SCK/CEN turbulence typing scheme, can simulate short time concentration levels very well. The variations of the plume under non-steady emission and meteo conditions are well simulated by the puff trajectory model. This leads to the general conclusion that the atmospheric dispersion models of the CAERS system can give a significant contribution to the management and the interpretation of air pollution concentration measurements in emergency situations

  4. Bayesian model selection validates a biokinetic model for zirconium processing in humans

    Science.gov (United States)

    2012-01-01

    Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152

  5. Experimental validation on the effect of material geometries and processing methodology of Polyoxymethylene (POM)

    Science.gov (United States)

    Hafizzal, Y.; Nurulhuda, A.; Izman, S.; Khadir, AZA

    2017-08-01

    POM-copolymer bond breaking leads to change depending with respect to processing methodology and material geometries. This paper present the oversights effect on the material integrity due to different geometries and processing methodology. Thermo-analytical methods with reference were used to examine the degradation of thermomechanical while Thermogravimetric Analysis (TGA) was used to judge the thermal stability of sample from its major decomposition temperature. Differential Scanning Calorimetry (DSC) investigation performed to identify the thermal behaviour and thermal properties of materials. The result shown that plastic gear geometries with injection molding at higher tonnage machine more stable thermally rather than resin geometries. Injection plastic gear geometries at low tonnage machine faced major decomposition temperatures at 313.61°C, 305.76 °C and 307.91 °C while higher tonnage processing method are fully decomposed at 890°C, significantly higher compared to low tonnage condition and resin geometries specimen at 398°C. Chemical composition of plastic gear geometries with injection molding at higher and lower tonnage are compare based on their moisture and Volatile Organic Compound (VOC) content, polymeric material content and the absence of filler. Results of higher moisture and Volatile Organic Compound (VOC) content are report in resin geometries (0.120%) compared to higher tonnage of injection plastic gear geometries which is 1.264%. The higher tonnage of injection plastic gear geometry are less sensitive to thermo-mechanical degradation due to polymer chain length and molecular weight of material properties such as tensile strength, flexural strength, fatigue strength and creep resistance.

  6. Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace

    Science.gov (United States)

    Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis

    2018-05-01

    The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.

  7. Monitoring Bare Soil Freeze–Thaw Process Using GPS-Interferometric Reflectometry: Simulation and Validation

    Directory of Open Access Journals (Sweden)

    Xuerui Wu

    2017-12-01

    Full Text Available Frozen soil and permafrost affect ecosystem diversity and productivity as well as global energy and water cycles. Although some space-based Radar techniques or ground-based sensors can monitor frozen soil and permafrost variations, there are some shortcomings and challenges. For the first time, we use GPS-Interferometric Reflectometry (GPS-IR to monitor and investigate the bare soil freeze–thaw process as a new remote sensing tool. The mixed-texture permittivity models are employed to calculate the frozen and thawed soil permittivities. When the soil freeze/thaw process occurs, there is an abrupt change in the soil permittivity, which will result in soil scattering variations. The corresponding theoretical simulation results from the forward GPS multipath simulator show variations of GPS multipath observables. As for the in-situ measurements, virtual bistatic radar is employed to simplify the analysis. Within the GPS-IR spatial resolution, one SNOTEL site (ID 958 and one corresponding PBO (plate boundary observatory GPS site (AB33 are used for analysis. In 2011, two representative days (frozen soil on Doy of Year (DOY 318 and thawed soil on DOY 322 show the SNR changes of phase and amplitude. The GPS site and the corresponding SNOTEL site in four different years are analyzed for comparisons. When the soil freeze/thaw process occurred and no confounding snow depth and soil moisture effects existed, it exhibited a good absolute correlation (|R| = 0.72 in 2009, |R| = 0.902 in 2012, |R| = 0.646 in 2013, and |R| = 0.7017 in 2014 with the average detrended SNR data. Our theoretical simulation and experimental results demonstrate that GPS-IR has potential for monitoring the bare soil temperature during the soil freeze–thaw process, while more test works should be done in the future. GNSS-R polarimetry is also discussed as an option for detection. More retrieval work about elevation and polarization combinations are the focus of future development.

  8. Development and validation of a novel monitoring system for batch flocculant solids settling process

    DEFF Research Database (Denmark)

    Valverde Pérez, Borja; Zhang, Xueqian; Penkarski-Rodon, Elena

    2017-01-01

    system able to monitor batch settling tests by tracking the sludge blanket height and solid concentration along the column in the range of 1 to 8 g L-1. The system could be efficiently applied to monitor the batch settling tests of several full scale treatment plants run under different operational......Secondary sedimentation is the main hydraulic bottleneck of effective pollution control WWTP under wetweather flow conditions. Therefore, online monitoring tools are required for control and optimization of the settling process under dynamic conditions. In this work we propose a novel monitoring...

  9. Multibody dynamical modeling for spacecraft docking process with spring-damper buffering device: A new validation approach

    Science.gov (United States)

    Daneshjou, Kamran; Alibakhshi, Reza

    2018-01-01

    In the current manuscript, the process of spacecraft docking, as one of the main risky operations in an on-orbit servicing mission, is modeled based on unconstrained multibody dynamics. The spring-damper buffering device is utilized here in the docking probe-cone system for micro-satellites. Owing to the impact occurs inevitably during docking process and the motion characteristics of multibody systems are remarkably affected by this phenomenon, a continuous contact force model needs to be considered. Spring-damper buffering device, keeping the spacecraft stable in an orbit when impact occurs, connects a base (cylinder) inserted in the chaser satellite and the end of docking probe. Furthermore, by considering a revolute joint equipped with torsional shock absorber, between base and chaser satellite, the docking probe can experience both translational and rotational motions simultaneously. Although spacecraft docking process accompanied by the buffering mechanisms may be modeled by constrained multibody dynamics, this paper deals with a simple and efficient formulation to eliminate the surplus generalized coordinates and solve the impact docking problem based on unconstrained Lagrangian mechanics. By an example problem, first, model verification is accomplished by comparing the computed results with those recently reported in the literature. Second, according to a new alternative validation approach, which is based on constrained multibody problem, the accuracy of presented model can be also evaluated. This proposed verification approach can be applied to indirectly solve the constrained multibody problems by minimum required effort. The time history of impact force, the influence of system flexibility and physical interaction between shock absorber and penetration depth caused by impact are the issues followed in this paper. Third, the MATLAB/SIMULINK multibody dynamic analysis software will be applied to build impact docking model to validate computed results and

  10. Composite Cure Process Modeling and Simulations using COMPRO(Registered Trademark) and Validation of Residual Strains using Fiber Optics Sensors

    Science.gov (United States)

    Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.

    2016-01-01

    Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process

  11. Influence of predictive contamination to agricultural products due to dry and wet processes during an accidental release of radionuclides

    International Nuclear Information System (INIS)

    Hwang, Won Tae; Kim, Eun Han; Suh, Kyung Suk; Jeong, Hyo Joon; Han, Moon Hee; Lee, Chang Woo

    2003-01-01

    The influence of predictive contamination to agricultural products due to the wet processes as well as dry processes from radioactive air concentration during a nuclear emergency is comprehensively analyzed. The previous dynamic food chain model DYNACON considering Korean agricultural and environmental conditions, in which the initial input parameter was radionuclide concentrations on the ground, is improved so as to evaluate radioactive contamination to agricultural products from either radioactive air concentrations or radionuclide concentrations on the ground. As for the results, wet deposition is a more dominant mechanism than dry deposition in contamination on the ground. While, the contamination levels of agricultural products are strongly dependent on radionuclide and precipitation when the deposition of radionuclides occurs. It means that the contamination levels of agricultural products are determined from which is the more dominant process between deposition on the ground and interception to agricultural plants

  12. Validation Studies of Temperature Distribution and Mould Filling Process for Composite Skeleton Castings

    Directory of Open Access Journals (Sweden)

    M. Cholewa

    2007-07-01

    Full Text Available In this work authors showed selected results of simulation and experimental studies on temperature distribution during solidification of composite skeleton casting and mould filling process (Fig. 4, 5, 6. The basic subject of the computer simulation was the analysis of ability of metal to fill the channels creating the skeleton shape and prepared in form of a core. Analysis of filling for each consecutive levels of the skeleton casting was conducted for simulation results and real casting. The skeleton casting was manufactured according to proposed technology (Fig. 5. Number of fully filled nodes in simulation was higher than obtained in experimental studies. It was observed in the experiment, that metal during pouring did not flow through the whole channel section, what enabled possibilities of reducing the channel section and pointed out the necessity of local pressure increase.

  13. Study of the production of the radiopharmaceutical 18F-FLT in automated system: contribution for process validation

    International Nuclear Information System (INIS)

    Zanette, Camila

    2013-01-01

    Radiopharmaceutical 18 F-FLT is a thymidine nucleoside analogue and a promising tumor proliferation marker for PET images. The synthesis of this radiopharmaceutical is not simple, and often has low yields. This radiopharmaceutical has already been studied for some years; however, there is no production, nor are there clinical studies in Brazil. The study of the production process and its compliance with the guidelines of Good Manufacturing Practices (ANVISA) are of extreme importance. This study aimed to investigate the synthesis of this radiopharmaceutical, evaluate methods of quality control that will be used in future production routines, perform cytotoxicity studies, biodistribution studies and PET imaging in animals, thereby contributing to the development and elaboration of the process validation protocol and to the establishment of analytical methods to be used during production routines. Initially, we studied the synthesis and production of 18 F-FLT, with the evaluation of three different temperatures of radiolabeling to check the behavior of the radiochemical yield and stability of the nal product. Studies of analytical methodology comprised the analysis of radionuclide identification, determination of chromatographic profiles, radiochemical purity, residual solvents, and pH. In vitro studies of internalization and cytotoxicity were also carried out. In in vivo studies, we evaluated the pharmacokinetics, biodistribution in healthy animals and in animals with tumor models, in addition to PET/CT images in animals with melanomas. The final product had high radiochemical purity and was stable for up to 10 hours after the synthesis, but got a relatively low radiochemical yield, as described in the literature. The tested analytical methods proved suitable for use in the quality control of 18 F-FLT. In in vitro studies, 18 F-FLT showed a significant percentage of binding to tumor cells, and the nonradiolabeled molecule was not considered toxic for these studied

  14. Validation of natural language processing to extract breast cancer pathology procedures and results

    Directory of Open Access Journals (Sweden)

    Arika E Wieneke

    2015-01-01

    Full Text Available Background: Pathology reports typically require manual review to abstract research data. We developed a natural language processing (NLP system to automatically interpret free-text breast pathology reports with limited assistance from manual abstraction. Methods: We used an iterative approach of machine learning algorithms and constructed groups of related findings to identify breast-related procedures and results from free-text pathology reports. We evaluated the NLP system using an all-or-nothing approach to determine which reports could be processed entirely using NLP and which reports needed manual review beyond NLP. We divided 3234 reports for development (2910, 90%, and evaluation (324, 10% purposes using manually reviewed pathology data as our gold standard. Results: NLP correctly coded 12.7% of the evaluation set, flagged 49.1% of reports for manual review, incorrectly coded 30.8%, and correctly omitted 7.4% from the evaluation set due to irrelevancy (i.e. not breast-related. Common procedures and results were identified correctly (e.g. invasive ductal with 95.5% precision and 94.0% sensitivity, but entire reports were flagged for manual review because of rare findings and substantial variation in pathology report text. Conclusions: The NLP system we developed did not perform sufficiently for abstracting entire breast pathology reports. The all-or-nothing approach resulted in too broad of a scope of work and limited our flexibility to identify breast pathology procedures and results. Our NLP system was also limited by the lack of the gold standard data on rare findings and wide variation in pathology text. Focusing on individual, common elements and improving pathology text report standardization may improve performance.

  15. Experiments to Populate and Validate a Processing Model for Polyurethane Foam: Additional Data for Structural Foams

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Giron, Nicholas Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Russick, Edward M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    We are developing computational models to help understand manufacturing processes, final properties and aging of structural foam, polyurethane PMDI. Th e resulting model predictions of density and cure gradients from the manufacturing process will be used as input to foam heat transfer and mechanical models. BKC 44306 PMDI-10 and BKC 44307 PMDI-18 are the most prevalent foams used in structural parts. Experiments needed to parameterize models of the reaction kinetics and the equations of motion during the foam blowing stages were described for BKC 44306 PMDI-10 in the first of this report series (Mondy et al. 2014). BKC 44307 PMDI-18 is a new foam that will be used to make relatively dense structural supports via over packing. It uses a different catalyst than those in the BKC 44306 family of foams; hence, we expect that the reaction kineti cs models must be modified. Here we detail the experiments needed to characteriz e the reaction kinetics of BKC 44307 PMDI-18 and suggest parameters for the model based on these experiments. In additi on, the second part of this report describes data taken to provide input to the preliminary nonlinear visco elastic structural response model developed for BKC 44306 PMDI-10 foam. We show that the standard cu re schedule used by KCP does not fully cure the material, and, upon temperature elevation above 150°C, oxidation or decomposition reactions occur that alter the composition of the foam. These findings suggest that achieving a fully cured foam part with this formulation may be not be possible through therma l curing. As such, visco elastic characterization procedures developed for curing thermosets can provide only approximate material properties, since the state of the material continuously evolves during tests.

  16. The Citadel of Alessandria: Values and strategies involved in the process of releasing from the public ownership

    Directory of Open Access Journals (Sweden)

    Cristina Coscia

    2015-06-01

    Full Text Available The issues concerning the valorization of assets state property, their management and financial rebalancing through a careful policy of disposals and growth of profitability. These dynamics, through the grant or lease to third parties, have been expanding and pay increasing attention to issues of public finance. A radical change of perspective has started in the evaluation of the role of asset management in the field of local authorities. The heritage is no longer considered static, but dynamic; it is gained as a strategic asset in the overall financial management. Local governments make use of this to ensure their service delivery goals and to maximize the well-being of the community. The asset of Defense Ministry transferred to the State Property Office, offers important opportunities for development: not only properties to insert in the real estate market for monetary returns profits to help the Local Governments finance (strategy that did not lead to the desired results, but also opportunities to initiate processes of valorization affecting the industrial area and the surrounding geographical area. In this sense, the case of the Citadel of Alessandria becomes a paradigmatic work to simulate technical decision making application to simulate applications (SWOT, Analytic Hierarchy Process, Analytic Network Process, etc.. Regarding the process and tools that can act as support in delineating the most compatible functional scenario.

  17. Volpe Aircraft Noise Certification DGPS Validation/Audit General Information, Data Submittal Guidelines, and Process Details; Letter Report V324-FB48B3-LR5

    Science.gov (United States)

    2018-01-09

    As required by Federal Aviation Administration Order 8110.4C, Type Certification Process, the Volpe Center Acoustics Facility (Volpe), in support of the Federal Aviation Administration Office of Environment and Energy (AEE), has completed valid...

  18. Selective and validated data processing techniques for performance improvement of automatic lines

    Directory of Open Access Journals (Sweden)

    D’Aponte Francesco

    2016-01-01

    Full Text Available Optimization of the data processing techniques of accelerometers and force transducers allowed to get information about actions in order to improve the behavior of a cutting stage of a converting machinery for diapers production. In particular, different mechanical configurations have been studied and compared in order to reduce the solicitations due to the impacts between knives and anvil, to get clean and accurate cuts and to reduce wear of knives themselves. Reducing the uncertainty of measurements allowed to correctly individuate the best configuration for the pneumatic system that realize the coupling between anvil and knife. The size of pipes, the working pressure and the type of the fluid used in the coupling system have been examined. Experimental results obtained by means of acceleration and force measurements allowed to identify in a reproducible and coherent way the geometry of the pushing device and the working pressure range of the hydraulic fluid. The remarkable reduction of knife and anvil vibrations is expected to strongly reduce the wear of the cutting stage components.

  19. Data Validation Package May 2016 Groundwater Sampling at the Lakeview, Oregon, Processing Site August 2016

    Energy Technology Data Exchange (ETDEWEB)

    Linard, Joshua [USDOE Office of Legacy Management, Washington, DC (United States); Hall, Steve [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States)

    2016-08-01

    This biennial event includes sampling five groundwater locations (four monitoring wells and one domestic well) at the Lakeview, Oregon, Processing Site. For this event, the domestic well (location 0543) could not be sampled because no one was in residence during the sampling event (Note: notification was provided to the resident prior to the event). Per Appendix A of the Groundwater Compliance Action Plan, sampling is conducted to monitor groundwater quality on a voluntary basis. Sampling and analyses were conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated). One duplicate sample was collected from location 0505. Water levels were measured at each sampled monitoring well. The constituents monitored at the Lakeview site are manganese and sulfate. Monitoring locations that exceeded the U.S. Environmental Protection Agency (EPA) Secondary Maximum Contaminant Levels for these constituents are listed in Table 1. Review of time-concentration graphs included in this report indicate that manganese and sulfate concentrations are consistent with historical measurements.

  20. Validation of formability of laminated sheet metal for deep drawing process using GTN damage model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Yongbin; Cha, Wan-gi; Kim, Naksoo [Department of Mechanical Engineering, Sogang University, 1 Sinsu-dong, Mapo-gu, Seoul, 121-742 (Korea, Republic of); Ko, Sangjin [Mold/die and forming technology team, Product prestige research lab, LG electronics, 222, LG-ro, Jinwi-myeon, Pyeongtaek-si, Gyeonggi-do, 451-713 (Korea, Republic of)

    2013-12-16

    In this study, we studied formability of PET/PVC laminated sheet metal which named VCM (Vinyl Coated Metal). VCM offers various patterns and good-looking metal steel used for appliances such as refrigerator and washing machine. But, this sheet has problems which are crack and peeling of film when the material is formed by deep drawing process. To predict the problems, we used finite element method and GTN (Gurson-Tvergaard-Needleman) damage model to represent damage of material. We divided the VCM into 3 layers (PET film, adhesive and steel added PVC) in finite element analysis model to express the crack and peeling phenomenon. The material properties of each layer are determined by reverse engineering based on tensile test result. Furthermore, we performed the simple rectangular deep drawing and simulated it. The simulation result shows good agreement with drawing experiment result in position, punch stroke of crack occurrence. Also, we studied the fracture mechanism of PET film on VCM by comparing the width direction strain of metal and PET film.

  1. Validity of the independent-processes approximation for resonance structures in electron-ion scattering cross sections

    International Nuclear Information System (INIS)

    Badnell, N.R.; Pindzola, M.S.; Griffin, D.C.

    1991-01-01

    The total inelastic cross section for electron-ion scattering may be found in the independent-processes approximation by adding the resonant cross section to the nonresonant background cross section. We study the validity of this approximation for electron excitation of multiply charged ions. The resonant-excitation cross section is calculated independently using distorted waves for various Li-like and Na-like ions using (N+1)-electron atomic-structure methods previously developed for the calculation of dielectronic-recombination cross sections. To check the effects of interference between the two scattering processes, we also carry out detailed close-coupling calculations for the same atomic ions using the R-matrix method. For low ionization stages, interference effects manifest themselves sometimes as strong window features in the close-coupling cross section, which are not present in the independent-processes cross section. For higher ionization stages, however, the resonance features found in the independent-processes approximation are found to be in good agreement with the close-coupling results

  2. Process evaluation to explore internal and external validity of the "Act in Case of Depression" care program in nursing homes.

    Science.gov (United States)

    Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J

    2012-06-01

    A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of

  3. Implementation of quality by design approach in manufacturing process optimization of dry granulated, immediate release, coated tablets - a case study.

    Science.gov (United States)

    Teżyk, Michał; Jakubowska, Emilia; Milanowski, Bartłomiej; Lulek, Janina

    2017-10-01

    The aim of this study was to optimize the process of tablets compression and identification of film-coating critical process parameters (CPPs) affecting critical quality attributes (CQAs) using quality by design (QbD) approach. Design of experiment (DOE) and regression methods were employed to investigate hardness, disintegration time, and thickness of uncoated tablets depending on slugging and tableting compression force (CPPs). Plackett-Burman experimental design was applied to identify critical coating process parameters among selected ones that is: drying and preheating time, atomization air pressure, spray rate, air volume, inlet air temperature, and drum pressure that may influence the hardness and disintegration time of coated tablets. As a result of the research, design space was established to facilitate an in-depth understanding of existing relationship between CPPs and CQAs of intermediate product (uncoated tablets). Screening revealed that spray rate and inlet air temperature are two most important factors that affect the hardness of coated tablets. Simultaneously, none of the tested coating factors have influence on disintegration time. The observation was confirmed by conducting film coating of pilot size batches.

  4. The Musical Emotional Bursts: A validated set of musical affect bursts to investigate auditory affective processing.

    Directory of Open Access Journals (Sweden)

    Sébastien ePaquette

    2013-08-01

    Full Text Available The Musical Emotional Bursts (MEB consist of 80 brief musical executions expressing basic emotional states (happiness, sadness and fear and neutrality. These musical bursts were designed to be the musical analogue of the Montreal Affective Voices (MAV – a set of brief non-verbal affective vocalizations portraying different basic emotions. The MEB consist of short (mean duration: 1.6 sec improvisations on a given emotion or of imitations of a given MAV stimulus, played on a violin (n:40 or a clarinet (n:40. The MEB arguably represent a primitive form of music emotional expression, just like the MAV represent a primitive form of vocal, nonlinguistic emotional expression. To create the MEB, stimuli were recorded from 10 violinists and 10 clarinetists, and then evaluated by 60 participants. Participants evaluated 240 stimuli (30 stimuli x 4 [3 emotions + neutral] x 2 instruments by performing either a forced-choice emotion categorization task, a valence rating task or an arousal rating task (20 subjects per task; 40 MAVs were also used in the same session with similar task instructions. Recognition accuracy of emotional categories expressed by the MEB (n:80 was lower than for the MAVs but still very high with an average percent correct recognition score of 80.4%. Highest recognition accuracies were obtained for happy clarinet (92.0% and fearful or sad violin (88.0% each MEB stimuli. The MEB can be used to compare the cerebral processing of emotional expressions in music and vocal communication, or used for testing affective perception in patients with communication problems.

  5. Validation and sensitivity tests on improved parametrizations of a land surface process model (LSPM) in the Po Valley

    International Nuclear Information System (INIS)

    Cassardo, C.; Carena, E.; Longhetto, A.

    1998-01-01

    The Land Surface Process Model (LSPM) has been improved with respect to the 1. version of 1994. The modifications have involved the parametrizations of the radiation terms and of turbulent heat fluxes. A parametrization of runoff has also been developed, in order to close the hydrologic balance. This 2. version of LSPM has been validated against experimental data gathered at Mottarone (Verbania, Northern Italy) during a field experiment. The results of this validation show that this new version is able to apportionate the energy into sensible and latent heat fluxes. LSPM has also been submitted to a series of sensitivity tests in order to investigate the hydrological part of the model. The physical quantities selected in these sensitivity experiments have been the initial soil moisture content and the rainfall intensity. In each experiment, the model has been forced by using the observations carried out at the synoptic stations of San Pietro Capofiume (Po Valley, Italy). The observed characteristics of soil and vegetation (not involved in the sensitivity tests) have been used as initial and boundary conditions. The results of the simulation show that LSPM can reproduce well the energy, heat and water budgets and their behaviours with varying the selected parameters. A careful analysis of the LSPM output shows also the importance to identify the effective soil type

  6. Study on the Rationality and Validity of Probit Models of Domino Effect to Chemical Process Equipment caused by Overpressure

    International Nuclear Information System (INIS)

    Sun, Dongliang; Huang, Guangtuan; Jiang, Juncheng; Zhang, Mingguang; Wang, Zhirong

    2013-01-01

    Overpressure is one important cause of domino effect in accidents of chemical process equipments. Some models considering propagation probability and threshold values of the domino effect caused by overpressure have been proposed in previous study. In order to prove the rationality and validity of the models reported in the reference, two boundary values of three damage degrees reported were considered as random variables respectively in the interval [0, 100%]. Based on the overpressure data for damage to the equipment and the damage state, and the calculation method reported in the references, the mean square errors of the four categories of damage probability models of overpressure were calculated with random boundary values, and then a relationship of mean square error vs. the two boundary value was obtained, the minimum of mean square error was obtained, compared with the result of the present work, mean square error decreases by about 3%. Therefore, the error was in the acceptable range of engineering applications, the models reported can be considered reasonable and valid.

  7. Study on the Rationality and Validity of Probit Models of Domino Effect to Chemical Process Equipment caused by Overpressure

    Science.gov (United States)

    Sun, Dongliang; Huang, Guangtuan; Jiang, Juncheng; Zhang, Mingguang; Wang, Zhirong

    2013-04-01

    Overpressure is one important cause of domino effect in accidents of chemical process equipments. Some models considering propagation probability and threshold values of the domino effect caused by overpressure have been proposed in previous study. In order to prove the rationality and validity of the models reported in the reference, two boundary values of three damage degrees reported were considered as random variables respectively in the interval [0, 100%]. Based on the overpressure data for damage to the equipment and the damage state, and the calculation method reported in the references, the mean square errors of the four categories of damage probability models of overpressure were calculated with random boundary values, and then a relationship of mean square error vs. the two boundary value was obtained, the minimum of mean square error was obtained, compared with the result of the present work, mean square error decreases by about 3%. Therefore, the error was in the acceptable range of engineering applications, the models reported can be considered reasonable and valid.

  8. Release of hydroxycinnamic acids and formation of flavour-active volatile phenols during the beer production process

    OpenAIRE

    Vanbeneden, Nele

    2007-01-01

    Among the flavour-active volatile phenols in beer, most of them originate from the raw materials used in the brewing process. Only some of them can be formed by yeast activity, namely 4-vinylguaiacol (4VG) and 4-vinylphenol (4VP). The presence of these volatile phenolic compounds is considered undesirable when present in excessive concentration in bottom-fermented pilsner beers, hence the term “phenolic off-flavour” (POF). It is attributed to beers with a strong medicinal, clove-like aroma. D...

  9. Communication behaviours of skilled and less skilled oncologists: a validation study of the Medical Interaction Process System (MIPS).

    Science.gov (United States)

    Ford, Sarah; Hall, Angela

    2004-09-01

    The Medical Interaction Process System (MIPS) was originally developed in order to create a reliable observation tool for analysing doctor-patient encounters in the oncology setting. This paper reports a series of analyses carried out to establish whether the behaviour categories of the MIPS can discriminate between skilled and less skilled communicators. This involved the use of MIPS coded cancer consultations to compare the MIPS indices of 10 clinicians evaluated by an independent professional as skilled communicators with 10 who were considered less skilled. Eleven out of the 15 MIPS variables tested were able to distinguish the skilled from the less skilled group. Although limitations to the study are discussed, the results indicate that the MIPS has satisfactory discriminatory power and the results provide validity data that meet key objectives for developing the system. There is an ever-increasing need for reliable methods of assessing doctors' communication skills and evaluating medical interview teaching programmes. Copyright 2004 Elsevier Ireland Ltd.

  10. Analysis of residual stresses due to roll-expansion process: Finite element computation and validation by experimental tests

    International Nuclear Information System (INIS)

    Aufaure, M.; Boudot, R.; Zacharie, G.; Proix, J.M.

    1987-01-01

    The steam generator heat exchangers of pressurized water reactors are made of U-shaped tubes, both ends of them being fixed to a plate by roll-expansion. This process consists in increasing the tube section by means of a rotating tool in order to apply its outer side to the surface of the hole through the plate. As reported by de Keroulas (1986), in service cracks appeared on these tubes in the transition from expanded to nonexpanded portions. So we developed a program to compute residual stresses at the surface of the tubes, which caused their cracking, and to endeavour to lower their level by acting on some parameters. This program was validated by experimental tests. (orig.)

  11. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-09-25

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.

  12. Modeling benthic–pelagic nutrient exchange processes and porewater distributions in a seasonally hypoxic sediment: evidence for massive phosphate release by Beggiatoa?

    Directory of Open Access Journals (Sweden)

    K. Wallmann

    2013-02-01

    Full Text Available This study presents benthic data from 12 samplings from February to December 2010 in a 28 m deep channel in the southwest Baltic Sea. In winter, the distribution of solutes in the porewater was strongly modulated by bioirrigation which efficiently flushed the upper 10 cm of sediment, leading to concentrations which varied little from bottom water values. Solute pumping by bioirrigation fell sharply in the summer as the bottom waters became severely hypoxic (2. At this point the giant sulfide-oxidizing bacteria Beggiatoa was visible on surface sediments. Despite an increase in O2 following mixing of the water column in November, macrofauna remained absent until the end of the sampling. Contrary to expectations, metabolites such as dissolved inorganic carbon, ammonium and hydrogen sulfide did not accumulate in the upper 10 cm during the hypoxic period when bioirrigation was absent, but instead tended toward bottom water values. This was taken as evidence for episodic bubbling of methane gas out of the sediment acting as an abiogenic irrigation process. Porewater–seawater mixing by escaping bubbles provides a pathway for enhanced nutrient release to the bottom water and may exacerbate the feedback with hypoxia. Subsurface dissolved phosphate (TPO4 peaks in excess of 400 μM developed in autumn, resulting in a very large diffusive TPO4 flux to the water column of 0.7 ± 0.2 mmol m−2 d−1. The model was not able to simulate this TPO4 source as release of iron-bound P (Fe–P or organic P. As an alternative hypothesis, the TPO4 peak was reproduced using new kinetic expressions that allow Beggiatoa to take up porewater TPO4 and accumulate an intracellular P pool during periods with oxic bottom waters. TPO4 is then released during hypoxia, as previous published results with sulfide-oxidizing bacteria indicate. The TPO4 added to the porewater over the year by organic P and Fe–P is recycled through Beggiatoa, meaning that no additional source of

  13. Assessment of impacts at the advanced test reactor as a result of chemical releases at the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Rood, A.S.

    1991-02-01

    This report provides an assessment of potential impacts at the Advanced Test Reactor Facility (ATR) resulting from accidental chemical spill at the Idaho Chemical Processing Plant (ICPP). Spills postulated to occur at the Lincoln Blvd turnoff to ICPP were also evaluated. Peak and time weighted average concentrations were calculated for receptors at the ATR facility and the Test Reactor Area guard station at a height above ground level of 1.0 m. Calculated concentrations were then compared to the 15 minute averaged Threshold Limit Value - Short Term Exposure Limit (TLV-STEL) and the 30 minute averaged Immediately Dangerous to Life and Health (IDLH) limit. Several different methodologies were used to estimate source strength and dispersion. Fifteen minute time weighted averaged concentrations of hydrofluoric acid and anhydrous ammonia exceeded TLV-STEL values for the cases considered. The IDLH value for these chemicals was not exceeded. Calculated concentrations of ammonium hydroxide, hexone, nitric acid, propane, gasoline, chlorine and liquid nitrogen were all below the TLV-STEL value

  14. Development and validation of the social information processing application: a Web-based measure of social information processing patterns in elementary school-age boys.

    Science.gov (United States)

    Kupersmidt, Janis B; Stelter, Rebecca; Dodge, Kenneth A

    2011-12-01

    The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in 3rd through 5th grades. This study included a racially and ethnically diverse sample of 244 boys ages 8 through 12 (M = 9.4) from public elementary schools in 3 states. The SIP-AP includes 8 videotaped vignettes, filmed from the first-person perspective, that depict common misunderstandings among boys. Each vignette shows a negative outcome for the victim and ambiguous intent on the part of the perpetrator. Boys responded to 16 Web-based questions representing the 5 social information processing mechanisms, after viewing each vignette. Parents and teachers completed measures assessing boys' antisocial behavior. Confirmatory factor analyses revealed that a model positing the original 5 cognitive mechanisms fit the data well when the items representing prosocial cognitions were included on their own factor, creating a 6th factor. The internal consistencies for each of the 16 individual cognitions as well as for the 6 cognitive mechanism scales were excellent. Boys with elevated scores on 5 of the 6 cognitive mechanisms exhibited more antisocial behavior than boys whose scores were not elevated. These findings highlight the need for further research on the measurement of prosocial cognitions or cognitive strengths in boys in addition to assessing cognitive deficits. Findings suggest that the SIP-AP is a reliable and valid tool for use in future research of social information processing skills in boys.

  15. Nuclide Release Behavior from a Repository for a Pyro-process HLW and SF due to Variation of the MWCF Properties

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Hwang, Yong Soo

    2009-01-01

    An assessment program for an optional evaluation of a repository both for disposal of such high-level wastes (HLWs) from various steps of pyro-processes of PWR spent nuclear fuel (SF) and for direct disposal of PWR and CANDU SFs has been developed by utilizing general purpose GoldSim developing tool, by which nuclide transports in the near- and far-field of a repository as well as a transport through a biosphere under various natural and manmade disruptive events affecting a nuclide release could be modeled and evaluated. KAERI has been in charge of modeling and developing assessment tools by which the above mentioned repository system could be assessed in accordance with various features, events, and processes (FEPs) that could happen in and around the repository system. To cope with such various natural and manmade disruptive FEPs as well as normal release scenarios, all the possible cases in view of the Korean circumstances should be modeled and have been evaluated even though we have not yet have any repository. A possible case, among many others, with the variation of such physical properties as the fracture width and the rock matrix diffusion depth, associated with the natural fractures in the geological rock media, along which nuclide could be transported preferentially with the flow of groundwater is considered in the current study. Due to whatever the reason, such as e,g., the earthquake or human intrusion, it is assumed that the physical properties of the major water conducting fault (MWCF) is changed resulting in the size of fracture width and the matrix diffusion depth. For such case another illustration is made for probabilistic evaluation of a hypothetical Korean HLW repository, as similarly done in the previous studies

  16. Validation of new 3D post processing algorithm for improved maximum intensity projections of MR angiography acquisitions in the brain

    Energy Technology Data Exchange (ETDEWEB)

    Bosmans, H; Verbeeck, R; Vandermeulen, D; Suetens, P; Wilms, G; Maaly, M; Marchal, G; Baert, A L [Louvain Univ. (Belgium)

    1995-12-01

    The objective of this study was to validate a new post processing algorithm for improved maximum intensity projections (mip) of intracranial MR angiography acquisitions. The core of the post processing procedure is a new brain segmentation algorithm. Two seed areas, background and brain, are automatically detected. A 3D region grower then grows both regions towards each other and this preferentially towards white regions. In this way, the skin gets included into the final `background region` whereas cortical blood vessels and all brain tissues are included in the `brain region`. The latter region is then used for mip. The algorithm runs less than 30 minutes on a full dataset on a Unix workstation. Images from different acquisition strategies including multiple overlapping thin slab acquisition, magnetization transfer (MT) MRA, Gd-DTPA enhanced MRA, normal and high resolution acquisitions and acquisitions from mid field and high field systems were filtered. A series of contrast enhanced MRA acquisitions obtained with identical parameters was filtered to study the robustness of the filter parameters. In all cases, only a minimal manual interaction was necessary to segment the brain. The quality of the mip was significantly improved, especially in post Gd-DTPA acquisitions or using MT, due to the absence of high intensity signals of skin, sinuses and eyes that otherwise superimpose on the angiograms. It is concluded that the filter is a robust technique to improve the quality of MR angiograms.

  17. Validation of new 3D post processing algorithm for improved maximum intensity projections of MR angiography acquisitions in the brain

    International Nuclear Information System (INIS)

    Bosmans, H.; Verbeeck, R.; Vandermeulen, D.; Suetens, P.; Wilms, G.; Maaly, M.; Marchal, G.; Baert, A.L.

    1995-01-01

    The objective of this study was to validate a new post processing algorithm for improved maximum intensity projections (mip) of intracranial MR angiography acquisitions. The core of the post processing procedure is a new brain segmentation algorithm. Two seed areas, background and brain, are automatically detected. A 3D region grower then grows both regions towards each other and this preferentially towards white regions. In this way, the skin gets included into the final 'background region' whereas cortical blood vessels and all brain tissues are included in the 'brain region'. The latter region is then used for mip. The algorithm runs less than 30 minutes on a full dataset on a Unix workstation. Images from different acquisition strategies including multiple overlapping thin slab acquisition, magnetization transfer (MT) MRA, Gd-DTPA enhanced MRA, normal and high resolution acquisitions and acquisitions from mid field and high field systems were filtered. A series of contrast enhanced MRA acquisitions obtained with identical parameters was filtered to study the robustness of the filter parameters. In all cases, only a minimal manual interaction was necessary to segment the brain. The quality of the mip was significantly improved, especially in post Gd-DTPA acquisitions or using MT, due to the absence of high intensity signals of skin, sinuses and eyes that otherwise superimpose on the angiograms. It is concluded that the filter is a robust technique to improve the quality of MR angiograms

  18. Assessment of Social Information Processing in early childhood: development and initial validation of the Schultz Test of Emotion Processing-Preliminary Version.

    Science.gov (United States)

    Schultz, David; Ambike, Archana; Logie, Sean Kevin; Bohner, Katherine E; Stapleton, Laura M; Vanderwalde, Holly; Min, Christopher B; Betkowski, Jennifer A

    2010-07-01

    Crick and Dodge's (Psychological Bulletin 115:74-101, 1994) social information processing model has proven very useful in guiding research focused on aggressive and peer-rejected children's social-cognitive functioning. Its application to early childhood, however, has been much more limited. The present study responds to this gap by developing and validating a video-based assessment tool appropriate for early childhood, the Schultz Test of Emotion Processing-Preliminary Version (STEP-P). One hundred twenty-five Head Start preschool children participated in the study. More socially competent children more frequently attributed sadness to the victims of provocation and labeled aggressive behaviors as both morally unacceptable and less likely to lead to positive outcomes. More socially competent girls labeled others' emotions more accurately. More disruptive children more frequently produced physically aggressive solutions to social provocations, and more disruptive boys less frequently interpreted social provocations as accidental. The STEP-P holds promise as an assessment tool that assesses knowledge structures related to the SIP model in early childhood.

  19. Structural and morphological studies on poly(3-hydroxybutyrate acid) (PHB)/chitosan drug releasing microspheres prepared by both single and double emulsion processes

    Energy Technology Data Exchange (ETDEWEB)

    Shih, W.-J. [Department of Materials Science and Engineering, National Cheng Kung University, 1 Ta-Hsueh Road, Tainan 70101, Taiwan (China); Chen, Y.-H. [Department of Mechanical Engineering, National Kaohsiung University of Applied Sciences, 415 Chien-kung Road, Kaohsiung 80782, Taiwan (China); Shih, C.-J. [Faculty of Fragrance and Cosmetics, Kaohsiung Medical University, No. 100, Shih-Chuang 1st Rd., Sanmin District, Kaohsiung 80708, Taiwan (China); Hon, M.-H. [Department of Materials Science and Engineering, National Cheng Kung University, 1 Ta-Hsueh Road, Tainan 70101, Taiwan (China); Dayeh University, 112 Shan-Jiau Road, Da-Tsuen, Changhua 515, Taiwan (China); Wang, M.-C. [Department of Mechanical Engineering, National Kaohsiung University of Applied Sciences, 415 Chien-kung Road, Kaohsiung 80782, Taiwan (China) and Department of Materials Science and Engineering, National United University, 1 Lien-Da Road, Kung-ching Li, Miao Li 360, Taiwan (China)]. E-mail: mcwang@cc.kuas.edu.tw

    2007-05-31

    Drug releasing microspheres of poly(3-hydroxybutyric acid)/chitosan (PHB/CTS) with various compositions have been synthesized by both single and double emulsion methods, and collected by a freeze-drying process. In this study, gentamicin was used as an antibacterial medicine coated with PHB. The PHB/CTS microspheres of various compositions prepared by a single emulsion process (SEP) were identified as the major PHB phase together with a minor unknown Phase X by X-ray diffraction (XRD) and FT-IR. However, in the microspheres prepared using a double emulsion process (DEP) the dominant Phase was X and the minor phase was PHB. The size of the PHB/CTS microspheres prepared by SEP increased with the PHB/CTS ratio from 1 {mu}m for 1:1 to 2 {mu}m for 5:1. However, the size of the PHB/CTS microspheres prepared by DEP decreased with the PHB/CTS ratio from 1 {mu}m for 1:1 to 800 nm for 5:1.

  20. Structural and morphological studies on poly(3-hydroxybutyrate acid) (PHB)/chitosan drug releasing microspheres prepared by both single and double emulsion processes

    International Nuclear Information System (INIS)

    Shih, W.-J.; Chen, Y.-H.; Shih, C.-J.; Hon, M.-H.; Wang, M.-C.

    2007-01-01

    Drug releasing microspheres of poly(3-hydroxybutyric acid)/chitosan (PHB/CTS) with various compositions have been synthesized by both single and double emulsion methods, and collected by a freeze-drying process. In this study, gentamicin was used as an antibacterial medicine coated with PHB. The PHB/CTS microspheres of various compositions prepared by a single emulsion process (SEP) were identified as the major PHB phase together with a minor unknown Phase X by X-ray diffraction (XRD) and FT-IR. However, in the microspheres prepared using a double emulsion process (DEP) the dominant Phase was X and the minor phase was PHB. The size of the PHB/CTS microspheres prepared by SEP increased with the PHB/CTS ratio from 1 μm for 1:1 to 2 μm for 5:1. However, the size of the PHB/CTS microspheres prepared by DEP decreased with the PHB/CTS ratio from 1 μm for 1:1 to 800 nm for 5:1

  1. Flash release an alternative for releasing complex MEMS devices

    NARCIS (Netherlands)

    Deladi, S.; Krijnen, Gijsbertus J.M.; Elwenspoek, Michael Curt

    2004-01-01

    A novel time-saving and cost-effective release technique has been developed and is described. The physical nature of the process is explained in combination with experimental observations. The results of the flash release process are compared with those of freeze-drying and supercritical CO2

  2. Investigating the feasibility of temperature-controlled accelerated drug release testing for an intravaginal ring.

    Science.gov (United States)

    Externbrink, Anna; Clark, Meredith R; Friend, David R; Klein, Sandra

    2013-11-01

    The objective of the present study was to investigate if temperature can be utilized to accelerate drug release from Nuvaring®, a reservoir type intravaginal ring based on polyethylene vinyl acetate copolymer that releases a constant dose of contraceptive steroids over a duration of 3 weeks. The reciprocating holder apparatus (USP 7) was utilized to determine real-time and accelerated etonogestrel release from ring segments. It was demonstrated that drug release increased with increasing temperature which can be attributed to enhanced drug diffusion. An Arrhenius relationship of the zero-order release constants was established, indicating that temperature is a valid parameter to accelerate drug release from this dosage form and that the release mechanism is maintained under these accelerated test conditions. Accelerated release tests are particularly useful for routine quality control to assist during batch release of extended release formulations that typically release the active over several weeks, months or even years, since they can increase the product shelf life. The accelerated method should therefore be able to discriminate between formulations with different release characteristics that can result from normal manufacturing variance. In the case of Nuvaring®, it is well known that the process parameters during the extrusion process strongly influence the polymeric structure. These changes in the polymeric structure can affect the permeability which, in turn, is reflected in the release properties. Results from this study indicate that changes in the polymeric structure can lead to a different temperature dependence of the release rate, and as a consequence, the accelerated method can become less sensitive to detect changes in the release properties. When the accelerated method is utilized during batch release, it is therefore important to take this possible restriction into account and to evaluate the accelerated method with samples from non

  3. On the selection and validation of biological treatment processes. The GDF experience; Le choix et la validation des procedes de traitement biologique. L`experience de GDF

    Energy Technology Data Exchange (ETDEWEB)

    Druelle, V. [Gaz de France (GDF), 75 - Paris (France)

    1996-12-31

    The biological treatment process was selected by Gaz de France (GDF), the French national gas utility, for the de-pollution of an old gas works where the main pollutants are coal tars containing polycyclic aromatic hydrocarbons. Microorganism-based biological treatment techniques may involve bio-reactors, static ground knolls (where oxygen is brought through drains) and dynamic knolls (where oxygenation is carried out by turning up the soil). Issues on sampling, sorting, process testing, site preparation, process control, etc. are reviewed

  4. On the selection and validation of biological treatment processes. The GDF experience; Le choix et la validation des procedes de traitement biologique. L`experience de GDF

    Energy Technology Data Exchange (ETDEWEB)

    Druelle, V [Gaz de France (GDF), 75 - Paris (France)

    1997-12-31

    The biological treatment process was selected by Gaz de France (GDF), the French national gas utility, for the de-pollution of an old gas works where the main pollutants are coal tars containing polycyclic aromatic hydrocarbons. Microorganism-based biological treatment techniques may involve bio-reactors, static ground knolls (where oxygen is brought through drains) and dynamic knolls (where oxygenation is carried out by turning up the soil). Issues on sampling, sorting, process testing, site preparation, process control, etc. are reviewed

  5. Application for approval of derived authorized limits for the release of the 190-C trenches and 105-C process water tunnels at the Hanford Site: Volume 2 - source term development

    International Nuclear Information System (INIS)

    Denham, D.H.; Winslow, S.L.; Moeller, M.P.; Kennedy, W.E. Jr.

    1997-03-01

    As part of environmental restoration activities at the Hanford Site, Bechtel Hanford, Inc. is conducting a series of evaluations to determine appropriate release conditions for specific facilities following the completion of decontamination and decommissioning projects. The release conditions, with respect to the residual volumetric radioactive contamination, are termed authorized limits. This report presents the summary of the supporting information and the final application for approval of derived authorized limits for the release of the 190-C trenches and the 105-C process water tunnels. This document contains two volumes; this volume (Vol. 2) contains the radiological characterization data, spreadsheet analyses, and radiological source terms

  6. Controlling the Release of Indomethacin from Glass Solutions Layered with a Rate Controlling Membrane Using Fluid-Bed Processing. Part 1: Surface and Cross-Sectional Chemical Analysis.

    Science.gov (United States)

    Dereymaker, Aswin; Scurr, David J; Steer, Elisabeth D; Roberts, Clive J; Van den Mooter, Guy

    2017-04-03

    Fluid bed coating has been shown to be a suitable manufacturing technique to formulate poorly soluble drugs in glass solutions. Layering inert carriers with a drug-polymer mixture enables these beads to be immediately filled into capsules, thus avoiding additional, potentially destabilizing, downstream processing. In this study, fluid bed coating is proposed for the production of controlled release dosage forms of glass solutions by applying a second, rate controlling membrane on top of the glass solution. Adding a second coating layer adds to the physical and chemical complexity of the drug delivery system, so a thorough understanding of the physical structure and phase behavior of the different coating layers is needed. This study aimed to investigate the surface and cross-sectional characteristics (employing scanning electron microscopy (SEM) and time of flight secondary ion mass spectrometry (ToF-SIMS)) of an indomethacin-polyvinylpyrrolidone (PVP) glass solution, top-coated with a release rate controlling membrane consisting of either ethyl cellulose or Eudragit RL. The implications of the addition of a pore former (PVP) and the coating medium (ethanol or water) were also considered. In addition, polymer miscibility and the phase analysis of the underlying glass solution were investigated. Significant differences in surface and cross-sectional topography of the different rate controlling membranes or the way they are applied (solution vs dispersion) were observed. These observations can be linked to the polymer miscibility differences. The presence of PVP was observed in all rate controlling membranes, even if it is not part of the coating solution. This could be attributed to residual powder presence in the coating chamber. The distribution of PVP among the sample surfaces depends on the concentration and the rate controlling polymer used. Differences can again be linked to polymer miscibility. Finally, it was shown that the underlying glass solution layer

  7. Possibilities for the Reuse of Steel from Decommissioning. Selected Scenarios in the Process of Proposal and Evaluation of Manufacturing Processes for Conditional Released Steel and their Application in General and Nuclear Industry

    International Nuclear Information System (INIS)

    Bezak, P.; Daniska, V.; Ondra, F.; Necas, V.

    2012-01-01

    Conditional release of steels from NPP decommissioning enables controlled reuse of non-negligible volumes of steels. For proposal of scenarios for steel reuse, it is needed to identify and evaluate partial elementary activities of the whole process from conditional release of steels, manufacturing of various elements up to realisation of scenarios. For scenarios of reuse of conditionally released steel the products of steel, as steel reinforcements, rails, profiles and sheets for technical constructions such as bridges, tunnels, railways and other constructions which guarantee the long-term properties over the periods of 50-100 years are considered. The idea offers also the possibility for using this type of steel for particular technical constructions, directly usable in nuclear facilities. The paper presents the review of activities for manufacturing of various steel construction elements made from conditionally released steels and their use in general and also in the nuclear industry. As the starting material for manufacturing of steel elements ingots or just fragments of steel after dismantling in controlled area can be used. These input materials are re-melted in industrial facilities in order to achieve the required physical and chemical characteristics. Mostly used technique for manufacturing of the steel construction elements is rolling. As the products considered in scenarios for reuse of conditional released steels are bars for reinforcement concrete, rolled steel sheets and other rolled profiles. For use in the nuclear industry it offers the possibility for casting of thick-walled steel containers for long-term storage of high level radioactive components in integral storage and also assembly of stainless steel tanks for storing of liquid radioactive waste. Lists of elementary activities which are needed for manufacturing of selected steel elements are elaborated. These elementary activities are then the base for detailed safety evaluation of external

  8. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    International Nuclear Information System (INIS)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs

  9. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    Energy Technology Data Exchange (ETDEWEB)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  10. Approaches to learning for the ANZCA Final Examination and validation of the revised Study Process Questionnaire in specialist medical training.

    Science.gov (United States)

    Weller, J M; Henning, M; Civil, N; Lavery, L; Boyd, M J; Jolly, B

    2013-09-01

    When evaluating assessments, the impact on learning is often overlooked. Approaches to learning can be deep, surface and strategic. To provide insights into exam quality, we investigated the learning approaches taken by trainees preparing for the Australian and New Zealand College of Anaesthetists (ANZCA) Final Exam. The revised two-factor Study Process Questionnaire (R-SPQ-2F) was modified and validated for this context and was administered to ANZCA advanced trainees. Additional questions were asked about perceived value for anaesthetic practice, study time and approaches to learning for each exam component. Overall, 236 of 690 trainees responded (34%). Responses indicated both deep and surface approaches to learning with a clear preponderance of deep approaches. The anaesthetic viva was valued most highly and the multiple choice question component the least. Despite this, respondents spent the most time studying for the multiple choice questions. The traditionally low short answer questions pass rate could not be explained by limited study time, perceived lack of value or study approaches. Written responses suggested that preparation for multiple choice questions was characterised by a surface approach, with rote memorisation of past questions. Minimal reference was made to the ANZCA syllabus as a guide for learning. These findings indicate that, although trainees found the exam generally relevant to practice and adopted predominantly deep learning approaches, there was considerable variation between the four components. These results provide data with which to review the existing ANZCA Final Exam and comparative data for future studies of the revisions to the ANZCA curriculum and exam process.

  11. Validation and Demonstration of the NOAA Unique Combined Atmospheric Processing System (NUCAPS) in Support of User Applications

    Science.gov (United States)

    Nalli, N. R.; Gambacorta, A.; Tan, C.; Iturbide, F.; Barnet, C. D.; Reale, A.; Sun, B.; Liu, Q.

    2017-12-01

    This presentation overviews the performance of the operational SNPP NOAA Unique Combined Atmospheric Processing System (NUCAPS) environmental data record (EDR) products. The SNPP Cross-track Infrared Sounder and Advanced Technology Microwave Sounder (CrIS/ATMS) suite, the first of the Joint Polar Satellite System (JPSS) Program, is one of NOAA's major investments in our nation's future operational environmental observation capability. The NUCAPS algorithm is a world-class NOAA-operational IR/MW retrieval algorithm based upon the well-established AIRS science team algorithm for deriving temperature, moisture, ozone and carbon trace gas to provide users with state-of-the-art EDR products. Operational use of the products includes the NOAA National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS), along with numerous science-user applications. NUCAPS EDR product assessments are made with reference to JPSS Level 1 global requirements, which provide the definitive metrics for assessing that the products have minimally met predefined global performance specifications. The NESDIS/STAR NUCAPS development and validation team recently delivered the Phase 4 algorithm which incorporated critical updates necessary for compatibility with full spectral-resolution (FSR) CrIS sensor data records (SDRs). Based on comprehensive analyses, the NUCAPS Phase 4 CrIS-FSR temperature, moisture and ozone profile EDRs, as well as the carbon trace gas EDRs (CO, CH4 and CO2), are shown o be meeting or close to meeting the JPSS program global requirements. Regional and temporal assessments of interest to EDR users (e.g., AWIPS) will also be presented.

  12. Using Healthcare Failure Mode and Effect Analysis to reduce medication errors in the process of drug prescription, validation and dispensing in hospitalised patients.

    Science.gov (United States)

    Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa

    2013-01-01

    To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.

  13. Transient simulation of an endothermic chemical process facility coupled to a high temperature reactor: Model development and validation

    International Nuclear Information System (INIS)

    Brown, Nicholas R.; Seker, Volkan; Revankar, Shripad T.; Downar, Thomas J.

    2012-01-01

    Highlights: ► Models for PBMR and thermochemical sulfur cycle based hydrogen plant are developed. ► Models are validated against available data in literature. ► Transient in coupled reactor and hydrogen plant system is studied. ► For loss-of-heat sink accident, temperature feedback within the reactor core enables shut down of the