WorldWideScience

Sample records for modeling tool nemo

  1. Spectral modeling of scintillator for the NEMO-3 and SuperNEMO detectors

    Energy Technology Data Exchange (ETDEWEB)

    Argyriades, J. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Arnold, R. [IPHC, Universite de Strasbourg, CNRS/IN2P3, F-67037 Strasbourg (France); Augier, C. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Baker, J. [INL, Idaho Falls, 83415 (United States); Barabash, A.S. [Institute of Theoretical and Experimental Physics, 117259 Moscow (Russian Federation); Bongrand, M.; Broudin-Bay, G. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Brudanin, V.B. [Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation); Caffrey, A.J. [INL, Idaho Falls, 83415 (United States); Cebrian, S. [University of Zaragoza, C/ Pedro Cerbuna 12, 50009 Zaragoza (Spain); Chapon, A. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Chauveau, E. [CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Universite de Bordeaux, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Dafni, Th. [University of Zaragoza, C/ Pedro Cerbuna 12, 50009 Zaragoza (Spain); Daraktchieva, Z. [University College London, WC1E 6BT London (United Kingdom); Diaz, J. [IFIC, CSIC - Universidad de Valencia, Valencia (Spain); Durand, D. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Egorov, V.G. [Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation); Evans, J.J. [University College London, WC1E 6BT London (United Kingdom); Fatemi-Ghomi, N. [University of Manchester, M13 9PL Manchester (United Kingdom); Flack, R. [University College London, WC1E 6BT London (United Kingdom)

    2011-01-01

    We have constructed a GEANT4-based detailed software model of photon transport in plastic scintillator blocks and have used it to study the NEMO-3 and SuperNEMO calorimeters employed in experiments designed to search for neutrinoless double beta decay. We compare our simulations to measurements using conversion electrons from a calibration source of {sup 207}Bi and show that the agreement is improved if wavelength-dependent properties of the calorimeter are taken into account. In this article, we briefly describe our modeling approach and results of our studies.

  2. NEMO: A Stellar Dynamics Toolbox

    Science.gov (United States)

    Barnes, Joshua; Hut, Piet; Teuben, Peter

    2010-10-01

    NEMO is an extendible Stellar Dynamics Toolbox, following an Open-Source Software model. It has various programs to create, integrate, analyze and visualize N-body and SPH like systems, following the pipe and filter architecture. In addition there are various tools to operate on images, tables and orbits, including FITS files to export/import to/from other astronomical data reduction packages. A large growing fraction of NEMO has been contributed by a growing list of authors. The source code consist of a little over 4000 files and a little under 1,000,000 lines of code and documentation, mostly C, and some C++ and Fortran. NEMO development started in 1986 in Princeton (USA) by Barnes, Hut and Teuben. See also ZENO (ascl:1102.027) for the version that Barnes maintains.

  3. Development of quantum device simulator NEMO-VN1

    Science.gov (United States)

    Hien, Dinh Sy; Thi Luong, Nguyen; Hoang Minh, Le; Tien Phuc, Tran; Thanh Trung, Pham; Dong, Bui An; Thu Thao, Huynh Lam; Van Le Thanh, Nguyen; Tuan, Thi Tran Anh; Hoang Trung, Huynh; Thi Thanh Nhan, Nguyen; Viet Nga, Dinh

    2009-09-01

    We have developed NEMO-VN1 (NanoElectronic MOdelling), a new modelling tool that simulates a wide variety of quantum devices including Quantum Dot (QD), Resonant Tunneling Diode (RTD), Resonant Tunneling Transistor (RTT), Single Electron Transistor (SET), Molecular FET (MFET), Carbon Nanotube FET (CNTFET), Spin FET (SPINFET). It has a collection of models that allow user to trade off between calculation speed and accuracy. NEMO-VN1 also includes a graphic user interface of Matlab that enables parameter entry, calculation control, intuitive display of calculation results, and in-situ data analysis methods.

  4. Development of quantum device simulator NEMO-VN1

    International Nuclear Information System (INIS)

    Dinh Sy Hien; Nguyen Thi Luong; Le Hoang Minh; Tran Tien Phuc; Pham Thanh Trung; Bui An Dong; Huynh Lam Thu Thao; Nguyen Van Le Thanh; Thi Tran Anh Tuan; Huynh Hoang Trung; Nguyen Thi Thanh Nhan; Dinh Viet Nga

    2009-01-01

    We have developed NEMO-VN1 (NanoElectronic MOdelling), a new modelling tool that simulates a wide variety of quantum devices including Quantum Dot (QD), Resonant Tunneling Diode (RTD), Resonant Tunneling Transistor (RTT), Single Electron Transistor (SET), Molecular FET (MFET), Carbon Nanotube FET (CNTFET), Spin FET (SPINFET). It has a collection of models that allow user to trade off between calculation speed and accuracy. NEMO-VN1 also includes a graphic user interface of Matlab that enables parameter entry, calculation control, intuitive display of calculation results, and in-situ data analysis methods.

  5. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    Koopmans, C.C.; Te Velde, D.W.; Groot, W.; Hendriks, J.H.A.

    1999-06-01

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  6. Surface wave effects in the NEMO ocean model: Forced and coupled experiments

    Science.gov (United States)

    Breivik, Øyvind; Mogensen, Kristian; Bidlot, Jean-Raymond; Balmaseda, Magdalena Alonso; Janssen, Peter A. E. M.

    2015-04-01

    The NEMO general circulation ocean model is extended to incorporate three physical processes related to ocean surface waves, namely the surface stress (modified by growth and dissipation of the oceanic wavefield), the turbulent kinetic energy flux from breaking waves, and the Stokes-Coriolis force. Experiments are done with NEMO in ocean-only (forced) mode and coupled to the ECMWF atmospheric and wave models. Ocean-only integrations are forced with fields from the ERA-Interim reanalysis. All three effects are noticeable in the extratropics, but the sea-state-dependent turbulent kinetic energy flux yields by far the largest difference. This is partly because the control run has too vigorous deep mixing due to an empirical mixing term in NEMO. We investigate the relation between this ad hoc mixing and Langmuir turbulence and find that it is much more effective than the Langmuir parameterization used in NEMO. The biases in sea surface temperature as well as subsurface temperature are reduced, and the total ocean heat content exhibits a trend closer to that observed in a recent ocean reanalysis (ORAS4) when wave effects are included. Seasonal integrations of the coupled atmosphere-wave-ocean model consisting of NEMO, the wave model ECWAM, and the atmospheric model of ECMWF similarly show that the sea surface temperature biases are greatly reduced when the mixing is controlled by the sea state and properly weighted by the thickness of the uppermost level of the ocean model. These wave-related physical processes were recently implemented in the operational coupled ensemble forecast system of ECMWF.

  7. A comparative signaling cost analysis of Macro Mobility scheme in NEMO (MM-NEMO) with mobility management protocol

    Science.gov (United States)

    Islam, Shayla; Abdalla, Aisha H.; Habaebi, Mohamed H.; Latif, Suhaimi A.; Hassan, Wan H.; Hasan, Mohammad K.; Ramli, H. A. M.; Khalifa, Othman O.

    2013-12-01

    NEMO BSP is an upgraded addition to Mobile IPv6 (MIPv6). As MIPv6 and its enhancements (i.e. HMIPv6) possess some limitations like higher handoff latency, packet loss, NEMO BSP also faces all these shortcomings by inheritance. Network Mobility (NEMO) is involved to handle the movement of Mobile Router (MR) and it's Mobile Network Nodes (MNNs) during handoff. Hence it is essential to upgrade the performance of mobility management protocol to obtain continuous session connectivity with lower delay and packet loss in NEMO environment. The completion of handoff process in NEMO BSP usually takes longer period since MR needs to register its single primary care of address (CoA) with home network that may cause performance degradation of the applications running on Mobile Network Nodes. Moreover, when a change in point of attachment of the mobile network is accompanied by a sudden burst of signaling messages, "Signaling Storm" occurs which eventually results in temporary congestion, packet delays or even packet loss. This effect is particularly significant for wireless environment where a wireless link is not as steady as a wired link since bandwidth is relatively limited in wireless link. Hence, providing continuous Internet connection without any interruption through applying multihoming technique and route optimization mechanism in NEMO are becoming the center of attention to the current researchers. In this paper, we propose a handoff cost model to compare the signaling cost of MM-NEMO with NEMO Basic Support Protocol (NEMO BSP) and HMIPv6.The numerical results shows that the signaling cost for the MM-NEMO scheme is about 69.6 % less than the NEMO-BSP and HMIPv6.

  8. A comparative signaling cost analysis of Macro Mobility scheme in NEMO (MM-NEMO) with mobility management protocol

    International Nuclear Information System (INIS)

    Islam, Shayla; Abdalla, Aisha H; Habaebi, Mohamed H; Latif, Suhaimi A; Hassan, Wan H; Hasan, Mohammad K; Ramli, H A M; Khalifa, Othman O

    2013-01-01

    NEMO BSP is an upgraded addition to Mobile IPv6 (MIPv6). As MIPv6 and its enhancements (i.e. HMIPv6) possess some limitations like higher handoff latency, packet loss, NEMO BSP also faces all these shortcomings by inheritance. Network Mobility (NEMO) is involved to handle the movement of Mobile Router (MR) and it's Mobile Network Nodes (MNNs) during handoff. Hence it is essential to upgrade the performance of mobility management protocol to obtain continuous session connectivity with lower delay and packet loss in NEMO environment. The completion of handoff process in NEMO BSP usually takes longer period since MR needs to register its single primary care of address (CoA) with home network that may cause performance degradation of the applications running on Mobile Network Nodes. Moreover, when a change in point of attachment of the mobile network is accompanied by a sudden burst of signaling messages, ''Signaling Storm'' occurs which eventually results in temporary congestion, packet delays or even packet loss. This effect is particularly significant for wireless environment where a wireless link is not as steady as a wired link since bandwidth is relatively limited in wireless link. Hence, providing continuous Internet connection without any interruption through applying multihoming technique and route optimization mechanism in NEMO are becoming the center of attention to the current researchers. In this paper, we propose a handoff cost model to compare the signaling cost of MM-NEMO with NEMO Basic Support Protocol (NEMO BSP) and HMIPv6.The numerical results shows that the signaling cost for the MM-NEMO scheme is about 69.6 % less than the NEMO-BSP and HMIPv6

  9. Modelling turbulent vertical mixing sensitivity using a 1-D version of NEMO

    Science.gov (United States)

    Reffray, G.; Bourdalle-Badie, R.; Calone, C.

    2015-01-01

    Through two numerical experiments, a 1-D vertical model called NEMO1D was used to investigate physical and numerical turbulent-mixing behaviour. The results show that all the turbulent closures tested (k+l from Blanke and Delecluse, 1993, and two equation models: generic length scale closures from Umlauf and Burchard, 2003) are able to correctly reproduce the classical test of Kato and Phillips (1969) under favourable numerical conditions while some solutions may diverge depending on the degradation of the spatial and time discretization. The performances of turbulence models were then compared with data measured over a 1-year period (mid-2010 to mid-2011) at the PAPA station, located in the North Pacific Ocean. The modelled temperature and salinity were in good agreement with the observations, with a maximum temperature error between -2 and 2 °C during the stratified period (June to October). However, the results also depend on the numerical conditions. The vertical RMSE varied, for different turbulent closures, from 0.1 to 0.3 °C during the stratified period and from 0.03 to 0.15 °C during the homogeneous period. This 1-D configuration at the PAPA station (called PAPA1D) is now available in NEMO as a reference configuration including the input files and atmospheric forcing set described in this paper. Thus, all the results described can be recovered by downloading and launching PAPA1D. The configuration is described on the NEMO site (PAPA">http://www.nemo-ocean.eu/Using-NEMO/Configurations/C1D_PAPA). This package is a good starting point for further investigation of vertical processes.

  10. Background constrains of the SuperNEMO experiment for neutrinoless double beta-decay searches

    Energy Technology Data Exchange (ETDEWEB)

    Povinec, Pavel P.

    2017-02-11

    The SuperNEMO experiment is a new generation of experiments dedicated to the search for neutrinoless double beta-decay, which if observed, would confirm the existence of physics beyond the Standard Model. It is based on the tracking and calorimetry techniques, which allow the reconstruction of the final state topology, including timing and kinematics of the double beta-decay transition events, offering a powerful tool for background rejection. While the basic detection strategy of the SuperNEMO detector remains the same as of the NEMO-3 detector, a number of improvements were accomplished for each of detector main components. Upgrades of the detector technologies and development of low-level counting techniques ensure radiopurity control of construction parts of the SuperNEMO detector. A reference material made of glass pellets has been developed to assure quality management and quality control of radiopurity measurements. The first module of the SuperNEMO detector (Demonstrator) is currently under construction in the Modane underground laboratory. No background event is expected in the neutrinoless double beta-decay region in 2.5 years of its operation using 7 kg of {sup 82}Se. The half-life sensitivity of the Demonstrator is expected to be >6.5·10{sup 24} y, corresponding to an effective Majorana neutrino mass sensitivity of |0.2−0.4| eV (90% C.L.). The full SuperNEMO experiment comprising of 20 modules with 100 kg of {sup 82}Se source should reach an effective Majorana neutrino mass sensitivity of |0.04−0.1| eV, and a half-life limit 1·10{sup 26} y. - Highlights: • SuperNEMO detector for 2β0ν-decay of {sup 82}Se should reach half-life limit of 10{sup 26} y. • Radiopurity of the SuperNEMO internal detector parts was checked down to 0.1 mBq/kg. • Reference material of glass pellets was developed for underground γ-spectrometry.

  11. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  12. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    International Nuclear Information System (INIS)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard

    2013-01-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  13. Sea-ice evaluation of NEMO-Nordic 1.0: a NEMO-LIM3.6-based ocean-sea-ice model setup for the North Sea and Baltic Sea

    Science.gov (United States)

    Pemberton, Per; Löptien, Ulrike; Hordoir, Robinson; Höglund, Anders; Schimanke, Semjon; Axell, Lars; Haapala, Jari

    2017-08-01

    The Baltic Sea is a seasonally ice-covered marginal sea in northern Europe with intense wintertime ship traffic and a sensitive ecosystem. Understanding and modeling the evolution of the sea-ice pack is important for climate effect studies and forecasting purposes. Here we present and evaluate the sea-ice component of a new NEMO-LIM3.6-based ocean-sea-ice setup for the North Sea and Baltic Sea region (NEMO-Nordic). The setup includes a new depth-based fast-ice parametrization for the Baltic Sea. The evaluation focuses on long-term statistics, from a 45-year long hindcast, although short-term daily performance is also briefly evaluated. We show that NEMO-Nordic is well suited for simulating the mean sea-ice extent, concentration, and thickness as compared to the best available observational data set. The variability of the annual maximum Baltic Sea ice extent is well in line with the observations, but the 1961-2006 trend is underestimated. Capturing the correct ice thickness distribution is more challenging. Based on the simulated ice thickness distribution we estimate the undeformed and deformed ice thickness and concentration in the Baltic Sea, which compares reasonably well with observations.

  14. Evaluation of the coupled COSMO-CLM+NEMO-Nordic model with focus on North and Baltic seas

    Science.gov (United States)

    Lenhardt, J.; Pham, T. V.; Früh, B.; Brauch, J.

    2017-12-01

    The region east of the Baltic Sea has been identified as a hot-spot of climate change by Giorgi, 2006, on the base of temperature and precipitation variability. For this purpose, the atmosphere model COSMO-CLM has been coupled to the ocean model NEMO, including the sea ice model LIM3, via the OASIS3-MCT coupler (Pham et al., 2014). The coupler interpolates heat, fresh water, momentum fluxes, sea level pressure and the fraction of sea ice at the interface in space and time. Our aim is to find an optimal configuration of the already existing coupled regional atmospheric-ocean model COSMO-CLM+NEMO-Nordic. So far results for the North- and Baltic seas show that the coupled run has large biases compared with the E-OBS reference data. Therefore, additional simulation evaluations are planned by the use of independent satellite observation data (e.g. Copernicus, EURO4M). We have performed a series of runs with the coupled COSMO-CLM+NEMO-Nordic model to find out about differences of model outputs due to different coupling time steps. First analyses of COSMO-CLM 2m temperatures let presume that different coupling time steps have an impact on the results of the coupled model run. Additional tests over a longer period of time are conducted to understand whether the signal-to-noise ratio could influence the bias. The results will be presented in our poster.

  15. Nor-ursodeoxycholic acid reverses hepatocyte-specific nemo-dependent steatohepatitis.

    Science.gov (United States)

    Beraza, Naiara; Ofner-Ziegenfuss, Lisa; Ehedego, Haksier; Boekschoten, Mark; Bischoff, Stephan C; Mueller, Michael; Trauner, Michael; Trautwein, Christian

    2011-03-01

    Hepatocyte-specific NEMO/NF-κB deleted mice (NEMO(Δhepa)) develop spontaneous non-alcoholic steatohepatitis (NASH). Free fatty acids and bile acids promote DR5 expression. TRAIL/NK cell-mediated activation of TRAIL-R2/DR5 plays an important role during acute injury in NEMO(Δhepa) mice. To inhibit the progression of NASH in the absence of hepatocyte-NEMO/NF-kB signaling. NEMOf/f and NEMO(Δhepa) mice were fed with a low-fat diet, and with two anticholestatic diets; UDCA and NorUDCA. The impact of these treatments on the progression of NASH was evaluated. We show that high expression of DR5 in livers from NEMO(Δhepa) mice is accompanied by an abundant presence of bile acids (BAs), misregulation of BA transporters and significant alteration of lipid metabolism-related genes. Additionally, mice lacking NEMO in hepatocytes spontaneously showed ductular response at young age. Unexpectedly, feeding of NEMO(Δhepa) mice with low-fat diet failed to improve chronic liver injury. Conversely, anti-cholestatic treatment with nor-ursodeoxycholic acid (NorUDCA), but not with ursodeoxycholic acid (UDCA), led to a significant attenuation of liver damage in NEMO(Δhepa) mice. The strong therapeutic effect of NorUDCA relied on a significant downregulation of LXR-dependent lipogenesis and the normalisation of BA metabolism through mechanisms involving cross-talk between Cyp7a1 and SHP. This was associated with the significant improvement of liver histology, NEMO(Δhepa)/NorUDCA-treated mice showed lower apoptosis and reduced CyclinD1 expression, indicating attenuation of the compensatory proliferative response to hepatocellular damage. Finally, fibrosis and ductular reaction markers were significantly reduced in NorUDCA-treated NEMO(Δhepa) mice. Overall, our work demonstrates the contribution of bile acids metabolism to the progression of NASH in the absence of hepatocyte-NF-kB through mechanisms involving DR5-apoptosis, inflammation and fibrosis. Our work suggests a potential

  16. Double hit of NEMO gene in preeclampsia.

    Directory of Open Access Journals (Sweden)

    Agata Sakowicz

    Full Text Available The precise etiology of preeclampsia is unknown. Family studies indicate that both genetic and environmental factors influence its development. One of these factors is NFkB, whose activation depends on NEMO (NFkB essential modulator. This is the first study to investigate the association between the existence of single nucleotide variant of the NEMO gene and the appearance of preeclampsia. A total of 151 women (72 preeclamptic women and 79 controls and their children were examined. Sanger sequencing was performed to identify variants in the NEMO gene in the preeclamptic mothers. The maternal identified variants were then sought in the studied groups of children, and in the maternal and child controls, using RFLP-PCR. Real-time RT-PCR was performed to assess NEMO gene expression in maternal blood, umbilical cord blood and placentas. The sequencing process indicated the existence of two different variants in the 3'UTR region of the NEMO gene of preeclamptic women (IKBKG:c.*368C>A and IKBKG:c.*402C>T. The simultaneous occurrence of the TT genotype in the mother and the TT genotype in the daughter or a T allele in the son increased the risk of preeclampsia development 2.59 fold. Additionally, we found that the configuration of maternal/fetal genotypes (maternal TT/ daughter TT or maternal TT/son T of IKBKG:c.*402C/T variant is associated with the level of NEMO gene expression. Our results showed that, the simultaneous occurrence of the maternal TT genotype (IKBKG:c.*402C>T variants and TT genotype in the daughter or T allele in the son correlates with the level of NEMO gene expression and increases the risk of preeclampsia development. Our observations may offer a new insight into the genetic etiology and pathogenesis of preeclampsia.

  17. Neutrino Physics without Neutrinos: Recent results from the NEMO-3 experiment and plans for SuperNEMO

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The observation of neutrino oscillations has proved that neutrinos have mass. This discovery has renewed and strengthened the interest in neutrinoless double beta decay experiments which provide the only practical way to determine whether neutrinos are Majorana or Dirac particles. The recently completed NEMO-3 experiment, located in the Laboratoire Souterrain de Modane in the Frejus Tunnel, was an experiment searching for neutrinoless double beta decays using a powerful technique for detecting a two-electron final state by employing an apparatus combining tracking, calorimetry, and the time-of-flight measurements. We will present latest results from NEMO-3 and will discuss the status of SuperNEMO, the next generation experiment that will exploit the same experimental technique to extend the sensitivity of the current search.

  18. Study of the background neutron and gamma components of the ββ(0ν) decay in the NEMO2 prototype detector. Consequences for the NEMO3 detector

    International Nuclear Information System (INIS)

    Marquet, Christine

    1999-01-01

    Neutrinoless double beta decay ββ(0ν) is a test of physics beyond the Standard Model by involving the existence of a massive Majorana neutrino (ν = ν-bar). To try to observe such a process with a sensitivity of 0.1 eV on the neutrino effective mass ( ν >), NEMO collaboration build the NEMO3 detector, able to measure half-lives greater than 10 24 years, corresponding to a few detected events per year. For that, it is necessary to know and master all background sources. This work was first dedicated to the study of external (to the double beta source) background with crossing electrons recorded with NEMO2 prototype detector and then to the simulation of this background in NEMO3 detector. Comparison between NEMO2 data and results of gamma and neutron simulations for different shieldings, with and without neutron source, has allowed to determine background contributions of radon, thoron, 208 Tl contaminations in materials, photon flux produced in laboratory and neutrons. This study, which has required improvements in the MICAP neutron simulation code by developing a photon generator, proved that radiative capture of fast neutrons thermalized in the detector was the source of events in the energy domain of the ββ(0ν) signal. In order to reach the required sensitivity on ν > mass, it has been shown that both a neutron shielding and magnetic field are necessary for NEMO3 detector. (author) [fr

  19. Development of an optical simulation for the SuperNEMO calorimeter

    Science.gov (United States)

    Huber, Arnaud; SuperNEMO Collaboration

    2017-09-01

    The SuperNEMO double beta decay project is a modular tracker-calorimeter based experiment. The aim of this project is to reach a sensitivity of the order of 1026 years concerning the neutrinoless double beta decay half-life, corresponding to a Majorana neutrino mass of 50-100 meV. The main calorimeter of the SuperNEMO demonstrator is based on 520 Optical Modules made of large volume plastic scintillators (10L) coupled with large area photomultipliers (Hamamatsu R5912-MOD and R6594). The design of the calorimeter is optimized for the double beta decay detection and allows gamma tagging for background rejection. In large volumes of scintillators, a similar deposited energy by electrons or photons will give different visible energy and signal shapes due to different interactions inside the scintillator. The aim of the optical simulation, developed for SuperNEMO, is to model the Optical Module response on the energy and time performances, regarding the particle type.

  20. Review paper of gateway selection schemes for MANET of NEMO (MANEMO)

    International Nuclear Information System (INIS)

    Mahmood, Z; Hashim, A; Khalifa, O; Anwar, F; Hameed, S

    2013-01-01

    The fast growth of Internet applications brings with it new challenges for researchers to provide new solutions that guarantee better Internet access for mobile hosts and networks. The globally reachable, Home-Agent based, infrastructure Network Mobility (NEMO) and the local, multi-hop, and infrastructure-less Mobile Ad hoc Network (MANET) developed by Internet Engineering Task Force (IETF) support different topologies of the mobile networks. A new architecture was proposed by combining both topologies to obtain Mobile Ad Hoc NEMO (MANEMO). However, the integration of NEMO and MANET introduces many challenges such as network loops, sub-optimal route, redundant tunnel problem, absence of communication without Home Agent reachability, and exit router selection when multiple Exit Routers to the Internet exist. This paper aims to review the different proposed models that could be used to implement the gateway selection mechanism and it highlights the strengths as well as the limitations of these approaches

  1. Quantification of cellular NEMO content and its impact on NF-κB activation by genotoxic stress.

    Directory of Open Access Journals (Sweden)

    Byounghoon Hwang

    Full Text Available NF-κB essential modulator, NEMO, plays a key role in canonical NF-κB signaling induced by a variety of stimuli, including cytokines and genotoxic agents. To dissect the different biochemical and functional roles of NEMO in NF-κB signaling, various mutant forms of NEMO have been previously analyzed. However, transient or stable overexpression of wild-type NEMO can significantly inhibit NF-κB activation, thereby confounding the analysis of NEMO mutant phenotypes. What levels of NEMO overexpression lead to such an artifact and what levels are tolerated with no significant impact on NEMO function in NF-κB activation are currently unknown. Here we purified full-length recombinant human NEMO protein and used it as a standard to quantify the average number of NEMO molecules per cell in a 1.3E2 NEMO-deficient murine pre-B cell clone stably reconstituted with full-length human NEMO (C5. We determined that the C5 cell clone has an average of 4 x 10(5 molecules of NEMO per cell. Stable reconstitution of 1.3E2 cells with different numbers of NEMO molecules per cell has demonstrated that a 10-fold range of NEMO expression (0.6-6x10(5 molecules per cell yields statistically equivalent NF-κB activation in response to the DNA damaging agent etoposide. Using the C5 cell line, we also quantified the number of NEMO molecules per cell in several commonly employed human cell lines. These results establish baseline numbers of endogenous NEMO per cell and highlight surprisingly normal functionality of NEMO in the DNA damage pathway over a wide range of expression levels that can provide a guideline for future NEMO reconstitution studies.

  2. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    Science.gov (United States)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  3. NEMO educational kit on micro-optics at the secondary school

    Science.gov (United States)

    Flores-Arias, M. T.; Bao-Varela, Carmen

    2014-07-01

    NEMO was the "Network of Excellence in Micro-Optics" granted in the "Sixth Framework Program" of the European Union. It aimed at providing Europe with a complete Micro-Optics food-chain, by setting up centers for optical modeling and design; measurement and instrumentation; mastering, prototyping and replication; integration and packaging and reliability and standardization. More than 300 researchers from 30 groups in 12 countries participated in the project. One of the objectives of NEMO was to spread excellence and disseminate knowledge on micro-optics and micro-photonics. To convince pupils, already from secondary school level on, about the crucial role of light and micro-optics and the opportunities this combination holds, several partners of NEMO had collaborate to create this Educational Kit. In Spain the partner involved in this aim was the "Microoptics and GRIN Optics Group" at the University of Santiago of Compostela (USC). The educational kits provided to the Secondary School were composed by two plastic cards with the following microoptical element: different kinds of diffractive optical elements or DOES and refractive optical elements or ROEs namely arrays of micro-lenses. The kit also included a DVD with a handbook for performing the experiments as well as a laser pointer source. This kit was distributed free of charge in the countries with partners in NEMO. In particular in Spain was offered to around 200 Secondary School Centers and only 80 answered accepting evaluate the kit.

  4. Expanding the substantial interactome of NEMO using protein microarrays.

    LENUS (Irish Health Repository)

    Fenner, Beau J

    2010-01-01

    Signal transduction by the NF-kappaB pathway is a key regulator of a host of cellular responses to extracellular and intracellular messages. The NEMO adaptor protein lies at the top of this pathway and serves as a molecular conduit, connecting signals transmitted from upstream sensors to the downstream NF-kappaB transcription factor and subsequent gene activation. The position of NEMO within this pathway makes it an attractive target from which to search for new proteins that link NF-kappaB signaling to additional pathways and upstream effectors. In this work, we have used protein microarrays to identify novel NEMO interactors. A total of 112 protein interactors were identified, with the most statistically significant hit being the canonical NEMO interactor IKKbeta, with IKKalpha also being identified. Of the novel interactors, more than 30% were kinases, while at least 25% were involved in signal transduction. Binding of NEMO to several interactors, including CALB1, CDK2, SAG, SENP2 and SYT1, was confirmed using GST pulldown assays and coimmunoprecipitation, validating the initial screening approach. Overexpression of CALB1, CDK2 and SAG was found to stimulate transcriptional activation by NF-kappaB, while SYT1 overexpression repressed TNFalpha-dependent NF-kappaB transcriptional activation in human embryonic kidney cells. Corresponding with this finding, RNA silencing of CDK2, SAG and SENP2 reduced NF-kappaB transcriptional activation, supporting a positive role for these proteins in the NF-kappaB pathway. The identification of a host of new NEMO interactors opens up new research opportunities to improve understanding of this essential cell signaling pathway.

  5. Epithelial NEMO/IKKγ limits fibrosis and promotes regeneration during pancreatitis.

    Science.gov (United States)

    Chan, Lap Kwan; Gerstenlauer, Melanie; Konukiewitz, Björn; Steiger, Katja; Weichert, Wilko; Wirth, Thomas; Maier, Harald Jakob

    2017-11-01

    Inhibitory κB kinase (IKK)/nuclear factor κB (NF-κB) signalling has been implicated in the pathogenesis of pancreatitis, but its precise function has remained controversial. Here, we analyse the contribution of IKK/NF-κB signalling in epithelial cells to the pathogenesis of pancreatitis by targeting the IKK subunit NF-κB essential modulator (NEMO) (IKKγ), which is essential for canonical NF-κB activation. Mice with a targeted deletion of NEMO in the pancreas were subjected to caerulein pancreatitis. Pancreata were examined at several time points and analysed for inflammation, fibrosis, cell death, cell proliferation, as well as cellular differentiation. Human samples were used to corroborate findings established in mice. In acute pancreatitis, NEMO deletion in the pancreatic parenchyma resulted in minor changes during the early phase but led to the persistence of inflammatory and fibrotic foci in the recovery phase. In chronic pancreatitis, NEMO deletion aggravated inflammation and fibrosis, inhibited compensatory acinar cell proliferation, and enhanced acinar atrophy and acinar-ductal metaplasia. Gene expression analysis revealed sustained activation of profibrogenic genes and the CXCL12/CXCR4 axis in the absence of epithelial NEMO. In human chronic pancreatitis samples, the CXCL12/CXCR4 axis was activated as well, with CXCR4 expression correlating with the degree of fibrosis. The aggravating effects of NEMO deletion were attenuated by the administration of the CXCR4 antagonist AMD3100. Our results suggest that NEMO in epithelial cells exerts a protective effect during pancreatitis by limiting inflammation and fibrosis and improving acinar cell regeneration. The CXCL12/CXCR4 axis is an important mediator of that effect and may also be of importance in human chronic pancreatitis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. AlignNemo: a local network alignment method to integrate homology and topology.

    Directory of Open Access Journals (Sweden)

    Giovanni Ciriello

    Full Text Available Local network alignment is an important component of the analysis of protein-protein interaction networks that may lead to the identification of evolutionary related complexes. We present AlignNemo, a new algorithm that, given the networks of two organisms, uncovers subnetworks of proteins that relate in biological function and topology of interactions. The discovered conserved subnetworks have a general topology and need not to correspond to specific interaction patterns, so that they more closely fit the models of functional complexes proposed in the literature. The algorithm is able to handle sparse interaction data with an expansion process that at each step explores the local topology of the networks beyond the proteins directly interacting with the current solution. To assess the performance of AlignNemo, we ran a series of benchmarks using statistical measures as well as biological knowledge. Based on reference datasets of protein complexes, AlignNemo shows better performance than other methods in terms of both precision and recall. We show our solutions to be biologically sound using the concept of semantic similarity applied to Gene Ontology vocabularies. The binaries of AlignNemo and supplementary details about the algorithms and the experiments are available at: sourceforge.net/p/alignnemo.

  7. RedNemo

    DEFF Research Database (Denmark)

    Alkan, Ferhat; Erten, Cesim

    2017-01-01

    is their erroneous nature; they contain false-positive interactions and usually many more false-negatives. Recently, several computational methods have been proposed for network reconstruction based on topology, where given an input PPI network the goal is to reconstruct the network by identifying false...... material including source code, useful scripts, experimental data and the results are available at http://webprs.khas.edu.tr/∼cesim/Red Nemo. tar.gz CONTACT: cesim@khas.edu.tr Supplementary information: Supplementary data are available at Bioinformatics online....

  8. GeNemo: a search engine for web-based functional genomic data.

    Science.gov (United States)

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Explicit representation and parametrised impacts of under ice shelf seas in the z∗ coordinate ocean model NEMO 3.6

    Directory of Open Access Journals (Sweden)

    P. Mathiot

    2017-07-01

    Full Text Available Ice-shelf–ocean interactions are a major source of freshwater on the Antarctic continental shelf and have a strong impact on ocean properties, ocean circulation and sea ice. However, climate models based on the ocean–sea ice model NEMO (Nucleus for European Modelling of the Ocean currently do not include these interactions in any detail. The capability of explicitly simulating the circulation beneath ice shelves is introduced in the non-linear free surface model NEMO. Its implementation into the NEMO framework and its assessment in an idealised and realistic circum-Antarctic configuration is described in this study. Compared with the current prescription of ice shelf melting (i.e. at the surface, inclusion of open sub-ice-shelf cavities leads to a decrease in sea ice thickness along the coast, a weakening of the ocean stratification on the shelf, a decrease in salinity of high-salinity shelf water on the Ross and Weddell sea shelves and an increase in the strength of the gyres that circulate within the over-deepened basins on the West Antarctic continental shelf. Mimicking the overturning circulation under the ice shelves by introducing a prescribed meltwater flux over the depth range of the ice shelf base, rather than at the surface, is also assessed. It yields similar improvements in the simulated ocean properties and circulation over the Antarctic continental shelf to those from the explicit ice shelf cavity representation. With the ice shelf cavities opened, the widely used three equation ice shelf melting formulation, which enables an interactive computation of melting, is tested. Comparison with observational estimates of ice shelf melting indicates realistic results for most ice shelves. However, melting rates for the Amery, Getz and George VI ice shelves are considerably overestimated.

  10. Nemo-3 experiment assets and limitations. Perspective for the double β physics

    International Nuclear Information System (INIS)

    Augier, C.

    2005-06-01

    After an introduction to this report in Chapter 1, I present a status of our knowledge in neutrino physics in Chapter 2. Then, I detail in Chapter 3 all the choices made for the design and realisation of the NEMO 3 detector for the research of double beta decay process. Performance of the detector is presented, concerning both the capacity of the detector to identify the backgrounds and the ability to study all the ββ process. I also explain the methods chosen by the NEMO collaboration to reduce the radon activity inside the detector and to make this background negligible today. This chapter, which is written in English, is the 'Technical report of the NEMO 3 detector' and forms an independent report for the NEMO collaborators. I finish this report in Chapter 4 with a ten years prospect for experimental projects in physics, with both the SuperNEMO project and its experiment program, and also by comparing the most interesting experiments, CUORE and GERDA, showing as an example the effect of nuclear matrix elements on the neutrino effective mass measurement. (author)

  11. BAHAN AJAR MENULIS CERITA FABEL DENGAN STIMULUS FILM FINDING NEMO

    Directory of Open Access Journals (Sweden)

    Lia Noviana Qostantia

    2017-03-01

    Full Text Available This research objectives were (1 describing instructional material of writing fable story using stimulus of finding nemo movie and (2 describing instructional material feasibility of writing fable story using stimulus of Finding Nemo movie that obtained from expert test and practitioner (teacher and student test. The developed instructional material was complementary book of writing fable story for students with material, language, and book display that adjusted with student’s needs. Those objectives could be made as guidance in developing the instructional material which including material content feasibility, language, and complementary book display aspect. Tujuan penelitian ini adalah (1 mengembangkan bahan ajar menulis cerita fabel dengan stimulus film finding nemo, (2 mendeskripsikan kelayakan bahan ajar menulis cerita fabel dengan stimulus film Finding Nemo yang diperoleh dari uji ahli, uji praktisi guru, dan siswa. Bahan ajar yang dikembangkan berupa buku pelengkap menulis cerita fabel untuk siswa dengan materi, bahasa, dan penyajian buku yang disesuaikan dengan kebutuhan siswa. Tujuan tersebut dapat dijadikan panduan dalam mengembangkan bahan ajar yang mencakup aspek kelayakan isi materi, bahasa, dan penyajian buku pelengkap.

  12. Analysis of the data from the NEMO3 experiment and search for neutrinoless double beta decay - Study of systematic bias of the calorimeter and development of analysis tools

    International Nuclear Information System (INIS)

    Hugon, C.

    2012-11-01

    The NEMO3 experiment was researching the neutrinoless double-β (0ndb) decay by using various sources of double beta decay isotopes (mainly 100 Mo, 82 Se, 116 Cd and 130 Te for about 10 kg in total). The detector was located in the underground laboratory of Modane (Italy) in the halfway point of the Frejus tunnel. This experiment demonstrated that the 'tracko-calo' technology is really competitive and, in addition, it gives new results for the 2-neutrinos double-β (2ndb) decay and the (0ndb) decays research. Moreover it opened an new way for its successor SuperNEMO, which aim is to reach a mass of 100 kg of 82 Se (for a sensitivity of 10 26 years). The main goal of the thesis is to measure the 2ndb and 0ndb decay of the 100 Mo to the excited state 0 1 + of the 100 Ru thanks to the whole NEMO3 data, with new original methods of analysis and through the development of the collaboration analysis software. The results obtained for the ground states (gs) and excited states 2ndb of the 100 Mo are for the half-lives: T(2nbd, gs)=[7.05±0.01(stat)±0.54(syst)]*10 18 years and T(2ndb, 0 1 + )=[6.15±1.1(sta)±0.78]*10 20 years. Those results are compatibles with the last ones published by the collaboration. For the 0ndb(0 1 + ), this work gave a half-life of T(0ndb, 0 1 + ) > 2.6*10 23 years, improving significantly the last published results. Furthermore those methods also allowed to present a new and more exhaustive background noise model for this experiment. The second point of this work was to measure the systematics errors of the NEMO3 calorimeter, among others due to the wavelength of the NEMO3 calibration systems. This work was done using a new test bench based on LED. This bench also allowed to contribute to the development of the SuperNEMO calorimeter, especially in the time characteristic and the energy linearity measurement of the photomultiplier intended to the demonstrator of the experiments. (author)

  13. Low background techniques for SuperNEMO

    International Nuclear Information System (INIS)

    Liu, Xin Ran; Mott, James

    2015-01-01

    The UK contribution to achieving the ultra-low background conditions required inside the detectors of the SuperNEMO experiment are described. A dedicated facility has been established for the screening and selection of materials through gamma ray spectroscopy using germanium detectors. Initial results from two detectors are shown. The radon level inside the SuperNEMO detector must be less than 150 μBq/m 3 in order to achieve the target sensitivity. A Radon Concentration Line (RnCL) has been developed capable of measuring radon levels in large gas volumes down to 5 μBq/m 3 , improving on standard state-of-the-art radon detectors by 3 orders of magnitude. The development, commissioning and first measurements of radon content using the RnCL are also presented. (paper)

  14. Disulfide-mediated stabilization of the IκB kinase binding domain of NF-κB essential modulator (NEMO).

    Science.gov (United States)

    Zhou, Li; Yeo, Alan T; Ballarano, Carmine; Weber, Urs; Allen, Karen N; Gilmore, Thomas D; Whitty, Adrian

    2014-12-23

    Human NEMO (NF-κB essential modulator) is a 419 residue scaffolding protein that, together with catalytic subunits IKKα and IKKβ, forms the IκB kinase (IKK) complex, a key regulator of NF-κB pathway signaling. NEMO is an elongated homodimer comprising mostly α-helix. It has been shown that a NEMO fragment spanning residues 44-111, which contains the IKKα/β binding site, is structurally disordered in the absence of bound IKKβ. Herein we show that enforcing dimerization of NEMO1-120 or NEMO44-111 constructs through introduction of one or two interchain disulfide bonds, through oxidation of the native Cys54 residue and/or at position 107 through a Leu107Cys mutation, induces a stable α-helical coiled-coil structure that is preorganized to bind IKKβ with high affinity. Chemical and thermal denaturation studies showed that, in the context of a covalent dimer, the ordered structure was stabilized relative to the denatured state by up to 3 kcal/mol. A full-length NEMO-L107C protein formed covalent dimers upon treatment of mammalian cells with H2O2. Furthermore, NEMO-L107C bound endogenous IKKβ in A293T cells, reconstituted TNF-induced NF-κB signaling in NEMO-deficient cells, and interacted with TRAF6. Our results indicate that the IKKβ binding domain of NEMO possesses an ordered structure in the unbound state, provided that it is constrained within a dimer as is the case in the constitutively dimeric full-length NEMO protein. The stability of the NEMO coiled coil is maintained by strong interhelix interactions in the region centered on residue 54. The disulfide-linked constructs we describe herein may be useful for crystallization of NEMO's IKKβ binding domain in the absence of bound IKKβ, thereby facilitating the structural characterization of small-molecule inhibitors.

  15. Porcine deltacoronavirus nsp5 inhibits interferon-β production through the cleavage of NEMO.

    Science.gov (United States)

    Zhu, Xinyu; Fang, Liurong; Wang, Dang; Yang, Yuting; Chen, Jiyao; Ye, Xu; Foda, Mohamed Frahat; Xiao, Shaobo

    2017-02-01

    Porcine deltacoronavirus (PDCoV) causes acute enteric disease and mortality in seronegative neonatal piglets. Previously we have demonstrated that PDCoV infection suppresses the production of interferon-beta (IFN-β), while the detailed mechanisms are poorly understood. Here, we demonstrate that nonstructural protein 5 (nsp5) of PDCoV, the 3C-like protease, significantly inhibits Sendai virus (SEV)-induced IFN-β production by targeting the NF-κB essential modulator (NEMO), confirmed by the diminished function of NEMO cleaved by PDCoV. The PDCoV nsp5 cleavage site in the NEMO protein was identified as glutamine 231, and was identical to the porcine epidemic diarrhea virus nsp5 cleavage site, revealing the likelihood of a common target in NEMO for coronaviruses. Furthermore, this cleavage impaired the ability of NEMO to activate the IFN response and downstream signaling. Taken together, our findings reveal PDCoV nsp5 to be a newly identified IFN antagonist and enhance the understanding of immune evasion by deltacoronaviruses. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Advanced energy systems and technologies (NEMO 2). Final report 1993-1998

    Energy Technology Data Exchange (ETDEWEB)

    Lund, P.; Konttinen, P. [eds.

    1998-12-31

    NEMO2 has been the major Finnish energy research programme on advanced energy systems and technologies during 1993-1998. The main objective of the programme has been to support industrial technology development but also to increase the utilisation of wind and solar energy in Finland. The main technology fields covered are wind and solar energy. In addition, the programme has supported projects on energy storage and other small-scale energy technologies such as fuel cells that support the main technology fields chosen. NEMO2 is one of the energy research programmes of the Technology Development Centre of Finland (TEKES). The total R and D funding over the whole programme period was FIM 130 million (ECU 22 million). The public funding of the total programme costs has been 43 %. The industrial participation has been strong. International co-operation has been an important aspect in NEMO2: the programme has stimulated 24 EU-projects and participation in several IEA co-operative tasks. International funding adds nearly 20 % to the NEMO2 R and D funding. (orig.)

  17. Advanced energy systems and technologies (NEMO 2). Final report 1993-1998

    International Nuclear Information System (INIS)

    Lund, P.; Konttinen, P.

    1998-01-01

    NEMO2 has been the major Finnish energy research programme on advanced energy systems and technologies during 1993-1998. The main objective of the programme has been to support industrial technology development but also to increase the utilisation of wind and solar energy in Finland. The main technology fields covered are wind and solar energy. In addition, the programme has supported projects on energy storage and other small-scale energy technologies such as fuel cells that support the main technology fields chosen. NEMO2 is one of the energy research programmes of the Technology Development Centre of Finland (TEKES). The total R and D funding over the whole programme period was FIM 130 million (ECU 22 million). The public funding of the total programme costs has been 43 %. The industrial participation has been strong. International co-operation has been an important aspect in NEMO2: the programme has stimulated 24 EU-projects and participation in several IEA co-operative tasks. International funding adds nearly 20 % to the NEMO2 R and D funding. (orig.)

  18. Probing new physics models of neutrinoless double beta decay with SuperNEMO

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, R. [CNRS/IN2P3, IPHC, Universite de Strasbourg, Strasbourg (France); Augier, C.; Bongrand, M.; Garrido, X.; Jullian, S.; Sarazin, X.; Simard, L. [CNRS/IN2P3, LAL, Universite Paris-Sud 11, Orsay (France); Baker, J.; Caffrey, A.J.; Horkley, J.J.; Riddle, C.L. [INL, Idaho Falls, ID (United States); Barabash, A.S.; Konovalov, S.I.; Umatov, V.I.; Vanyushin, I.A. [Institute of Theoretical and Experimental Physics, Moscow (Russian Federation); Basharina-Freshville, A.; Evans, J.J.; Flack, R.; Holin, A.; Kauer, M.; Richards, B.; Saakyan, R.; Thomas, J.; Vasiliev, V.; Waters, D. [University College London, London (United Kingdom); Brudanin, V.; Egorov, V.; Kochetov, O.; Nemchenok, I.; Timkin, V.; Tretyak, V.; Vasiliev, R. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Cebrian, S.; Dafni, T.; Irastorza, I.G.; Gomez, H.; Iguaz, F.J.; Luzon, G.; Rodriguez, A. [University of Zaragoza, Zaragoza (Spain); Chapon, A.; Durand, D.; Guillon, B.; Mauger, F. [Universite de Caen, LPC Caen, ENSICAEN, Caen (France); Chauveau, E.; Hubert, P.; Hugon, C.; Lutter, G.; Marquet, C.; Nachab, A.; Nguyen, C.H.; Perrot, F.; Piquemal, F.; Ricol, J.S. [UMR 5797, Universite de Bordeaux, Centre d' Etudes Nucleaires de Bordeaux Gradignan, Gradignan (France); UMR 5797, CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, Gradignan (France); Deppisch, F.F.; Jackson, C.M.; Nasteva, I.; Soeldner-Rembold, S. [Univ. of Manchester (United Kingdom); Diaz, J.; Monrabal, F.; Serra, L.; Yahlali, N. [CSIC - Univ. de Valencia, IFIC (Spain); Fushima, K.I. [Tokushima Univ., Tokushima (Japan); Holy, K.; Povinec, P.P.; Simkovic, F. [Comenius Univ., FMFI, Bratislava (Slovakia); Ishihara, N. [KEK, Tsukuba, Ibaraki (Japan); Kovalenko, V. [CNRS/IN2P3, IPHC, Univ. de Strasbourg (France); Joint Inst. for Nuclear Research, Dubna (Russian Federation); Lamhamdi, T. [USMBA, Fes (Morocco); Lang, K.; Pahlka, R.B. [Univ. of Texas, Austin, TX (United States)] (and others)

    2010-12-15

    The possibility to probe new physics scenarios of light Majorana neutrino exchange and right-handed currents at the planned next generation neutrinoless double {beta} decay experiment SuperNEMO is discussed. Its ability to study different isotopes and track the outgoing electrons provides the means to discriminate different underlying mechanisms for the neutrinoless double {beta} decay by measuring the decay half-life and the electron angular and energy distributions. (orig.)

  19. Withaferin A disrupts ubiquitin-based NEMO reorganization induced by canonical NF-κB signaling

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Shawn S. [McArdle Laboratory for Cancer Research, Department of Oncology, University of Wisconsin-Madison, 6159 Wisconsin Institute for Medical Research, 1111 Highland Avenue, Madison, WI 53705 (United States); Medical Scientist Training Program, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, WI 53705 (United States); Cellular and Molecular Biology Program, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, WI 53705 (United States); Oberley, Christopher [McArdle Laboratory for Cancer Research, Department of Oncology, University of Wisconsin-Madison, 6159 Wisconsin Institute for Medical Research, 1111 Highland Avenue, Madison, WI 53705 (United States); Hooper, Christopher P. [McArdle Laboratory for Cancer Research, Department of Oncology, University of Wisconsin-Madison, 6159 Wisconsin Institute for Medical Research, 1111 Highland Avenue, Madison, WI 53705 (United States); Cellular and Molecular Biology Program, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, WI 53705 (United States); Grindle, Kreg [Department of Medicine, Division of Hematology and Oncology, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, WI 53705 (United States); Wuerzberger-Davis, Shelly [McArdle Laboratory for Cancer Research, Department of Oncology, University of Wisconsin-Madison, 6159 Wisconsin Institute for Medical Research, 1111 Highland Avenue, Madison, WI 53705 (United States); Wolff, Jared [Department of Medicine, Division of Hematology and Oncology, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, WI 53705 (United States); and others

    2015-02-01

    The NF-κB family of transcription factors regulates numerous cellular processes, including cell proliferation and survival responses. The constitutive activation of NF-κB has also emerged as an important oncogenic driver in many malignancies, such as activated B-cell like diffuse large B cell lymphoma, among others. In this study, we investigated the impact and mechanisms of action of Withaferin A, a naturally produced steroidal lactone, against both signal-inducible as well as constitutive NF-κB activities. We found that Withaferin A is a robust inhibitor of canonical and constitutive NF-κB activities, leading to apoptosis of certain lymphoma lines. In the canonical pathway induced by TNF, Withaferin A did not disrupt RIP1 polyubiquitination or NEMO–IKKβ interaction and was a poor direct IKKβ inhibitor, but prevented the formation of TNF-induced NEMO foci which colocalized with TNF ligand. While GFP-NEMO efficiently formed TNF-induced foci, a GFP-NEMO{sup Y308S} mutant that is defective in binding to polyubiquitin chains did not form foci. Our study reveals that Withaferin A is a novel type of IKK inhibitor which acts by disrupting NEMO reorganization into ubiquitin-based signaling structures in vivo. - Highlights: • Withaferin A, a NF-κB inhibitor, disrupts signaling induced NEMO localization, a novel point of inhibition. • NEMO can be localized to distinct signaling foci after treatment with TNF. • ABC-type DLCBL cells can be sensitized to apoptosis after treatment with Withaferin A.

  20. Strategy of HPGe screening measurements in the SuperNEMO experiment

    Energy Technology Data Exchange (ETDEWEB)

    Perrot, Frédéric [Université de Bordeaux, Centre d' Etudes Nucléaires de Bordeaux Gradignan, UMR 5797, Chemin du Solarium, Le Haut-Vigneau, BP120, F-33175 Gradignan, France and CNRS/IN2P3, Centre d' Etudes Nucléaires de Bordeaux Gradignan, UMR 5797 (France); Collaboration: SuperNEMO Collaboration

    2013-08-08

    SuperNEMO is a double beta decay experiment that will use a tracko-calorimeter technique. The goal is to reach a sensitivity of T{sub 1/2}(0ν)>10{sup 26} y corresponding to an effective Majorana neutrino mass of 0.04-0.11 eV with 100 kg of {sup 82}Se. The general strategy of the HPGe screening measurements is described for the materials of the SuperNEMO demonstrator, regarding their radiopurity and their location. The two platforms, PRISNA and LSM, used for this screening are also briefly described.

  1. Naval EarthMap Observer (NEMO) science and naval products

    Science.gov (United States)

    Davis, Curtiss O.; Kappus, Mary E.; Gao, Bo-Cai; Bissett, W. Paul; Snyder, William A.

    1998-11-01

    A wide variety of applications of imaging spectrometry have been demonstrated using data from aircraft systems. Based on this experience the Navy is pursuing the Hyperspectral Remote Sensing Technology (HRST) Program to use hyperspectral imagery to characterize the littoral environment, for scientific and environmental studies and to meet Naval needs. To obtain the required space based hyperspectral imagery the Navy has joined in a partnership with industry to build and fly the Naval EarthMap Observer (NEMO). The NEMO spacecraft has the Coastal Ocean Imaging Spectrometer (COIS) a hyperspectral imager with adequate spectral and spatial resolution and a high signal-to- noise ratio to provide long term monitoring and real-time characterization of the coastal environment. It includes on- board processing for rapid data analysis and data compression, a large volume recorder, and high speed downlink to handle the required large volumes of data. This paper describes the algorithms for processing the COIS data to provide at-launch ocean data products and the research and modeling that are planned to use COIS data to advance our understanding of the dynamics of the coastal ocean.

  2. Study of water masses variability in the Mediterranean Sea using in-situ data / NEMO-Med12 model.

    Science.gov (United States)

    Margirier, Félix; Testor, Pierre; Mortier, Laurent; Arsouze, Thomas; Bosse, Anthony; Houpert, Loic; Hayes, Dan

    2016-04-01

    In the past 10 years, numerous observation programs in the Mediterranean deployed autonomous platforms (moorings, argo floats, gliders) and thus considerably increased the number of in-situ observations and the data coverage. In this study, we analyse time series built with profile data on interannual scales. Sorting data in regional boxes, we follow the evolution of different water masses in the basin and generate indexes to characterize their evolution. We then put those indexes in relation with external (atmospheric) forcings and present an intercomparison with the NEMO-Med12 model to estimate both the skill of the model and the relevance of the data-sampling in reproducing the evolution of water masses properties.

  3. Development and Pre-Operational Validation of NEMO Based Eddy Ressolving Regional Configuration for Gulf of Finland

    Science.gov (United States)

    Sofina, Ekaterina; Vankevich, Roman; Tatiana, Eremina

    2014-05-01

    At the present day RSHU the Operational Oceanographic System for the Gulf of Finland (GULFOOS) is in a trial operation. For the future development of the operational system, the quality of which also strongly depends on the hydrothermodynamic model spatial resolution. The new model configuration has been implemented, based on the international project NEMO (Nucleus for European Modelling of the Ocean). Based on NEMO toolbox a new eddy permitting z-coordinated configuration realized with horizontal resolution 30x15'' (~500 m) and 1 m vertical step. Chosen horizontal resolution enough to resolve typical submesoscale eddies in this basin where the internal Rossby radius is usually 2-4 km [1]. Verification performed with use all available measurements including vessel, ferry boxes, autonomous profilers, satellite SST. It was shown that submesoscale eddies and filaments generated by baroclinic instability of fronts in upper layers of the Gulf can change vertical stratification and deepening of the mixed layer. Increase in the model resolution leads to a clear improvement of the representation of the key hydro-physical fields: filaments propagation, local eddies. Obtained results confirm that model adequately reproduce general circulation and seasonal evolution of vertical water structure. It is shown that NEMO model initially designed for a global ocean can be used in regional operational application in case of highly stratified shallow basin with complex bathymetry. Computation efficiency of the system including 3DVar assimilation was enough for 24x7 operational task on 12 nodes of Intel based cluster. Proposed regional modeling system has potential to give information on non-observed physical quantities and to provide links between observations by identifying small-scale patterns and processes. References 1. Alenius P., Nekrasov A., Myrberg, K. The baroclinic Rossby-radius in the Gulf of Finland. Continental Shelf Research, 2003, 23, 563-573.

  4. NEMO-SN-1 the first 'real-time' seafloor observatory of ESONET

    International Nuclear Information System (INIS)

    Favali, Paolo; Beranzoli, Laura; D'Anna, Giuseppe; Gasparoni, Francesco; Gerber, Hans W.

    2006-01-01

    The fruitful collaboration between Italian Research Institutions, particularly Istituto Nazionale di Fisica Nucleare (INFN) and Istituto Nazionale di Geofisica e Vulcanologia (INGV) together with Marine Engineering Companies, led to the development of NEMO-SN-1, the first European cabled seafloor multiparameter observatory. This observatory, deployed at 2060 m w.d. about 12 miles off-shore the Eastern coasts of Sicily (Southern Italy), is in real-time acquisition since January 2005 and addressed to different set of measurements: geophysical and oceanographic. In particular the SN-1 seismological data are integrated in the INGV land-based national seismic network, and they arrive in real-time to the Operative Centre in Rome. In the European Commission (EC) European Seafloor Observatory NETwork (ESONET) project, in connection to the Global Monitoring for Environment and Security (GMES) action plan, the NEMO-SN-1 site has been proposed as an European key area, both for its intrinsic importance for geo-hazards and for the availability of infrastructure as a stepwise development in GMES program. Presently, NEMO-SN-1 is the only ESONET site operative. The paper gives a description of SN-1 observatory with examples of data

  5. NEMO on the shelf: assessment of the Iberia–Biscay–Ireland configuration

    Directory of Open Access Journals (Sweden)

    C. Maraldi

    2013-08-01

    Full Text Available This work describes the design and validation of a high-resolution (1/36° ocean forecasting model over the "Iberian–Biscay–Irish" (IBI area. The system has been set-up using the NEMO model (Nucleus for European Modelling of the Ocean. New developments have been incorporated in NEMO to make it suitable to open- as well as coastal-ocean modelling. In this paper, we pursue three main objectives: (1 to give an overview of the model configuration used for the simulations; (2 to give a broad-brush account of one particular aspect of this work, namely consistency verification; this type of validation is conducted upstream of the implementation of the system before it is used for production and routinely validated; it is meant to guide model development in identifying gross deficiencies in the modelling of several key physical processes; and (3 to show that such a regional modelling system has potential as a complement to patchy observations (an integrated approach to give information on non-observed physical quantities and to provide links between observations by identifying broader-scale patterns and processes. We concentrate on the year 2008. We first provide domain-wide consistency verification results in terms of barotropic tides, transports, sea surface temperature and stratification. We then focus on two dynamical subregions: the Celtic shelves and the Bay of Biscay slope and deep regions. The model–data consistency is checked for variables and processes such as tidal currents, tidal fronts, internal tides and residual elevation. We also examine the representation in the model of a seasonal pattern of the Bay of Biscay circulation: the warm extension of the Iberian Poleward Current along the northern Spanish coast (Navidad event in the winter of 2007–2008.

  6. Implementation of the NEMO model for estimating the spread of leakage from chemical munitions in the Baltic Sea - the first approach

    Science.gov (United States)

    Andrzejewski, Jan

    2017-04-01

    After the Second World War, during the Potsdam Conference a decision about demilitarization of Germany was made, and as a consequence, ammunition including chemical warfare agents (CWA) was dumped into the basins of the Baltic Sea. This type of weapon was stored in metal barrels that were under strong influence of electrochemical oxidation, also known as corrosion. Several tens years later, scientists were wondering what consequences for marine ecosystem could a leakage from this weapon bring. Although over 70 years passed since the Second World War, the influence of potential leakage of the CWA has not been properly estimated. Thus, the main goal of this work is to estimate dangerous area caused by potential leakage using the NEMO (Nucleus for European Modelling of the Ocean) ocean model. The NEMO ocean model is developed by the European Consortium including research institutes from France, England and Italy. The first step of this work is to implement the model for the area of the Baltic Sea. It requires generation of horizontal and vertical grid, bathymetry, atmospheric forces and lateral boundary conditions. Implemented model will have to be checked - it means it will have to pass a validation process. The Baltic Sea is one of the best measured sea in the World - as a consequence a lot of data are freely available for researchers. After validation and tuning up the model, implementation of passive tracer is planned. Passive tracer is the prognostic variable that could represent concentration of potential leakage and does not have influence on the density of the model. Based on distribution of the passive tracer, dangerous areas in the locations of dumpsites will be assessed. The research work was funded by the European Union (European Regional Development Fund) under the Interreg Baltic Sea Region Programme 2014-2020, project #R013 DAIMON (Decision Aid for Marine Munitions).

  7. Nemo-3 calorimeter electronics

    International Nuclear Information System (INIS)

    Bernaudin, P.; Cheikali, C.; Lavigne, B.; Richard, A.; Lebris, J.

    2000-11-01

    The calorimeter electronics of the NEMO-3 double beta decay experiment fulfills three functions: -energy measurement of the electrons by measuring the charge of the pulses, - time measurement, - fast first level triggering. The electronics of the 1940 Scintillator-PM modules is implemented as 40 '9U x 400 mm VME' boards of up to 51 channels. For each channel the analog signals conditioning is implemented as one SMD daughter board. Each board performs 12 bit charge measurements with 0.35 pC charge resolution, 12 bit time measurements with 50 ps time resolution and a fast analog multiplicity level for triggering. The total handling and conversion time for all the channels is less than 100 μs. The electronics will be presented as well as the test system. (authors)

  8. Study of tracking detector of NEMO3 experiment - simulation of the measurement of the ultra low {sup 208}Tl radioactivity in the source foils used as neutrinoless double beta decay emitters in NEMO3 experiment; Etude du detecteur de traces de l'experience NEMO3. Simulation de la mesure de l'ultra-faible radioactivite en {sup 208}Tl des sources de l'experience NEMO3 candidates a la double desintegration {beta} sans emission de neutrino

    Energy Technology Data Exchange (ETDEWEB)

    Errahmane, K

    2001-04-01

    The purpose of NEMO3 experiment is the research of the neutrinoless double beta decay. This low energy process can sign the massive and Majorana nature of neutrino. This experiment, with a very low radioactive background and containing 10 kg of enriched isotopes, studies mainly {sup 100}Mo. Installed at the Frejus underground laboratory, NEMO3 is a cylindrical detector, which consists in very thin central source foils, in a tracking detector made up of vertical drift cells operating in Geiger mode, in a calorimeter and in a suitable shielding. This thesis is divided in two different parts. The first part is a full study of the features of the tracking detector. With a prototype composed of 9 drift cells, we characterised the longitudinal and transverse reconstruction of position of the ionisation created by a LASER. With the first 3 modules under operation, we used radioactive external neutron sources to measure the transverse resolution of ionisation position in a drift cell for high energy electrons. To study the vertex reconstruction on the source foil, sources of {sup 207}Bi, which produced conversion electrons, were used inside the 3 modules. The second part of this thesis, we show, with simulations, that we can measure, with NEMO3 detector itself, the ultra low level of contamination in {sup 208}Tl of the source foil, which comes from the natural radioactive chain of thorium. Using electron-photons channels, we can obtain the {sup 208}Tl activity in the sources. With an analysis on the energy and on the time of flight of particles, NEMO3 is able to reach a sensitivity of 20{mu}Bq/kg after only 2 months of measurement. This sensitivity is the maximum {sup 208}Tl activity, which we accepted for the sources in the NEMO3 proposal. (author)

  9. Novel hypomorphic mutation in IKBKG impairs NEMO-ubiquitylation causing ectodermal dysplasia, immunodeficiency, incontinentia pigmenti, and immune thrombocytopenic purpura.

    Science.gov (United States)

    Ramírez-Alejo, Noé; Alcántara-Montiel, Julio C; Yamazaki-Nakashimada, Marco; Duran-McKinster, Carola; Valenzuela-León, Paola; Rivas-Larrauri, Francisco; Cedillo-Barrón, Leticia; Hernández-Rivas, Rosaura; Santos-Argumedo, Leopoldo

    2015-10-01

    NF-κB essential modulator (NEMO) is a component of the IKK complex, which participates in the activation of the NF-κB pathway. Hypomorphic mutations in the IKBKG gene result in different forms of anhidrotic ectodermal dysplasia with immunodeficiency (EDA-ID) in males without affecting carrier females. Here, we describe a hypomorphic and missense mutation, designated c.916G>A (p.D306N), which affects our patient, his mother, and his sister. This mutation did not affect NEMO expression; however, an immunoprecipitation assay revealed reduced ubiquitylation upon CD40-stimulation in the patient's cells. Functional studies have demonstrated reduced phosphorylation and degradation of IκBα, affecting NF-κB recruitment into the nucleus. The patient presented with clinical features of ectodermal dysplasia, immunodeficiency, and immune thrombocytopenic purpura, the latter of which has not been previously reported in a patient with NEMO deficiency. His mother and sister displayed incontinentia pigmenti indicating that, in addition to amorphic mutations, hypomorphic mutations in NEMO can affect females. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. NEMO-SN1 observatory developments in view of the European Research Infrastructures EMSO and KM3NET

    Energy Technology Data Exchange (ETDEWEB)

    Favali, Paolo, E-mail: emsopp@ingv.i [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Beranzoli, Laura [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Italiano, Francesco [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Palermo, Via Ugo La Malfa 153, 90146 Palermo (Italy); Migneco, Emilio; Musumeci, Mario; Papaleo, Riccardo [Istituto Nazionale di Fisica Nucleare (INFN), Laboratori Nazionali del Sud, Via di S. Sofia 62, 95125 Catania (Italy)

    2011-01-21

    NEMO-SN1 (Western Ionian Sea off Eastern Sicily), the first real-time multiparameter observatory operating in Europe since 2005, is one of the nodes of the upcoming European ESFRI large-scale research infrastructure EMSO (European Multidisciplinary Seafloor Observatory), a network of seafloor observatories placed at marine sites on the European Continental Margin. NEMO-SN1 constitutes also an important test-site for the study of prototypes of Kilometre Cube Neutrino Telescope (KM3NeT), another European ESFRI large-scale research infrastructure. Italian resources have been devoted to the development of NEMO-SN1 facilities and logistics, as with the PEGASO project, while the EC project ESONET-NoE is funding a demonstration mission and a technological test. EMSO and KM3NeT are presently in the Preparatory Phase as projects funded under the EC-FP7.

  11. Advanced energy systems and technologies research in Finland. NEMO 2 annual report 1994-1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    Advanced energy technologies were linked to the national energy research in beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry set up many energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on technological solutions. In the beginning of the 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies such as energy storage and hydrogen technology. Resources has been focused on three specific areas: Arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). It seems that in Finland the growth of the new energy technology industry is focused on these areas. The sales of the industry have been growing considerable due to the national research activities and support of technology development. The sales have increased 6 - 7 times compared to the year 1987 and is now over 200 million FIM. The support to industries and their involvement in the program has grown more than 15 times compared to 1988. The total funding of the NEMO 2 program me was 30 million FIM in 1994 and 21 million FIM in 1995. The programme consists of 20 research projects, 15 joint development projects, and 5 EU projects. In this report, the essential research projects of the programme in 1994-1995 are described. The total funding for these projects was about 25 million FIM, of which the TEKES`s share was about half. When the research projects and joint development projects are

  12. Double-beta decay measurement of 100Mo to the excited 01+ state of 100Ru in the NEMO3 experiment - R/D program for SuperNEMO: development of a BiPo detector to measure ultra low contaminations in the source foils

    International Nuclear Information System (INIS)

    Chapon, A.

    2011-10-01

    The NEMO3 detector was designed for the study of double beta decay and in particular the search for neutrinoless double beta decay (ββ0ν). The quantity of 100 Mo in the detector (7 kg) allows also a competitive measurement of the two-neutrino double beta decay (ββ2ν) of 100 Mo to the excited 0 1 + state of 100 Ru (eeNγ channel). Monte-Carlo simulations of the effect and of all the possible sources of background have been studied in order to determine their contributions to the full NEMO3 experimental data (2003-2011). These one have then been analysed: the ββ2ν decay half-life has been measured, and a limit on the ββ0ν decay has been obtained. Moreover, the SuperNEMO experiment aims to reach a sensitivity up to 10 26 years on the half-life of neutrinoless double beta decay. The SuperNEMO detector radioactivity has to be as low as possible. Especially radio-purity levels of 2 μBq*kg -1 in 208 Tl and 10 μBq*kg -1 in 214 Bi are required for the source foils. The gamma-spectrometry can not measure such low contamination levels. Hence, a BiPo dedicated detector has been developed to measure 208 Tl and 214 Bi contaminations, identifying the Bi→Po→Pb β-α chains. A proof of principle has been performed and the detector background has been measured. Assuming these values, a full BiPo detector of 3.6 m 2 can achieve the required sensitivities for the SuperNEMO source foils within six months of measurement. (author)

  13. Evaluation of QoS supported in Network Mobility NEMO environments

    International Nuclear Information System (INIS)

    Hussien, L F; Abdalla, A H; Habaebi, M H; Khalifa, O O; Hassan, W H

    2013-01-01

    Network mobility basic support (NEMO BS) protocol is an entire network, roaming as a unit which changes its point of attachment to the Internet and consequently its reachability in the network topology. NEMO BS doesn't provide QoS guarantees to its users same as traditional Internet IP and Mobile IPv6 as well. Typically, all the users will have same level of services without considering about their application requirements. This poses a problem to real-time applications that required QoS guarantees. To gain more effective control of the network, incorporated QoS is needed. Within QoS-enabled network the traffic flow can be distributed to various priorities. Also, the network bandwidth and resources can be allocated to different applications and users. Internet Engineering Task Force (IETF) working group has proposed several QoS solutions for static network such as IntServ, DiffServ and MPLS. These QoS solutions are designed in the context of a static environment (i.e. fixed hosts and networks). However, they are not fully adapted to mobile environments. They essentially demands to be extended and adjusted to meet up various challenges involved in mobile environments. With existing QoS mechanisms many proposals have been developed to provide QoS for individual mobile nodes (i.e. host mobility). In contrary, research based on the movement of the whole mobile network in IPv6 is still undertaking by the IETF working groups (i.e. network mobility). Few researches have been done in the area of providing QoS for roaming networks. Therefore, this paper aims to review and investigate (previous /and current) related works that have been developed to provide QoS in mobile network. Consequently, a new proposed scheme will be introduced to enhance QoS within NEMO environment, achieving by which seamless mobility to users of mobile network node (MNN)

  14. Atmospheric muons in the NEMO Phase 1 detector at the Catania test site

    International Nuclear Information System (INIS)

    Margiotta, Annarita

    2006-01-01

    The NEMO Collaboration is involved in a long term R and D activity towards the construction of a km 3 telescope in the Mediterranean sea. It has dedicated special efforts in the development of technologies for a km 3 detector and in the search, characterization and monitoring of a deep sea site adequate for the installation of the Mediterranean km 3 . Now the NEMO Collaboration is involved in the Phase 1 of the project, planning to install a fully equipped deep-sea facility to test prototypes and develop new technologies for the detector. A full Monte Carlo simulation has been performed to analyse the response of a reduced-size detector to the passage of atmospheric muons. Preliminary steps of the simulation are presented in this work

  15. Radon emanation chamber: High sensitivity measurements for the SuperNEMO experiment

    Energy Technology Data Exchange (ETDEWEB)

    Soulé, B. [Université Bordeaux 1, Centre d' Etudes Nucléaires de Bordeaux Gradignan, UMR 5797, Chemin du Solarium, Le Haut-Vigneau, BP120, F-33175 Gradignan (France); Collaboration: SuperNEMO Collaboration; and others

    2013-08-08

    Radon is a well-known source of background in ββ0ν experiments due to the high Q{sub β} value of one of its daughter nucleus, {sup 214}Bi. The SuperNEMO collaboration requires a maximum radon contamination of 0.1 mBq/m{sup 3} inside its next-generation double beta decay detector. To reach such a low activity, a drastic screening process has been set for the selection of the detector's materials. In addition to a good radiopurity, a low emanation rate is required. To test this parameter, a Radon Emanation Setup is running at CENBG. It consists in a large emanation chamber connected to an electrostatic detector. By measuring large samples and having a low background level, this setup reaches a sensitivity of a few μ Bq. m{sup −2}. d{sup −1} and is able to qualify materials used in the construction of the SuperNEMO detector.

  16. Search for evidence of lepton number violation by neutrinoless double beta decay process from 82Se and 150Nd in NEMO-3 experiment: Bi-Po decay study from thoron chain

    International Nuclear Information System (INIS)

    Lemiere, Y.

    2008-09-01

    The NEMO-3 experiment searches for a neutrinoless double beta decay signal (ββ0ν) with an expected sensitivity in terms of the half-life limit of the order of 10 24 years. The discovery of this signal, forbidden in the Standard Model, would imply the violation of leptonic number conservation and would allow to determine the nature of this particle (Dirac or Majorana) and measure the neutrino mass scale. The goal of this work is to study high energy events from 82 Se and 150 Nd ββ decay used in NEMO-3 detector. The first part of this work consists in the elaboration of a background model using NEMO-3 data. In the second part, the ββ2ν half-life and a lower limit of the ββ0ν half-life are computed using massive Majorana neutrino exchange hypothesis, we have got: T(0ν) > 1.44*10 22 years for 150 Nd and T(0ν) > 1.82*10 23 years for 82 Se. The upper limits for the effective mass of the Majorana neutrinos are also computed, we obtain: m ββ 150 Nd and m ββ 82 Se. In the last part, the measurement of some specific thallium contamination is performed thanks to the NEMO-3 capability to detect the 212 Bi-Po decay. The measured value of the surface contamination of the calorimeter is about (150 ± 30) μBq/m 3 . So the surface contamination is too low to intervene in the data analysis of NEMO-3 but appears important for next generation scintillators

  17. Advanced energy systems and technologies research in Finland. NEMO-2 Programme Annual Report 1996-1997

    International Nuclear Information System (INIS)

    1998-01-01

    Advanced energy technologies were linked to the national energy research in the beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry established several energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on a few promising technological solutions. In the beginning of 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies, such as energy storage and hydrogen technology. Resources have been focused on three specific areas: arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). In Finland, the growth of the new energy technology industry is concentrated on these areas. The turnover of the Finnish industry has been growing considerably due to the national research activities and support of technology development. The sales have increased more than 10 times compared with the year 1987 and is now over 300 million FIM. The support to industries and their involvement in the program has grown considerably. In this report, the essential research projects of the programme during 1996-1997 are described. The total funding for these projects was about 30 million FIM per year, of which the TEKES's share was about 40 per cent. The programme consists of 10 research projects, some 15 joint development projects, and 9 EU projects. In case the research projects and joint development projects are acting very closely, the description of the project is

  18. Advanced energy systems and technologies research in Finland. NEMO-2 Programme Annual Report 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-10-01

    Advanced energy technologies were linked to the national energy research in the beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry established several energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on a few promising technological solutions. In the beginning of 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies, such as energy storage and hydrogen technology. Resources have been focused on three specific areas: arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). In Finland, the growth of the new energy technology industry is concentrated on these areas. The turnover of the Finnish industry has been growing considerably due to the national research activities and support of technology development. The sales have increased more than 10 times compared with the year 1987 and is now over 300 million FIM. The support to industries and their involvement in the program has grown considerably. In this report, the essential research projects of the programme during 1996-1997 are described. The total funding for these projects was about 30 million FIM per year, of which the TEKES`s share was about 40 per cent. The programme consists of 10 research projects, some 15 joint development projects, and 9 EU projects. In case the research projects and joint development projects are acting very closely, the description of the project is

  19. A Spectrum Handoff Scheme for Optimal Network Selection in NEMO Based Cognitive Radio Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Krishan Kumar

    2017-01-01

    Full Text Available When a mobile network changes its point of attachments in Cognitive Radio (CR vehicular networks, the Mobile Router (MR requires spectrum handoff. Network Mobility (NEMO in CR vehicular networks is concerned with the management of this movement. In future NEMO based CR vehicular networks deployment, multiple radio access networks may coexist in the overlapping areas having different characteristics in terms of multiple attributes. The CR vehicular node may have the capability to make call for two or more types of nonsafety services such as voice, video, and best effort simultaneously. Hence, it becomes difficult for MR to select optimal network for the spectrum handoff. This can be done by performing spectrum handoff using Multiple Attributes Decision Making (MADM methods which is the objective of the paper. The MADM methods such as grey relational analysis and cost based methods are used. The application of MADM methods provides wider and optimum choice among the available networks with quality of service. Numerical results reveal that the proposed scheme is effective for spectrum handoff decision for optimal network selection with reduced complexity in NEMO based CR vehicular networks.

  20. Correlating interleukin-12 stimulated interferon-γ production and the absence of ectodermal dysplasia and anhidrosis (EDA) in patients with mutations in NF-κB essential modulator (NEMO).

    Science.gov (United States)

    Haverkamp, Margje H; Marciano, Beatriz E; Frucht, David M; Jain, Ashish; van de Vosse, Esther; Holland, Steven M

    2014-05-01

    Patients with hypomorphic mutations in Nuclear Factor-κB Essential Modulator (NEMO) are immunodeficient (ID) and most display ectodermal dysplasia and anhidrosis (EDA). We compared cytokine production by NEMO-ID patients with and without EDA. PBMCs of NEMO-ID patients, four with EDA carrying E315A, C417R, D311N and Q403X, and three without EDA carrying E315A, E311_L333del and R254G, were cultured with PHA, PHA plus IL-12p70, LPS, LPS plus IFN-γ, TNF and IL-1β. The production of various cytokines was measured in the supernatants. Fifty-nine healthy individuals served as controls. PBMCs of NEMO-ID patients without EDA produce subnormal amounts of IFN-γ after stimulation with PHA, but normal amounts of IFN-γ after PHA plus IL-12p70. In contrast, IFN-γ production by patients with EDA was low in both cases. Patients with EDA also generate lower PHA-stimulated IL-10 and IL-1β than controls, whereas the production of these cytokines by patients without EDA was normal. Responses of PBMCs in NEMO-ID patients with EDA to PHA with and without IL-12p70 appear less robust than in NEMO-ID patients without EDA. This possibly indicates a better preserved NEMO function in our patients without EDA.

  1. Jules Verne's Captain Nemo and French Revolutionary Gustave Flourens:A Hidden Character Model?

    Directory of Open Access Journals (Sweden)

    Leonidas Kallivretakis

    2005-01-01

    Full Text Available This article treats the recent assumption made by Vernian specialist William Butcher that Jules Verne's most famous character, Captain Nemo, is based on the French revolutionary intellectual Gustave Flourens (1838-1871, son of the eminent physiologist J. P. M. Flourens. Gustave Flourens fought in the Cretan insurrection of 1866-1868, later participated in the republican opposition against Napoleon III's imperial regime, eventually became a friend of Karl Marx and was finally killed as a general of the Paris Commune. By comparing step-by-step Verne's inspiration and writing procedures with Flourens' unfolding activities and fame, it is concluded that there is little basis for such an assumption. The article includes also a brief account of the Cretan question in the nineteenth century and of the deep discord between Marx's and Flourens' respective analyses of the Eastern Question.

  2. Development of high performance and very low radioactivity scintillation counters for the SuperNEMO calorimeter

    International Nuclear Information System (INIS)

    Chauveau, E.

    2010-11-01

    SuperNEMO is a next generation double beta decay experiment which will extend the successful 'tracko-calo' technique employed in NEMO 3. The main characteristic of this type of detector is to identify not only double beta decays, but also to measure its own background components. The project aims to reach a sensitivity up to 10 26 years on the half-life of 82 Se. One of the main challenge of the Research and Development is to achieve an unprecedented energy resolution for the electron calorimeter, better than 8 % FWHM at 1 MeV. This thesis contributes to improve scintillators and photomultipliers performances and reduce their radioactivity, including in particular the development of a new photomultiplier in collaboration with Photonis. (author)

  3. Cellular automaton and elastic net for event reconstruction in the NEMO-2 experiment

    International Nuclear Information System (INIS)

    Kovalenko, V.

    1997-01-01

    A cellular automaton for track searching and an elastic net for charged particle trajectory fitting are presented. The advantages of the methods are: simplicity of the algorithms, fast and stable convergence to real tracks, and a reconstruction efficiency close to 100%. Demonstration programs are available at http://nuweb.jinr.dubna.su/LNP/NEMO using a Java enabled browser. (orig.)

  4. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    International Nuclear Information System (INIS)

    Adrian-Martinez, S.; Ardid, M.; Llorens Alvarez, C.D.; Saldana, M.; Aiello, S.; Giordano, V.; Leonora, E.; Longhitano, F.; Randazzo, N.; Sipala, V.; Ventura, C.; Ameli, F.; Biagioni, A.; De Bonis, G.; Fermani, P.; Lonardo, A.; Nicolau, C.A.; Simeone, F.; Vicini, P.; Anghinolfi, M.; Hugon, C.; Musico, P.; Orzelli, A.; Sanguineti, M.; Barbarino, G.; Barbato, F.C.T.; De Rosa, G.; Di Capua, F.; Garufi, F.; Vivolo, D.; Barbarito, E.; Beverini, N.; Calamai, M.; Maccioni, E.; Marinelli, A.; Terreni, G.; Biagi, S.; Cacopardo, G.; Cali, C.; Caruso, F.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D'Amato, C.; De Luca, V.; Distefano, C.; Gmerk, A.; Grasso, R.; Imbesi, M.; Kulikovskiy, V.; Larosa, G.; Lattuada, D.; Leismueller, K.P.; Litrico, P.; Migneco, E.; Miraglia, A.; Musumeci, M.; Orlando, A.; Papaleo, R.; Pulvirenti, S.; Riccobene, G.; Rovelli, A.; Sapienza, P.; Sciacca, V.; Speziale, F.; Spitaleri, A.; Trovato, A.; Viola, S.; Bouhadef, B.; Flaminio, V.; Raffaelli, F.; Bozza, C.; Grella, G.; Stellacci, S.M.; Calvo, D.; Real, D.; Capone, A.; Masullo, R.; Perrina, C.; Ceres, A.; Circella, M.; Mongelli, M.; Sgura, I.; Chiarusi, T.; D'Amico, A.; Deniskina, N.; Migliozzi, P.; Mollo, C.M.; Enzenhoefer, A.; Lahmann, R.; Ferrara, G.; Fusco, L.A.; Margiotta, A.; Pellegrino, C.; Spurio, M.; Lo Presti, D.; Pugliatti, C.; Martini, A.; Trasatti, L.; Morganti, M.; Pellegriti, M.G.; Piattelli, P.; Taiuti, M.

    2016-01-01

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than 1 year at the ''record'' depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the Italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than 1 year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of 40 K light emission, with a small percentage of light bursts due to bioluminescence. All these features confirm the stability and good optical properties of the site. (orig.)

  5. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    Energy Technology Data Exchange (ETDEWEB)

    Adrian-Martinez, S.; Ardid, M.; Llorens Alvarez, C.D.; Saldana, M. [Universitat Politecnica de Valencia, Instituto de Investigacion para la Gestion Integrada de las Zonas Costeras, Gandia (Spain); Aiello, S.; Giordano, V.; Leonora, E.; Longhitano, F.; Randazzo, N.; Sipala, V.; Ventura, C. [INFN Sezione Catania, Catania (Italy); Ameli, F.; Biagioni, A.; De Bonis, G.; Fermani, P.; Lonardo, A.; Nicolau, C.A.; Simeone, F.; Vicini, P. [INFN Sezione Roma, Rome (Italy); Anghinolfi, M.; Hugon, C.; Musico, P.; Orzelli, A.; Sanguineti, M. [INFN Sezione Genova, Genoa (Italy); Barbarino, G.; Barbato, F.C.T.; De Rosa, G.; Di Capua, F.; Garufi, F.; Vivolo, D. [INFN Sezione Napoli, Naples (Italy); Dipartimento di Scienze Fisiche Universita di Napoli, Naples (Italy); Barbarito, E. [INFN Sezione Bari, Bari (Italy); Dipartimento Interateneo di Fisica Universita di Bari, Bari (Italy); Beverini, N.; Calamai, M.; Maccioni, E.; Marinelli, A.; Terreni, G. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Dipartimento di Fisica Universita di Pisa, Polo Fibonacci, Pisa (Italy); Biagi, S.; Cacopardo, G.; Cali, C.; Caruso, F.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D' Amato, C.; De Luca, V.; Distefano, C.; Gmerk, A.; Grasso, R.; Imbesi, M.; Kulikovskiy, V.; Larosa, G.; Lattuada, D.; Leismueller, K.P.; Litrico, P.; Migneco, E.; Miraglia, A.; Musumeci, M.; Orlando, A.; Papaleo, R.; Pulvirenti, S.; Riccobene, G.; Rovelli, A.; Sapienza, P.; Sciacca, V.; Speziale, F.; Spitaleri, A.; Trovato, A.; Viola, S. [INFN Laboratori Nazionali del Sud, Catania (Italy); Bouhadef, B.; Flaminio, V.; Raffaelli, F. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Bozza, C.; Grella, G.; Stellacci, S.M. [INFN Gruppo Collegato di Salerno, Fisciano (Italy); Dipartimento di Fisica Universita di Salerno, Fisciano (Italy); Calvo, D.; Real, D. [CSIC-Universitat de Valencia, IFIC-Instituto de Fisica Corpuscular, Valencia (Spain); Capone, A.; Masullo, R.; Perrina, C. [INFN Sezione Roma, Rome (Italy); Dipartimento di Fisica Universita ' ' Sapienza' ' , Rome (Italy); Ceres, A.; Circella, M.; Mongelli, M.; Sgura, I. [INFN Sezione Bari, Bari (Italy); Chiarusi, T. [INFN Sezione Bologna, Bologna (Italy); D' Amico, A. [INFN Laboratori Nazionali del Sud, Catania (Italy); Nikhef, Science Park, Amsterdam (Netherlands); Deniskina, N.; Migliozzi, P.; Mollo, C.M. [INFN Sezione Napoli, Naples (Italy); Enzenhoefer, A.; Lahmann, R. [Friedrich-Alexander-Universitaet Erlangen-Nuernberg, Erlangen Centre for Astroparticle Physics, Erlangen (Germany); Ferrara, G. [INFN Laboratori Nazionali del Sud, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Fusco, L.A.; Margiotta, A.; Pellegrino, C.; Spurio, M. [INFN Sezione Bologna, Bologna (Italy); Dipartimento di Fisica ed Astronomia Universita di Bologna, Bologna (Italy); Lo Presti, D.; Pugliatti, C. [INFN Sezione Catania, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Martini, A.; Trasatti, L. [INFN Laboratori Nazionali di Frascati, Frascati (Italy); Morganti, M. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Accademia Navale di Livorno, Livorno (Italy); Pellegriti, M.G. [INFN Laboratori Nazionali del Sud, Catania (IT); Piattelli, P. [INFN Laboratori Nazionali del Sud, Catania (IT); Taiuti, M. [INFN Sezione Genova, Genoa (IT); Dipartimento di Fisica Universita di Genova, Genoa (IT)

    2016-02-15

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than 1 year at the ''record'' depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the Italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than 1 year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of {sup 40}K light emission, with a small percentage of light bursts due to bioluminescence. All these features confirm the stability and good optical properties of the site. (orig.)

  6. Response of water temperature to surface wave effects in the Baltic Sea: simulations with the coupled NEMO-WAM model

    Science.gov (United States)

    Alari, Victor; Staneva, Joanna; Breivik, Øyvind; Bidlot, Jean-Raymond; Mogensen, Kristian; Janssen, Peter

    2016-04-01

    The effects of wind waves on the Baltic Sea water temperature has been studied by coupling the hydrodynamical model NEMO with the wave model WAM. The wave forcing terms that have been taken into consideration are: Stokes-Coriolis force, seastate dependent energy flux and sea-state dependent momentum flux. The combined role of these processes as well as their individual contributions on simulated temperature is analysed. The results indicate a pronounced effect of waves on surface temperature, on the distribution of vertical temperature and on upwellinǵs. In northern parts of the Baltic Sea a warming of the surface layer occurs in the wave included simulations. This in turn reduces the cold bias between simulated and measured data. The warming is primarily caused by sea-state dependent energy flux. Wave induced cooling is mostly observed in near coastal areas and is mainly due to Stokes-Coriolis forcing. The latter triggers effect of intensifying upwellings near the coasts, depending on the direction of the wind. The effect of sea-state dependent momentum flux is predominantly to warm the surface layer. During the summer the wave induced water temperature changes were up to 1 °C.

  7. PORFIDO on the NEMO Phase 2 tower

    Energy Technology Data Exchange (ETDEWEB)

    Ciaffoni, Orlando; Cordelli, Marco; Habel, Roberto; Martini, Agnese; Trasatti, Luciano [INFN-Laboratori Nazionali di Frascati, Via E. Fermi 40, I-00044 Frascati (RM) (Italy)

    2014-11-18

    We have designed and built an underwater measurement system, PORFIDO (Physical Oceanography by RFID Outreach) to gather oceanographic data from the Optical Modules of a neutrino telescope with a minimum of disturbance to the main installation. PORFIDO is composed of a sensor glued to the outside of an Optical Module, in contact with seawater, and of a reader placed inside the sphere, facing the sensor. Data are transmitted to the reader through the glass by RFID and to shore in real time for periods of years. The sensor gathers power from the radio frequency, thus eliminating the need for batteries or connectors through the glass. We have deployed four PORFIDO probes measuring temperatures with the NEMO-KM3Net-Italy Phase 2 tower in april 2013. The four probes are operative and are transmitting temperature data from 3500 m depth.

  8. PORFIDO on the NEMO Phase 2 tower

    International Nuclear Information System (INIS)

    Ciaffoni, Orlando; Cordelli, Marco; Habel, Roberto; Martini, Agnese; Trasatti, Luciano

    2014-01-01

    We have designed and built an underwater measurement system, PORFIDO (Physical Oceanography by RFID Outreach) to gather oceanographic data from the Optical Modules of a neutrino telescope with a minimum of disturbance to the main installation. PORFIDO is composed of a sensor glued to the outside of an Optical Module, in contact with seawater, and of a reader placed inside the sphere, facing the sensor. Data are transmitted to the reader through the glass by RFID and to shore in real time for periods of years. The sensor gathers power from the radio frequency, thus eliminating the need for batteries or connectors through the glass. We have deployed four PORFIDO probes measuring temperatures with the NEMO-KM3Net-Italy Phase 2 tower in april 2013. The four probes are operative and are transmitting temperature data from 3500 m depth

  9. Oceanographic conditions in the NEMO region during the KM3NeT project (April 2006-May 2009)

    International Nuclear Information System (INIS)

    Sparnocchia, Stefania; Pietro Gasparini, Gian; Schroeder, Katrin; Borghini, Mireno

    2011-01-01

    An intense observational activity was conducted in the NEMO region, western Ionian Sea, 40 nm south-east of Capo Passero (Sicily), in the framework of the KM3NeT project. Several oceanographic cruises were performed from 2006 to 2009 and current measurements carried out. The new data describe the present status of the deep layer and its evolution after the occurrence of a notable change that affected the Eastern Mediterranean water masses and circulation during the 1990's. In particular, they evidence the presence of a newly formed water mass in the abyssal layer of the Ionian Sea, coming likely from the Adriatic. Deep currents in the region are quite energetic, as already known, and highly variable both spatially and in strength. They are organized in a cyclonic circuit, with a prevalent north-west direction corresponding to the NEMO site.

  10. Comparative study of sea ice dynamics simulations with a Maxwell elasto-brittle rheology and the elastic-viscous-plastic rheology in NEMO-LIM3

    Science.gov (United States)

    Raulier, Jonathan; Dansereau, Véronique; Fichefet, Thierry; Legat, Vincent; Weiss, Jérôme

    2017-04-01

    Sea ice is a highly dynamical environment characterized by a dense mesh of fractures or leads, constantly opening and closing over short time scales. This characteristic geomorphology is linked to the existence of linear kinematic features, which consist of quasi-linear patterns emerging from the observed strain rate field of sea ice. Standard rheologies used in most state-of-the-art sea ice models, like the well-known elastic-viscous-plastic rheology, are thought to misrepresent those linear kinematic features and the observed statistical distribution of deformation rates. Dedicated rheologies built to catch the processes known to be at the origin of the formation of leads are developed but still need evaluations on the global scale. One of them, based on a Maxwell elasto-brittle formulation, is being integrated in the NEMO-LIM3 global ocean-sea ice model (www.nemo-ocean.eu; www.elic.ucl.ac.be/lim). In the present study, we compare the results of the sea ice model LIM3 obtained with two different rheologies: the elastic-viscous-plastic rheology commonly used in LIM3 and a Maxwell elasto-brittle rheology. This comparison is focused on the statistical characteristics of the simulated deformation rate and on the ability of the model to reproduce the existence of leads within the ice pack. The impact of the lead representation on fluxes between ice, atmosphere and ocean is also assessed.

  11. Cellular automaton and elastic net for event reconstruction in the NEMO-2 experiment

    International Nuclear Information System (INIS)

    Kisel, I.; Kovalenko, V.; Laplanche, F.

    1997-01-01

    A cellular automaton for track searching combined with an elastic net for charged particle trajectory fitting is presented. The advantages of the methods are: the simplicity of the algorithms, the fast and stable convergency to real tracks, and a good reconstruction efficiency. The combination of techniques have been used with success for event reconstruction on the data of the NEMO-2 double-beta (ββ) decay experiments. (orig.)

  12. NEMO-SMO acoustic array: A deep-sea test of a novel acoustic positioning system for a km3-scale underwater neutrino telescope

    Science.gov (United States)

    Viola, S.; Ardid, M.; Bertin, V.; Enzenhöfer, A.; Keller, P.; Lahmann, R.; Larosa, G.; Llorens, C. D.; NEMO Collaboration; SMO Collaboration

    2013-10-01

    Within the activities of the NEMO project, the installation of a 8-floors tower (NEMO-Phase II) at a depth of 3500 m is foreseen in 2012. The tower will be installed about 80 km off-shore Capo Passero, in Sicily. On board the NEMO tower, an array of 18 acoustic sensors will be installed, permitting acoustic detection of biological sources, studies for acoustic neutrino detection and primarily acoustic positioning of the underwater structures. For the latter purpose, the sensors register acoustic signals emitted by five acoustic beacons anchored on the sea-floor. The data acquisition system of the acoustic sensors is fully integrated with the detector data transport system and is based on an “all data to shore” philosophy. Signals coming from hydrophones are continuously sampled underwater at 192 kHz/24 bit and transmitted to shore through an electro-optical cable for real-time analysis. A novel technology for underwater GPS time-stamping of data has been implemented and tested. The operation of the acoustic array will permit long-term test of sensors and electronics technologies that are proposed for the acoustic positioning system of KM3NeT.

  13. Hypohidrotic ectodermal dysplasia and immunodeficiency with coincident NEMO and EDA Mutations

    Directory of Open Access Journals (Sweden)

    Michael D. Keller

    2011-11-01

    Full Text Available Ectodermal dysplasias (ED are uncommon genetic disorders resulting in abnormalities in ectodermally-derived structures. Though many ED-associated genes have been described, the NF-κB Essential Modulator (NEMO encoded by the IKBKG gene is unique in that mutations also result in severe humoral and cellular immunologic defects. We describe three unrelated kindreds with defects in both EDA and IKBKG resulting from an X-chromosome crossover. This demonstrates the importance of thorough immunologic consideration of patients with ED even when an EDA etiology is confirmed, and raises the possibility of a specific phenotype arising from coincident mutations in EDA and IKBKB.

  14. NEMO: Extraction and normalization of organization names from PubMed affiliations.

    Science.gov (United States)

    Jonnalagadda, Siddhartha Reddy; Topham, Philip

    2010-10-04

    Today, there are more than 18 million articles related to biomedical research indexed in MEDLINE, and information derived from them could be used effectively to save the great amount of time and resources spent by government agencies in understanding the scientific landscape, including key opinion leaders and centers of excellence. Associating biomedical articles with organization names could significantly benefit the pharmaceutical marketing industry, health care funding agencies and public health officials and be useful for other scientists in normalizing author names, automatically creating citations, indexing articles and identifying potential resources or collaborators. Large amount of extracted information helps in disambiguating organization names using machine-learning algorithms. We propose NEMO, a system for extracting organization names in the affiliation and normalizing them to a canonical organization name. Our parsing process involves multi-layered rule matching with multiple dictionaries. The system achieves more than 98% f-score in extracting organization names. Our process of normalization that involves clustering based on local sequence alignment metrics and local learning based on finding connected components. A high precision was also observed in normalization. NEMO is the missing link in associating each biomedical paper and its authors to an organization name in its canonical form and the Geopolitical location of the organization. This research could potentially help in analyzing large social networks of organizations for landscaping a particular topic, improving performance of author disambiguation, adding weak links in the co-author network of authors, augmenting NLM's MARS system for correcting errors in OCR output of affiliation field, and automatically indexing the PubMed citations with the normalized organization name and country. Our system is available as a graphical user interface available for download along with this paper.

  15. NEMO binds ubiquitinated TANK-binding kinase 1 (TBK1 to regulate innate immune responses to RNA viruses.

    Directory of Open Access Journals (Sweden)

    Lingyan Wang

    Full Text Available RIG-I-like receptors (RLR are intracellular sensors utilized by nearly all cell types for recognition of viral RNA, initiation of antiviral defense, and induction of type I interferons (IFN. TBK1 is a critical kinase implicated in RLR-dependent IFN transcription. Posttranslational modification of TBK1 by K63-linked ubiquitin is required for RLR driven signaling. However, the TBK1 ubiquitin acceptor sites and the function of ubiquitinated TBK1 in the signaling cascade are unknown. We now show that TBK1 is ubiquitinated on residues K69, K154, and K372 in response to infection with RNA virus. The K69 and K154 residues are critical for innate antiviral responses and IFN production. Ubiquitinated TBK1 recruits the downstream adaptor NEMO through ubiquitin binding domains. The assembly of the NEMO/TBK1 complex on the mitochondrial protein MAVS leads to activation of TBK1 kinase activity and phosphorylation of the transcription factor, interferon response factor 3. The combined results refine current views of RLR signaling, define the role of TBK1 polyubiquitination, and detail the mechanisms involved in signalosome assembly.

  16. Study of the background of the neutrinoless double {beta} decay with the detector NEMO 2: contribution arising from the radon diffusion and internal pollution of the source {sup 214}Bi have been estimated; Etude du bruit de fond de la double-desintegration {beta} sans emission de neutrino dans le detecteur NEMO 2: contribution du radon ambiant et mesure de la pollution interne de la source en {sup 214}Bi

    Energy Technology Data Exchange (ETDEWEB)

    Mauger, F.

    1995-02-01

    The NEMO experiment is designed to understand the nature of the neutrino by studying the double beta decay of Mo-100 which is related to the Majorana neutrino effective mass. In this kind of experiment a good understanding of the different sources of background is crucial as only few events are expected per year at the required level of sensitivity. In this thesis we present the main theoretical and experimental aspects of the measurement of the neutrinoless double beta decay of Mo-100 with the prototype detector NEMO2. The goal of this study is to obtain a realistic interpretation of the few events detected at high energy in the two-electron channel as a background to neutrinoless double beta decay. In particular, the contribution arising from Bi-214 has been investigated. These events have been selected and analysed by means of the beta-alpha decays of Bi-214 into Pb-210. The events are characterized by a delayed track in the wire chamber and the corresponding signal is rather clean. The study has demonstrated the diffusion of Rn-222 into the detector and its contribution to Bi-214 pollution has been estimated. A measurement of the Bi-214 internal contamination of the source has been made as well as an estimation of the Bi-214 deposit due to Rn-222. As a result of this study it appears that, under the conditions of the NEMO2 experiment, the Bi and Rn contributions are of the same order of magnitude as the background induced at high energy by two-neutrino double beta decay. In conclusion, the backgrounds of the neutrinoless double beta decay of Mo-100 are well understood in the NEMO2 experiment leading to an extrapolation for the NEMO3 experiment. (authors).

  17. Gene therapy decreases seizures in a model of Incontinentia pigmenti.

    Science.gov (United States)

    Dogbevia, Godwin K; Töllner, Kathrin; Körbelin, Jakob; Bröer, Sonja; Ridder, Dirk A; Grasshoff, Hanna; Brandt, Claudia; Wenzel, Jan; Straub, Beate K; Trepel, Martin; Löscher, Wolfgang; Schwaninger, Markus

    2017-07-01

    Incontinentia pigmenti (IP) is a genetic disease leading to severe neurological symptoms, such as epileptic seizures, but no specific treatment is available. IP is caused by pathogenic variants that inactivate the Nemo gene. Replacing Nemo through gene therapy might provide therapeutic benefits. In a mouse model of IP, we administered a single intravenous dose of the adeno-associated virus (AAV) vector, AAV-BR1-CAG-NEMO, delivering the Nemo gene to the brain endothelium. Spontaneous epileptic seizures and the integrity of the blood-brain barrier (BBB) were monitored. The endothelium-targeted gene therapy improved the integrity of the BBB. In parallel, it reduced the incidence of seizures and delayed their occurrence. Neonate mice intravenously injected with the AAV-BR1-CAG-NEMO vector developed no hepatocellular carcinoma or other major adverse effects 11 months after vector injection, demonstrating that the vector has a favorable safety profile. The data show that the BBB is a target of antiepileptic treatment and, more specifically, provide evidence for the therapeutic benefit of a brain endothelial-targeted gene therapy in IP. Ann Neurol 2017;82:93-104. © 2017 American Neurological Association.

  18. Results of the BiPo-1 prototype for radiopurity measurements for the SuperNEMO double beta decay source foils

    Energy Technology Data Exchange (ETDEWEB)

    Argyriades, J. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Arnold, R. [IPHC, Universite de Strasbourg, CNRS/IN2P3, F-67037 Strasbourg (France); Augier, C. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Baker, J. [INL, Idaho Falls, ID 83415 (United States); Barabash, A.S. [Institute of Theoretical and Experimental Physics, 117259 Moscow (Russian Federation); Basharina-Freshville, A. [University College London, WC1E 6BT London (United Kingdom); Bongrand, M.; Bourgeois, C.; Breton, D.; Briere, M.; Broudin-Bay, G. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Brudanin, V.B. [Joint Institute for Neear Research, 141980 Dubna (Russian Federation); Caffrey, A.J. [INL, Idaho Falls, ID 83415 (United States); Carcel, S. [Instituto de Fisica Corpuscular, CSIC, Universidad de Valencia, Valencia (Spain); Cebrian, S. [Instituto de Fisica Nuclear y Altas Energias, Universidad de Zaragoza, Zaragoza (Spain); Chapon, A. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Chauveau, E. [CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Universite de Bordeaux, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Dafni, Th. [Instituto de Fisica Nuclear y Altas Energias, Universidad de Zaragoza, Zaragoza (Spain); Diaz, J. [Instituto de Fisica Corpuscular, CSIC, Universidad de Valencia, Valencia (Spain); Durand, D. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France)

    2010-10-01

    The development of BiPo detectors is dedicated to the measurement of extremely high radiopurity in {sup 208}Tl and {sup 214}Bi for the SuperNEMO double beta decay source foils. A modular prototype, called BiPo-1, with 0.8 m{sup 2} of sensitive surface area, has been running in the Modane Underground Laboratory since February, 2008. The goal of BiPo-1 is to measure the different components of the background and in particular the surface radiopurity of the plastic scintillators that make up the detector. The first phase of data collection has been dedicated to the measurement of the radiopurity in {sup 208}Tl. After more than one year of background measurement, a surface activity of the scintillators of A({sup 208}Tl)=1.5{mu}Bq/m{sup 2} is reported here. Given this level of background, a larger BiPo detector having 12 m{sup 2} of active surface area, is able to qualify the radiopurity of the SuperNEMO selenium double beta decay foils with the required sensitivity of A({sup 208}Tl)<2{mu}Bq/kg (90% C.L.) with a six month measurement.

  19. Assessment of the sea-ice carbon pump: Insights from a three-dimensional ocean-sea-ice biogeochemical model (NEMO-LIM-PISCES

    Directory of Open Access Journals (Sweden)

    Sébastien Moreau

    2016-08-01

    Full Text Available Abstract The role of sea ice in the carbon cycle is minimally represented in current Earth System Models (ESMs. Among potentially important flaws, mentioned by several authors and generally overlooked during ESM design, is the link between sea-ice growth and melt and oceanic dissolved inorganic carbon (DIC and total alkalinity (TA. Here we investigate whether this link is indeed an important feature of the marine carbon cycle misrepresented in ESMs. We use an ocean general circulation model (NEMO-LIM-PISCES with sea-ice and marine carbon cycle components, forced by atmospheric reanalyses, adding a first-order representation of DIC and TA storage and release in/from sea ice. Our results suggest that DIC rejection during sea-ice growth releases several hundred Tg C yr−1 to the surface ocean, of which < 2% is exported to depth, leading to a notable but weak redistribution of DIC towards deep polar basins. Active carbon processes (mainly CaCO3 precipitation but also ice-atmosphere CO2 fluxes and net community production increasing the TA/DIC ratio in sea-ice modified ocean-atmosphere CO2 fluxes by a few Tg C yr−1 in the sea-ice zone, with specific hemispheric effects: DIC content of the Arctic basin decreased but DIC content of the Southern Ocean increased. For the global ocean, DIC content increased by 4 Tg C yr−1 or 2 Pg C after 500 years of model run. The simulated numbers are generally small compared to the present-day global ocean annual CO2 sink (2.6 ± 0.5 Pg C yr−1. However, sea-ice carbon processes seem important at regional scales as they act significantly on DIC redistribution within and outside polar basins. The efficiency of carbon export to depth depends on the representation of surface-subsurface exchanges and their relationship with sea ice, and could differ substantially if a higher resolution or different ocean model were used.

  20. Correcting Biases in a lower resolution global circulation model with data assimilation

    Science.gov (United States)

    Canter, Martin; Barth, Alexander

    2016-04-01

    With this work, we aim at developping a new method of bias correction using data assimilation. This method is based on the stochastic forcing of a model to correct bias. First, through a preliminary run, we estimate the bias of the model and its possible sources. Then, we establish a forcing term which is directly added inside the model's equations. We create an ensemble of runs and consider the forcing term as a control variable during the assimilation of observations. We then use this analysed forcing term to correct the bias of the model. Since the forcing is added inside the model, it acts as a source term, unlike external forcings such as wind. This procedure has been developed and successfully tested with a twin experiment on a Lorenz 95 model. It is currently being applied and tested on the sea ice ocean NEMO LIM model, which is used in the PredAntar project. NEMO LIM is a global and low resolution (2 degrees) coupled model (hydrodynamic model and sea ice model) with long time steps allowing simulations over several decades. Due to its low resolution, the model is subject to bias in area where strong currents are present. We aim at correcting this bias by using perturbed current fields from higher resolution models and randomly generated perturbations. The random perturbations need to be constrained in order to respect the physical properties of the ocean, and not create unwanted phenomena. To construct those random perturbations, we first create a random field with the Diva tool (Data-Interpolating Variational Analysis). Using a cost function, this tool penalizes abrupt variations in the field, while using a custom correlation length. It also decouples disconnected areas based on topography. Then, we filter the field to smoothen it and remove small scale variations. We use this field as a random stream function, and take its derivatives to get zonal and meridional velocity fields. We also constrain the stream function along the coasts in order not to have

  1. NEMO 2 - Be aware: Wind and solar are coming

    Energy Technology Data Exchange (ETDEWEB)

    Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland)

    1996-12-31

    Finnish research and development is well placed with respect to new renewable energy technologies in that there exists considerable expertise in specialized areas. For example, over 20 % of all power transmission equipment and generators used in wind energy systems world-wide are manufactured in Finland, while advanced instruments for monitoring wind speed are also highly regarded internationally. Moreover, unique wind technology for complex windy and freezing conditions have been developed. Finland has a 10 % share in the European photovoltaic market, and has competitive advantages in photovoltaic systems and applications, thin film solar cells, and automated electronic controlling systems. A unique solar energy storage system based on hydrogen technology demonstrates skills on overcoming the summer-winter syndrome of large-scale solar energy utilization. The annual turnover of the Finnish industries on solar and wind energy has increased from 5 million ECU in 1988 to almost 50 million ECU in 1996. The national R and D and D from 1988 onwards has played an important role in this context. Most of the research and development into new and renewable energy technologies in Finland has been carried out through the Advanced New Energy Systems and Technologies Research Programme (NEMO2) of Tekes

  2. South Atlantic meridional transports from NEMO-based simulations and reanalyses

    Science.gov (United States)

    Mignac, Davi; Ferreira, David; Haines, Keith

    2018-02-01

    The meridional heat transport (MHT) of the South Atlantic plays a key role in the global heat budget: it is the only equatorward basin-scale ocean heat transport and it sets the northward direction of the global cross-equatorial transport. Its strength and variability, however, are not well known. The South Atlantic transports are evaluated for four state-of-the-art global ocean reanalyses (ORAs) and two free-running models (FRMs) in the period 1997-2010. All products employ the Nucleus for European Modelling of the Oceans (NEMO) model, and the ORAs share very similar configurations. Very few previous works have looked at ocean circulation patterns in reanalysis products, but here we show that the ORA basin interior transports are consistently improved by the assimilated in situ and satellite observations relative to the FRMs, especially in the Argo period. The ORAs also exhibit systematically higher meridional transports than the FRMs, which is in closer agreement with observational estimates at 35 and 11° S. However, the data assimilation impact on the meridional transports still greatly varies among the ORAs, leading to differences up to ˜ 8 Sv and 0.4 PW in the South Atlantic Meridional Overturning Circulation and the MHTs, respectively. We narrow this down to large inter-product discrepancies in the western boundary currents (WBCs) at both upper and deep levels explaining up to ˜ 85 % of the inter-product differences in MHT. We show that meridional velocity differences, rather than temperature differences, in the WBCs drive ˜ 83 % of this MHT spread. These findings show that the present ocean observation network and data assimilation schemes can be used to consistently constrain the South Atlantic interior circulation but not the overturning component, which is dominated by the narrow western boundary currents. This will likely limit the effectiveness of ORA products for climate or decadal prediction studies.

  3. Advancing nanoelectronic device modeling through peta-scale computing and deployment on nanoHUB

    International Nuclear Information System (INIS)

    Haley, Benjamin P; Luisier, Mathieu; Klimeck, Gerhard; Lee, Sunhee; Ryu, Hoon; Bae, Hansang; Saied, Faisal; Clark, Steve

    2009-01-01

    Recent improvements to existing HPC codes NEMO 3-D and OMEN, combined with access to peta-scale computing resources, have enabled realistic device engineering simulations that were previously infeasible. NEMO 3-D can now simulate 1 billion atom systems, and, using 3D spatial decomposition, scale to 32768 cores. Simulation time for the band structure of an experimental P doped Si quantum computing device fell from 40 minutes to 1 minute. OMEN can perform fully quantum mechanical transport calculations for real-word UTB FETs on 147,456 cores in roughly 5 minutes. Both of these tools power simulation engines on the nanoHUB, giving the community access to previously unavailable research capabilities.

  4. Finding NEMO (novel electromaterial muscle oscillator): a polypyrrole powered robotic fish with real-time wireless speed and directional control

    International Nuclear Information System (INIS)

    McGovern, Scott; Alici, Gursel; Spinks, Geoffrey; Truong, Van-Tan

    2009-01-01

    This paper presents the development of an autonomously powered and controlled robotic fish that incorporates an active flexural joint tail fin, activated through conducting polymer actuators based on polypyrrole (PPy). The novel electromaterial muscle oscillator (NEMO) tail fin assembly on the fish could be controlled wirelessly in real time by varying the frequency and duty cycle of the voltage signal supplied to the PPy bending-type actuators. Directional control was achieved by altering the duty cycle of the voltage input to the NEMO tail fin, which shifted the axis of oscillation and enabled turning of the robotic fish. At low speeds, the robotic fish had a turning circle as small as 15 cm (or 1.1 body lengths) in radius. The highest speed of the fish robot was estimated to be approximately 33 mm s −1 (or 0.25 body lengths s −1 ) and was achieved with a flapping frequency of 0.6–0.8 Hz which also corresponded with the most hydrodynamically efficient mode for tail fin operation. This speed is approximately ten times faster than those for any previously reported artificial muscle based device that also offers real-time speed and directional control. This study contributes to previously published studies on bio-inspired functional devices, demonstrating that electroactive polymer actuators can be real alternatives to conventional means of actuation such as electric motors

  5. Finding NEMO (novel electromaterial muscle oscillator): a polypyrrole powered robotic fish with real-time wireless speed and directional control

    Science.gov (United States)

    McGovern, Scott; Alici, Gursel; Truong, Van-Tan; Spinks, Geoffrey

    2009-09-01

    This paper presents the development of an autonomously powered and controlled robotic fish that incorporates an active flexural joint tail fin, activated through conducting polymer actuators based on polypyrrole (PPy). The novel electromaterial muscle oscillator (NEMO) tail fin assembly on the fish could be controlled wirelessly in real time by varying the frequency and duty cycle of the voltage signal supplied to the PPy bending-type actuators. Directional control was achieved by altering the duty cycle of the voltage input to the NEMO tail fin, which shifted the axis of oscillation and enabled turning of the robotic fish. At low speeds, the robotic fish had a turning circle as small as 15 cm (or 1.1 body lengths) in radius. The highest speed of the fish robot was estimated to be approximately 33 mm s-1 (or 0.25 body lengths s-1) and was achieved with a flapping frequency of 0.6-0.8 Hz which also corresponded with the most hydrodynamically efficient mode for tail fin operation. This speed is approximately ten times faster than those for any previously reported artificial muscle based device that also offers real-time speed and directional control. This study contributes to previously published studies on bio-inspired functional devices, demonstrating that electroactive polymer actuators can be real alternatives to conventional means of actuation such as electric motors.

  6. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  7. Failure of the Nemo trial: bumetanide is a promising agent to treat many brain disorders but not newborn seizures

    Directory of Open Access Journals (Sweden)

    Yehezkel eBen-Ari

    2016-04-01

    Full Text Available The diuretic bumetanide failed to treat acute seizures due to hypoxic ischemic encephalopathy (HIE in newborn babies and was associated with hearing loss (NEMO trial; 1. On the other hand, clinical and experimental observations suggest that the diuretic might provide novel therapy for many brain disorders including autistic spectrum disorder, schizophrenia, Rett syndrome and Parkinson disease. Here, we discuss the differences between the pathophysiology of severe recurrent seizures in the neonates and neurological and psychiatric disorders stressing the uniqueness of severe seizures in newborn in comparison to other disorders.

  8. AMM15: a new high-resolution NEMO configuration for operational simulation of the European north-west shelf

    Science.gov (United States)

    Graham, Jennifer A.; O'Dea, Enda; Holt, Jason; Polton, Jeff; Hewitt, Helene T.; Furner, Rachel; Guihou, Karen; Brereton, Ashley; Arnold, Alex; Wakelin, Sarah; Castillo Sanchez, Juan Manuel; Mayorga Adame, C. Gabriela

    2018-02-01

    This paper describes the next-generation ocean forecast model for the European north-west shelf, which will become the basis of operational forecasts in 2018. This new system will provide a step change in resolution and therefore our ability to represent small-scale processes. The new model has a resolution of 1.5 km compared with a grid spacing of 7 km in the current operational system. AMM15 (Atlantic Margin Model, 1.5 km) is introduced as a new regional configuration of NEMO v3.6. Here we describe the technical details behind this configuration, with modifications appropriate for the new high-resolution domain. Results from a 30-year non-assimilative run using the AMM15 domain demonstrate the ability of this model to represent the mean state and variability of the region.Overall, there is an improvement in the representation of the mean state across the region, suggesting similar improvements may be seen in the future operational system. However, the reduction in seasonal bias is greater off-shelf than on-shelf. In the North Sea, biases are largely unchanged. Since there has been no change to the vertical resolution or parameterization schemes, performance improvements are not expected in regions where stratification is dominated by vertical processes rather than advection. This highlights the fact that increased horizontal resolution will not lead to domain-wide improvements. Further work is needed to target bias reduction across the north-west shelf region.

  9. Development from the seafloor to the sea surface of the cabled NEMO-SN1 observatory in the Western Ionian Sea

    Science.gov (United States)

    Sparnocchia, Stefania; Beranzoli, Laura; Borghini, Mireno; Durante, Sara; Favali, Paolo; Giovanetti, Gabriele; Italiano, Francesco; Marinaro, Giuditta; Meccia, Virna; Papaleo, Riccardo; Riccobene, Giorgio; Schroeder, Katrin

    2015-04-01

    A prototype of cabled deep-sea observatory has been operating in real-time since 2005 in Southern Italy (East Sicily, 37°30' N - 15°06'E), at 2100 m water depth, 25 km from the harbor of the city of Catania. It is the first-established real-time node of the "European Multidisciplinary Seafloor and water column Observatory" (EMSO, http://www.emso-eu.org) a research infrastructure of the Sector Environment of ESFRI. In the present configuration it consists of two components: the multi-parametric station NEMO-SN1 (TSN branch) equipped with geophysical and environmental sensors for measurements at the seafloor, and the NEMO-OνDE station (TSS branch) equipped with 4 wideband hydrophones. A 28 km long electro-optical cable connects the observatory to a shore laboratory in the Catania harbor, hosting the data acquisition system and supplying power and data transmission to the underwater instrumentation. The NEMO-SN1 observatory is located in an area particularly suited to multidisciplinary studies. The site is one of the most seismically active areas of the Mediterranean (some of the strongest earthquakes occurred in 1169, 1693 and 1908, also causing very intense tsunami waves) and is close to Mount Etna, one of the largest and most active volcanoes in Europe. The deployment area is also a key site for monitoring deep-water dynamics in the Ionian Sea, connecting the Levantine basin to the southern Adriatic basin where intermediate and deep waters are formed, and finally to the western Mediterranean Sea via the Strait of Sicily. The observatory is being further developed under EMSO MedIT (http://www.emso-medit.it/en/), a structural enhancement project contributing to the consolidation and enhancement of the European research infrastructure EMSO in Italian Convergence Regions. In this framework, a new Junction Box will be connected to the TSN branch and will provide wired and wireless (acoustic connections) for seafloor platforms and moorings. This will allow the

  10. Hepatic tissue environment in NEMO-deficient mice critically regulates positive selection of donor cells after hepatocyte transplantation.

    Directory of Open Access Journals (Sweden)

    Michaela Kaldenbach

    Full Text Available BACKGROUND: Hepatocyte transplantation (HT is a promising alternative treatment strategy for end-stage liver diseases compared with orthotopic liver transplantation. A limitation for this approach is the low engraftment of donor cells. The deletion of the I-kappa B kinase-regulatory subunit IKKγ/NEMO in hepatocytes prevents nuclear factor (NF-kB activation and triggers spontaneous liver apoptosis, chronic hepatitis and the development of liver fibrosis and hepatocellular carcinoma. We hypothesized that NEMOΔhepa mice may therefore serve as an experimental model to study HT. METHODS: Pre-conditioned NEMOΔhepa mice were transplanted with donor-hepatocytes from wildtype (WT and mice deficient for the pro-apoptotic mediator Caspase-8 (Casp8Δhepa. RESULTS: Transplantation of isolated WT-hepatocytes into pre-conditioned NEMOΔhepa mice resulted in a 6-7 fold increase of donor cells 12 weeks after HT, while WT-recipients showed no liver repopulation. The use of apoptosis-resistant Casp8Δhepa-derived donor cells further enhanced the selection 3-fold after 12-weeks and up to 10-fold increase after 52 weeks compared with WT donors. While analysis of NEMOΔhepa mice revealed strong liver injury, HT-recipient NEMOΔhepa mice showed improved liver morphology and decrease in serum transaminases. Concomitant with these findings, the histological examination elicited an improved liver tissue architecture associated with significantly lower levels of apoptosis, decreased proliferation and a lesser amount of liver fibrogenesis. Altogether, our data clearly support the therapeutic benefit of the HT procedure into NEMOΔhepa mice. CONCLUSION: This study demonstrates the feasibility of the NEMOΔhepa mouse as an in vivo tool to study liver repopulation after HT. The improvement of the characteristic phenotype of chronic liver injury in NEMOΔhepa mice after HT suggests the therapeutic potential of HT in liver diseases with a chronic inflammatory phenotype and

  11. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  12. Design and assembly of the optical modules for phase-2 of the NEMO project

    Energy Technology Data Exchange (ETDEWEB)

    Leonora, E., E-mail: emanuele.leonora@ct.infn.it; Aiello, S.

    2013-10-11

    The NEMO collaboration team has undertaken a Phase-2 project, which aims at the realization and installation of a new infrastructure at the Capo Passero (Italy) deep-sea site at a depth of 3500 m. With this objective in mind, a fully equipped tower with 8-storey hosting two optical modules at each end is under construction. Following a well established procedure, 32 optical modules have been assembled. The optical module consists of a large area photomultiplier tube enclosed in a pressure resistant glass sphere with a diameter of 13 in. The photomultiplier is a R7081 type, produced by Hamamatsu, with a photocathode area with a diameter of 10 in. and 10 dynodes. Mechanical and optical contacts between the front of the photomultiplier tube and the glass surface are ensured by an optical bi-component silicone gel. A mu-metal cage is used to shield the photomultiplier against the influence of the Earth's magnetic field.

  13. Measurement of the two neutrino double beta decay half-life of Zr-96 with the NEMO-3 detector

    Energy Technology Data Exchange (ETDEWEB)

    Argyriades, J. [LAL, Universite Paris-Sud 11, CNRS/IN2P3, F-91405 Orsay (France); Arnold, R. [IPHC, Universite de Strasbourg, CNRS/IN2P3, F-67037 Strasbourg (France); Augier, C. [LAL, Universite Paris-Sud 11, CNRS/IN2P3, F-91405 Orsay (France); Baker, J. [INL, Idaho National Laboratory, 83415 Idaho Falls (United States); Barabash, A.S. [ITEP, Institute of Theoretical and Experimental Physics, 117259 Moscow (Russian Federation); Basharina-Freshville, A. [University College London, WC1E 6BT London (United Kingdom); Bongrand, M. [LAL, Universite Paris-Sud 11, CNRS/IN2P3, F-91405 Orsay (France); Broudin-Bay, G. [Universite Bordeaux, CENBG, UMR 5797, F-33175 Gradignan (France); CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR5797, F-33175 Gradignan (France); Brudanin, V. [JINR, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation); Caffrey, A.J. [INL, Idaho National Laboratory, 83415 Idaho Falls (United States); Chapon, A. [LPC, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Chauveau, E. [Universite Bordeaux, CENBG, UMR 5797, F-33175 Gradignan (France); CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR5797, F-33175 Gradignan (France); Daraktchieva, Z. [University College London, WC1E 6BT London (United Kingdom); Durand, D. [LPC, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Egorov, V. [JINR, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation); Fatemi-Ghomi, N. [University of Manchester, M13 9PL Manchester (United Kingdom); Flack, R. [University College London, WC1E 6BT London (United Kingdom); Guillon, B. [LPC, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Hubert, Ph. [Universite Bordeaux, CENBG, UMR 5797, F-33175 Gradignan (France); CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR5797, F-33175 Gradignan (France); Jullian, S. [LAL, Universite Paris-Sud 11, CNRS/IN2P3, F-91405 Orsay (France)

    2010-12-08

    Using 9.4 g of {sup 96}Zr isotope and 1221 days of data from the NEMO-3 detector corresponding to 0.031 kg y, the obtained 2{nu}{beta}{beta} decay half-life measurement is T{sub 1/2}{sup 2{nu}=}[2.35{+-}0.14(stat){+-}0.16(syst)]x10{sup 19} yr. Different characteristics of the final state electrons have been studied, such as the energy sum, individual electron energy, and angular distribution. The 2{nu} nuclear matrix element is extracted using the measured 2{nu}{beta}{beta} half-life and is M{sup 2{nu}=}0.049{+-}0.002. Constraints on 0{nu}{beta}{beta} decay have also been set.

  14. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  15. Toward variational assimilation of SARAL/Altika altimeter data in a North Atlantic circulation model at eddy-permitting resolution: assessment of a NEMO-based 4D-VAR system

    Science.gov (United States)

    Bouttier, Pierre-Antoine; Brankart, Jean-Michel; Candille, Guillem; Vidard, Arthur; Blayo, Eric; Verron, Jacques; Brasseur, Pierre

    2015-04-01

    In this project, the response of a variational data assimilation system based on NEMO and its linear tangent and adjoint model is investigated using a 4DVAR algorithm into a North-Atlantic model at eddy-permitting resolution. The assimilated data consist of Jason-2 and SARAL/AltiKA dataset collected during the 2013-2014 period. The main objective is to explore the robustness of the 4DVAR algorithm in the context of a realistic turbulent oceanic circulation at mid-latitude constrained by multi-satellite altimetry missions. This work relies on two previous studies. First, a study with similar objectives was performed based on academic double-gyre turbulent model and synthetic SARAL/AltiKA data, using the same DA experimental framework. Its main goal was to investigate the impact of turbulence on variational DA methods performance. The comparison with this previous work will bring to light the methodological and physical issues encountered by variational DA algorithms in a realistic context at similar, eddy-permitting spatial resolution. We also have demonstrated how a dataset mimicking future SWOT observations improves 4DVAR incremental performances at eddy-permitting resolution. Then, in the context of the OSTST and FP7 SANGOMA projects, an ensemble DA experiment based on the same model and observational datasets has been realized (see poster by Brasseur et al.). This work offers the opportunity to compare efficiency, pros and cons of both DA methods in the context of KA-band altimetric data, at spatial resolution commonly used today for research and operational applications. In this poster we will present the validation plan proposed to evaluate the skill of variational experiment vs. ensemble assimilation experiments covering the same period using independent observations (e.g. from Cryosat-2 mission).

  16. Aluminium in an ocean general circulation model compared with the West Atlantic Geotraces cruises

    CSIR Research Space (South Africa)

    Van Hulten, M

    2013-10-01

    Full Text Available A model of aluminium has been developed and implemented in an Ocean General Circulation Model (NEMO-PISCES). In the model, aluminium enters the ocean by means of dust deposition. The internal oceanic processes are described by advection, mixing...

  17. Evaluation of an operational ocean model configuration at 1/12° spatial resolution for the Indonesian seas (NEMO2.3/INDO12) - Part 1: Ocean physics

    Science.gov (United States)

    Tranchant, Benoît; Reffray, Guillaume; Greiner, Eric; Nugroho, Dwiyoga; Koch-Larrouy, Ariane; Gaspar, Philippe

    2016-03-01

    INDO12 is a 1/12° regional version of the NEMO physical ocean model covering the whole Indonesian EEZ (Exclusive Economic Zone). It has been developed and is now running every week in the framework of the INDESO (Infrastructure Development of Space Oceanography) project implemented by the Indonesian Ministry of Marine Affairs and Fisheries. The initial hydrographic conditions as well as open-boundary conditions are derived from the operational global ocean forecasting system at 1/4° operated by Mercator Océan. Atmospheric forcing fields (3-hourly ECMWF (European Centre for Medium-Range Weather Forecast) analyses) are used to force the regional model. INDO12 is also forced by tidal currents and elevations, and by the inverse barometer effect. The turbulent mixing induced by internal tides is taken into account through a specific parameterisation. In this study we evaluate the model skill through comparisons with various data sets including outputs of the parent model, climatologies, in situ temperature and salinity measurements, and satellite data. The biogeochemical model results assessment is presented in a companion paper (Gutknecht et al., 2015). The simulated and altimeter-derived Eddy Kinetic Energy fields display similar patterns and confirm that tides are a dominant forcing in the area. The volume transport of the Indonesian throughflow (ITF) is in good agreement with the INSTANT estimates while the transport through Luzon Strait is, on average, westward but probably too weak. Compared to satellite data, surface salinity and temperature fields display marked biases in the South China Sea. Significant water mass transformation occurs along the main routes of the ITF and compares well with observations. Vertical mixing is able to modify the South and North Pacific subtropical water-salinity maximum as seen in T-S diagrams. In spite of a few weaknesses, INDO12 proves to be able to provide a very realistic simulation of the ocean circulation and water mass

  18. Isolating Tracers of Phytoplankton with Allometric Zooplankton (TOPAZ) from Modular Ocean Model (MOM5) to Couple it with a Global Ocean Model

    Science.gov (United States)

    Jung, H. C.; Moon, B. K.; Wie, J.; Park, H. S.; Kim, K. Y.; Lee, J.; Byun, Y. H.

    2017-12-01

    This research is motivated by a need to develop a new coupled ocean-biogeochemistry model, a key tool for climate projections. The Modular Ocean Model (MOM5) is a global ocean/ice model developed by the Geophysical Fluid Dynamics Laboratory (GFDL) in the US, and it incorporates Tracers of Phytoplankton with Allometric Zooplankton (TOPAZ), which simulates the marine biota associated with carbon cycles. We isolated TOPAZ from MOM5 into a stand-alone version (TOPAZ-SA), and had it receive initial data and ocean physical fields required. Then, its reliability was verified by comparing the simulation results from the TOPAZ-SA with the MOM5/TOPAZ. This stand-alone version of TOPAZ is to be coupled to the Nucleus for European Modelling of the Ocean (NEMO). Here we present the preliminary results. Acknowledgements This research was supported by the project "Research and Development for KMA Weather, Climate, and Earth system Services" (NIMS-2016-3100) of the National Institute of Meteorological Sciences/Korea Meteorological Administration.

  19. Graph and model transformation tools for model migration : empirical results from the transformation tool contest

    NARCIS (Netherlands)

    Rose, L.M.; Herrmannsdoerfer, M.; Mazanek, S.; Van Gorp, P.M.E.; Buchwald, S.; Horn, T.; Kalnina, E.; Koch, A.; Lano, K.; Schätz, B.; Wimmer, M.

    2014-01-01

    We describe the results of the Transformation Tool Contest 2010 workshop, in which nine graph and model transformation tools were compared for specifying model migration. The model migration problem—migration of UML activity diagrams from version 1.4 to version 2.2—is non-trivial and practically

  20. Captain Nemo/Lt-General Pitt Rivers and Cleopatra’s Needle — A Story of Flagships

    Directory of Open Access Journals (Sweden)

    Christopher Evans

    2005-11-01

    Full Text Available Recently re-reading Verne’s 20,000 Leagues Beneath the Sea for our children I was struck by the marked similarities between the novel’s elusive protagonist, Captain Nemo, and the renowned later 19th century British archaeologist, Lt.-General Pitt Rivers. Could they have been the same person? How could something so seemingly blatant have gone unnoticed? These questions are, of course, only raised in a spirit of academic tongue-in-check. Yet, in an ethos of ‘learning through amusement’ (itself directly relevant to the themes of this study, exploring the parallels between these two ‘heroic’ individuals provides insights into the nature of 19th century science, Victorian edification and disciplinary institutionalisation (e.g. Levine 1986. This eclectic contribution will, moreover, be introduced with the third component of its headline title – Cleopatra’s Needle – as this provides an appropriately quasinautical parable on the project of 19th century archaeology and the problem of ‘deep time’ (Murray 1993.

  1. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  2. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  3. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  4. Thermodynamic and dynamic ice thickness contributions in the Canadian Arctic Archipelago in NEMO-LIM2 numerical simulations

    Science.gov (United States)

    Hu, Xianmin; Sun, Jingfan; Chan, Ting On; Myers, Paul G.

    2018-04-01

    Sea ice thickness evolution within the Canadian Arctic Archipelago (CAA) is of great interest to science, as well as local communities and their economy. In this study, based on the NEMO numerical framework including the LIM2 sea ice module, simulations at both 1/4 and 1/12° horizontal resolution were conducted from 2002 to 2016. The model captures well the general spatial distribution of ice thickness in the CAA region, with very thick sea ice (˜ 4 m and thicker) in the northern CAA, thick sea ice (2.5 to 3 m) in the west-central Parry Channel and M'Clintock Channel, and thin ( Program data at first-year landfast ice sites except at the northern sites with high concentration of old ice. At 1/4 to 1/12° scale, model resolution does not play a significant role in the sea ice simulation except to improve local dynamics because of better coastline representation. Sea ice growth is decomposed into thermodynamic and dynamic (including all non-thermodynamic processes in the model) contributions to study the ice thickness evolution. Relatively smaller thermodynamic contribution to ice growth between December and the following April is found in the thick and very thick ice regions, with larger contributions in the thin ice-covered region. No significant trend in winter maximum ice volume is found in the northern CAA and Baffin Bay while a decline (r2 ≈ 0.6, p < 0.01) is simulated in Parry Channel region. The two main contributors (thermodynamic growth and lateral transport) have high interannual variabilities which largely balance each other, so that maximum ice volume can vary interannually by ±12 % in the northern CAA, ±15 % in Parry Channel, and ±9 % in Baffin Bay. Further quantitative evaluation is required.

  5. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  6. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  7. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  8. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  9. Mixed layer depth calculation in deep convection regions in ocean numerical models

    Science.gov (United States)

    Courtois, Peggy; Hu, Xianmin; Pennelly, Clark; Spence, Paul; Myers, Paul G.

    2017-12-01

    Mixed Layer Depths (MLDs) diagnosed by conventional numerical models are generally based on a density difference with the surface (e.g., 0.01 kg.m-3). However, the temperature-salinity compensation and the lack of vertical resolution contribute to over-estimated MLD, especially in regions of deep convection. In the present work, we examined the diagnostic MLD, associated with the deep convection of the Labrador Sea Water (LSW), calculated with a simple density difference criterion. The over-estimated MLD led us to develop a new tool, based on an observational approach, to recalculate MLD from model output. We used an eddy-permitting, 1/12° regional configuration of the Nucleus for European Modelling of the Ocean (NEMO) to test and discuss our newly defined MLD. We compared our new MLD with that from observations, and we showed a major improvement with our new algorithm. To show the new MLD is not dependent on a single model and its horizontal resolution, we extended our analysis to include 1/4° eddy-permitting simulations, and simulations using the Modular Ocean Model (MOM) model.

  10. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  11. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  12. Measurement of the 2νββ decay of 100Mo to the excited 01+ state in the NEMO3 experiment

    International Nuclear Information System (INIS)

    Vala, L.

    2003-09-01

    The NEMO3 detector was designed for the study of double beta decay and in particular to search for the neutrinoless double beta decay process (0νββ). The intended sensitivity in terms of a half-life limit for the 0νββ decay is of the order of 10 25 y which corresponds to an effective neutrino mass m ν on the level of (0.3 - 0.1) eV. The 0νββ process is today the most promising test of the Majorana nature of the neutrino. The detector was constructed in the Modane Underground Laboratory (LSM) in France by an international collaboration including France, Russia, the Czech Republic, the USA, the UK, Finland, and Japan. The experiment has been taking data since May 2002. The quantity of 100 Mo in the detector (7 kg) allows an efficient measurement of the two-neutrino double beta decay (2νββ) of 100 Mo to the excited 0 1 + state (eeNγ channel). Monte-Carlo simulations of the effect and of all the relative sources of background have been produced in order to define a set of appropriate selection criteria. Both Monte-Carlo simulations and special runs with sources of 208 Tl and 214 Bi showed that the only significant background in the eeNγ channel comes from radon that penetrated inside the wire chamber of NEMO3. The experimental data acquired from May 2002 to May 2003 have been analysed in order to determine the signal from the 2νββ decay of 100 Mo to the excited 0 1 + state and the corresponding background level. The physical result, which was obtained at the level of four standard deviations, is given in the form of an interval of half-life values at 95% confidence level: [5.84*10 20 , 2.26*10 21 ] y for method A and [5.83*10 20 , 1.71*10 21 ] y for method B. (author)

  13. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  14. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  15. Approximate Stokes Drift Profiles and their use in Ocean Modelling

    Science.gov (United States)

    Breivik, Oyvind; Bidlot, Jea-Raymond; Janssen, Peter A. E. M.; Mogensen, Kristian

    2016-04-01

    Deep-water approximations to the Stokes drift velocity profile are explored as alternatives to the monochromatic profile. The alternative profiles investigated rely on the same two quantities required for the monochromatic profile, viz the Stokes transport and the surface Stokes drift velocity. Comparisons against parametric spectra and profiles under wave spectra from the ERA-Interim reanalysis and buoy observations reveal much better agreement than the monochromatic profile even for complex sea states. That the profiles give a closer match and a more correct shear has implications for ocean circulation models since the Coriolis-Stokes force depends on the magnitude and direction of the Stokes drift profile and Langmuir turbulence parameterizations depend sensitively on the shear of the profile. Of the two Stokes drift profiles explored here, the profile based on the Phillips spectrum is by far the best. In particular, the shear near the surface is almost identical to that influenced by the f-5 tail of spectral wave models. The NEMO general circulation ocean model was recently extended to incorporate the Stokes-Coriolis force along with two other wave-related effects. The ECWMF coupled atmosphere-wave-ocean ensemble forecast system now includes these wave effects in the ocean model component (NEMO).

  16. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    Science.gov (United States)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  17. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  18. Intestinal exposure to PCB 153 induces inflammation via the ATM/NEMO pathway.

    Science.gov (United States)

    Phillips, Matthew C; Dheer, Rishu; Santaolalla, Rebeca; Davies, Julie M; Burgueño, Juan; Lang, Jessica K; Toborek, Michal; Abreu, Maria T

    2018-01-15

    Polychlorinated biphenyls (PCBs) are persistent organic pollutants that adversely affect human health. PCBs bio-accumulate in organisms important for human consumption. PCBs accumulation in the body leads to activation of the transcription factor NF-κB, a major driver of inflammation. Despite dietary exposure being one of the main routes of exposure to PCBs, the gut has been widely ignored when studying the effects of PCBs. We investigated the effects of PCB 153 on the intestine and addressed whether PCB 153 affected intestinal permeability or inflammation and the mechanism by which this occurred. Mice were orally exposed to PCB 153 and gut permeability was assessed. Intestinal epithelial cells (IECs) were collected and evaluated for evidence of genotoxicity and inflammation. A human IEC line (SW480) was used to examine the direct effects of PCB 153 on epithelial function. NF-кB activation was measured using a reporter assay, DNA damage was assessed, and cytokine expression was ascertained with real-time PCR. Mice orally exposed to PCB 153 had an increase in intestinal permeability and inflammatory cytokine expression in their IECs; inhibition of NF-кB ameliorated both these effects. This inflammation was associated with genotoxic damage and NF-кB activation. Exposure of SW480 cells to PCB 153 led to similar effects as seen in vivo. We found that activation of the ATM/NEMO pathway by genotoxic stress was upstream of NF-kB activation. These results demonstrate that oral exposure to PCB 153 is genotoxic to IECs and induces downstream inflammation and barrier dysfunction in the intestinal epithelium. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  20. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  1. Estimating the Numerical Diapycnal Mixing in the GO5.0 Ocean Model

    Science.gov (United States)

    Megann, A.; Nurser, G.

    2014-12-01

    Constant-depth (or "z-coordinate") ocean models such as MOM4 and NEMO have become the de facto workhorse in climate applications, and have attained a mature stage in their development and are well understood. A generic shortcoming of this model type, however, is a tendency for the advection scheme to produce unphysical numerical diapycnal mixing, which in some cases may exceed the explicitly parameterised mixing based on observed physical processes, and this is likely to have effects on the long-timescale evolution of the simulated climate system. Despite this, few quantitative estimations have been made of the magnitude of the effective diapycnal diffusivity due to numerical mixing in these models. GO5.0 is the latest ocean model configuration developed jointly by the UK Met Office and the National Oceanography Centre (Megann et al, 2014), and forms part of the GC1 and GC2 climate models. It uses version 3.4 of the NEMO model, on the ORCA025 ¼° global tripolar grid. We describe various approaches to quantifying the numerical diapycnal mixing in this model, and present results from analysis of the GO5.0 model based on the isopycnal watermass analysis of Lee et al (2002) that indicate that numerical mixing does indeed form a significant component of the watermass transformation in the ocean interior.

  2. AGAMA: Action-based galaxy modeling framework

    Science.gov (United States)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  3. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  4. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  5. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  6. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  8. Nemo Solus Satis Sapit: Trends of Research Collaborations in the Vietnamese Social Sciences, Observing 2008–2017 Scopus Data

    Directory of Open Access Journals (Sweden)

    Quan-Hoang Vuong

    2017-10-01

    Full Text Available “Nemo solus satis sapit”—no one can be wise enough on his own. This is particularly true when it comes to collaborations in scientific research. Concerns over this issue in Vietnam, a developing country with limited academic resources, led to an in-depth study on Vietnamese social science research, using Google Scholar and Scopus, during 2008–2017. The results showed that more than 90% of scientists had worked with colleagues to publish, and they had collaborated 13 times on average during the time limit of the data sample. These collaborations, both domestic and international, mildly boosted author performance. On the other hand, the modest number of publications by Vietnamese authors was reportedly linked to Vietnamese social scientists’ heavy reliance on collaborative work as non-leading co-authors: for an entire decade (2008–2017, the average author assumes the leading role merely in two articles, and hardly ever published alone. This implies that policy-makers ought to consider promoting institutional collaborations while also encouraging authors to acquire the experience of publishing solo.

  9. Medicanes in an ocean-atmosphere coupled regional climate model

    Science.gov (United States)

    Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.

    2014-08-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid spacings of 0.44, 0.22, and 0.08°; with/without spectral nudging, and an ocean grid spacing of 1/12°). The results show that at high resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.

  10. A comparison of tools for modeling freshwater ecosystem services.

    Science.gov (United States)

    Vigerstol, Kari L; Aukema, Juliann E

    2011-10-01

    Interest in ecosystem services has grown tremendously among a wide range of sectors, including government agencies, NGO's and the business community. Ecosystem services entailing freshwater (e.g. flood control, the provision of hydropower, and water supply), as well as carbon storage and sequestration, have received the greatest attention in both scientific and on-the-ground applications. Given the newness of the field and the variety of tools for predicting water-based services, it is difficult to know which tools to use for different questions. There are two types of freshwater-related tools--traditional hydrologic tools and newer ecosystem services tools. Here we review two of the most prominent tools of each type and their possible applications. In particular, we compare the data requirements, ease of use, questions addressed, and interpretability of results among the models. We discuss the strengths, challenges and most appropriate applications of the different models. Traditional hydrological tools provide more detail whereas ecosystem services tools tend to be more accessible to non-experts and can provide a good general picture of these ecosystem services. We also suggest gaps in the modeling toolbox that would provide the greatest advances by improving existing tools. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  12. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  13. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  14. A new direction-sensitive optical module for deep-sea neutrino telescopy

    International Nuclear Information System (INIS)

    Brunoldi, Marco

    2009-01-01

    Within the KM3NeT framework, the NEMO (NEutrino Mediterranean Observatory) project is studying new technologies for a km 3 -scale neutrino telescope in the Mediterranean Sea. The telescope goal will be the investigation of the high-energy component of the cosmic neutrino spectrum: a promising tool to better understand the mechanisms that originate extreme-energy cosmic rays. Neutrino energy and direction will be reconstructed using the Cherenkov light produced in water by muons coming from neutrino interactions. Two prototypes of a new large-area (10 in.) 4-anode photomultipliers, manufactured by Hamamatsu at the request of the NEMO Collaboration, have been extensively studied. These tubes will be integrated into spherical glass pressure-resistant optical modules and used for the first time to detect the direction of the detected Cherenkov light at the NEMO deep-sea (3600 m) site near Capo Passero in Sicily. The photocathode surface in these optical modules will be effectively divided into four quadrants by a pair of crescent-shaped mirrors embedded in the optical gel linking the PMT to the glass pressure sphere. A series of measurements was performed at the testing facility of the NEMO group at the INFN Sezione di Catania. The single photoelectron peak, the transit time spread, the gain and the cross-talk of the prototype have been studied, to have a complete characterization and make feasible a comparison with previous models. The first prototype of direction-sensitive optical module has been assembled and tested with a dedicated experimental setup at the INFN Sezione di Genova. First results of tests of the prototype are presented.

  15. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  16. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  17. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  18. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  19. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  20. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  1. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  2. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  3. Naval EarthMap Observer: overview and data processing

    Science.gov (United States)

    Bowles, Jeffrey H.; Davis, Curtiss O.; Carney, Megan; Clamons, Dean; Gao, Bo-Cai; Gillis, David; Kappus, Mary E.; Lamela, G.; Montes, Marcos J.; Palmadesso, Peter J.; Rhea, J.; Snyder, William A.

    1999-12-01

    We present an overview of the Naval EarthMap Observer (NEMO) spacecraft and then focus on the processing of NEMO data both on-board the spacecraft and on the ground. The NEMO spacecraft provides for Joint Naval needs and demonstrates the use of hyperspectral imagery for the characterization of the littoral environment and for littoral ocean model development. NEMO is being funded jointly by the U.S. government and commercial partners. The Coastal Ocean Imaging Spectrometer (COIS) is the primary instrument on the NEMO and covers the spectral range from 400 to 2500 nm at 10-nm resolution with either 30 or 60 m work GSD. The hyperspectral data is processed on-board the NEMO using NRL's Optical Real-time Automated Spectral Identification System (ORASIS) algorithm that provides for real time analysis, feature extraction and greater than 10:1 data compression. The high compression factor allows for ground coverage of greater than 106 km2/day. Calibration of the sensor is done with a combination of moon imaging, using an onboard light source and vicarious calibration using a number of earth sites being monitored for that purpose. The data will be atmospherically corrected using ATREM. Algorithms will also be available to determine water clarity, bathymetry and bottom type.

  4. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  5. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  6. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  7. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  8. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  9. Modeling the dielectric logging tool at high frequency

    International Nuclear Information System (INIS)

    Chew, W.C.

    1987-01-01

    The high frequency dielectric logging tool has been used widely in electromagnetic well logging, because by measuring the dielectric constant at high frequencies (1 GHz), the water saturation of rocks could be known without measuring the water salinity in the rocks. As such, it could be used to delineate fresh water bearing zones, as the dielectric constant of fresh water is much higher than that of oil while they may have the same resistivity. The authors present a computer model, though electromagnetic field analysis, the response of such a measurement tool in a well logging environment. As the measurement is performed at high frequency, usually with small separation between the transmitter and receivers, some small geological features could be measured by such a tool. They use the computer model to study the behavior of such a tool across geological bed boundaries, and also across thin geological beds. Such a study could be very useful in understanding the limitation on the resolution of the tool. Furthermore, they could study the standoff effect and the depth of investigation of such a tool. This could delineate the range of usefulness of the measurement

  10. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    OpenAIRE

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

  11. User Manual for the PROTEUS Mesh Tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Micheal A. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, Emily R [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-19

    PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation. There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial

  12. System Dynamics Modeling of interactive cost factors for small modular reactors

    International Nuclear Information System (INIS)

    Ahn, Nam Sung; Lee, Keun Dae; Yoon, Suk Ho

    2011-01-01

    As a part of the Study on Economic Efficiency and Marketability of small modular reactors project, we at Nemo partners NEC consulting corporation were studying the various cost factors on small modular reactors (SMRs). To have a better knowledge of the interaction between the cost factors, System Dynamics Modeling has been developed. This model will contribute to our understanding of the interaction on the major factors effecting on the unit cost of SMRs to the SMRs' market share in the market economics as competition

  13. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  14. A look at the ocean in the EC-Earth climate model

    Energy Technology Data Exchange (ETDEWEB)

    Sterl, Andreas; Bintanja, Richard; Severijns, Camiel [Royal Netherlands Meteorological Institute (KNMI), P.O. Box 201, De Bilt (Netherlands); Brodeau, Laurent [Stockholm University, Department of Meteorology, Stockholm (Sweden); Gleeson, Emily; Semmler, Tido [Met Eireann, Dublin (Ireland); Koenigk, Torben; Wyser, Klaus [Swedish Meteorological and Hydrological Institute (SMHI), Norrkoeping (Sweden); Schmith, Torben; Yang, Shuting [Danish Meteorological Institute (DMI), Copenhagen (Denmark)

    2012-12-15

    EC-Earth is a newly developed global climate system model. Its core components are the Integrated Forecast System (IFS) of the European Centre for Medium Range Weather Forecasts (ECMWF) as the atmosphere component and the Nucleus for European Modelling of the Ocean (NEMO) developed by Institute Pierre Simon Laplace (IPSL) as the ocean component. Both components are used with a horizontal resolution of roughly one degree. In this paper we describe the performance of NEMO in the coupled system by comparing model output with ocean observations. We concentrate on the surface ocean and mass transports. It appears that in general the model has a cold and fresh bias, but a much too warm Southern Ocean. While sea ice concentration and extent have realistic values, the ice tends to be too thick along the Siberian coast. Transports through important straits have realistic values, but generally are at the lower end of the range of observational estimates. Exceptions are very narrow straits (Gibraltar, Bering) which are too wide due to the limited resolution. Consequently the modelled transports through them are too high. The strength of the Atlantic meridional overturning circulation is also at the lower end of observational estimates. The interannual variability of key variables and correlations between them are realistic in size and pattern. This is especially true for the variability of surface temperature in the tropical Pacific (El Nino). Overall the ocean component of EC-Earth performs well and helps making EC-Earth a reliable climate model. (orig.)

  15. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  16. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  17. Tool wear modeling using abductive networks

    Science.gov (United States)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  18. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  19. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi......-annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  20. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  1. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  2. Measurement of the 2{nu}{beta}{beta} decay of {sup 100}Mo to the excited 0{sub 1}{sup +} state in the NEMO3 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Vala, L

    2003-09-01

    The NEMO3 detector was designed for the study of double beta decay and in particular to search for the neutrinoless double beta decay process (0{nu}{beta}{beta}). The intended sensitivity in terms of a half-life limit for the 0{nu}{beta}{beta} decay is of the order of 10{sup 25} y which corresponds to an effective neutrino mass m{sub {nu}} on the level of (0.3 - 0.1) eV. The 0{nu}{beta}{beta} process is today the most promising test of the Majorana nature of the neutrino. The detector was constructed in the Modane Underground Laboratory (LSM) in France by an international collaboration including France, Russia, the Czech Republic, the USA, the UK, Finland, and Japan. The experiment has been taking data since May 2002. The quantity of {sup 100}Mo in the detector (7 kg) allows an efficient measurement of the two-neutrino double beta decay (2{nu}{beta}{beta}) of {sup 100}Mo to the excited 0{sub 1}{sup +} state (eeN{gamma} channel). Monte-Carlo simulations of the effect and of all the relative sources of background have been produced in order to define a set of appropriate selection criteria. Both Monte-Carlo simulations and special runs with sources of {sup 208}Tl and {sup 214}Bi showed that the only significant background in the eeN{gamma} channel comes from radon that penetrated inside the wire chamber of NEMO3. The experimental data acquired from May 2002 to May 2003 have been analysed in order to determine the signal from the 2{nu}{beta}{beta} decay of {sup 100}Mo to the excited 0{sub 1}{sup +} state and the corresponding background level. The physical result, which was obtained at the level of four standard deviations, is given in the form of an interval of half-life values at 95% confidence level: [5.84*10{sup 20}, 2.26*10{sup 21}] y for method A and [5.83*10{sup 20}, 1.71*10{sup 21}] y for method B. (author)

  3. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  4. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  5. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  6. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  7. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  8. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  9. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  10. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  11. Surviving the present: Modeling tools for organizational change

    International Nuclear Information System (INIS)

    Pangaro, P.

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

  12. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  13. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  14. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  16. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  17. Assessment of wear dependence parameters in complex model of cutting tool wear

    Science.gov (United States)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  18. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  19. Thermomechanical modelling of laser surface glazing for H13 tool steel

    Science.gov (United States)

    Kabir, I. R.; Yin, D.; Tamanna, N.; Naher, S.

    2018-03-01

    A two-dimensional thermomechanical finite element (FE) model of laser surface glazing (LSG) has been developed for H13 tool steel. The direct coupling technique of ANSYS 17.2 (APDL) has been utilised to solve the transient thermomechanical process. A H13 tool steel cylindrical cross-section has been modelled for laser power 200 W and 300 W at constant 0.2 mm beam width and 0.15 ms residence time. The model can predict temperature distribution, stress-strain increments in elastic and plastic region with time and space. The crack formation tendency also can be assumed by analysing the von Mises stress in the heat-concentrated zone. Isotropic and kinematic hardening models have been applied separately to predict the after-yield phenomena. At 200 W laser power, the peak surface temperature achieved is 1520 K which is below the melting point (1727 K) of H13 tool steel. For laser power 300 W, the peak surface temperature is 2523 K. Tensile residual stresses on surface have been found after cooling, which are in agreement with literature. Isotropic model shows higher residual stress that increases with laser power. Conversely, kinematic model gives lower residual stress which decreases with laser power. Therefore, both plasticity models could work in LSG for H13 tool steel.

  20. Study and development of microporous organic compounds for radon adsorption and his application in particle physics

    International Nuclear Information System (INIS)

    Noel, Raymond

    2015-01-01

    The neutrino is one of the twelve elementary particles from the standard model. It is characterize by a neutral electrical charge and an extremely low mass. Many experiments have been set up in order to study the properties of neutrino. Despite scientific breakthrough, the nature of this particle is still unknown up to now. The NEMO collaboration is studying the neutrinoless double beta decay, a very rare radioactive process, to find out the nature of neutrino and to know if the neutrino is equivalent to the antineutrino. Today, the NEMO collaboration is building a new detector called SuperNEMO. The gas inside the detector need to have a concentration in radon below 100 μBq/m"3, to minimize the radioactive background. The purification of this gas is achieved from the adsorption of radon by microporous material. In this work, we have developed in CPPM a bench test to measure the radon adsorption by various materials, in order to propose an adsorption model, and to reach the purity condition needed for SuperNEMO. Along with the study on adsorbents available and to better understand the radon adsorption, we synthesized and studied at CINaM star-shape poly-aromatic hydrocarbons and branched or dendritic aromatic polymers, incorporating sulfur, to adsorb radon [fr

  1. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  2. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  3. Development of modeling tools for pin-by-pin precise reactor simulation

    International Nuclear Information System (INIS)

    Ma Yan; Li Shu; Li Gang; Zhang Baoyin; Deng Li; Fu Yuanguang

    2013-01-01

    In order to develop large-scale transport simulation and calculation method (such as simulation of whole reactor core pin-by-pin problem), the Institute of Applied Physics and Computational Mathematics developed the neutron-photon coupled transport code JMCT and the toolkit JCOGIN. Creating physical calculation model easily and efficiently can essentially reduce problem solving time. Currently, lots of visual modeling programs have been developed based on different CAD systems. In this article, the developing idea of a visual modeling tool based on field oriented development was introduced. Considering the feature of physical modeling, fast and convenient operation modules were developed. In order to solve the storage and conversion problems of large scale models, the data structure and conversional algorithm based on the hierarchical geometry tree were designed. The automatic conversion and generation of physical model input file for JMCT were realized. By using this modeling tool, the Dayawan reactor whole core physical model was created, and the transformed file was delivered to JMCT for transport calculation. The results validate the correctness of the visual modeling tool. (authors)

  4. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  5. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  6. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  7. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  8. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    Science.gov (United States)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  9. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  10. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  11. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  12. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  13. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  14. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  15. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  16. Estimating the numerical diapycnal mixing in an eddy-permitting ocean model

    Science.gov (United States)

    Megann, Alex

    2018-01-01

    Constant-depth (or "z-coordinate") ocean models such as MOM4 and NEMO have become the de facto workhorse in climate applications, having attained a mature stage in their development and are well understood. A generic shortcoming of this model type, however, is a tendency for the advection scheme to produce unphysical numerical diapycnal mixing, which in some cases may exceed the explicitly parameterised mixing based on observed physical processes, and this is likely to have effects on the long-timescale evolution of the simulated climate system. Despite this, few quantitative estimates have been made of the typical magnitude of the effective diapycnal diffusivity due to numerical mixing in these models. GO5.0 is a recent ocean model configuration developed jointly by the UK Met Office and the National Oceanography Centre. It forms the ocean component of the GC2 climate model, and is closely related to the ocean component of the UKESM1 Earth System Model, the UK's contribution to the CMIP6 model intercomparison. GO5.0 uses version 3.4 of the NEMO model, on the ORCA025 global tripolar grid. An approach to quantifying the numerical diapycnal mixing in this model, based on the isopycnal watermass analysis of Lee et al. (2002), is described, and the estimates thereby obtained of the effective diapycnal diffusivity in GO5.0 are compared with the values of the explicit diffusivity used by the model. It is shown that the effective mixing in this model configuration is up to an order of magnitude higher than the explicit mixing in much of the ocean interior, implying that mixing in the model below the mixed layer is largely dominated by numerical mixing. This is likely to have adverse consequences for the representation of heat uptake in climate models intended for decadal climate projections, and in particular is highly relevant to the interpretation of the CMIP6 class of climate models, many of which use constant-depth ocean models at ¼° resolution

  17. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  18. Dynamic wind turbine models in power system simulation tool DIgSILENT

    OpenAIRE

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar; Iov, F.; Blaabjerg, F.

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The repo...

  19. On the influence of model physics on simulations of Arctic and Antarctic sea ice

    Directory of Open Access Journals (Sweden)

    F. Massonnet

    2011-09-01

    Full Text Available Two hindcast (1983–2007 simulations are performed with the global, ocean-sea ice models NEMO-LIM2 and NEMO-LIM3 driven by atmospheric reanalyses and climatologies. The two simulations differ only in their sea ice component, while all other elements of experimental design (resolution, initial conditions, atmospheric forcing are kept identical. The main differences in the sea ice models lie in the formulation of the subgrid-scale ice thickness distribution, of the thermodynamic processes, of the sea ice salinity and of the sea ice rheology. To assess the differences in model skill over the period of investigation, we develop a set of metrics for both hemispheres, comparing the main sea ice variables (concentration, thickness and drift to available observations and focusing on both mean state and seasonal to interannual variability. Based upon these metrics, we discuss the physical processes potentially responsible for the differences in model skill. In particular, we suggest that (i a detailed representation of the ice thickness distribution increases the seasonal to interannual variability of ice extent, with spectacular improvement for the simulation of the recent observed summer Arctic sea ice retreats, (ii the elastic-viscous-plastic rheology enhances the response of ice to wind stress, compared to the classical viscous-plastic approach, (iii the grid formulation and the air-sea ice drag coefficient affect the simulated ice export through Fram Strait and the ice accumulation along the Canadian Archipelago, and (iv both models show less skill in the Southern Ocean, probably due to the low quality of the reanalyses in this region and to the absence of important small-scale oceanic processes at the models' resolution (~1°.

  20. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  1. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  2. Simulation of double beta decay in the 'SeXe' TPC

    Energy Technology Data Exchange (ETDEWEB)

    Mauger, F [LPC Caen and University of Caen, ENSICAEN, 6 Bd Marechal Juin, 14050 CAEN CEDEX 4 (France)

    2007-04-15

    In 2004, the NEMO collaboration has started some preliminary studies for a next-generation double beta decay experiment: SuperNEMO. The possibility to use a large gaseous TPC has been investigated using simulation and extrapolation of former experiments. In this talk, I report on the reasons why such techniques have not been selected in 2004 and led the NEMO collaboration to reuse the techniques implemented within the NEMO3 detector.

  3. Simulation of double beta decay in the ''SeXe'' TPC

    Science.gov (United States)

    Mauger, F.

    2007-04-01

    In 2004, the NEMO collaboration has started some preliminary studies for a next-generation double beta decay experiment: SuperNEMO. The possibility to use a large gaseous TPC has been investigated using simulation and extrapolation of former experiments. In this talk, I report on the reasons why such techniques have not been selected in 2004 and led the NEMO collaboration to reuse the techniques implemented within the NEMO3 detector.

  4. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  5. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  6. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  7. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib, E-mail: garib@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC.

  8. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    International Nuclear Information System (INIS)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC

  9. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  10. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology...... is used for checking the consistency of a design with respect to the availablity of services and resources. In the second application, a tool for automatically implementing the communication infrastructure of a process network application, the Service Relation Model is used for analyzing the capabilities...

  11. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  12. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  13. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  14. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  15. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn Frits; Remke, Anne Katharina Ingrid; Haverkort, Boudewijn R.H.M.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to

  16. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    Science.gov (United States)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  17. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  18. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  19. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    Science.gov (United States)

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  20. Laguna Verde simulator: A new TRAC-RT based application

    International Nuclear Information System (INIS)

    Munoz Cases, J.J.; Tanarro Onrubia, A.

    2006-01-01

    In a partnership with GSE Systems, TECNATOM is developing a full scope training simulator for Laguna Verde Unit 2 (LV2). The simulator design is based upon the current 'state-of-the art technology' regarding the simulation platform, instructor station, visualization tools, advanced thermalhydraulics and neutronics models, I/O systems and automated model building technology. When completed, LV2 simulator will achieve a remarkable level of modeling fidelity by using TECNATOM's TRAC-RT advanced thermalhydraulic code for the reactor coolant and main steam systems, and NEMO neutronic model for the reactor core calculations. These models have been utilized up to date for the development or upgrading of nine NPP simulators in Spain and abroad, with more than 8000 hours of training sessions, and have developed an excellent reputation for its robustness and high fidelity. (author)

  1. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  2. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  3. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  4. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. Transformation of UML models to CSP : a case study for graph transformation tools

    NARCIS (Netherlands)

    Varró, D.; Asztalos, M.; Bisztray, D.; Boronat, A.; Dang, D.; Geiß, R.; Greenyer, J.; Van Gorp, P.M.E.; Kniemeyer, O.; Narayanan, A.; Rencis, E.; Weinell, E.; Schürr, A.; Nagl, M.; Zündorf, A.

    2008-01-01

    Graph transformation provides an intuitive mechanism for capturing model transformations. In the current paper, we investigate and compare various graph transformation tools using a compact practical model transformation case study carried out as part of the AGTIVE 2007 Tool Contest [22]. The aim of

  7. Assessment of Global Forecast Ocean Assimilation Model (FOAM) using new satellite SST data

    Science.gov (United States)

    Ascione Kenov, Isabella; Sykes, Peter; Fiedler, Emma; McConnell, Niall; Ryan, Andrew; Maksymczuk, Jan

    2016-04-01

    There is an increased demand for accurate ocean weather information for applications in the field of marine safety and navigation, water quality, offshore commercial operations, monitoring of oil spills and pollutants, among others. The Met Office, UK, provides ocean forecasts to customers from governmental, commercial and ecological sectors using the Global Forecast Ocean Assimilation Model (FOAM), an operational modelling system which covers the global ocean and runs daily, using the NEMO (Nucleus for European Modelling of the Ocean) ocean model with horizontal resolution of 1/4° and 75 vertical levels. The system assimilates salinity and temperature profiles, sea surface temperature (SST), sea surface height (SSH), and sea ice concentration observations on a daily basis. In this study, the FOAM system is updated to assimilate Advanced Microwave Scanning Radiometer 2 (AMSR2) and the Spinning Enhanced Visible and Infrared Imager (SEVIRI) SST data. Model results from one month trials are assessed against observations using verification tools which provide a quantitative description of model performance and error, based on statistical metrics, including mean error, root mean square error (RMSE), correlation coefficient, and Taylor diagrams. A series of hindcast experiments is used to run the FOAM system with AMSR2 and SEVIRI SST data, using a control run for comparison. Results show that all trials perform well on the global ocean and that largest SST mean errors were found in the Southern hemisphere. The geographic distribution of the model error for SST and temperature profiles are discussed using statistical metrics evaluated over sub-regions of the global ocean.

  8. BPMNDiffViz : a tool for BPMN models comparison

    NARCIS (Netherlands)

    Ivanov, S.Y.; Kalenkova, A.A.; Aalst, van der W.M.P.; Daniel, F.; Zugal, S.

    2015-01-01

    Automatic comparison of business processes plays an important role in their analysis and optimization. In this paper we present the web-based tool BPMNDiffViz, that finds business processes discrepancies and visualizes them. BPMN (Business Process Model and Notation) 2.0 - one of the most commonly

  9. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have...

  10. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Directory of Open Access Journals (Sweden)

    Okokpujie Imhade Princess

    2017-12-01

    Full Text Available In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N, feed rate (f, axial depth of cut (a and radial depth of cut (r. The experiment was designed using central composite design (CCD in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM. The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  11. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Science.gov (United States)

    Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.

    2017-12-01

    In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  12. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  13. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  14. Basin-wide seasonal evolution of the Indian Ocean's phytoplankton blooms

    Digital Repository Service at National Institute of Oceanography (India)

    Levy, M.; Shankar, D.; Andre, J.M.; Shenoi, S.S.C.; Durand, F.; De

    OF GEOPHYSICAL RESEARCH, VOL. ???, XXXX, DOI:10.1029/, Basin-wide seasonal evolution of the Indian Ocean’s1 phytoplankton blooms2 M. L´evy 1,2 ,D.Shankar 2 , J.-M. Andr´e 1,2 , S. S. C. Shenoi 2 ,F.Durand 2,3 and C. de Boyer Mont´egut 4 M. L´evy, LOCEAN....2. The physical model We used outputs from the NEMO OGCM in its global configuration ORCA0595 (http://www.locean-ipsl.upmc.fr/NEMO). The model run that we used is an updated96 version of the simulation validated by de Boyer Mont´egut et al. (in press) over...

  15. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  16. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  17. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  18. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  19. An empirical approach for evaluating the usability of model-driven tools

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Panach, Jose Ignacio; Baars, Arthur Iwan; Vos, Tanja; Pastor, Oscar

    2013-01-01

    MDD tools are very useful to draw conceptual models and to automate code generation. Even though this would bring many benefits, wide adoption of MDD tools is not yet a reality. Various research activities are being undertaken to find why and to provide the required solutions. However, insufficient

  20. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  1. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  2. Metabolic engineering tools in model cyanobacteria.

    Science.gov (United States)

    Carroll, Austin L; Case, Anna E; Zhang, Angela; Atsumi, Shota

    2018-03-26

    Developing sustainable routes for producing chemicals and fuels is one of the most important challenges in metabolic engineering. Photoautotrophic hosts are particularly attractive because of their potential to utilize light as an energy source and CO 2 as a carbon substrate through photosynthesis. Cyanobacteria are unicellular organisms capable of photosynthesis and CO 2 fixation. While engineering in heterotrophs, such as Escherichia coli, has result in a plethora of tools for strain development and hosts capable of producing valuable chemicals efficiently, these techniques are not always directly transferable to cyanobacteria. However, recent efforts have led to an increase in the scope and scale of chemicals that cyanobacteria can produce. Adaptations of important metabolic engineering tools have also been optimized to function in photoautotrophic hosts, which include Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-Cas9, 13 C Metabolic Flux Analysis (MFA), and Genome-Scale Modeling (GSM). This review explores innovations in cyanobacterial metabolic engineering, and highlights how photoautotrophic metabolism has shaped their development. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  3. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  4. Design and development of nEMoS, an all-in-one, low-cost, web-connected and 3D-printed device for environmental analysis.

    Science.gov (United States)

    Salamone, Francesco; Belussi, Lorenzo; Danza, Ludovico; Ghellere, Matteo; Meroni, Italo

    2015-06-04

    The Indoor Environmental Quality (IEQ) refers to the quality of the environment in relation to the health and well-being of the occupants. It is a holistic concept, which considers several categories, each related to a specific environmental parameter. This article describes a low-cost and open-source hardware architecture able to detect the indoor variables necessary for the IEQ calculation as an alternative to the traditional hardware used for this purpose. The system consists of some sensors and an Arduino board. One of the key strengths of Arduino is the possibility it affords of loading the script into the board's memory and letting it run without interfacing with computers, thus granting complete independence, portability and accuracy. Recent works have demonstrated that the cost of scientific equipment can be reduced by applying open-source principles to their design using a combination of the Arduino platform and a 3D printer. The evolution of the 3D printer has provided a new means of open design capable of accelerating self-directed development. The proposed nano Environmental Monitoring System (nEMoS) instrument is shown to have good reliability and it provides the foundation for a more critical approach to the use of professional sensors as well as for conceiving new scenarios and potential applications.

  5. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    Science.gov (United States)

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  6. Towards an Experimental Framework for Measuring Usability of Model-Driven Tools

    NARCIS (Netherlands)

    Panach, Jose Ignacio; Condori-Fernandez, Nelly; Baar, Arthur; Vos, Tanja; Romeu, Ignacio; Pastor, Oscar; Campos, Pedro; Graham, Nicholas; Jorge, Joaquim; Nunes, Nuno; Palanque, Philippe; Winckler, Marco

    2011-01-01

    According to the Model-Driven Development (MDD) paradigm, analysts can substantially improve the software development process concentrating their efforts on a conceptual model, which can be transformed into code by means of transformation rules applied by a model compiler. However, MDD tools are not

  7. Modelling and Development of a High Performance Milling Process with Monolithic Cutting Tools

    International Nuclear Information System (INIS)

    Ozturk, E.; Taylor, C. M.; Turner, S.; Devey, M.

    2011-01-01

    Critical aerospace components usually require difficult to machine workpiece materials like nickel based alloys. Moreover; there is a pressing need to maximize the productivity of machining operations. This need can be satisfied by selection of higher feed velocity, axial and radial depths. But there may be several problems during machining in this case. Due to high cutting speeds in high performance machining, the tool life may be unacceptably low. If magnitudes of cutting forces are high, out of tolerance static form errors may result; moreover in the extreme cases, the cutting tool may break apart. Forced vibrations may deteriorate the surface quality. Chatter vibrations may develop if the selected parameters result in instability. In this study, in order to deal with the tool life issue, several experimental cuts are made with different tool geometries, and the best combination in terms of tool life is selected. A force model is developed and the results of the force model are verified by experimental results. The force model is used in predicting the effect of process parameters on cutting forces. In order to account for the other concerns such as static form errors, forced and chatter vibrations, additional process models are currently under development.

  8. GAMBIT: the global and modular beyond-the-standard-model inference tool

    Science.gov (United States)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  9. GAMBIT. The global and modular beyond-the-standard-model inference tool

    Energy Technology Data Exchange (ETDEWEB)

    Athron, Peter; Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Dickinson, Hugh [University of Minnesota, Minnesota Institute for Astrophysics, Minneapolis, MN (United States); Jackson, Paul; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); McKay, James [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Ripken, Joachim [Max Planck Institute for Solar System Research, Goettingen (Germany); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Seo, Seon-Hee [Seoul National University, Department of Physics and Astronomy, Seoul (Korea, Republic of); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Wild, Sebastian [DESY, Hamburg (Germany); Collaboration: The GAMBIT Collaboration

    2017-11-15

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  10. GAMBIT. The global and modular beyond-the-standard-model inference tool

    International Nuclear Information System (INIS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are; Buckley, Andy; Chrzaszcz, Marcin; Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan; Cornell, Jonathan M.; Dickinson, Hugh; Jackson, Paul; White, Martin; Kvellestad, Anders; Savage, Christopher; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; Wild, Sebastian

    2017-01-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  11. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  12. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  13. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  14. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    Science.gov (United States)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  15. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  16. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process.

    Science.gov (United States)

    Araujo, Pablo Granda; Gras, Anna; Ginovart, Marta

    2016-01-01

    Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs) using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics) can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4 (+), NO3 (-), NO2 (-), N2) to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc). MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  17. NASCENT: an automatic protein interaction network generation tool for non-model organisms.

    Science.gov (United States)

    Banky, Daniel; Ordog, Rafael; Grolmusz, Vince

    2009-04-24

    Large quantity of reliable protein interaction data are available for model organisms in public depositories (e.g., MINT, DIP, HPRD, INTERACT). Most data correspond to experiments with the proteins of Saccharomyces cerevisiae, Drosophila melanogaster, Homo sapiens, Caenorhabditis elegans, Escherichia coli and Mus musculus. For other important organisms the data availability is poor or non-existent. Here we present NASCENT, a completely automatic web-based tool and also a downloadable Java program, capable of modeling and generating protein interaction networks even for non-model organisms. The tool performs protein interaction network modeling through gene-name mapping, and outputs the resulting network in graphical form and also in computer-readable graph-forms, directly applicable by popular network modeling software. http://nascent.pitgroup.org.

  18. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  19. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  20. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  1. Development and application of modeling tools for sodium fast reactor inspection

    Energy Technology Data Exchange (ETDEWEB)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan [CEA LIST, Centre de Saclay F-91191 Gif-sur-Yvette (France)

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  2. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  3. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  4. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  5. Greenhouse gases from wastewater treatment — A review of modelling tools

    International Nuclear Information System (INIS)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-01-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N_2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. - Highlights: • The state of the art in GHG production/emission/modelling from WWTPs was outlined. • Detailed mechanisms of N_2O production by AOB are still not completely known. • N_2O modelling could be improved considering both AOB pathways contribution. • To improve protocols the regulatory framework among countries has to be equalized. • Plant-wide modelling can help modeller/engineer/operator to reduce GHG emissions.

  6. Greenhouse gases from wastewater treatment — A review of modelling tools

    Energy Technology Data Exchange (ETDEWEB)

    Mannina, Giorgio, E-mail: giorgio.mannina@unipa.it [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Ekama, George [Water Research Group, Department of Civil Engineering, University of Cape Town, Rondebosch, 7700 Cape (South Africa); Caniani, Donatella [Department of Engineering and Physics of the Environment, University of Basilicata, viale dell' Ateneo Lucano 10, 85100 Potenza (Italy); Cosenza, Alida [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Esposito, Giovanni [Department of Civil and Mechanical Engineering, University of Cassino and the Southern Lazio, Via Di Biasio, 43, 03043 Cassino, FR (Italy); Gori, Riccardo [Department of Civil and Environmental Engineering, University of Florence, Via Santa Marta 3, 50139 Florence (Italy); Garrido-Baserba, Manel [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Rosso, Diego [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Water-Energy Nexus Center, University of California, Irvine, CA 92697-2175 (United States); Olsson, Gustaf [Department of Industrial Electrical Engineering and Automation (IEA), Lund University, Box 118, SE-22100 Lund (Sweden)

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N{sub 2}O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. - Highlights: • The state of the art in GHG production/emission/modelling from WWTPs was outlined. • Detailed mechanisms of N{sub 2}O production by AOB are still not completely known. • N{sub 2}O modelling could be improved considering both AOB pathways contribution. • To improve protocols the regulatory framework among countries has to be equalized. • Plant-wide modelling can help modeller/engineer/operator to reduce GHG emissions.

  7. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  8. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Science.gov (United States)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  9. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  10. Predicting the Abrasion Resistance of Tool Steels by Means of Neurofuzzy Model

    Directory of Open Access Journals (Sweden)

    Dragutin Lisjak

    2013-07-01

    Full Text Available This work considers use neurofuzzy set theory for estimate abrasion wear resistance of steels based on chemical composition, heat treatment (austenitising temperature, quenchant and tempering temperature, hardness after hardening and different tempering temperature and volume loss of materials according to ASTM G 65-94. Testing of volume loss for the following group of materials as fuzzy data set was taken: carbon tool steels, cold work tool steels, hot work tools steels, high-speed steels. Modelled adaptive neuro fuzzy inference system (ANFIS is compared to statistical model of multivariable non-linear regression (MNLR. From the results it could be concluded that it is possible well estimate abrasion wear resistance for steel whose volume loss is unknown and thus eliminate unnecessary testing.

  11. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  12. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  13. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  14. How to define the tool kit for the corrective maintenance service? : a tool kit definition model under the service performance criterion

    NARCIS (Netherlands)

    Chen, Denise

    2009-01-01

    Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences

  15. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  16. Final Report: Simulation Tools for Parallel Microwave Particle in Cell Modeling

    International Nuclear Information System (INIS)

    Stoltz, Peter H.

    2008-01-01

    Transport of high-power rf fields and the subsequent deposition of rf power into plasma is an important component of developing tokamak fusion energy. Two limitations on rf heating are: (i) breakdown of the metallic structures used to deliver rf power to the plasma, and (ii) a detailed understanding of how rf power couples into a plasma. Computer simulation is a main tool for helping solve both of these problems, but one of the premier tools, VORPAL, is traditionally too difficult to use for non-experts. During this Phase II project, we developed the VorpalView user interface tool. This tool allows Department of Energy researchers a fully graphical interface for analyzing VORPAL output to more easily model rf power delivery and deposition in plasmas.

  17. I Feel You: The Design and Evaluation of a Domotic Affect-Sensitive Spoken Conversational Agent

    Directory of Open Access Journals (Sweden)

    Juan Manuel Montero

    2013-08-01

    Full Text Available We describe the work on infusion of emotion into a limited-task autonomous spoken conversational agent situated in the domestic environment, using a need-inspired task-independent emotion model (NEMO. In order to demonstrate the generation of affect through the use of the model, we describe the work of integrating it with a natural-language mixed-initiative HiFi-control spoken conversational agent (SCA. NEMO and the host system communicate externally, removing the need for the Dialog Manager to be modified, as is done in most existing dialog systems, in order to be adaptive. The first part of the paper concerns the integration between NEMO and the host agent. The second part summarizes the work on automatic affect prediction, namely, frustration and contentment, from dialog features, a non-conventional source, in the attempt of moving towards a more user-centric approach. The final part reports the evaluation results obtained from a user study, in which both versions of the agent (non-adaptive and emotionally-adaptive were compared. The results provide substantial evidences with respect to the benefits of adding emotion in a spoken conversational agent, especially in mitigating users’ frustrations and, ultimately, improving their satisfaction.

  18. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  19. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data.

    Science.gov (United States)

    García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio

    2016-01-28

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC-MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc . Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC-MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.

  20. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2016-01-01

    Full Text Available Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC in combination with multivariate adaptive regression splines (MARS technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.

  1. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    Science.gov (United States)

    García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio

    2016-01-01

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed. PMID:28787882

  2. Investigating the turbulence response of a 1-D idealized water column located in the sub-Antarctic zone with focus on the upper ocean dynamics

    CSIR Research Space (South Africa)

    Boodhraj, Kirodh

    2017-09-01

    Full Text Available A one-dimensional ocean physical model was implemented in the sub-Antarctic Southern Ocean using the Nucleus for the European Modelling of the Ocean (NEMO) model. It was used to examine the effects of the turbulence response of the simulation...

  3. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  4. Design and Development of nEMoS, an All-in-One, Low-Cost, Web-Connected and 3D-Printed Device for Environmental Analysis

    Directory of Open Access Journals (Sweden)

    Francesco Salamone

    2015-06-01

    Full Text Available The Indoor Environmental Quality (IEQ refers to the quality of the environment in relation to the health and well-being of the occupants. It is a holistic concept, which considers several categories, each related to a specific environmental parameter. This article describes a low-cost and open-source hardware architecture able to detect the indoor variables necessary for the IEQ calculation as an alternative to the traditional hardware used for this purpose. The system consists of some sensors and an Arduino board. One of the key strengths of Arduino is the possibility it affords of loading the script into the board’s memory and letting it run without interfacing with computers, thus granting complete independence, portability and accuracy. Recent works have demonstrated that the cost of scientific equipment can be reduced by applying open-source principles to their design using a combination of the Arduino platform and a 3D printer. The evolution of the 3D printer has provided a new means of open design capable of accelerating self-directed development. The proposed nano Environmental Monitoring System (nEMoS instrument is shown to have good reliability and it provides the foundation for a more critical approach to the use of professional sensors as well as for conceiving new scenarios and potential applications.

  5. Graphite-MicroMégas, a tool for DNA modeling

    OpenAIRE

    Hornus , Samuel; Larivière , Damien

    2011-01-01

    National audience; MicroMégas is the current state of an ongoing effort to develop tools for modeling biological assembly of molecules. We here present its DNA modeling part. MicroMégas is implemented as a plug-in to Graphite, which is a research plat- form for computer graphics, 3D modeling and numerical geometry that is developed by members of the ALICE team of INRIA.; Nous décrivons l'outils MicroMégas et les techniques qu'il met en jeu pour la modélisation d'assemblage de molécule, en par...

  6. Tools and data for the geochemical modeling. Thermodynamic data for sulfur species and background salts and tools for the uncertainty analysis; WEDA. Werkzeuge und Daten fuer die Geochemische Modellierung. Thermodynamische Daten fuer Schwefelspezies und Hintergrundsalze sowie Tools zur Unsicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Sven; Schoenwiese, Dagmar; Scharge, Tina

    2015-07-15

    The report on tools and data for the geochemical modeling covers the following issues: experimental methods and theoretical models, design of a thermodynamic model for reduced sulfur species, thermodynamic models for background salts, tools for the uncertainty and sensitivity analyses of geochemical equilibrium modeling.

  7. A Modeling approach for analysis and improvement of spindle-holder-tool assembly dynamics

    OpenAIRE

    Budak, Erhan; Ertürk, A.; Erturk, A.; Özgüven, H. N.; Ozguven, H. N.

    2006-01-01

    The most important information required for chatter stability analysis is the dynamics of the involved structures, i.e. the frequency response functions (FRFs) which are usually determined experimentally. In this study, the tool point FRF of a spindle-holder-tool assembly is analytically determined by using the receptance coupling and structural modification techniques. Timoshenko’s beam model is used for increased accuracy. The spindle is also modeled analytically with elastic supports repre...

  8. An assessment of the role of the k-e vertical mixing scheme in the simulation of Southern Ocean upper dynamics

    CSIR Research Space (South Africa)

    Boodhraj, K

    2016-11-01

    Full Text Available Following the work done by Reffrey, Calone and Bourdalle-Badie (2015) we implemented a one dimensional (1D) ocean physical model in the sub-Antarctic Southern Ocean using the Nucleus for the European Modelling of the Ocean(NEMO) model. The 1D model...

  9. MODERN TOOLS FOR MODELING ACTIVITY IT-COMPANIES

    Directory of Open Access Journals (Sweden)

    Марина Петрівна ЧАЙКОВСЬКА

    2015-05-01

    Full Text Available Increasing competition in the market of the web-based applications increases the importance of the quality of services and optimization of processes of interaction with customers. The purpose of the article is to develop recommendations for improving the business processes of IT enterprises of web application segment based on technological tools for business modeling, shaping requirements for the development of an information system for customer interaction; analysis of the effective means of implementation and evaluation of the economic effects of the introduction. A scheme of the business process development and launch of the website was built, based on the analysis of business process models and “swim lane” models, requirements for IP customer relationship management for web studio were established. Market of software to create IP was analyzed, and the ones corresponding to the requirements were selected. IP system was developed and tested, implemented it in the company, an appraisal of the economic effect was conducted.

  10. Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2014-11-01

    Full Text Available Tool condition monitoring (TCM plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM, hidden Markov model (HMM and radius basis function (RBF are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  11. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    International Nuclear Information System (INIS)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin

    2014-01-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  12. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    Science.gov (United States)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts

  13. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  14. Scale models: A proven cost-effective tool for outage planning

    Energy Technology Data Exchange (ETDEWEB)

    Lee, R. [Commonwealth Edison Co., Morris, IL (United States); Segroves, R. [Sargent & Lundy, Chicago, IL (United States)

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning and monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.

  15. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  16. Successful treatment with infliximab for inflammatory colitis in a patient with X-linked anhidrotic ectodermal dysplasia with immunodeficiency.

    Science.gov (United States)

    Mizukami, Tomoyuki; Obara, Megumi; Nishikomori, Ryuta; Kawai, Tomoki; Tahara, Yoshihiro; Sameshima, Naoki; Marutsuka, Kousuke; Nakase, Hiroshi; Kimura, Nobuhiro; Heike, Toshio; Nunoi, Hiroyuki

    2012-02-01

    X-linked anhidrotic ectodermal dysplasia with immunodeficiency (X-EDA-ID) is caused by hypomorphic mutations in the gene encoding nuclear factor-κB essential modulator protein (NEMO). Patients are susceptibile to diverse pathogens due to insufficient cytokine and frequently show severe chronic colitis. An 11-year-old boy with X-EDA-ID was hospitalized with autoimmune symptoms and severe chronic colitis which had been refractory to immunosuppressive drugs. Since tumor necrosis factor (TNF) α is responsible for the pathogenesis of NEMO colitis according to intestinal NEMO and additional TNFR1 knockout mice studies, and high levels of TNFα-producing mononuclear cells were detected in the patient due to the unexpected gene reversion mosaicism of NEMO, an anti-TNFα monoclonal antibody was administered to ameliorate his abdominal symptoms. Repeated administrations improved his colonoscopic findings as well as his dry skin along with a reduction of TNFα-expressing T cells. These findings suggest TNF blockade therapy is of value for refractory NEMO colitis with gene reversion.

  17. A model for flexible tools used in minimally invasive medical virtual environments.

    Science.gov (United States)

    Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos

    2011-01-01

    Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.

  18. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  19. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  20. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    OpenAIRE

    Garc?a Nieto, Paulino Jos?; Garc?a-Gonzalo, Esperanza; Ord??ez Gal?n, Celestino; Bernardo S?nchez, Antonio

    2016-01-01

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism i...

  1. Human-scale interaction for virtual model displays: a clear case for real tools

    Science.gov (United States)

    Williams, George C.; McDowall, Ian E.; Bolas, Mark T.

    1998-04-01

    We describe a hand-held user interface for interacting with virtual environments displayed on a Virtual Model Display. The tool, constructed entirely of transparent materials, is see-through. We render a graphical counterpart of the tool on the display and map it one-to-one with the real tool. This feature, combined with a capability for touch- sensitive, discrete input, results in a useful spatial input device that is visually versatile. We discuss the tool's design and interaction techniques it supports. Briefly, we look at the human factors issues and engineering challenges presented by this tool and, in general, by the class of hand-held user interfaces that are see-through.

  2. A temperature dependent cyclic plasticity model for hot work tool steel including particle coarsening

    Science.gov (United States)

    Jilg, Andreas; Seifert, Thomas

    2018-05-01

    Hot work tools are subjected to complex thermal and mechanical loads during hot forming processes. Locally, the stresses can exceed the material's yield strength in highly loaded areas as e.g. in small radii in die cavities. To sustain the high loads, the hot forming tools are typically made of martensitic hot work steels. While temperatures for annealing of the tool steels usually lie in the range between 400 and 600 °C, the steels may experience even higher temperatures during hot forming, resulting in softening of the material due to coarsening of strengthening particles. In this paper, a temperature dependent cyclic plasticity model for the martensitic hot work tool steel 1.2367 (X38CrMoV5-3) is presented that includes softening due to particle coarsening and that can be applied in finite-element calculations to assess the effect of softening on the thermomechanical fatigue life of hot work tools. To this end, a kinetic model for the evolution of the mean size of secondary carbides based on Ostwald ripening is coupled with a cyclic plasticity model with kinematic hardening. Mechanism-based relations are developed to describe the dependency of the mechanical properties on carbide size and temperature. The material properties of the mechanical and kinetic model are determined on the basis of tempering hardness curves as well as monotonic and cyclic tests.

  3. Prediction of the wear and evolution of cutting tools in a carbide / titanium-aluminum-vanadium machining tribosystem by volumetric tool wear characterization and modeling

    Science.gov (United States)

    Kuttolamadom, Mathew Abraham

    The objective of this research work is to create a comprehensive microstructural wear mechanism-based predictive model of tool wear in the tungsten carbide / Ti-6Al-4V machining tribosystem, and to develop a new topology characterization method for worn cutting tools in order to validate the model predictions. This is accomplished by blending first principle wear mechanism models using a weighting scheme derived from scanning electron microscopy (SEM) imaging and energy dispersive x-ray spectroscopy (EDS) analysis of tools worn under different operational conditions. In addition, the topology of worn tools is characterized through scanning by white light interferometry (WLI), and then application of an algorithm to stitch and solidify data sets to calculate the volume of the tool worn away. The methodology was to first combine and weight dominant microstructural wear mechanism models, to be able to effectively predict the tool volume worn away. Then, by developing a new metrology method for accurately quantifying the bulk-3D wear, the model-predicted wear was validated against worn tool volumes obtained from corresponding machining experiments. On analyzing worn crater faces using SEM/EDS, adhesion was found dominant at lower surface speeds, while dissolution wear dominated with increasing speeds -- this is in conformance with the lower relative surface speed requirement for micro welds to form and rupture, essentially defining the mechanical load limit of the tool material. It also conforms to the known dominance of high temperature-controlled wear mechanisms with increasing surface speed, which is known to exponentially increase temperatures especially when machining Ti-6Al-4V due to its low thermal conductivity. Thus, straight tungsten carbide wear when machining Ti-6Al-4V is mechanically-driven at low surface speeds and thermally-driven at high surface speeds. Further, at high surface speeds, craters were formed due to carbon diffusing to the tool surface and

  4. Transposons As Tools for Functional Genomics in Vertebrate Models.

    Science.gov (United States)

    Kawakami, Koichi; Largaespada, David A; Ivics, Zoltán

    2017-11-01

    Genetic tools and mutagenesis strategies based on transposable elements are currently under development with a vision to link primary DNA sequence information to gene functions in vertebrate models. By virtue of their inherent capacity to insert into DNA, transposons can be developed into powerful tools for chromosomal manipulations. Transposon-based forward mutagenesis screens have numerous advantages including high throughput, easy identification of mutated alleles, and providing insight into genetic networks and pathways based on phenotypes. For example, the Sleeping Beauty transposon has become highly instrumental to induce tumors in experimental animals in a tissue-specific manner with the aim of uncovering the genetic basis of diverse cancers. Here, we describe a battery of mutagenic cassettes that can be applied in conjunction with transposon vectors to mutagenize genes, and highlight versatile experimental strategies for the generation of engineered chromosomes for loss-of-function as well as gain-of-function mutagenesis for functional gene annotation in vertebrate models, including zebrafish, mice, and rats. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Community Intercomparison Suite (CIS) v1.4.0: A tool for intercomparing models and observations

    NARCIS (Netherlands)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip

    2016-01-01

    The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in-situ and model data. While there are a number of tools available for working with climate model data, the large diversity of

  6. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.; Roth, E.M.

    1990-01-01

    The US Nuclear Regulatory Commission is sponsoring a research program to develop improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. Under this program, a tool for simulating how people form intentions to act in NPP emergency situations was developed using artificial intelligence (AI) techniques. This tool is called Cognitive Environment Simulation (CES). The Cognitive Reliability Assessment Technique (or CREATE) was also developed to specify how CBS can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. The next step in the research program was to evaluate the modeling tool and the method for using the tool for Human Reliability Analysis (HRA) in PRAs. Three evaluation activities were conducted. First, a panel of highly distinguished experts in cognitive modeling, AI, PRA and HRA provided a technical review of the simulation development work. Second, based on panel recommendations, CES was exercised on a family of steam generator tube rupture incidents where empirical data on operator performance already existed. Third, a workshop with HRA practitioners was held to analyze a worked example of the CREATE method to evaluate the role of CES/CREATE in HRA. The results of all three evaluations indicate that CES/CREATE represents a promising approach to modeling operator intention formation during emergency operations

  7. Modelling Greenland icebergs

    Science.gov (United States)

    Marson, Juliana M.; Myers, Paul G.; Hu, Xianmin

    2017-04-01

    The Atlantic Meridional Overturning Circulation (AMOC) is well known for carrying heat from low to high latitudes, moderating local temperatures. Numerical studies have examined the AMOC's variability under the influence of freshwater input to subduction and deep convections sites. However, an important source of freshwater has often been overlooked or misrepresented: icebergs. While liquid runoff decreases the ocean salinity near the coast, icebergs are a gradual and remote source of freshwater - a difference that affects sea ice cover, temperature, and salinity distribution in ocean models. Icebergs originated from the Greenland ice sheet, in particular, can affect the subduction process in Labrador Sea by decreasing surface water density. Our study aims to evaluate the distribution of icebergs originated from Greenland and their contribution to freshwater input in the North Atlantic. To do that, we use an interactive iceberg module coupled with the Nucleus for European Modelling of the Ocean (NEMO v3.4), which will calve icebergs from Greenland according to rates established by Bamber et al. (2012). Details on the distribution and trajectory of icebergs within the model may also be of use for understanding potential navigation threats, as shipping increases in northern waters.

  8. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  9. Modeling as a tool for process control: alcoholic fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Tayeb, A M; Ashour, I A; Mostafa, N A [El-Minia Univ. (EG). Faculty of Engineering

    1991-01-01

    The results of the alcoholic fermentation of beet sugar molasses and wheat milling residues (Akalona) were fed into a computer program. Consequently, the kinetic parameters for these fermentation reactions were determined. These parameters were put into a kinetic model. Next, the model was tested, and the results obtained were compared with the experimental results of both beet molasses and Akalona. The deviation of the experimental results from the results obtained from the model was determined. An acceptable deviation of 1.2% for beet sugar molasses and 3.69% for Akalona was obtained. Thus, the present model could be a tool for chemical engineers working in fermentation processes both with respect to the control of the process and the design of the fermentor. (Author).

  10. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    Science.gov (United States)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-10-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.

  11. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  12. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  13. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin (eds.) [Chinese Academy of Sciences, Beijing, (China). Inst. of Atmospheric Physics

    2014-04-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  14. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Science.gov (United States)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  15. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  16. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  17. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  18. MODEL CAR TRANSPORT SYSTEM - MODERN ITS EDUCATION TOOL

    Directory of Open Access Journals (Sweden)

    Karel Bouchner

    2017-12-01

    Full Text Available The model car transport system is a laboratory intended for a practical development in the area of the motor traffic. It is also an important education tool for students’ hands-on training, enabling students to test the results of their own studies. The main part of the model car transportation network is a model in a ratio 1:87 (HO, based on component units of FALLER Car system, e.g. cars, traffic lights, carriage way, parking spaces, stop sections, branch-off junctions, sensors and control sections. The model enables to simulate real traffic situations. It includes a motor traffic in a city, in a small village, on a carriageway between a city and a village including a railway crossing. The traffic infrastructure includes different kinds of intersections, such as T-junctions, a classic four-way crossroad and four-way traffic circle, with and without traffic lights control. Another important part of the model is a segment of a highway which includes an elevated crossing with highway approaches and exits.

  19. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  20. Materials modelling - a possible design tool for advanced nuclear applications

    International Nuclear Information System (INIS)

    Hoffelner, W.; Samaras, M.; Bako, B.; Iglesias, R.

    2008-01-01

    The design of components for power plants is usually based on codes, standards and design rules or code cases. However, it is very difficult to get the necessary experimental data to prove these lifetime assessment procedures for long-term applications in environments where complex damage interactions (temperature, stress, environment, irradiation) can occur. The rules used are often very simple and do not have a basis which take physical damage into consideration. The linear life fraction rule for creep and fatigue interaction can be taken as a prominent example. Materials modelling based on a multi-scale approach in principle provides a tool to convert microstructural findings into mechanical response and therefore has the capability of providing a set of tools for the improvement of design life assessments. The strength of current multi-scale modelling efforts is the insight they offer as regards experimental phenomena. To obtain an understanding of these phenomena it is import to focus on issues which are important at the various time and length scales of the modelling code. In this presentation the multi-scale path will be demonstrated with a few recent examples which focus on VHTR applications. (authors)

  1. Modeling with a view to target identification in metabolic engineering: a critical evaluation of the available tools.

    Science.gov (United States)

    Maertens, Jo; Vanrolleghem, Peter A

    2010-01-01

    The state of the art tools for modeling metabolism, typically used in the domain of metabolic engineering, were reviewed. The tools considered are stoichiometric network analysis (elementary modes and extreme pathways), stoichiometric modeling (metabolic flux analysis, flux balance analysis, and carbon modeling), mechanistic and approximative modeling, cybernetic modeling, and multivariate statistics. In the context of metabolic engineering, one should be aware that the usefulness of these tools to optimize microbial metabolism for overproducing a target compound depends predominantly on the characteristic properties of that compound. Because of their shortcomings not all tools are suitable for every kind of optimization; issues like the dependence of the target compound's synthesis on severe (redox) constraints, the characteristics of its formation pathway, and the achievable/desired flux towards the target compound should play a role when choosing the optimization strategy.

  2. Modelling tools to evaluate China's future energy system - a review of the Chinese perspective

    DEFF Research Database (Denmark)

    Mischke, Peggy; Karlsson, Kenneth Bernard

    2014-01-01

    finds that there are considerable ranges in the reference scenarios: (i) GDP is projected to grow by 630e840% from 2010 to 2050, (ii) energy demand could increase by 200e300% from 2010 to 2050, and (iii) CO2 emissions could rise by 160e250% from 2010 to 2050. Although the access to the modelling tools...... compares 18 energy modelling tools from ten Chinese institutions. These models have been described in English language publications between 2005 and 2013, although not all are published in peer-reviewed journals. When comparing the results for three main energy system indicators across models, this paper...

  3. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    Energy Technology Data Exchange (ETDEWEB)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSim micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.

  4. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  5. Bio-AIMS Collection of Chemoinformatics Web Tools based on Molecular Graph Information and Artificial Intelligence Models.

    Science.gov (United States)

    Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro

    2015-01-01

    The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.

  6. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    Directory of Open Access Journals (Sweden)

    Stefan K Lhachimi

    Full Text Available BACKGROUND: Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA. A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states and three usability criteria (modest data requirements, rich model output, generally accessible to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment, we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. METHODS AND RESULTS: DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. CONCLUSION: By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based

  7. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Richard Stahl

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  8. Efficient Mobility Management Signalling in Network Mobility Supported PMIPV6.

    Science.gov (United States)

    Samuelraj, Ananthi Jebaseeli; Jayapal, Sundararajan

    2015-01-01

    Proxy Mobile IPV6 (PMIPV6) is a network based mobility management protocol which supports node's mobility without the contribution from the respective mobile node. PMIPV6 is initially designed to support individual node mobility and it should be enhanced to support mobile network movement. NEMO-BSP is an existing protocol to support network mobility (NEMO) in PMIPV6 network. Due to the underlying differences in basic protocols, NEMO-BSP cannot be directly applied to PMIPV6 network. Mobility management signaling and data structures used for individual node's mobility should be modified to support group nodes' mobility management efficiently. Though a lot of research work is in progress to implement mobile network movement in PMIPV6, it is not yet standardized and each suffers with different shortcomings. This research work proposes modifications in NEMO-BSP and PMIPV6 to achieve NEMO support in PMIPV6. It mainly concentrates on optimizing the number and size of mobility signaling exchanged while mobile network or mobile network node changes its access point.

  9. Efficient Mobility Management Signalling in Network Mobility Supported PMIPV6

    Directory of Open Access Journals (Sweden)

    Ananthi Jebaseeli Samuelraj

    2015-01-01

    Full Text Available Proxy Mobile IPV6 (PMIPV6 is a network based mobility management protocol which supports node’s mobility without the contribution from the respective mobile node. PMIPV6 is initially designed to support individual node mobility and it should be enhanced to support mobile network movement. NEMO-BSP is an existing protocol to support network mobility (NEMO in PMIPV6 network. Due to the underlying differences in basic protocols, NEMO-BSP cannot be directly applied to PMIPV6 network. Mobility management signaling and data structures used for individual node’s mobility should be modified to support group nodes’ mobility management efficiently. Though a lot of research work is in progress to implement mobile network movement in PMIPV6, it is not yet standardized and each suffers with different shortcomings. This research work proposes modifications in NEMO-BSP and PMIPV6 to achieve NEMO support in PMIPV6. It mainly concentrates on optimizing the number and size of mobility signaling exchanged while mobile network or mobile network node changes its access point.

  10. About Using Predictive Models and Tools To Assess Chemicals under TSCA

    Science.gov (United States)

    As part of EPA's effort to promote chemical safety, OPPT provides public access to predictive models and tools which can help inform the public on the hazards and risks of substances and improve chemical management decisions.

  11. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  12. Unusual Father-to-Daughter Transmission of Incontinentia Pigmenti Due to Mosaicism in IP Males.

    Science.gov (United States)

    Fusco, Francesca; Conte, Matilde Immacolata; Diociaiuti, Andrea; Bigoni, Stefania; Branda, Maria Francesca; Ferlini, Alessandra; El Hachem, Maya; Ursini, Matilde Valeria

    2017-09-01

    Incontinentia pigmenti (IP; Online Mendelian Inheritance in Man catalog #308300) is an X-linked dominant ectodermal disorder caused by mutations of the inhibitor of κ polypeptide gene enchancer in B cells, kinase γ ( IKBKG )/ nuclear factor κB, essential modulator ( NEMO ) gene. Hemizygous IKBKG/NEMO loss-of-function (LoF) mutations are lethal in males, thus patients are female, and the disease is always transmitted from an IP-affected mother to her daughter. We present 2 families with father-to-daughter transmission of IP and provide for the first time molecular evidence that the combination of somatic and germ-line mosaicism for IKBKG/NEMO loss of function mutations in IP males resulted in the transmission of the disease to a female child. We searched for the IKBKG/NEMO mutant allele in blood, urine, skin, and sperm DNA and found that the 2 fathers were somatic and germ-line mosaics for the p.Gln132×mutation or the exon 4-10 deletion of IKBKG/NEMO , respectively. The highest level of IKBKG/NEMO mutant cells was detected in the sperm, which might explain the recurrence of the disease. We therefore recommend careful clinical evaluation in IP male cases and the genetic investigation in sperm DNA to ensure correct genetic counseling and prevent the risk of paternal transmission of IP. Copyright © 2017 by the American Academy of Pediatrics.

  13. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  14. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  15. Assessment of local models and tools for analyzing smart-growth strategies.

    Science.gov (United States)

    2007-07-01

    There is a growing interest in California in smart-growth land- use and transportation : strategies designed to provide mobility options and reduce demand on automobile-oriented facilities. This study focuses on models and tools available for u...

  16. FORMAL MODELLING OF BUSINESS RULES: WHAT KIND OF TOOL TO USE?

    Directory of Open Access Journals (Sweden)

    Sandra Lovrenčić

    2006-12-01

    Full Text Available Business rules are today essential parts of a business system model. But presently, there are still various approaches to, definitions and classifications of this concept. Similarly, there are also different approaches in business rules formalization and implementation. This paper investigates formalization using formal language in association with easy domain modelling. Two of the tools that enable such approach are described and compared according to several factors. They represent ontology modelling and UML, nowadays widely used standard for object-oriented modelling. A simple example is also presented.

  17. MAPIT: A new software tool to assist in the transition from conceptual model to numerical simulation models

    International Nuclear Information System (INIS)

    Canales, T.W.; Grant, C.W.

    1996-01-01

    MapIt is a new software tool developed at Lawrence Livermore National Laboratory to assist ground water remediation professionals in generating numerical simulation models from a variety of physical and chemical data sources and the corresponding 1, 2, and 3 dimensional conceptual models that emerge from analysis of such data

  18. Tool path in torus tool CNC machining

    Directory of Open Access Journals (Sweden)

    XU Ying

    2016-10-01

    Full Text Available This paper is about tool path in torus tool CNC machining.The mathematical model of torus tool is established.The tool path planning algorithm is determined through calculation of the cutter location,boundary discretization,calculation of adjacent tool path and so on,according to the conversion formula,the cutter contact point will be converted to the cutter location point and then these points fit a toolpath.Lastly,the path planning algorithm is implemented by using Matlab programming.The cutter location points for torus tool are calculated by Matlab,and then fit these points to a toolpath.While using UG software,another tool path of free surface is simulated of the same data.It is drew compared the two tool paths that using torus tool is more efficient.

  19. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  20. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  1. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  2. Bayesian networks modeling for thermal error of numerical control machine tools

    Institute of Scientific and Technical Information of China (English)

    Xin-hua YAO; Jian-zhong FU; Zi-chen CHEN

    2008-01-01

    The interaction between the heat source location,its intensity,thermal expansion coefficient,the machine system configuration and the running environment creates complex thermal behavior of a machine tool,and also makes thermal error prediction difficult.To address this issue,a novel prediction method for machine tool thermal error based on Bayesian networks (BNs) was presented.The method described causal relationships of factors inducing thermal deformation by graph theory and estimated the thermal error by Bayesian statistical techniques.Due to the effective combination of domain knowledge and sampled data,the BN method could adapt to the change of running state of machine,and obtain satisfactory prediction accuracy.Ex-periments on spindle thermal deformation were conducted to evaluate the modeling performance.Experimental results indicate that the BN method performs far better than the least squares(LS)analysis in terms of modeling estimation accuracy.

  3. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  4. Triad Issue Paper: Using Geophysical Tools to Develop the Conceptual Site Model

    Science.gov (United States)

    This technology bulletin explains how hazardous-waste site professionals can use geophysical tools to provide information about subsurface conditions to create a more representative conceptual site model (CSM).

  5. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  6. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  7. Wave and Wind Model Performance Metrics Tools

    Science.gov (United States)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base

  8. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    OpenAIRE

    Ritou , Mathieu; Garnier , Sébastien; Furet , Benoît; Hascoët , Jean-Yves

    2014-01-01

    International audience; The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach. Then, a new solution i...

  9. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  10. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  11. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  12. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  13. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  14. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report. Version 1.0

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S.; Kramer White, Julie; Labbe, Steve G.; Rotter, Hank A.

    2005-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments, and real-time on-orbit assessments. The tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  15. Nemot kummitab plagiaadisüüdistus

    Index Scriptorium Estoniae

    2003-01-01

    Animafilm "Kalapoeg Nemo" ("Finding Nemo") : režissöör Andrew Stanton : Ameerika Ühendriigid 2003. Prantsuse lastekirjanik Franck Le Calvez käib kohut filmi tegijate ja levitajatega autoriõiguste rikkumise pärast

  16. The Capital Asset Pricing Model: An Evaluation of its Potential as a Strategic Planning Tool

    OpenAIRE

    Thomas H. Naylor; Francis Tapon

    1982-01-01

    In this paper we provide a summary of the capital asset pricing model (CAPM) and point out how it might possibly be used as a tool for strategic planning by corporations that own a portfolio of businesses. We also point out some of the assumptions underlying the CAPM which must be satisfied if it is to be used for strategic planning. Next we include a critical appraisal of the CAPM as a strategic planning tool. Finally, we state the case for linking competitive strategy models, CAPM models, a...

  17. System dynamics models as decision-making tools in agritourism

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2016-12-01

    Full Text Available Agritourism as a type of niche tourism is a complex and softly defined phaenomenon. The demands for fast and integrated decision regarding agritourism and its interconnections with environment, economy (investments, traffic and social factors (tourists is urgent. Many different methodologies and methods master softly structured questions and dilemmas with global and local properties. Here we present methods of systems thinking and system dynamics, which were first brought into force in the educational and training area in the form of different computer simulations and later as tools for decision-making and organisational re-engineering. We develop system dynamics models in order to present accuracy of methodology. These models are essentially simple and can serve only as describers of the activity of basic mutual influences among variables. We will pay the attention to the methodology for parameter model values determination and the so-called mental model. This one is the basis of causal connections among model variables. At the end, we restore a connection between qualitative and quantitative models in frame of system dynamics.

  18. Danish heat atlas as a support tool for energy system models

    International Nuclear Information System (INIS)

    Petrovic, Stefan N.; Karlsson, Kenneth B.

    2014-01-01

    Highlights: • The GIS method for calculating costs of district heating expansion is presented. • High socio-economic potential for district heating is identified within urban areas. • The method for coupling a heat atlas and TIMES optimization model is proposed. • Presented methods can be used for any geographical region worldwide. - Abstract: In the past four decades following the global oil crisis in 1973, Denmark has implemented remarkable changes in its energy sector, mainly due to the energy conservation measures on the demand side and the energy efficiency improvements on the supply side. Nowadays, the capital intensive infrastructure investments, such as the expansion of district heating networks and the introduction of significant heat saving measures require highly detailed decision-support tool. A Danish heat atlas provides highly detailed database with extensive information about more than 2.5 million buildings in Denmark. Energy system analysis tools incorporate environmental, economic, energy and engineering analysis of future energy systems and are considered crucial for the quantitative assessment of transitional scenarios towards future milestones, such as EU 2020 goals and Denmark’s goal of achieving fossil free society after 2050. The present paper shows how a Danish heat atlas can be used for providing inputs to energy system models, especially related to the analysis of heat saving measures within building stock and expansion of district heating networks. As a result, marginal cost curves are created, approximated and prepared for the use in optimization energy system model. Moreover, it is concluded that heat atlas can contribute as a tool for data storage and visualisation of results

  19. Design and implementation of the infrastructure of HadGEM3: the next-generation Met Office climate modelling system

    Directory of Open Access Journals (Sweden)

    H. T. Hewitt

    2011-04-01

    Full Text Available This paper describes the development of a technically robust climate modelling system, HadGEM3, which couples the Met Office Unified Model atmosphere component, the NEMO ocean model and the Los Alamos sea ice model (CICE using the OASIS coupler. Details of the coupling and technical solutions of the physical model (HadGEM3-AO are documented, in addition to a description of the configurations of the individual submodels. The paper demonstrates that the implementation of the model has resulted in accurate conservation of heat and freshwater across the model components. The model performance in early versions of this climate model is briefly described to demonstrate that the results are scientifically credible. HadGEM3-AO is the basis for a number of modelling efforts outside of the Met Office, both within the UK and internationally. This documentation of the HadGEM3-AO system provides a detailed reference for developers of HadGEM3-based climate configurations.

  20. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  1. Numerical modelling of the buoyant marine microplastics in the South-Eastern Baltic Sea

    Science.gov (United States)

    Bagaev, Andrei; Mizyuk, Artem; Chubarenko, Irina; Khatmullilna, Liliya

    2017-04-01

    Microplastics is a burning issue in the marine pollution science. Its sources, ways of propagation and final destiny pose a lot of questions to the modern oceanographers. Hence, a numerical model is an optimal tool for reconstruction of microplastics pathways and fate. Within the MARBLE project (lamp.ocean.ru), a model of Lagrangian particles transport was developed. It was tested coupled with oceanographic transport fields from the operational oceanography product of Copernicus Marine Monitoring Environment Service. Our model deals with two major types of microplastics such as microfibres and buoyant spheroidal particles. We are currently working to increase the grid resolution by means of the NEMO regional configuration for the south-eastern Baltic Sea. Several expeditions were organised to the three regions of the Baltic Sea (the Gotland, the Bornholm, and the Gdansk basins). Water samples from the surface and different water layers were collected, processed, and analysed by our team. A set of laboratory experiments was specifically designed to establish the settling velocity of particles of various shapes and densities. The analysis in question provided us with the understanding necessary for the model to reproduce the large-scale dynamics of microfibres. In the simulation, particles were spreading from the shore to the deep sea, slowly sinking to the bottom, while decreasing in quantity due to conditional sedimentation. Our model is expected to map out the microplastics life cycle and to account for its distribution patterns under the impact of wind and currents. For this purpose, we have already included the parameterization for the wind drag force applied to a particle. Initial results of numerical experiments seem to indicate the importance of proper implicit parameterization of the particle dynamics at the vertical solid boundary. Our suggested solutions to that problem will be presented at the EGU-2017. The MARBLE project is supported by Russian Science

  2. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  3. RPAP3 enhances cytotoxicity of doxorubicin by impairing NF-kappa B pathway

    International Nuclear Information System (INIS)

    Shimada, Kana; Saeki, Makio; Egusa, Hiroshi; Fukuyasu, Sho; Yura, Yoshiaki; Iwai, Kazuhiro; Kamisaki, Yoshinori

    2011-01-01

    Research highlights: → RNA polymerase II-associated protein 3 (RPAP3) possesses an activity to bind with NEMO and to inhibit the ubiquitination of NEMO. → RPAP3 enhances doxorubicin-induced cell death in breast cancer cell line T-47D through the marked impairment of NF-κB pathway. → RPAP3 is a novel modulator of NF-κB pathway in apoptosis induced by anti-cancer chemotherapeutic agents. -- Abstract: Activation of anti-apoptotic gene transcription by NF-κB (nuclear factor-kappa B) has been reported to be linked with a resistance of cancer cells against chemotherapy. NEMO (NF-κB essential modulator) interacts with a number of proteins and modulates the activity of NF-κB pathway. In this study, we revealed that RPAP3 (RNA polymerase II-associated protein 3) possesses an activity to bind with NEMO and to inhibit the ubiquitination of NEMO and that RPAP3 enhances doxorubicin-induced cell death in breast cancer cell line T-47D through the marked impairment of NF-κB pathway. These results indicate that RPAP3 may be a novel modulator of NF-κB pathway in apoptosis induced by anti-cancer chemotherapeutic agents.

  4. Optimal Vehicle Design Using the Integrated System and Cost Modeling Tool Suite

    Science.gov (United States)

    2010-08-01

    Space Vehicle Costing ( ACEIT ) • New Small Sat Model Development & Production Cost O&M Cost Module  Radiation Exposure  Radiation Detector Response...Reliability OML Availability Risk l l Tools CEA, SRM Model, POST, ACEIT , Inflation Model, Rotor Blade Des, Microsoft Project, ATSV, S/1-iABP...space STK, SOAP – Specific mission • Space Vehicle Design (SMAD) • Space Vehicle Propulsion • Orbit Propagation • Space Vehicle Costing ( ACEIT ) • New

  5. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    Science.gov (United States)

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  6. Development of a surrogate model for elemental analysis using a natural gamma ray spectroscopy tool

    International Nuclear Information System (INIS)

    Zhang, Qiong

    2015-01-01

    A systematic computational method for obtaining accurate elemental standards efficiently for varying borehole conditions was developed based on Monte Carlo simulations, surrogate modeling, and data assimilation. Elemental standards are essential for spectral unfolding in formation evaluation applications commonly used for nuclear well logging tools. Typically, elemental standards are obtained by standardized measurements, but these experiments are expensive and lack the flexibility to address different logging conditions. In contrast, computer-based Monte Carlo simulations provide an accurate and more flexible approach to obtaining elemental standards for formation evaluation. The presented computational method recognizes that in contrast to typical neutron–photon simulations, where the source is typically artificial and well characterized (Galford, 2009), an accurate knowledge of the source is essential for matching the obtained Monte Carlo elemental standards with their experimental counterparts. Therefore, source distributions are adjusted to minimize the L2 difference of the Monte Carlo computed and experimental standards. Subsequently, an accurate surrogate model is developed accounting for different casing and cement thicknesses, and tool positions within the borehole. The adjusted source distributions are then utilized to generate and validate spectra for varying borehole conditions: tool position, casing and cement thickness. The effect of these conditions on the spectra are investigated and discussed in this work. Given that Monte Carlo modeling provides much lower cost and more flexibility, employing Monte Carlo could enhance the processing of nuclear tool logging data computed standards. - Highlights: • A novel computational model for efficiently computing elemental standards for varying borehole conditions has been developed. • A model of an experimental test pit was implemented in the Monte Carlo code GEANT4 for computing elemental standards.

  7. Surface modeling of workpiece and tool trajectory planning for spray painting robot.

    Directory of Open Access Journals (Sweden)

    Yang Tang

    Full Text Available Automated tool trajectory planning for spray-painting robots is still a challenging problem, especially for a large free-form surface. A grid approximation of a free-form surface is adopted in CAD modeling in this paper. A free-form surface model is approximated by a set of flat patches. We describe here an efficient and flexible tool trajectory optimization scheme using T-Bézier curves calculated in a new way from trigonometrical bases. The distance between the spray gun and the free-form surface along the normal vector is varied. Automotive body parts, which are large free-form surfaces, are used to test the scheme. The experimental results show that the trajectory planning algorithm achieves satisfactory performance. This algorithm can also be extended to other applications.

  8. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    Science.gov (United States)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  9. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    Science.gov (United States)

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  10. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  11. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    Science.gov (United States)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  12. Analysis method for the search for neutrinoless double beta decay in the NEMO3 experiment: study of the background and first results

    International Nuclear Information System (INIS)

    Etienvre, A.I.

    2003-04-01

    The NEMO3 detector, installed in the Frejus Underground Laboratory, is dedicated to the study of neutrinoless double beta decay: the observation of this process would sign the massive and Majorana nature of neutrino. The experiment consists in very thin central source foils (the total mass is equal to 10 kg), a tracking detector made of drift cells operating in Geiger mode, a calorimeter made of plastic scintillators associated to photomultipliers, a coil producing a 30 gauss magnetic field and two shields, dedicated to the reduction of the γ-ray and neutron fluxes. In the first part, I describe the implications of several mechanisms, related to trilinear R-parity violation, on double beta decay. The second part is dedicated to a detailed study of the tracking detector of the experiment: after a description of the different working tests, I present the determination of the characteristics of the tracking reconstruction (transverse and longitudinal resolution, by Geiger cell and precision on vertex determination, charge recognition). The last part corresponds to the analysis of the data taken by the experiment. On the one hand, an upper limit on the Tl 208 activity of the sources has been determined: it is lower than 68 mBq/kg, at 90% of confidence level. On the other hand, I have developed and tested on these data a method in order to analyse the neutrinoless double beta decay signal; this method is based on a maximum of likelihood using all the available information. Using this method, I could determine a first and very preliminary upper limit on the effective mass of the neutrino. (author)

  13. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    Science.gov (United States)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  14. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  15. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  16. MAFALDA: An early warning modeling tool to forecast volcanic ash dispersal and deposition

    Science.gov (United States)

    Barsotti, S.; Nannipieri, L.; Neri, A.

    2008-12-01

    Forecasting the dispersal of ash from explosive volcanoes is a scientific challenge to modern volcanology. It also represents a fundamental step in mitigating the potential impact of volcanic ash on urban areas and transport routes near explosive volcanoes. To this end we developed a Web-based early warning modeling tool named MAFALDA (Modeling and Forecasting Ash Loading and Dispersal in the Atmosphere) able to quantitatively forecast ash concentrations in the air and on the ground. The main features of MAFALDA are the usage of (1) a dispersal model, named VOL-CALPUFF, that couples the column ascent phase with the ash cloud transport and (2) high-resolution weather forecasting data, the capability to run and merge multiple scenarios, and the Web-based structure of the procedure that makes it suitable as an early warning tool. MAFALDA produces plots for a detailed analysis of ash cloud dynamics and ground deposition, as well as synthetic 2-D maps of areas potentially affected by dangerous concentrations of ash. A first application of MAFALDA to the long-lasting weak plumes produced at Mt. Etna (Italy) is presented. A similar tool can be useful to civil protection authorities and volcanic observatories in reducing the impact of the eruptive events. MAFALDA can be accessed at http://mafalda.pi.ingv.it.

  17. Models, methods and software tools for building complex adaptive traffic systems

    International Nuclear Information System (INIS)

    Alyushin, S.A.

    2011-01-01

    The paper studies the modern methods and tools to simulate the behavior of complex adaptive systems (CAS), the existing systems of traffic modeling in simulators and their characteristics; proposes requirements for assessing the suitability of the system to simulate the CAS behavior in simulators. The author has developed a model of adaptive agent representation and its functioning environment to meet certain requirements set above, and has presented methods of agents' interactions and methods of conflict resolution in simulated traffic situations. A simulation system realizing computer modeling for simulating the behavior of CAS in traffic situations has been created [ru

  18. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    Science.gov (United States)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state

  19. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  20. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  1. GraphCrunch 2: Software tool for network modeling, alignment and clustering.

    Science.gov (United States)

    Kuchaiev, Oleksii; Stevanović, Aleksandar; Hayes, Wayne; Pržulj, Nataša

    2011-01-19

    Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI) data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL") for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other existing tool. Finally, GraphCruch 2 implements an

  2. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  3. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  4. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  5. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    International Nuclear Information System (INIS)

    Sulaiman, S; Roshan, A; Ariffin, M K A

    2013-01-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes

  6. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    , connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control......This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built...

  8. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  9. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  10. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  11. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  12. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  13. Analysis method for the search for neutrinoless double beta decay in the NEMO3 experiment: study of the background and first results; Methode d'analyse pour la recherche de la double desintegration {beta} sans emission de neutrinos dans l'experience NEMO3. Etude du bruit de fond et premiers resultats

    Energy Technology Data Exchange (ETDEWEB)

    Etienvre, A I

    2003-04-15

    The NEMO3 detector, installed in the Frejus Underground Laboratory, is dedicated to the study of neutrinoless double beta decay: the observation of this process would sign the massive and Majorana nature of neutrino. The experiment consists in very thin central source foils (the total mass is equal to 10 kg), a tracking detector made of drift cells operating in Geiger mode, a calorimeter made of plastic scintillators associated to photomultipliers, a coil producing a 30 gauss magnetic field and two shields, dedicated to the reduction of the {gamma}-ray and neutron fluxes. In the first part, I describe the implications of several mechanisms, related to trilinear R-parity violation, on double beta decay. The second part is dedicated to a detailed study of the tracking detector of the experiment: after a description of the different working tests, I present the determination of the characteristics of the tracking reconstruction (transverse and longitudinal resolution, by Geiger cell and precision on vertex determination, charge recognition). The last part corresponds to the analysis of the data taken by the experiment. On the one hand, an upper limit on the Tl{sup 208} activity of the sources has been determined: it is lower than 68 mBq/kg, at 90% of confidence level. On the other hand, I have developed and tested on these data a method in order to analyse the neutrinoless double beta decay signal; this method is based on a maximum of likelihood using all the available information. Using this method, I could determine a first and very preliminary upper limit on the effective mass of the neutrino. (author)

  14. A weak AMOC in a cold climate: Causes and remedies for a bias in the low-resolution version of the UK Earth System Model

    Science.gov (United States)

    Kuhlbrodt, T.; Jones, C.

    2016-02-01

    The UK Earth System Model (UKESM) is currently being developed by the UK Met Office and the academic community in the UK. The low-resolution version of UKESM has got a nominal grid cell size of 150 km in the atmosphere (Unified Model [UM], N96) and 1° in the ocean (NEMO, ORCA1). In several preliminary test configurations of UKESM-N96-ORCA1, we find a significant cold bias in the northern hemisphere in comparison with HadGEM2 (N96-ORCA025, i.e. 0.25° resolution in the ocean). The sea surface is too cold by more than 2 K, and up to 6 K, in large parts of the North Atlantic and the northwest Pacific. In addition to the cold bias, the maximum AMOC transport (diagnosed below 500 m depth) decreases in all the configurations, displaying values between 11 and 14 Sv after 50 years run length. Transport at 26°N is even smaller and hence too weak in relation to observed values (approx. 18 Sv). The mixed layer is too deep within the North Atlantic Current and the Kuroshio, but too shallow north of these currents. The cold bias extends to a depth of several hundred metres. In the North Atlantic, it is accompanied by a freshening of up to 1.5 psu, compared to present-day climatology, along the path of the North Atlantic Current. A core problem appears to be the cessation of deep-water formation in the Labrador Sea. Remarkably, using earlier versions of NEMO and the UM, the AMOC is stable at around 16 or 17 Sv in the N96-ORCA1 configuration. We report on various strategies to reduce the cold bias and enhance the AMOC transport. Changing various parameters that affect the vertical mixing in NEMO has no significant effect. Modifying the bathymetry to deepen and widen the channels across the Greenland-Iceland-Scotland sill leads to a short-term improvement in AMOC transport, but only for about ten years. Strikingly, in a configuration with longer time steps for the atmosphere model we find a climate that is even colder, but has got a more vigorous maximum AMOC transport (14 Sv

  15. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    International Nuclear Information System (INIS)

    Dunford, Gary; Williams, David; Smith, Rick

    2013-01-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  16. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    Energy Technology Data Exchange (ETDEWEB)

    Dunford, Gary [AEM Consulting, LLC, 1201 Jadwin Avenue, Richland, WA 99352 (United States); Williams, David [WIT, Inc., 11173 Oak Fern Court, San Diego, CA 92131 (United States); Smith, Rick [Knowledge Systems Design, Inc., 13595 Quaker Hill Cross Rd, Nevada City, CA 95959 (United States)

    2013-07-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  17. PAH plant uptake prediction: Evaluation of combined availability tools and modeling approach

    OpenAIRE

    Ouvrard, Stéphanie; DUPUY, Joan; Leglize, Pierre; Sterckeman, Thibault

    2015-01-01

    Transfer to plant is one of the main human exposure pathways of polycyclic aromatic hydrocarbons (PAH) from contaminated soils. However existing models implemented in risk assessment tools mostly rely on i) total contaminant concentration and ii) plant uptake models based on hydroponics experiments established with pesticides (Briggs et al., 1982, 1983). Total concentrations of soil contaminants are useful to indicate pollution, however they do not necessarily indicate risk. Me...

  18. Modelling tools for managing Induced RiverBank Filtration MAR schemes

    Science.gov (United States)

    De Filippis, Giovanna; Barbagli, Alessio; Marchina, Chiara; Borsi, Iacopo; Mazzanti, Giorgio; Nardi, Marco; Vienken, Thomas; Bonari, Enrico; Rossetto, Rudy

    2017-04-01

    Induced RiverBank Filtration (IRBF) is a widely used technique in Managed Aquifer Recharge (MAR) schemes, when aquifers are hydraulically connected with surface water bodies, with proven positive effects on quality and quantity of groundwater. IRBF allows abstraction of a large volume of water, avoiding large decrease in groundwater heads. Moreover, thanks to the filtration process through the soil, the concentration of chemical species in surface water can be reduced, thus becoming an excellent resource for the production of drinking water. Within the FP7 MARSOL project (demonstrating Managed Aquifer Recharge as a SOLution to water scarcity and drought; http://www.marsol.eu/), the Sant'Alessio IRBF (Lucca, Italy) was used to demonstrate the feasibility and technical and economic benefits of managing IRBF schemes (Rossetto et al., 2015a). The Sant'Alessio IRBF along the Serchio river allows to abstract an overall amount of about 0.5 m3/s providing drinking water for 300000 people of the coastal Tuscany (mainly to the town of Lucca, Pisa and Livorno). The supplied water is made available by enhancing river bank infiltration into a high yield (10-2 m2/s transmissivity) sandy-gravelly aquifer by rising the river head and using ten vertical wells along the river embankment. A Decision Support System, consisting in connected measurements from an advanced monitoring network and modelling tools was set up to manage the IRBF. The modelling system is based on spatially distributed and physically based coupled ground-/surface-water flow and solute transport models integrated in the FREEWAT platform (developed within the H2020 FREEWAT project - FREE and Open Source Software Tools for WATer Resource Management; Rossetto et al., 2015b), an open source and public domain GIS-integrated modelling environment for the simulation of the hydrological cycle. The platform aims at improving water resource management by simplifying the application of EU water-related Directives and at

  19. Operation room tool handling and miscommunication scenarios: an object-process methodology conceptual model.

    Science.gov (United States)

    Wachs, Juan P; Frenkel, Boaz; Dori, Dov

    2014-11-01

    Errors in the delivery of medical care are the principal cause of inpatient mortality and morbidity, accounting for around 98,000 deaths in the United States of America (USA) annually. Ineffective team communication, especially in the operation room (OR), is a major root of these errors. This miscommunication can be reduced by analyzing and constructing a conceptual model of communication and miscommunication in the OR. We introduce the principles underlying Object-Process Methodology (OPM)-based modeling of the intricate interactions between the surgeon and the surgical technician while handling surgical instruments in the OR. This model is a software- and hardware-independent description of the agents engaged in communication events, their physical activities, and their interactions. The model enables assessing whether the task-related objectives of the surgical procedure were achieved and completed successfully and what errors can occur during the communication. The facts used to construct the model were gathered from observations of various types of operations miscommunications in the operating room and its outcomes. The model takes advantage of the compact ontology of OPM, which is comprised of stateful objects - things that exist physically or informatically, and processes - things that transform objects by creating them, consuming them or changing their state. The modeled communication modalities are verbal and non-verbal, and errors are modeled as processes that deviate from the "sunny day" scenario. Using OPM refinement mechanism of in-zooming, key processes are drilled into and elaborated, along with the objects that are required as agents or instruments, or objects that these processes transform. The model was developed through an iterative process of observation, modeling, group discussions, and simplification. The model faithfully represents the processes related to tool handling that take place in an OR during an operation. The specification is at

  20. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  1. A modeling tool to support decision making in future hydropower development in Chile

    Science.gov (United States)

    Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.

    2017-12-01

    Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower

  2. Digital soil mapping as a tool for quantifying state-and-transition models

    Science.gov (United States)

    Ecological sites and associated state-and-transition models (STMs) are rapidly becoming important land management tools in rangeland systems in the US and around the world. Descriptions of states and transitions are largely developed from expert knowledge and generally accepted species and community...

  3. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    International Nuclear Information System (INIS)

    Benichou, Léo; Mayr, Sebastian

    2014-01-01

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  4. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    Energy Technology Data Exchange (ETDEWEB)

    Benichou, Léo, E-mail: leo.benichou@theshiftproject.org [The Shift Project, Paris (France); Mayr, Sebastian, E-mail: communication@theshiftproject.org [Paris School of International Affairs, Sciences Po., Paris (France)

    2014-01-13

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  5. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  6. 75 FR 54627 - ICLUS v1.3 User's Manual: ArcGIS Tools and Datasets for Modeling U.S. Housing Density Growth

    Science.gov (United States)

    2010-09-08

    ...'s guide titled, ``ICLUS v1.3 User's Manual: ArcGIS Tools and Datasets for Modeling U.S. Housing...: ``ICLUS v1.3 User's Manual: ArcGIS Tools and Datasets for Modeling U.S. Housing Density Growth'' and the... final document title, ``ICLUS v1.3 User's Manual: ArcGIS Tools and Datasets for Modeling U.S. Housing...

  7. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  8. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  9. CINE CLUB

    CERN Multimedia

    Ciné Club

    2009-01-01

    Main Auditorium CERN jeudi 17 décembre 2009 à 18 h 15 à l’Amphithéâtre principal du CERN (bâtiment 500)   Comme chaque année avant Noël, le CineClub du CERN est heureux d’inviter petits et grands à une projection gratuite du film   LE MONDE DE NEMO (FINDING NEMO) (USA, 2003, Andrew Stanton and Lee Unkrich)   Dans les eaux tropicales de la Grande Barrière de corail, un poisson-clown du nom de Marin mène une existence paisible avec son fils unique, Nemo. Redoutant l’océan et ses risques imprévisibles, il fait de son mieux pour protéger son fils. Comme tous les petits poissons de son âge, celui-ci rêve pourtant d’explorer les mystérieux récifs. Lorsque Nemo disparaît, Marin devient malgré lui le héros d’une qu&a...

  10. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  11. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  12. Wear-dependent specific coefficients in a mechanistic model for turning of nickel-based superalloy with ceramic tools

    Science.gov (United States)

    López de Lacalle, Luis Norberto; Urbicain Pelayo, Gorka; Fernández-Valdivielso, Asier; Alvarez, Alvaro; González, Haizea

    2017-09-01

    Difficult to cut materials such as nickel and titanium alloys are used in the aeronautical industry, the former alloys due to its heat-resistant behavior and the latter for the low weight - high strength ratio. Ceramic tools made out alumina with reinforce SiC whiskers are a choice in turning for roughing and semifinishing workpiece stages. Wear rate is high in the machining of these alloys, and consequently cutting forces tends to increase along one operation. This paper establishes the cutting force relation between work-piece and tool in the turning of such difficult-to-cut alloys by means of a mechanistic cutting force model that considers the tool wear effect. The cutting force model demonstrates the force sensitivity to the cutting engagement parameters (ap, f) when using ceramic inserts and wear is considered. Wear is introduced through a cutting time factor, being useful in real conditions taking into account that wear quickly appears in alloys machining. A good accuracy in the cutting force model coefficients is the key issue for an accurate prediction of turning forces, which could be used as criteria for tool replacement or as input for chatter or other models.

  13. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  14. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  15. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  16. Modeling and Adhesive Tool Wear in Dry Drilling of Aluminum Alloys

    International Nuclear Information System (INIS)

    Girot, F.; Gutierrez-Orrantia, M. E.; Calamaz, M.; Coupard, D.

    2011-01-01

    One of the challenges in aeronautic drilling operations is the elimination of cutting fluids while maintaining the quality of drilled parts. This paper therefore aims to increase the tool life and process quality by working on relationships existing between drilling parameters (cutting speed and feed rate), coatings and tool geometry. In dry drilling, the phenomenon of Built-Up Layer is the predominant damage mechanism. A model fitting the axial force with the cutting parameters and the damage has been developed. The burr thickness and its dispersion decrease with the feed rate. The current diamond coatings which exhibit a strong adhesion to the carbide substrate can limit this adhesive layer phenomenon. A relatively smooth nano-structured coating strongly limits the development of this layer.

  17. Nemo-like kinase as a negative regulator of nuclear receptor Nurr1 gene transcription in prostate cancer

    International Nuclear Information System (INIS)

    Wang, Jian; Yang, Zhi-Hong; Chen, Hua; Li, Hua-Hui; Chen, Li-Yong; Zhu, Zhu; Zou, Ying; Ding, Cong-Cong; Yang, Jing; He, Zhi-Wei

    2016-01-01

    Nurr1, a member of the orphan receptor family, plays an important role in several types of cancer. Our previous work demonstrated that increased expression of Nurr1 plays a significant role in the initiation and progression of prostate cancer (PCa), though the mechanisms for regulation of Nurr1 expression remain unknown. In this study, we investigated the hypothesis that Nemo-like kinase (NLK) is a key regulator of Nurr1 expression in PCa. Immunohistochemistry and Western blot analysis were used to evaluate levels of NLK and Nurr1 in prostatic tissues and cell lines. The effects of overexpression or knockdown of Nurr1 were evaluated in PCa cells through use of PCR, Western blots and promoter reporter assays. The role of Nurr1 promoter cis element was studied by creation of two mutant Nurr1 promoter luciferase constructs, one with a mutated NF-κB binding site and one with a mutated CREB binding site. In addition, three specific inhibitors were used to investigate the roles of these proteins in transcriptional activation of Nurr1, including BAY 11–7082 (NF-κB inhibitor), KG-501 (CREB inhibitor) and ICG-001 (CREB binding protein, CBP, inhibitor). The function of CBP in NLK-mediated regulation of Nurr1 expression was investigated using immunofluorescence, co-immunoprecipitation (Co-IP) and chromatin immunoprecipitation assays (ChIPs). NLK expression was inversely correlated with Nurr1 expression in prostate cancer tissues and cell lines. Overexpression of NLK suppressed Nurr1 promoter activity, leading to downregulation of Nurr1 expression. In contrast, knockdown of NLK demonstrated opposite results, leading to upregulation of Nurr1. When compared with the wild-type Nurr1 promoter, mutation of NF-κB- and CREB-binding sites of the Nurr1 promoter region significantly reduced the upregulation of Nurr1 induced by knockdown of NLK in LNCaP cells; treatment with inhibitors of CREB, CBP and NF-κB led to similar results. We also found that NLK directly interacts with CBP

  18. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    are missing. The aim was to develop a clinically applicable risk prediction tool. Methods The prediction models were developed and tested using three prospective data sets from Finland (n = 860), Denmark (n = 453), and Scotland (n = 231). Prediction models for persistent pain of moderate to severe intensity......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  19. ARCHITECTURAL FORM CREATION IN THE DESIGN STUDIO: PHYSICAL MODELING AS AN EFFECTIVE DESIGN TOOL

    Directory of Open Access Journals (Sweden)

    Wael Abdelhameed

    2011-11-01

    Full Text Available This research paper attempts to shed more light on an area of the design studio, which concerns with the use of physical modeling as a design medium in architectural form creation. An experiment has been carried out during an architectural design studio in order to not only investigate physical modeling as a tool of form creation but also improve visual design thinking that students employ while using this manual tool. To achieve the research objective, a method was proposed and applied to track form creation processes, based upon three types of operation, namely: sketching transformations, divergent physical-modeling transformations, and convergent physical-modeling transformations. The method helps record the innovative transitions of form during conceptual designing in a simple way. Investigating form creation processes and activities associated with visual design thinking enables the research to conclude to general results of the role of physical modeling in the conceptual phase of designing, and to specific results of the methods used in this architectural design studio experiment.

  20. Ocean modelling aspects for drift applications

    Science.gov (United States)

    Stephane, L.; Pierre, D.

    2010-12-01

    Nowadays, many authorities in charge of rescue-at-sea operations lean on operational oceanography products to outline research perimeters. Moreover, current fields estimated with sophisticated ocean forecasting systems can be used as input data for oil spill/ adrift object fate models. This emphasises the necessity of an accurate sea state forecast, with a mastered level of reliability. This work focuses on several problems inherent to drift modeling, dealing in the first place with the efficiency of the oceanic current field representation. As we want to discriminate the relevance of a particular physical process or modeling option, the idea is to generate series of current fields of different characteristics and then qualify them in term of drift prediction efficiency. Benchmarked drift scenarios were set up from real surface drifters data, collected in the Mediterranean sea and off the coasts of Angola. The time and space scales that we are interested in are about 72 hr forecasts (typical timescale communicated in case of crisis), for distance errors that we hope about a few dozen of km around the forecast (acceptable for reconnaissance by aircrafts) For the ocean prediction, we used some regional oceanic configurations based on the NEMO 2.3 code, nested into Mercator 1/12° operational system. Drift forecasts were computed offline with Mothy (Météo France oil spill modeling system) and Ariane (B. Blanke, 1997), a Lagrangian diagnostic tool. We were particularly interested in the importance of the horizontal resolution, vertical mixing schemes, and any processes that may impact the surface layer. The aim of the study is to ultimately point at the most suitable set of parameters for drift forecast use inside operational oceanic systems. We are also motivated in assessing the relevancy of ensemble forecasts regarding determinist predictions. Several tests showed that mis-described observed trajectories can finally be modelled statistically by using uncertainties

  1. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  2. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  3. Rogeaulito: a world energy scenario modeling tool for transparent energy system thinking

    Directory of Open Access Journals (Sweden)

    Léo eBenichou

    2014-01-01

    Full Text Available Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  4. Tools for assessing mitochondrial dynamics in mouse tissues and neurodegenerative models

    Science.gov (United States)

    Pham, Anh H.

    Mitochondria are dynamic organelles that undergo membrane fusion and fission and transport. The dynamic properties of mitochondria are important for regulating mitochondrial function. Defects in mitochondrial dynamics are linked neurodegenerative diseases and affect the development of many tissues. To investigate the role of mitochondrial dynamics in diseases, versatile tools are needed to explore the physiology of these dynamic organelles in multiple tissues. Current tools for monitoring mitochondrial dynamics have been limited to studies in cell culture, which may be inadequate model systems for exploring the network of tissues. Here, we have generated mouse models for monitoring mitochondrial dynamics in a broad spectrum of tissues and cell types. The Photo-Activatable Mitochondrial (PhAM floxed) line enables Cre-inducible expression of a mitochondrial targeted photoconvertible protein, Dendra2 (mito-Dendra2). In the PhAMexcised line, mito-Dendra2 is ubiquitously expressed to facilitate broad analysis of mitochondria at various developmental processes. We have utilized these models to study mitochondrial dynamics in the nigrostriatal circuit of Parkinson's disease (PD) and in the development of skeletal muscles. Increasing evidences implicate aberrant regulation of mitochondrial fusion and fission in models of PD. To assess the function of mitochondrial dynamics in the nigrostriatal circuit, we utilized transgenic techniques to abrogate mitochondrial fusion. We show that deletion of the Mfn2 leads to the degeneration of dopaminergic neurons and Parkinson's-like features in mice. To elucidate the dynamic properties of mitochondria during muscle development, we established a platform for examining mitochondrial compartmentalization in skeletal muscles. This model system may yield clues to the role of mitochondrial dynamics in mitochondrial myopathies.

  5. The Climate-Agriculture-Modeling and Decision Tool (CAMDT) for Climate Risk Management in Agriculture

    Science.gov (United States)

    Ines, A. V. M.; Han, E.; Baethgen, W.

    2017-12-01

    Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT

  6. License Management System

    OpenAIRE

    Urhonen, Matti

    2014-01-01

    Anite Finland Ltd. is a Finnish based company providing a full range of software and hardware solutions for the mobile network testing. The Nemo product portfolio consists of several products from handheld measurement tools all the way to the powerful data analysing server solutions. The main customers are mobile operators all around the world. Anti-piracy and copy protecting is a serious business, as there are skilful people trying to break the copy protections for illegal income or to g...

  7. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  8. Climate change web picker. A tool bridging daily climate needs in process based modelling in forestry and agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Palma, J.H.N.

    2017-11-01

    Aim of study: Climate data is a need for different types of modeling assessments, especially those involving process based modeling focusing on climate change impacts. However, there is a scarcity of tools delivering easy access to climate datasets to use in biological related modeling. This study aimed at the development of a tool that could provide an user-friendly interface to facilitate access to climate datasets, that are used to supply climate scenarios for the International Panel on Climate Change. Area of study: The tool provides daily datasets across Europe, and also parts of northern Africa Material and Methods: The tool uses climatic datasets generated from third party sources (IPCC related) while a web based interface was developed in JavaScript to ease the access to the datasets Main Results: The interface delivers daily (or monthly) climate data from a user-defined location in Europe for 7 climate variables: minimum and maximum temperature, precipitation, radiation, minimum and maximum relative humidity and wind speed). The time frame ranges from 1951 to 2100, providing the basis to use the data for climate change impact assessments. The tool is free and publicly available at http://www.isa.ulisboa.pt/proj/clipick/. Research Highlights: A new and easy-to-use tool is suggested that will promote the use of climate change scenarios across Europe, especially when daily time steps are needed. CliPick eases the communication between climatic and modelling communities such as agriculture and forestry.

  9. Climate change web picker. A tool bridging daily climate needs in process based modelling in forestry and agriculture

    International Nuclear Information System (INIS)

    Palma, J.H.N.

    2017-01-01

    Aim of study: Climate data is a need for different types of modeling assessments, especially those involving process based modeling focusing on climate change impacts. However, there is a scarcity of tools delivering easy access to climate datasets to use in biological related modeling. This study aimed at the development of a tool that could provide an user-friendly interface to facilitate access to climate datasets, that are used to supply climate scenarios for the International Panel on Climate Change. Area of study: The tool provides daily datasets across Europe, and also parts of northern Africa Material and Methods: The tool uses climatic datasets generated from third party sources (IPCC related) while a web based interface was developed in JavaScript to ease the access to the datasets Main Results: The interface delivers daily (or monthly) climate data from a user-defined location in Europe for 7 climate variables: minimum and maximum temperature, precipitation, radiation, minimum and maximum relative humidity and wind speed). The time frame ranges from 1951 to 2100, providing the basis to use the data for climate change impact assessments. The tool is free and publicly available at http://www.isa.ulisboa.pt/proj/clipick/. Research Highlights: A new and easy-to-use tool is suggested that will promote the use of climate change scenarios across Europe, especially when daily time steps are needed. CliPick eases the communication between climatic and modelling communities such as agriculture and forestry.

  10. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  11. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  12. A regionally-linked, dynamic material flow modelling tool for rolled, extruded and cast aluminium products

    DEFF Research Database (Denmark)

    Bertram, M.; Ramkumar, S.; Rechberger, H.

    2017-01-01

    A global aluminium flow modelling tool, comprising nine trade linked regions, namely China, Europe, Japan, Middle East, North America, Other Asia, Other Producing Countries, South America and Rest of World, has been developed. The purpose of the Microsoft Excel-based tool is the quantification...... of regional stocks and flows of rolled, extruded and casting alloys across space and over time, giving the industry the ability to evaluate the potential to recycle aluminium scrap most efficiently. The International Aluminium Institute will update the tool annually and publish a visualisation of results...

  13. Mediterranea Forecasting System: a focus on wave-current coupling

    Science.gov (United States)

    Clementi, Emanuela; Delrosso, Damiano; Pistoia, Jenny; Drudi, Massimiliano; Fratianni, Claudia; Grandi, Alessandro; Pinardi, Nadia; Oddo, Paolo; Tonani, Marina

    2016-04-01

    The Mediterranean Forecasting System (MFS) is a numerical ocean prediction system that produces analyses, reanalyses and short term forecasts for the entire Mediterranean Sea and its Atlantic Ocean adjacent areas. MFS became operational in the late 90's and has been developed and continuously improved in the framework of a series of EU and National funded programs and is now part of the Copernicus Marine Service. The MFS is composed by the hydrodynamic model NEMO (Nucleus for European Modelling of the Ocean) 2-way coupled with the third generation wave model WW3 (WaveWatchIII) implemented in the Mediterranean Sea with 1/16 horizontal resolution and forced by ECMWF atmospheric fields. The model solutions are corrected by the data assimilation system (3D variational scheme adapted to the oceanic assimilation problem) with a daily assimilation cycle, using a background error correlation matrix varying seasonally and in different sub-regions of the Mediterranean Sea. The focus of this work is to present the latest modelling system upgrades and the related achieved improvements. In order to evaluate the performance of the coupled system a set of experiments has been built by coupling the wave and circulation models that hourly exchange the following fields: the sea surface currents and air-sea temperature difference are transferred from NEMO model to WW3 model modifying respectively the mean momentum transfer of waves and the wind speed stability parameter; while the neutral drag coefficient computed by WW3 model is passed to NEMO that computes the turbulent component. In order to validate the modelling system, numerical results have been compared with in-situ and remote sensing data. This work suggests that a coupled model might be capable of a better description of wave-current interactions, in particular feedback from the ocean to the waves might assess an improvement on the prediction capability of wave characteristics, while suggests to proceed toward a fully

  14. Interdisciplinary semantic model for managing the design of a steam-assisted gravity drainage tooling system

    Directory of Open Access Journals (Sweden)

    Michael Leitch

    2018-01-01

    Full Text Available Complex engineering systems often require extensive coordination between different expert areas in order to avoid costly design iterations and rework. Cyber-physics system (CPS engineering methods could provide valuable insights to help model these interactions and optimize the design of such systems. In this work, steam assisted gravity drainage (SAGD, a complex oil extraction process that requires deep understanding of several physical-chemical phenomena, is examined whereby the complexities and interdependencies of the system are explored. Based on an established unified feature modeling scheme, a software modeling framework is proposed to manage the design process of the production tools used for SAGD oil extraction. Applying CPS methods to unify complex phenomenon and engineering models, the proposed CPS model combines effective simulation with embedded knowledge of completion tooling design in order to optimize reservoir performance. The system design is expressed using graphical diagrams of the unified modelling language (UML convention. To demonstrate the capability of this system, a distributed research group is described, and their activities coordinated using the described CPS model.

  15. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  16. PORFIDO: Oceanographic data for neutrino telescopes

    International Nuclear Information System (INIS)

    Cordelli, Marco; Martini, Agnese; Habel, Roberto; Trasatti, Luciano

    2011-01-01

    PORFIDO (Physical Oceanography by RFID Outreach) is a system designed to be installed in the optical modules of the NEMO experiment and possibly, in future underwater neutrino telescopes to gather oceanographic data with a minimum of disturbance to the main project and a very limited budget. The system gathers oceanographic data (temperature, etc.) from passive RFID tags (WISPs) attached to the outside of the NEMO optical modules with an RF reader situated inside the glass sphere, without the need of connectors or penetrators, which are very expensive and offer low reliability. Ten PORFIDOs will be deployed with the NEMO Phase 2 tower in 2011.

  17. PORFIDO: Oceanographic data for neutrino telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Cordelli, Marco; Martini, Agnese; Habel, Roberto [INFN-Laboratori Nazionali di Frascati, Via E. Fermi 40, I-00044 Frascati (Italy); Trasatti, Luciano, E-mail: luciano.trasatti@gmail.co [INFN-Laboratori Nazionali di Frascati, Via E. Fermi 40, I-00044 Frascati (Italy)

    2011-01-21

    PORFIDO (Physical Oceanography by RFID Outreach) is a system designed to be installed in the optical modules of the NEMO experiment and possibly, in future underwater neutrino telescopes to gather oceanographic data with a minimum of disturbance to the main project and a very limited budget. The system gathers oceanographic data (temperature, etc.) from passive RFID tags (WISPs) attached to the outside of the NEMO optical modules with an RF reader situated inside the glass sphere, without the need of connectors or penetrators, which are very expensive and offer low reliability. Ten PORFIDOs will be deployed with the NEMO Phase 2 tower in 2011.

  18. Mathematical modelling of migration: A suitable tool for the enforcement authorities?

    DEFF Research Database (Denmark)

    Petersen, Jens Højslev; Trier, Xenia Thorsager; Fabech, B.

    2005-01-01

    possibilities of implementing migration-modelling software as a tool in official food control and possibly in improving the own-check programmes of Danish plastic-converting plants. Food inspectors from nine regional food control centres initially attended a training course in the use of a commercial modelling...... reason was a lack of information from those in the raw material supply chain who considered their products protected by commercial confidentiality. In general, the food inspectors were in favour of using migration modelling for future control visits.......A few years ago, it became accepted that the plastics industry could use migration modelling for compliance testing. When a calculation confirms that the migration of a compound from a plastic material or article is below the specific migration limit, this is considered sufficient documentation...

  19. Techniques for the construction of an elliptical-cylindrical model using circular rotating tools in non CNC machines

    International Nuclear Information System (INIS)

    Villalobos Mendoza, Brenda; Cordero Davila, Alberto; Gonzalez Garcia, Jorge

    2011-01-01

    This paper describes the construction of an elliptical-cylindrical model without spherical aberration using vertical rotating tools. The engine of the circular tool is placed on one arm so that the tool fits on the surface and this in turn is moved by an X-Y table. The test method and computer algorithms that predict the desired wear are described.

  20. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  1. Analysis method for the search for neutrinoless double beta decay in the NEMO3 experiment: study of the background and first results; Methode d'analyse pour la recherche de la double desintegration {beta} sans emission de neutrinos dans l'experience NEMO3. Etude du bruit de fond et premiers resultats

    Energy Technology Data Exchange (ETDEWEB)

    Etienvre, A.I

    2003-04-15

    The NEMO3 detector, installed in the Frejus Underground Laboratory, is dedicated to the study of neutrinoless double beta decay: the observation of this process would sign the massive and Majorana nature of neutrino. The experiment consists in very thin central source foils (the total mass is equal to 10 kg), a tracking detector made of drift cells operating in Geiger mode, a calorimeter made of plastic scintillators associated to photomultipliers, a coil producing a 30 gauss magnetic field and two shields, dedicated to the reduction of the {gamma}-ray and neutron fluxes. In the first part, I describe the implications of several mechanisms, related to trilinear R-parity violation, on double beta decay. The second part is dedicated to a detailed study of the tracking detector of the experiment: after a description of the different working tests, I present the determination of the characteristics of the tracking reconstruction (transverse and longitudinal resolution, by Geiger cell and precision on vertex determination, charge recognition). The last part corresponds to the analysis of the data taken by the experiment. On the one hand, an upper limit on the Tl{sup 208} activity of the sources has been determined: it is lower than 68 mBq/kg, at 90% of confidence level. On the other hand, I have developed and tested on these data a method in order to analyse the neutrinoless double beta decay signal; this method is based on a maximum of likelihood using all the available information. Using this method, I could determine a first and very preliminary upper limit on the effective mass of the neutrino. (author)

  2. Coloured Petri Nets and CPN Tools for Modelling and Validation of Concurrent Systems

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kristensen, Lars Michael; Wells, Lisa Marie

    2007-01-01

    Coloured Petri Nets (CPNs) is a language for the modeling and validation og systems in which concurrency, communication, and synchronisation play a major role. Coloured Petri Nets is a descrete-event modeling language combining Petri Nets with the funcitonal programming language Standard ML. Petr...... with user-defined Standard ML functions. A license for CPN Tools can be obtained free of charge, also for commercial use....

  3. Modelling raw water quality: development of a drinking water management tool.

    Science.gov (United States)

    Kübeck, Ch; van Berk, W; Bergmann, A

    2009-01-01

    Ensuring future drinking water supply requires a tough management of groundwater resources. However, recent practices of economic resource control often does not involve aspects of the hydrogeochemical and geohydraulical groundwater system. In respect of analysing the available quantity and quality of future raw water, an effective resource management requires a full understanding of the hydrogeochemical and geohydraulical processes within the aquifer. For example, the knowledge of raw water quality development within the time helps to work out strategies of water treatment as well as planning finance resources. On the other hand, the effectiveness of planed measurements reducing the infiltration of harmful substances such as nitrate can be checked and optimized by using hydrogeochemical modelling. Thus, within the framework of the InnoNet program funded by Federal Ministry of Economics and Technology, a network of research institutes and water suppliers work in close cooperation developing a planning and management tool particularly oriented on water management problems. The tool involves an innovative material flux model that calculates the hydrogeochemical processes under consideration of the dynamics in agricultural land use. The program integrated graphical data evaluation is aligned on the needs of water suppliers.

  4. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    Science.gov (United States)

    Zuberi, Aamir; Lutz, Cathleen

    2016-01-01

    Abstract The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling

  5. Mouse Models for Drug Discovery. Can New Tools and Technology Improve Translational Power?

    Science.gov (United States)

    Zuberi, Aamir; Lutz, Cathleen

    2016-12-01

    The use of mouse models in biomedical research and preclinical drug evaluation is on the rise. The advent of new molecular genome-altering technologies such as CRISPR/Cas9 allows for genetic mutations to be introduced into the germ line of a mouse faster and less expensively than previous methods. In addition, the rapid progress in the development and use of somatic transgenesis using viral vectors, as well as manipulations of gene expression with siRNAs and antisense oligonucleotides, allow for even greater exploration into genomics and systems biology. These technological advances come at a time when cost reductions in genome sequencing have led to the identification of pathogenic mutations in patient populations, providing unprecedented opportunities in the use of mice to model human disease. The ease of genetic engineering in mice also offers a potential paradigm shift in resource sharing and the speed by which models are made available in the public domain. Predictively, the knowledge alone that a model can be quickly remade will provide relief to resources encumbered by licensing and Material Transfer Agreements. For decades, mouse strains have provided an exquisite experimental tool to study the pathophysiology of the disease and assess therapeutic options in a genetically defined system. However, a major limitation of the mouse has been the limited genetic diversity associated with common laboratory mice. This has been overcome with the recent development of the Collaborative Cross and Diversity Outbred mice. These strains provide new tools capable of replicating genetic diversity to that approaching the diversity found in human populations. The Collaborative Cross and Diversity Outbred strains thus provide a means to observe and characterize toxicity or efficacy of new therapeutic drugs for a given population. The combination of traditional and contemporary mouse genome editing tools, along with the addition of genetic diversity in new modeling systems

  6. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  7. Modelling tools for assessing bioremediation performance and risk of chlorinated solvents in clay tills

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia

    design are challenging. This thesis presents the development and application of analytical and numerical models to improve our understanding of transport and degradation processes in clay tills, which is crucial for assessing bioremediation performance and risk to groundwater. A set of modelling tools...... to groundwater and bioremediation performance in low-permeability media....

  8. The Applicability of Taylor’s Model to the Drilling of CFRP Using Uncoated WC-Co Tools: The Influence of Cutting Speed on Tool Wear

    OpenAIRE

    Merino Perez, J.L.; Merson, E.; Ayvar-Soberanis, S.; Hodzic, A.

    2014-01-01

    This work investigates the applicability of Taylor’s model on the drilling of CFRP using uncoated WC-Co tools, by assessing the influence of cutting speed (Vc) on tool wear. Two different resins, possessing low and high glass transition temperatures (Tg), and two different reinforcements, high strength and high modulus woven fabrics, were combined into three different systems. Flank wear rate gradient exhibited to be more reinforcement dependent, while the actual flank wear rate showed to be ...

  9. PRISM -- A tool for modelling proton energy deposition in semiconductor materials

    International Nuclear Information System (INIS)

    Oldfield, M.K.; Underwood, C.I.

    1996-01-01

    This paper presents a description of, and test results from, a new PC based software simulation tool PRISM (Protons in Semiconductor Materials). The model describes proton energy deposition in complex 3D sensitive volumes of semiconductor materials. PRISM is suitable for simulating energy deposition in surface-barrier detectors and semiconductor memory devices, the latter being susceptible to Single-Event Upset (SEU) and Multiple-Bit Upset (MBU). The design methodology on which PRISM is based, together with the techniques used to simulate ion transport and energy deposition, are described. Preliminary test results used to analyze the PRISM model are presented

  10. 3D-Printed Craniosynostosis Model: New Simulation Surgical Tool.

    Science.gov (United States)

    Ghizoni, Enrico; de Souza, João Paulo Sant Ana Santos; Raposo-Amaral, Cassio Eduardo; Denadai, Rafael; de Aquino, Humberto Belém; Raposo-Amaral, Cesar Augusto; Joaquim, Andrei Fernandes; Tedeschi, Helder; Bernardes, Luís Fernando; Jardini, André Luiz

    2018-01-01

    Craniosynostosis is a complex disease once it involves deep anatomic perception, and a minor mistake during surgery can be fatal. The objective of this report is to present novel 3-dimensional-printed polyamide craniosynostosis models that can improve the understanding and treatment complex pathologies. The software InVesalius was used for segmentation of the anatomy image (from 3 patients between 6 and 9 months old). Afterward, the file was transferred to a 3-dimensional printing system and, with the use of an infrared laser, slices of powder PA 2200 were consecutively added to build a polyamide model of cranial bone. The 3 craniosynostosis models allowed fronto-orbital advancement, Pi procedure, and posterior distraction in the operating room environment. All aspects of the craniofacial anatomy could be shown on the models, as well as the most common craniosynostosis pathologic variations (sphenoid wing elevation, shallow orbits, jugular foramen stenosis). Another advantage of our model is its low cost, about 100 U.S. dollars or even less when several models are produced. Simulation is becoming an essential part of medical education for surgical training and for improving surgical safety with adequate planning. This new polyamide craniosynostosis model allowed the surgeons to have realistic tactile feedback on manipulating a child's bone and permitted execution of the main procedures for anatomic correction. It is a low-cost model. Therefore our model is an excellent option for training purposes and is potentially a new important tool to improve the quality of the management of patients with craniosynostosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. DESIGN AND ANALYSIS OF THE SNS CCL HOT MODEL WATER COOLING SYSTEM USING THE SINDA/FLUINT NETWORK MODELING TOOL

    Energy Technology Data Exchange (ETDEWEB)

    C. AMMERMAN; J. BERNARDIN

    1999-11-01

    This report presents results for design and analysis of the hot model water cooling system for the Spallation Neutron Source (SNS) coupled-cavity linac (CCL). The hot model, when completed, will include segments for both the CCL and coupled-cavity drift-tube linac (CCDTL). The scope of this report encompasses the modeling effort for the CCL portion of the hot model. This modeling effort employed the SINDA/FLUINT network modeling tool. This report begins with an introduction of the SNS hot model and network modeling using SINDA/FLUINT. Next, the development and operation of the SINDA/FLUINT model are discussed. Finally, the results of the SINDA/FLUINT modeling effort are presented and discussed.

  12. Creating Electronic Books-Chapters for Computers and Tablets Using Easy Java/JavaScript Simulations, EjsS Modeling Tool

    OpenAIRE

    Wee, Loo Kang

    2015-01-01

    This paper shares my journey (tools used, design principles derived and modeling pedagogy implemented) when creating electronic books-chapters (epub3 format) for computers and tablets using Easy Java/JavaScript Simulations, (old name EJS, new EjsS) Modeling Tool. The theory underpinning this work grounded on learning by doing through dynamic and interactive simulation-models that can be more easily made sense of instead of the static nature of printed materials. I started combining related co...

  13. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  15. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  16. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  17. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  18. Numerical modelling of tools steel hardening. A thermal phenomena and phase transformations

    Directory of Open Access Journals (Sweden)

    T. Domański

    2010-01-01

    Full Text Available This paper the model hardening of tool steel takes into considerations of thermal phenomena and phase transformations in the solid state are presented. In the modelling of thermal phenomena the heat equations transfer has been solved by Finite Elements Method. The graph of continuous heating (CHT and continuous cooling (CCT considered steel are used in the model of phase transformations. Phase altered fractions during the continuous heating austenite and continuous cooling pearlite or bainite are marked in the model by formula Johnson-Mehl and Avrami. For rate of heating >100 K/s the modified equation Koistinen and Marburger is used. Modified equation Koistinen and Marburger identify the forming fraction of martensite.

  19. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  20. A tool to convert CAD models for importation into Geant4

    Science.gov (United States)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  1. Finding Difference: Nemo and Friends Opening the Door to Disability Theory

    Science.gov (United States)

    Preston, Daniel L.

    2010-01-01

    While middle school and high school students may have watched the Disney and Disney/Pixar films when they were younger, chances are they did not do so with a critical eye toward difference and disability, despite the fact that these films serve as excellent tools for teaching about difference. Recent estimates label 20% of the world's population…

  2. The neutron porosity tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-01-01

    The report contains a review of available information on neutron porosity tools with the emphasis on dual thermal-neutron-detector porosity tools and epithermal-neutron-detector porosity tools. The general principle of such tools is discussed and theoretical models are very briefly reviewed. Available data on tool designs are summarized with special regard to the source-detector distance. Tool operational data, porosity determination and correction of measurements are briefly discussed. (author) 15 refs

  3. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  4. Research on Error Modelling and Identification of 3 Axis NC Machine Tools Based on Cross Grid Encoder Measurement

    International Nuclear Information System (INIS)

    Du, Z C; Lv, C F; Hong, M S

    2006-01-01

    A new error modelling and identification method based on the cross grid encoder is proposed in this paper. Generally, there are 21 error components in the geometric error of the 3 axis NC machine tools. However according our theoretical analysis, the squareness error among different guide ways affects not only the translation error component, but also the rotational ones. Therefore, a revised synthetic error model is developed. And the mapping relationship between the error component and radial motion error of round workpiece manufactured on the NC machine tools are deduced. This mapping relationship shows that the radial error of circular motion is the comprehensive function result of all the error components of link, worktable, sliding table and main spindle block. Aiming to overcome the solution singularity shortcoming of traditional error component identification method, a new multi-step identification method of error component by using the Cross Grid Encoder measurement technology is proposed based on the kinematic error model of NC machine tool. Firstly, the 12 translational error components of the NC machine tool are measured and identified by using the least square method (LSM) when the NC machine tools go linear motion in the three orthogonal planes: XOY plane, XOZ plane and YOZ plane. Secondly, the circular error tracks are measured when the NC machine tools go circular motion in the same above orthogonal planes by using the cross grid encoder Heidenhain KGM 182. Therefore 9 rotational errors can be identified by using LSM. Finally the experimental validation of the above modelling theory and identification method is carried out in the 3 axis CNC vertical machining centre Cincinnati 750 Arrow. The entire 21 error components have been successfully measured out by the above method. Research shows the multi-step modelling and identification method is very suitable for 'on machine measurement'

  5. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  6. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  7. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  8. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    Science.gov (United States)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  9. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  10. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  11. Logic models as a tool for sexual violence prevention program development.

    Science.gov (United States)

    Hawkins, Stephanie R; Clinton-Sherrod, A Monique; Irvin, Neil; Hart, Laurie; Russell, Sarah Jane

    2009-01-01

    Sexual violence is a growing public health problem, and there is an urgent need to develop sexual violence prevention programs. Logic models have emerged as a vital tool in program development. The Centers for Disease Control and Prevention funded an empowerment evaluation designed to work with programs focused on the prevention of first-time male perpetration of sexual violence, and it included as one of its goals, the development of program logic models. Two case studies are presented that describe how significant positive changes can be made to programs as a result of their developing logic models that accurately describe desired outcomes. The first case study describes how the logic model development process made an organization aware of the importance of a program's environmental context for program success; the second case study demonstrates how developing a program logic model can elucidate gaps in organizational programming and suggest ways to close those gaps.

  12. Initialization methods and ensembles generation for the IPSL GCM

    Science.gov (United States)

    Labetoulle, Sonia; Mignot, Juliette; Guilyardi, Eric; Denvil, Sébastien; Masson, Sébastien

    2010-05-01

    The protocol used and developments made for decadal and seasonal predictability studies at IPSL (Paris, France) are presented. The strategy chosen is to initialize the IPSL-CM5 (NEMO ocean and LMDZ atmosphere) model only at the ocean-atmosphere interface, following the guidance and expertise gained from ocean-only NEMO experiments. Two novel approaches are presented for initializing the coupled system. First, a nudging of sea surface temperature and wind stress towards available reanalysis is made with the surface salinity climatologically restored. Second, the heat, salt and momentum fluxes received by the ocean model are computed as a linear combination of the fluxes computed by the atmospheric model and by a CORE-style bulk formulation using up-to-date reanalysis. The steps that led to these choices are presented, as well as a description of the code adaptation and a comparison of the computational cost of both methods. The strategy for the generation of ensembles at the end of the initialization phase is also presented. We show how the technical environment of IPSL-CM5 (LibIGCM) was modified to achieve these goals.

  13. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  14. A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model

    Science.gov (United States)

    Jacobson, I. D.

    1977-01-01

    A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.

  15. Modeling in the Classroom: An Evolving Learning Tool

    Science.gov (United States)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  16. Study of coastal line change modelling around the NPP site of Muria Peninsula

    International Nuclear Information System (INIS)

    Tumpal Pahala Tua Sinaga; Henu Susiati

    2007-01-01

    Coastal areas always changing due to two energies coming sea and land congregate. The changes are the forward-backward coastal lines alteration. The coastal lines alteration is caused by coastal sediment transport such as Long shore Sediment Transport and Cross-shore Sediment Transport. This research was aimed to model the sediment transport rate, direction and sediment transport volume and also to investigate the abrasion and accretion areas in Muria Peninsula. The method used in this sampling was purposive sampling method and data processing using NEMOS software. Overall result from the sediment transport model in Semenanjung Muria, the sediment transport rate were Q + =2471331.00 m 3 /year, Q - =-1325456.80 m 3 /year, Q gs =3796792.60 m 3 /year and Q net =1145874.40 m 3 /year; average abrasion and accretion distance were -0.982 m/year and 0.770 m/year, transport volume to right = 13431.15 m 3 /year and to left = -7203.53 m 3 /year. (author)

  17. Model-Based Fault Diagnosis Techniques Design Schemes, Algorithms and Tools

    CERN Document Server

    Ding, Steven X

    2013-01-01

    Guaranteeing a high system performance over a wide operating range is an important issue surrounding the design of automatic control systems with successively increasing complexity. As a key technology in the search for a solution, advanced fault detection and identification (FDI) is receiving considerable attention. This book introduces basic model-based FDI schemes, advanced analysis and design algorithms, and mathematical and control-theoretic tools. This second edition of Model-Based Fault Diagnosis Techniques contains: ·         new material on fault isolation and identification, and fault detection in feedback control loops; ·         extended and revised treatment of systematic threshold determination for systems with both deterministic unknown inputs and stochastic noises; addition of the continuously-stirred tank heater as a representative process-industrial benchmark; and ·         enhanced discussion of residual evaluation in stochastic processes. Model-based Fault Diagno...

  18. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  19. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  20. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Science.gov (United States)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  1. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  2. Designing an ICT tool platform to support SME business model innovation: Results of a first design cycle

    NARCIS (Netherlands)

    de Reuver, G.A.; Athanasopoulou, A.; Haaker, T.I.; Roelfsema, M.; Riedle, M; Breitfuss, G.

    2016-01-01

    Business model innovation (BMI) is becoming increasingly relevant for enterprises as they are faced with profound changes like digitalization. While business model thinking in academia has advanced, practical tooling that supports business model innovation for small and medium sized enterprises

  3. Rescheduling nursing shifts: scoping the challenge and examining the potential of mathematical model based tools.

    Science.gov (United States)

    Clark, Alistair; Moule, Pam; Topping, Annie; Serpell, Martin

    2015-05-01

    To review research in the literature on nursing shift scheduling / rescheduling, and to report key issues identified in a consultation exercise with managers in four English National Health Service trusts to inform the development of mathematical tools for rescheduling decision-making. Shift rescheduling is unrecognised as an everyday time-consuming management task with different imperatives from scheduling. Poor rescheduling decisions can have quality, cost and morale implications. A systematic critical literature review identified rescheduling issues and existing mathematic modelling tools. A consultation exercise with nursing managers examined the complex challenges associated with rescheduling. Minimal research exists on rescheduling compared with scheduling. Poor rescheduling can result in greater disruption to planned nursing shifts and may impact negatively on the quality and cost of patient care, and nurse morale and retention. Very little research examines management challenges or mathematical modelling for rescheduling. Shift rescheduling is a complex and frequent management activity that is more challenging than scheduling. Mathematical modelling may have potential as a tool to support managers to minimise rescheduling disruption. The lack of specific methodological support for rescheduling that takes into account its complexity, increases the likelihood of harm for patients and stress for nursing staff and managers. © 2013 John Wiley & Sons Ltd.

  4. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  5. USER FRIENDLY OPEN GIS TOOL FOR LARGE SCALE DATA ASSIMILATION – A CASE STUDY OF HYDROLOGICAL MODELLING

    Directory of Open Access Journals (Sweden)

    P. K. Gupta

    2012-08-01

    Full Text Available Open source software (OSS coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc... and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.. Quantum GIS (QGIS is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn – Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect, landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  6. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  7. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  8. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  9. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt

  10. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  11. Cost-benefit analysis model: A tool for area-wide fruit fly management. Procedures manual

    International Nuclear Information System (INIS)

    Enkerlin, W.; Mumford, J.; Leach, A.

    2007-03-01

    The Generic Fruit Fly Cost-Benefit Analysis Model assists in economic decision making associated with area-wide fruit fly control options. The FRUIT FLY COST-BENEFIT ANALYSIS PROGRAM (available on 1 CD-ROM from the Joint FAO/IAEA Programme of Nuclear Techniques in Food and Agriculture) is an Excel 2000 Windows based program, for which all standard Windows and Excel conventions apply. The Model is user friendly and thus largely self-explanatory. Nevertheless, it includes a procedures manual that has been prepared to guide the user, and thus should be used together with the software. Please note that the table presenting the pest management options in the Introductory Page of the model is controlled by spin buttons and click boxes. These controls are linked to macros that hide non relevant tables and boxes. N.B. it is important that the medium level of security is selected from the Tools menu of Excel, to do this go to Tools|Macros|Security| and select Medium. When the file is opened a form will appear containing three buttons, click on the middle button, 'Enable Macros', so that the macros may be used. Ideally the model should be used as a support tool by working groups aiming at assessing the economic returns of different fruit fly control options (suppression, eradication, containment and prevention). The working group should include professionals in agriculture with experience in area-wide implementation of integrated pest management programmes, an economist or at least someone with basic knowledge in economics, and if relevant, an entomologist with some background in the application of the sterile insect technique (SIT)

  12. An axisymmetrical non-linear finite element model for induction heating in injection molding tools

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Menotti, Stefano

    2016-01-01

    To analyze the heating and cooling phase of an induction heated injection molding tool accurately, the temperature dependent magnetic properties, namely the non-linear B-H curves, need to be accounted for in an induction heating simulation. Hence, a finite element model has been developed......, including the non-linear temperature dependent magnetic data described by a three-parameter modified Frohlich equation fitted to the magnetic saturation curve, and solved with an iterative procedure. The numerical calculations are compared with experiments conducted with two types of induction coils, built...... in to the injection molding tool. The model shows very good agreement with the experimental temperature measurements. It is also shown that the non-linearity can be used without the temperature dependency in some cases, and a proposed method is presented of how to estimate an effective linear permeability to use...

  13. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    of the characterisation model(s) and limited input data on chemical properties, which often has resulted in the omission of toxicants from the LCIA, or at best focus on well characterised chemicals. The project addresses both problems and integrates models, as well as data, in an information system – the OMNIITOX IS....... There is also a need for clarification of the relations between the (environmental) risk assessments of toxicants and LCIA, in addition to investigating the feasibility of introducing LCA into European chemicals legislation, tasks that also were addressed in the project.......This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts...

  14. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    Science.gov (United States)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  15. An online tool for business modelling and a refinement of the Business Canvas

    NARCIS (Netherlands)

    Rogier Brussee; Peter de Groot

    2016-01-01

    We give a refinement of the well known business model canvas by Osterwalder and Pigneur by splitting the basic blocks into further subblocks to reduce confusion and increase its expressive power. The splitting is used in an online tool which in addition comes with a set of questions to further

  16. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  17. Using interactive modeling tools to engage with, inform and empower decision making in local communities of landscape managers

    DEFF Research Database (Denmark)

    Christensen, Andreas Aagaard

    During the last decade digital modelling tools for environmental impact assessment have become increasingly interactive, agile and user-oriented. This has made it possible to implement models in situ, using them in live scenario situations with local stakeholders. As a result modelling tools......- and long term environmental impact of landscape management. This opens up a number of questions regarding the status and consequence of scientific data and modelled impact estimates as compared to locally held knowledge and expertise. It also opens up questions regarding how the injection of modelling...... for modelling the effect of agricultural land use decisions on nitrogen emission to the environment at landscape scales. Recently Danish authorities proposed to shift the scale of regulation from national regulatory instruments to a more local level to better fit relevant socio-political and agro-environmental...

  18. An Interactive Tool for Automatic Predimensioning and Numerical Modeling of Arch Dams

    Directory of Open Access Journals (Sweden)

    D. J. Vicente

    2017-01-01

    Full Text Available The construction of double-curvature arch dams is an attractive solution from an economic viewpoint due to the reduced volume of concrete necessary for their construction as compared to conventional gravity dams. Due to their complex geometry, many criteria have arisen for their design. However, the most widespread methods are based on recommendations of traditional technical documents without taking into account the possibilities of computer-aided design. In this paper, an innovative software tool to design FEM models of double-curvature arch dams is presented. Several capabilities are allowed: simplified geometry creation (interesting for academic purposes, preliminary geometrical design, high-detailed model construction, and stochastic calculation performance (introducing uncertainty associated with material properties and other parameters. This paper specially focuses on geometrical issues describing the functionalities of the tool and the fundamentals of the design procedure with regard to the following aspects: topography, reference cylinder, excavation depth, crown cantilever thickness and curvature, horizontal arch curvature, excavation and concrete mass volume, and additional elements such as joints or spillways. Examples of application on two Spanish dams are presented and the results obtained analyzed.

  19. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  20. Something black in the American Psyche: formal innovation and Freudian imagery in the comics of Winsor McCay and Robert Crumb.

    Science.gov (United States)

    Shannon, Edward A

    2010-01-01

    Winsor McCay’s Little Nemo in Slumberland anticipates Robert Crumb’s work. McCay’s innocent dreamscapes seem antithetical to the sexually explicit work of anti-capitalist Crumb, but Nemo looks forward to Crumb in subject and form. Nemo’s presentation of class, gender, and race, and its pre-Freudian sensibility are ironic counterpoints to Crumb’s political, Freudian comix.