WorldWideScience

Sample records for modeling tool nemo

  1. Towards petascaling of the NEMO ocean model

    Science.gov (United States)

    Donners, J.; Audiffren, N.; Molines, J.-M.

    2012-04-01

    PRACE, the Partnership for Advanced Computing in Europe, offers acces to the largest high-performance computing systems in Europe. These systems follow the trend of increasing numbers of nodes, each with an increasing number of cores. To utilize these computing systems, it is necessary to use a model that is parallellized and has a good scalability. This poster describes different efforts to improve the scalability of the NEMO ocean model. Most importantly, the problem size needs to be chosen adequately: it should contain enough computations to keep thousands of cores busy, but foremostly it has to be scientifically relevant. The global, 1/12degree, NEMO ocean model configuration, developed by the Mercator team, is used for operational ocean forecasting. Therefore, PRACE selected this model for the PRACE Benchmarking suite. However, an increased problem size alone was not enough to efficiently use these petascale systems. Different optimizations were required to reach the necessary performance. Scientifically, the model should simulate one year within a wallclock day. Technically, the application needs to scale up to a minimum number of cores. For example, to utilize the fastest system in Europe, the new Curie system in France, the lower limit is 2048 cores. Scalability can be increased by minimizing the time needed for communication between cores. This has been done in two ways. Firstly, advanced parameters of the MPI-communication library were optimized. The improvement consists in: 1. using RDMA for eager messages (NEMO messages size are below the eager size limit) conjugated with adequate openib flags. 2. tuning for openMPI for collective communication through the btl_coll_tuned_dynamic_rules flag. Overall, the improvement is 33%. Secondly, NEMO uses a tri-polar and staggered grid, which involves a complicated fold across the northpole. Communication along this fold involves collective gather and scatter operations which create a bottleneck at a single core, so

  2. NEMO: A Stellar Dynamics Toolbox

    Science.gov (United States)

    Barnes, Joshua; Hut, Piet; Teuben, Peter

    2010-10-01

    NEMO is an extendible Stellar Dynamics Toolbox, following an Open-Source Software model. It has various programs to create, integrate, analyze and visualize N-body and SPH like systems, following the pipe and filter architecture. In addition there are various tools to operate on images, tables and orbits, including FITS files to export/import to/from other astronomical data reduction packages. A large growing fraction of NEMO has been contributed by a growing list of authors. The source code consist of a little over 4000 files and a little under 1,000,000 lines of code and documentation, mostly C, and some C++ and Fortran. NEMO development started in 1986 in Princeton (USA) by Barnes, Hut and Teuben. See also ZENO (ascl:1102.027) for the version that Barnes maintains.

  3. Modelling turbulent vertical mixing sensitivity using a 1-D version of NEMO

    Science.gov (United States)

    Reffray, G.; Bourdalle-Badie, R.; Calone, C.

    2015-01-01

    Through two numerical experiments, a 1-D vertical model called NEMO1D was used to investigate physical and numerical turbulent-mixing behaviour. The results show that all the turbulent closures tested (k+l from Blanke and Delecluse, 1993, and two equation models: generic length scale closures from Umlauf and Burchard, 2003) are able to correctly reproduce the classical test of Kato and Phillips (1969) under favourable numerical conditions while some solutions may diverge depending on the degradation of the spatial and time discretization. The performances of turbulence models were then compared with data measured over a 1-year period (mid-2010 to mid-2011) at the PAPA station, located in the North Pacific Ocean. The modelled temperature and salinity were in good agreement with the observations, with a maximum temperature error between -2 and 2 °C during the stratified period (June to October). However, the results also depend on the numerical conditions. The vertical RMSE varied, for different turbulent closures, from 0.1 to 0.3 °C during the stratified period and from 0.03 to 0.15 °C during the homogeneous period. This 1-D configuration at the PAPA station (called PAPA1D) is now available in NEMO as a reference configuration including the input files and atmospheric forcing set described in this paper. Thus, all the results described can be recovered by downloading and launching PAPA1D. The configuration is described on the NEMO site (PAPA">http://www.nemo-ocean.eu/Using-NEMO/Configurations/C1D_PAPA). This package is a good starting point for further investigation of vertical processes.

  4. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  5. Performance and results of the high-resolution biogeochemical model PELAGOS025 within NEMO

    Science.gov (United States)

    Epicoco, I.; Mocavero, S.; Macchia, F.; Vichi, M.; Lovato, T.; Masina, S.; Aloisio, G.

    2015-12-01

    The present work aims at evaluating the scalability performance of a high-resolution global ocean biogeochemistry model (PELAGOS025) on massive parallel architectures and the benefits in terms of the time-to-solution reduction. PELAGOS025 is an on-line coupling between the physical ocean model NEMO and the BFM biogeochemical model. Both the models use a parallel domain decomposition along the horizontal dimension. The parallelisation is based on the message passing paradigm. The performance analysis has been done on two parallel architectures, an IBM BlueGene/Q at ALCF (Argonne Leadership Computing Facilities) and an IBM iDataPlex with Sandy Bridge processors at CMCC (Euro Mediterranean Center on Climate Change). The outcome of the analysis demonstrated that the lack of scalability is due to several factors such as the I/O operations, the memory contention, the load unbalancing due to the memory structure of the BFM component and, for the BlueGene/Q, the absence of a hybrid parallelisation approach.

  6. Analysis of the data from the NEMO3 experiment and search for neutrinoless double beta decay - Study of systematic bias of the calorimeter and development of analysis tools

    International Nuclear Information System (INIS)

    Hugon, C.

    2012-11-01

    The NEMO3 experiment was researching the neutrinoless double-β (0ndb) decay by using various sources of double beta decay isotopes (mainly 100 Mo, 82 Se, 116 Cd and 130 Te for about 10 kg in total). The detector was located in the underground laboratory of Modane (Italy) in the halfway point of the Frejus tunnel. This experiment demonstrated that the 'tracko-calo' technology is really competitive and, in addition, it gives new results for the 2-neutrinos double-β (2ndb) decay and the (0ndb) decays research. Moreover it opened an new way for its successor SuperNEMO, which aim is to reach a mass of 100 kg of 82 Se (for a sensitivity of 10 26 years). The main goal of the thesis is to measure the 2ndb and 0ndb decay of the 100 Mo to the excited state 0 1 + of the 100 Ru thanks to the whole NEMO3 data, with new original methods of analysis and through the development of the collaboration analysis software. The results obtained for the ground states (gs) and excited states 2ndb of the 100 Mo are for the half-lives: T(2nbd, gs)=[7.05±0.01(stat)±0.54(syst)]*10 18 years and T(2ndb, 0 1 + )=[6.15±1.1(sta)±0.78]*10 20 years. Those results are compatibles with the last ones published by the collaboration. For the 0ndb(0 1 + ), this work gave a half-life of T(0ndb, 0 1 + ) > 2.6*10 23 years, improving significantly the last published results. Furthermore those methods also allowed to present a new and more exhaustive background noise model for this experiment. The second point of this work was to measure the systematics errors of the NEMO3 calorimeter, among others due to the wavelength of the NEMO3 calibration systems. This work was done using a new test bench based on LED. This bench also allowed to contribute to the development of the SuperNEMO calorimeter, especially in the time characteristic and the energy linearity measurement of the photomultiplier intended to the demonstrator of the experiments. (author)

  7. Explicit representation and parametrised impacts of under ice shelf seas in the z∗ coordinate ocean model NEMO 3.6

    Directory of Open Access Journals (Sweden)

    P. Mathiot

    2017-07-01

    Full Text Available Ice-shelf–ocean interactions are a major source of freshwater on the Antarctic continental shelf and have a strong impact on ocean properties, ocean circulation and sea ice. However, climate models based on the ocean–sea ice model NEMO (Nucleus for European Modelling of the Ocean currently do not include these interactions in any detail. The capability of explicitly simulating the circulation beneath ice shelves is introduced in the non-linear free surface model NEMO. Its implementation into the NEMO framework and its assessment in an idealised and realistic circum-Antarctic configuration is described in this study. Compared with the current prescription of ice shelf melting (i.e. at the surface, inclusion of open sub-ice-shelf cavities leads to a decrease in sea ice thickness along the coast, a weakening of the ocean stratification on the shelf, a decrease in salinity of high-salinity shelf water on the Ross and Weddell sea shelves and an increase in the strength of the gyres that circulate within the over-deepened basins on the West Antarctic continental shelf. Mimicking the overturning circulation under the ice shelves by introducing a prescribed meltwater flux over the depth range of the ice shelf base, rather than at the surface, is also assessed. It yields similar improvements in the simulated ocean properties and circulation over the Antarctic continental shelf to those from the explicit ice shelf cavity representation. With the ice shelf cavities opened, the widely used three equation ice shelf melting formulation, which enables an interactive computation of melting, is tested. Comparison with observational estimates of ice shelf melting indicates realistic results for most ice shelves. However, melting rates for the Amery, Getz and George VI ice shelves are considerably overestimated.

  8. Interactions between Arctic sea ice drift, concentration and thickness modelled by NEMO-LIM3.6

    Science.gov (United States)

    Docquier, David; Massonnet, François; Tandon, Neil F.; Lecomte, Olivier; Fichefet, Thierry

    2017-04-01

    Sea ice cover and thickness have substantially decreased in the Arctic Ocean since the beginning of the satellite era. As a result, sea ice strength has been reduced, allowing more deformation and fracturing and leading to increased sea ice drift speed. We use the global ocean-sea ice NEMO-LIM3.6 model as well as satellite and buoy observations over the period from 1979 to 2013 to study the interactions between sea ice drift, concentration and thickness. Overall, the model agrees well with observations in terms of sea ice extent, concentration and thickness. Although the seasonal cycle of sea ice drift is reasonably well reproduced by the model, the modelled values are generally higher and the trend is weaker compared to observations, resulting in lower sea ice export at Fram Strait than observed. NEMO-LIM3.6 is able to capture the relationship between sea ice drift and strength in terms of seasonal cycle, with higher drift for both lower concentration and lower thickness, in agreement with observations. Sensitivity experiments are carried out by varying the initial ice strength and show that higher values of ice strength lead to lower ice thickness. The negative feedback between sea ice strength, heat loss and thickness can explain these results. This study forms part of the EU Horizon 2020 PRIMAVERA project aiming at developing a new generation of advanced and well-evaluated high-resolution global climate models.

  9. Finding NEMO in preeclampsia.

    Science.gov (United States)

    Sakowicz, Agata; Hejduk, Paulina; Pietrucha, Tadeusz; Nowakowska, Magdalena; Płuciennik, Elżbieta; Pospiech, Karolina; Gach, Agnieszka; Rybak-Krzyszkowska, Magda; Sakowicz, Bartosz; Kaminski, Marek; Krasomski, Grzegorz; Biesiada, Lidia

    2016-04-01

    The mechanism of preeclampsia and its way of inheritance are still a mystery. Biochemical and immunochemical studies reveal a substantial increase in tumor necrosis factor alpha, interleukin-1 beta, and interleukin-6 concentrations in the blood of women with preeclampsia. The level of these factors is regulated by nuclear facxtor-kappa B, whose activation in a classical pathway requires inhibitory kappa B kinase gamma (known as NEMO or IKBKG). Moreover, NEMO can schedule between cytoplasma and the nucleus. In the nucleus, IKBKG interacts with other proteins, and thus, it is implicated in the regulation of different gene expressions, which are related to cell cycle progression, proliferation, differentiation, and apoptosis. This is the first study investigating the association between the level of NEMO gene expression and the presence of preeclampsia. We tested the hypothesis that the simultaneous increase in NEMO gene expression both in the mother and her fetus may be responsible for the preeclampsia development. Moreover, the relationships between clinical risk factors of preeclampsia and the levels of NEMO gene expression in blood, umbilical cord blood, and placentas were investigated. A total of 91 women (43 preeclamptic women and 48 controls) and their children were examined. Real-time reverse transcription-polymerase chain reaction was used to assess the amount total NEMO messenger ribonucleic acid (mRNA) content and the mRNA level of each NEMO transcript from exons 1A, 1B, and 1C in maternal blood, umbilical cord blood, and placentas. Univariate analyses and correlation tests were performed to examine the association between NEMO gene expression and preeclampsia. Newborn weight and height, maternal platelet number, and gestational age (week of delivery) were lower in the group of women with preeclampsia than controls. NEMO gene expression level was found to be almost 7 times higher in the group of women with preeclampsia than healthy controls. The correlation

  10. A comparative signaling cost analysis of Macro Mobility scheme in NEMO (MM-NEMO) with mobility management protocol

    International Nuclear Information System (INIS)

    Islam, Shayla; Abdalla, Aisha H; Habaebi, Mohamed H; Latif, Suhaimi A; Hassan, Wan H; Hasan, Mohammad K; Ramli, H A M; Khalifa, Othman O

    2013-01-01

    NEMO BSP is an upgraded addition to Mobile IPv6 (MIPv6). As MIPv6 and its enhancements (i.e. HMIPv6) possess some limitations like higher handoff latency, packet loss, NEMO BSP also faces all these shortcomings by inheritance. Network Mobility (NEMO) is involved to handle the movement of Mobile Router (MR) and it's Mobile Network Nodes (MNNs) during handoff. Hence it is essential to upgrade the performance of mobility management protocol to obtain continuous session connectivity with lower delay and packet loss in NEMO environment. The completion of handoff process in NEMO BSP usually takes longer period since MR needs to register its single primary care of address (CoA) with home network that may cause performance degradation of the applications running on Mobile Network Nodes. Moreover, when a change in point of attachment of the mobile network is accompanied by a sudden burst of signaling messages, ''Signaling Storm'' occurs which eventually results in temporary congestion, packet delays or even packet loss. This effect is particularly significant for wireless environment where a wireless link is not as steady as a wired link since bandwidth is relatively limited in wireless link. Hence, providing continuous Internet connection without any interruption through applying multihoming technique and route optimization mechanism in NEMO are becoming the center of attention to the current researchers. In this paper, we propose a handoff cost model to compare the signaling cost of MM-NEMO with NEMO Basic Support Protocol (NEMO BSP) and HMIPv6.The numerical results shows that the signaling cost for the MM-NEMO scheme is about 69.6 % less than the NEMO-BSP and HMIPv6

  11. Background constrains of the SuperNEMO experiment for neutrinoless double beta-decay searches

    Energy Technology Data Exchange (ETDEWEB)

    Povinec, Pavel P.

    2017-02-11

    The SuperNEMO experiment is a new generation of experiments dedicated to the search for neutrinoless double beta-decay, which if observed, would confirm the existence of physics beyond the Standard Model. It is based on the tracking and calorimetry techniques, which allow the reconstruction of the final state topology, including timing and kinematics of the double beta-decay transition events, offering a powerful tool for background rejection. While the basic detection strategy of the SuperNEMO detector remains the same as of the NEMO-3 detector, a number of improvements were accomplished for each of detector main components. Upgrades of the detector technologies and development of low-level counting techniques ensure radiopurity control of construction parts of the SuperNEMO detector. A reference material made of glass pellets has been developed to assure quality management and quality control of radiopurity measurements. The first module of the SuperNEMO detector (Demonstrator) is currently under construction in the Modane underground laboratory. No background event is expected in the neutrinoless double beta-decay region in 2.5 years of its operation using 7 kg of {sup 82}Se. The half-life sensitivity of the Demonstrator is expected to be >6.5·10{sup 24} y, corresponding to an effective Majorana neutrino mass sensitivity of |0.2−0.4| eV (90% C.L.). The full SuperNEMO experiment comprising of 20 modules with 100 kg of {sup 82}Se source should reach an effective Majorana neutrino mass sensitivity of |0.04−0.1| eV, and a half-life limit 1·10{sup 26} y. - Highlights: • SuperNEMO detector for 2β0ν-decay of {sup 82}Se should reach half-life limit of 10{sup 26} y. • Radiopurity of the SuperNEMO internal detector parts was checked down to 0.1 mBq/kg. • Reference material of glass pellets was developed for underground γ-spectrometry.

  12. Implementation of Black Sea numerical model based on NEMO and 3DVAR data assimilation scheme for operational forecasting

    Science.gov (United States)

    Ciliberti, Stefania Angela; Peneva, Elisaveta; Storto, Andrea; Rostislav, Kandilarov; Lecci, Rita; Yang, Chunxue; Coppini, Giovanni; Masina, Simona; Pinardi, Nadia

    2016-04-01

    This study describes a new model implementation for the Black Sea, which uses data assimilation, towards operational forecasting, based on NEMO (Nucleus for European Modelling of the Ocean, Madec et al., 2012). The Black Sea domain is resolved with 1/27°×1/36° horizontal resolution (~3 km) and 31 z-levels with partial steps based on the GEBCO bathymetry data (Grayek et al., 2010). The model is forced by momentum, water and heat fluxes interactively computed by bulk formulae using high resolution atmospheric forcing provided by the European Centre for Medium-Range Forecast (ECMWF). The initial condition is calculated from long-term climatological temperature and salinity 3D fields. Precipitation field over the basin has been computed from the climatological GPCP rainfall monthly data (Adler et al., 2003; Huffman et al., 2009), while the evaporation is derived from the latent heat flux. The climatological monthly mean runoff of the major rivers in the Black Sea is computed using the hydrological dataset provided by SESAME project (Ludvig et al., 2009). The exchange with Mediterranean Sea through the Bosporus Straits is represented by a surface boundary condition taking into account the barotropic transport calculated to balance the fresh water fluxes on monthly bases (Stanev and Beckers, 1999, Peneva et al., 2001). A multi-annual run 2011-2015 has been completed in order to describe the main characteristics of the Black Sea circulation dynamics and thermohaline structure and the numerical results have been validated using in-situ (ARGO) and satellite (SST, SLA) data. The Black Sea model represents also the core of the new Black Sea Forecasting System, implemented at CMCC operationally since January 2016, which produces at daily frequency 10-day forecasts, 3-days analyses and 1-day simulation. Once a week, the system is run 15-day in the past in analysis mode to compute the new optimal initial condition for the forecast cycle. The assimilation is performed by a

  13. Implementation of the NEMO model for estimating the spread of leakage from chemical munitions in the Baltic Sea - the first approach

    Science.gov (United States)

    Andrzejewski, Jan

    2017-04-01

    After the Second World War, during the Potsdam Conference a decision about demilitarization of Germany was made, and as a consequence, ammunition including chemical warfare agents (CWA) was dumped into the basins of the Baltic Sea. This type of weapon was stored in metal barrels that were under strong influence of electrochemical oxidation, also known as corrosion. Several tens years later, scientists were wondering what consequences for marine ecosystem could a leakage from this weapon bring. Although over 70 years passed since the Second World War, the influence of potential leakage of the CWA has not been properly estimated. Thus, the main goal of this work is to estimate dangerous area caused by potential leakage using the NEMO (Nucleus for European Modelling of the Ocean) ocean model. The NEMO ocean model is developed by the European Consortium including research institutes from France, England and Italy. The first step of this work is to implement the model for the area of the Baltic Sea. It requires generation of horizontal and vertical grid, bathymetry, atmospheric forces and lateral boundary conditions. Implemented model will have to be checked - it means it will have to pass a validation process. The Baltic Sea is one of the best measured sea in the World - as a consequence a lot of data are freely available for researchers. After validation and tuning up the model, implementation of passive tracer is planned. Passive tracer is the prognostic variable that could represent concentration of potential leakage and does not have influence on the density of the model. Based on distribution of the passive tracer, dangerous areas in the locations of dumpsites will be assessed. The research work was funded by the European Union (European Regional Development Fund) under the Interreg Baltic Sea Region Programme 2014-2020, project #R013 DAIMON (Decision Aid for Marine Munitions).

  14. RedNemo

    DEFF Research Database (Denmark)

    Alkan, Ferhat; Erten, Cesim

    2017-01-01

    MOTIVATION: Analysis of protein-protein interaction (PPI) networks provides invaluable insight into several systems biology problems. High-throughput experimental techniques together with computational methods provide large-scale PPI networks. However, a major issue with these networks is their e......MOTIVATION: Analysis of protein-protein interaction (PPI) networks provides invaluable insight into several systems biology problems. High-throughput experimental techniques together with computational methods provide large-scale PPI networks. However, a major issue with these networks...... material including source code, useful scripts, experimental data and the results are available at http://webprs.khas.edu.tr/∼cesim/Red Nemo. tar.gz CONTACT: cesim@khas.edu.tr Supplementary information: Supplementary data are available at Bioinformatics online....

  15. Polyubiquitin Drives the Molecular Interactions of the NF-κB Essential Modulator (NEMO) by Allosteric Regulation.

    Science.gov (United States)

    Catici, Dragana A M; Horne, James E; Cooper, Grace E; Pudney, Christopher R

    2015-05-29

    The NF-κB essential modulator (NEMO) is the master regulator of NF-κB signaling, controlling the immune and nervous systems. NEMO affects the activity of IκB kinase-β (IKKβ), which relieves the inhibition of the NF-κB transcriptional regulation machinery. Despite major effort, there is only a very sparse, phenomenological understanding of how NEMO regulates IKKβ and shows specificity in its large range of molecular interactions. We explore the key molecular interactions of NEMO using a molecular biophysics approach, incorporating rapid-mixing stopped-flow, high-pressure, and CD spectroscopies. Our study demonstrates that NEMO has a significant degree of native structural disorder and that molecular flexibility and ligand-induced conformational change are at the heart of the molecular interactions of NEMO. We found that long chain length, unanchored, linear polyubiquitin drives NEMO activity, enhancing the affinity of NEMO for IKKβ and the kinase substrate IκBα and promoting membrane association. We present evidence that unanchored polyubiquitin achieves this regulation by inducing NEMO conformational change by an allosteric mechanism. We combine our quantitative findings to give a detailed molecular mechanistic model for the activity of NEMO, providing insight into the molecular mechanism of NEMO activity with broad implications for the biological role of free polyubiquitin. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. ArcNEMO, a spatially distributed nutrient emission model developed in Python to quantify losses of nitrogen and phosphorous from agriculture to surface waters

    Science.gov (United States)

    Van Opstal, Mattias; Tits, Mia; Beckers, Veronique; Batelaan, Okke; Van Orshoven, Jos; Elsen, Annemie; Diels, Jan; D'heygere, Tom; Van Hoof, Kor

    2014-05-01

    Pollution of surface water bodies with nitrogen (N) and phosphorous (P) from agricultural sources is a major problem in areas with intensive agriculture in Europe. The Flemish Environment Agency requires information on how spatially explicit policy measures on manure and fertilizer use, and changes in land use and soil management affect the N and P concentration in the surface waters in the region of Flanders, Belgium. To assist in this, a new spatially distributed, mechanistic nutrient emission model was developed in the open-source language Python. The model is called ArcNEMO (Nutrient Emission MOdel). The model is fully integrated in ArcGIS, but could be easily adapted to work with open-source GIS software. In Flanders, detailed information is available each year on the delineation of each agricultural parcel and the crops grown on them. Parcels are linked to farms, and for each farm yearly manure and fertilizer use is available. To take full advantage of this information and to be able to simulate nutrient losses to the high-density surface water network, the model makes use of grid cells of 50 by 50m. A fertilizer allocation model was developed to calculate from the yearly parcel and farm data the fertilizer and manure input per grid cell for further use in the ArcNEMO-model. The model architecture was chosen such that the model can be used to simulate spatially explicit monthly discharge and losses of N and P to the surface water for the whole of Flanders (13,500 km²) over periods of 10-20 years. The extended time period is necessary because residence times in groundwater and the rates of organic matter turnover imply that water quality reacts slowly to changes of land use and fertilization practices. Vertical water flow and nutrient transport in the unsaturated zone are described per grid cell using a cascading bucket-type model with daily time steps. Groundwater flow is described by solving the 2D-groundwater flow equation using an explicit numerical

  17. Study of the background neutron and gamma components of the ββ(0ν) decay in the NEMO2 prototype detector. Consequences for the NEMO3 detector

    International Nuclear Information System (INIS)

    Marquet, Christine

    1999-01-01

    Neutrinoless double beta decay ββ(0ν) is a test of physics beyond the Standard Model by involving the existence of a massive Majorana neutrino (ν = ν-bar). To try to observe such a process with a sensitivity of 0.1 eV on the neutrino effective mass ( ν >), NEMO collaboration build the NEMO3 detector, able to measure half-lives greater than 10 24 years, corresponding to a few detected events per year. For that, it is necessary to know and master all background sources. This work was first dedicated to the study of external (to the double beta source) background with crossing electrons recorded with NEMO2 prototype detector and then to the simulation of this background in NEMO3 detector. Comparison between NEMO2 data and results of gamma and neutron simulations for different shieldings, with and without neutron source, has allowed to determine background contributions of radon, thoron, 208 Tl contaminations in materials, photon flux produced in laboratory and neutrons. This study, which has required improvements in the MICAP neutron simulation code by developing a photon generator, proved that radiative capture of fast neutrons thermalized in the detector was the source of events in the energy domain of the ββ(0ν) signal. In order to reach the required sensitivity on ν > mass, it has been shown that both a neutron shielding and magnetic field are necessary for NEMO3 detector. (author) [fr

  18. Evaluation of an operational ocean model configuration at 1/12° spatial resolution for the Indonesian seas (NEMO2.3/INDO12) - Part 1: Ocean physics

    Science.gov (United States)

    Tranchant, Benoît; Reffray, Guillaume; Greiner, Eric; Nugroho, Dwiyoga; Koch-Larrouy, Ariane; Gaspar, Philippe

    2016-03-01

    INDO12 is a 1/12° regional version of the NEMO physical ocean model covering the whole Indonesian EEZ (Exclusive Economic Zone). It has been developed and is now running every week in the framework of the INDESO (Infrastructure Development of Space Oceanography) project implemented by the Indonesian Ministry of Marine Affairs and Fisheries. The initial hydrographic conditions as well as open-boundary conditions are derived from the operational global ocean forecasting system at 1/4° operated by Mercator Océan. Atmospheric forcing fields (3-hourly ECMWF (European Centre for Medium-Range Weather Forecast) analyses) are used to force the regional model. INDO12 is also forced by tidal currents and elevations, and by the inverse barometer effect. The turbulent mixing induced by internal tides is taken into account through a specific parameterisation. In this study we evaluate the model skill through comparisons with various data sets including outputs of the parent model, climatologies, in situ temperature and salinity measurements, and satellite data. The biogeochemical model results assessment is presented in a companion paper (Gutknecht et al., 2015). The simulated and altimeter-derived Eddy Kinetic Energy fields display similar patterns and confirm that tides are a dominant forcing in the area. The volume transport of the Indonesian throughflow (ITF) is in good agreement with the INSTANT estimates while the transport through Luzon Strait is, on average, westward but probably too weak. Compared to satellite data, surface salinity and temperature fields display marked biases in the South China Sea. Significant water mass transformation occurs along the main routes of the ITF and compares well with observations. Vertical mixing is able to modify the South and North Pacific subtropical water-salinity maximum as seen in T-S diagrams. In spite of a few weaknesses, INDO12 proves to be able to provide a very realistic simulation of the ocean circulation and water mass

  19. Assessment of the sea-ice carbon pump: Insights from a three-dimensional ocean-sea-ice biogeochemical model (NEMO-LIM-PISCES

    Directory of Open Access Journals (Sweden)

    Sébastien Moreau

    2016-08-01

    Full Text Available Abstract The role of sea ice in the carbon cycle is minimally represented in current Earth System Models (ESMs. Among potentially important flaws, mentioned by several authors and generally overlooked during ESM design, is the link between sea-ice growth and melt and oceanic dissolved inorganic carbon (DIC and total alkalinity (TA. Here we investigate whether this link is indeed an important feature of the marine carbon cycle misrepresented in ESMs. We use an ocean general circulation model (NEMO-LIM-PISCES with sea-ice and marine carbon cycle components, forced by atmospheric reanalyses, adding a first-order representation of DIC and TA storage and release in/from sea ice. Our results suggest that DIC rejection during sea-ice growth releases several hundred Tg C yr−1 to the surface ocean, of which < 2% is exported to depth, leading to a notable but weak redistribution of DIC towards deep polar basins. Active carbon processes (mainly CaCO3 precipitation but also ice-atmosphere CO2 fluxes and net community production increasing the TA/DIC ratio in sea-ice modified ocean-atmosphere CO2 fluxes by a few Tg C yr−1 in the sea-ice zone, with specific hemispheric effects: DIC content of the Arctic basin decreased but DIC content of the Southern Ocean increased. For the global ocean, DIC content increased by 4 Tg C yr−1 or 2 Pg C after 500 years of model run. The simulated numbers are generally small compared to the present-day global ocean annual CO2 sink (2.6 ± 0.5 Pg C yr−1. However, sea-ice carbon processes seem important at regional scales as they act significantly on DIC redistribution within and outside polar basins. The efficiency of carbon export to depth depends on the representation of surface-subsurface exchanges and their relationship with sea ice, and could differ substantially if a higher resolution or different ocean model were used.

  20. Review paper of gateway selection schemes for MANET of NEMO (MANEMO)

    International Nuclear Information System (INIS)

    Mahmood, Z; Hashim, A; Khalifa, O; Anwar, F; Hameed, S

    2013-01-01

    The fast growth of Internet applications brings with it new challenges for researchers to provide new solutions that guarantee better Internet access for mobile hosts and networks. The globally reachable, Home-Agent based, infrastructure Network Mobility (NEMO) and the local, multi-hop, and infrastructure-less Mobile Ad hoc Network (MANET) developed by Internet Engineering Task Force (IETF) support different topologies of the mobile networks. A new architecture was proposed by combining both topologies to obtain Mobile Ad Hoc NEMO (MANEMO). However, the integration of NEMO and MANET introduces many challenges such as network loops, sub-optimal route, redundant tunnel problem, absence of communication without Home Agent reachability, and exit router selection when multiple Exit Routers to the Internet exist. This paper aims to review the different proposed models that could be used to implement the gateway selection mechanism and it highlights the strengths as well as the limitations of these approaches

  1. Green Infrastructure Modeling Tools

    Science.gov (United States)

    Modeling tools support planning and design decisions on a range of scales from setting a green infrastructure target for an entire watershed to designing a green infrastructure practice for a particular site.

  2. Timing Calibration of the NEMO Optical Sensors

    Science.gov (United States)

    Circella, M.; de Marzo, C.; Megna, R.; Ruppi, M.

    2006-04-01

    This paper describes the timing calibration system for the NEMO underwater neutrino telescope. The NEMO Project aims at the construction of a km3 detector, equipped with a large number of photomultipliers, in the Mediterranean Sea. We foresee a redundant system to perform the time calibration of our apparatus: 1) A two-step procedure for measuring the offsets in the time measurements of the NEMO optical sensors, so as to measure separately the time delay for the synchronization signals to reach the offshore electronics and the response time of the photomultipliers to calibration signals delivered from optical pulsers through an optical fibre distribution system; 2) an all-optical procedure for measuring the differences in the time offsets of the different optical modules illuminated by calibration pulses. Such a system can be extended to work for a very large apparatus, even for complex arrangements of widely spaced sensors. The NEMO prototyping activities ongoing at a test site off the coast of Sicily will allow the system described in this work to be operated and tested in situ next year.

  3. Population Density Modeling Tool

    Science.gov (United States)

    2012-06-26

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE 26

  4. Tracking Nemo: Help Scientists Understand Zebrafish Behavior.

    Science.gov (United States)

    Tolbert, Tyrone J; Nakayama, Shinnosuke; Porfiri, Maurizio

    2018-02-22

    The advent of automated tracking software has significantly reduced the time required to record movement trajectories, thereby facilitating behavioral studies of zebrafish. However, results are substantially influenced by tracking errors, such as loss and misidentification of individuals. In this study, we present the development of an online citizen science platform, Tracking Nemo, to improve data accuracy on swimming trajectories of zebrafish groups. As an online extension of software for tracking the position of zebrafish from video recordings, Tracking Nemo offers volunteers the opportunity to contribute to science by manually correcting tracked trajectory data from their personal computers. Researchers can upload their videos that require human intervention for correcting and validating the data. Citizen scientists can monitor their contributions through a leaderboard system, which is designed to strengthen participant retention and contribution by tapping into intrinsic and extrinsic motivations. Tracking Nemo is expected to help scientists improve data accuracy through the involvement of citizen scientists, who, in turn, engage in an authentic research activity and learn more about the behavior of zebrafish.

  5. AlignNemo: a local network alignment method to integrate homology and topology.

    Directory of Open Access Journals (Sweden)

    Giovanni Ciriello

    Full Text Available Local network alignment is an important component of the analysis of protein-protein interaction networks that may lead to the identification of evolutionary related complexes. We present AlignNemo, a new algorithm that, given the networks of two organisms, uncovers subnetworks of proteins that relate in biological function and topology of interactions. The discovered conserved subnetworks have a general topology and need not to correspond to specific interaction patterns, so that they more closely fit the models of functional complexes proposed in the literature. The algorithm is able to handle sparse interaction data with an expansion process that at each step explores the local topology of the networks beyond the proteins directly interacting with the current solution. To assess the performance of AlignNemo, we ran a series of benchmarks using statistical measures as well as biological knowledge. Based on reference datasets of protein complexes, AlignNemo shows better performance than other methods in terms of both precision and recall. We show our solutions to be biologically sound using the concept of semantic similarity applied to Gene Ontology vocabularies. The binaries of AlignNemo and supplementary details about the algorithms and the experiments are available at: sourceforge.net/p/alignnemo.

  6. NEMO educational kit on micro-optics at the secondary school

    Science.gov (United States)

    Flores-Arias, M. T.; Bao-Varela, Carmen

    2014-07-01

    NEMO was the "Network of Excellence in Micro-Optics" granted in the "Sixth Framework Program" of the European Union. It aimed at providing Europe with a complete Micro-Optics food-chain, by setting up centers for optical modeling and design; measurement and instrumentation; mastering, prototyping and replication; integration and packaging and reliability and standardization. More than 300 researchers from 30 groups in 12 countries participated in the project. One of the objectives of NEMO was to spread excellence and disseminate knowledge on micro-optics and micro-photonics. To convince pupils, already from secondary school level on, about the crucial role of light and micro-optics and the opportunities this combination holds, several partners of NEMO had collaborate to create this Educational Kit. In Spain the partner involved in this aim was the "Microoptics and GRIN Optics Group" at the University of Santiago of Compostela (USC). The educational kits provided to the Secondary School were composed by two plastic cards with the following microoptical element: different kinds of diffractive optical elements or DOES and refractive optical elements or ROEs namely arrays of micro-lenses. The kit also included a DVD with a handbook for performing the experiments as well as a laser pointer source. This kit was distributed free of charge in the countries with partners in NEMO. In particular in Spain was offered to around 200 Secondary School Centers and only 80 answered accepting evaluate the kit.

  7. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Science.gov (United States)

    Billeci, Lucia; Magliaro, Chiara; Pioggia, Giovanni; Ahluwalia, Arti

    2013-01-01

    Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using 3-way principal component analysis (PCA). Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism. PMID:23420185

  8. Multfilm "V poiskah Nemo" delajet detei ubiitsami rõbok

    Index Scriptorium Estoniae

    2004-01-01

    Animafilm "Kalapoeg Nemo" : režissöör Andrew Stanton : Ameerika Ühendriigid 2003. Filmi vaatamise järgselt on tuhanded lapsed lasknud oma akvaariumikalad vabadusse, põhjustades sellega nende huku või keskkonnaprobleeme

  9. PORFIDO on the NEMO Phase 2 tower

    Energy Technology Data Exchange (ETDEWEB)

    Ciaffoni, Orlando; Cordelli, Marco; Habel, Roberto; Martini, Agnese; Trasatti, Luciano [INFN-Laboratori Nazionali di Frascati, Via E. Fermi 40, I-00044 Frascati (RM) (Italy)

    2014-11-18

    We have designed and built an underwater measurement system, PORFIDO (Physical Oceanography by RFID Outreach) to gather oceanographic data from the Optical Modules of a neutrino telescope with a minimum of disturbance to the main installation. PORFIDO is composed of a sensor glued to the outside of an Optical Module, in contact with seawater, and of a reader placed inside the sphere, facing the sensor. Data are transmitted to the reader through the glass by RFID and to shore in real time for periods of years. The sensor gathers power from the radio frequency, thus eliminating the need for batteries or connectors through the glass. We have deployed four PORFIDO probes measuring temperatures with the NEMO-KM3Net-Italy Phase 2 tower in april 2013. The four probes are operative and are transmitting temperature data from 3500 m depth.

  10. PORFIDO on the NEMO Phase 2 tower

    International Nuclear Information System (INIS)

    Ciaffoni, Orlando; Cordelli, Marco; Habel, Roberto; Martini, Agnese; Trasatti, Luciano

    2014-01-01

    We have designed and built an underwater measurement system, PORFIDO (Physical Oceanography by RFID Outreach) to gather oceanographic data from the Optical Modules of a neutrino telescope with a minimum of disturbance to the main installation. PORFIDO is composed of a sensor glued to the outside of an Optical Module, in contact with seawater, and of a reader placed inside the sphere, facing the sensor. Data are transmitted to the reader through the glass by RFID and to shore in real time for periods of years. The sensor gathers power from the radio frequency, thus eliminating the need for batteries or connectors through the glass. We have deployed four PORFIDO probes measuring temperatures with the NEMO-KM3Net-Italy Phase 2 tower in april 2013. The four probes are operative and are transmitting temperature data from 3500 m depth

  11. Low power electronics for NEMO detector

    International Nuclear Information System (INIS)

    Lo Presti, Domenico

    2000-01-01

    For the realization of the submarine detector NEMO it is necessary to design an acquisition system which is able to capture the signals coming from photo-multipliers (PMs) of the optical modules (OMs) and to satisfy several specifications: low power consumption; few submarine interconnections for reliability and simplicity of the deployment; flexibility of the system; minimum dead time; high dynamic range; accuracy in order to have good resolution. Here we present a Very Large Scale Integration full-custom solution for the OMs according to the requirements. It foresees to use a switched capacitor analog memory, a trigger and single photon classification system, a PLL and a control unit able to manage the different operation states of the whole system. For such a system we foresee a power dissipation not higher than 200-300 mV in each OM, 20 bit dynamic range and a dead time of about 0.1%

  12. Development of an optical simulation for the SuperNEMO calorimeter

    Science.gov (United States)

    Huber, Arnaud; SuperNEMO Collaboration

    2017-09-01

    The SuperNEMO double beta decay project is a modular tracker-calorimeter based experiment. The aim of this project is to reach a sensitivity of the order of 1026 years concerning the neutrinoless double beta decay half-life, corresponding to a Majorana neutrino mass of 50-100 meV. The main calorimeter of the SuperNEMO demonstrator is based on 520 Optical Modules made of large volume plastic scintillators (10L) coupled with large area photomultipliers (Hamamatsu R5912-MOD and R6594). The design of the calorimeter is optimized for the double beta decay detection and allows gamma tagging for background rejection. In large volumes of scintillators, a similar deposited energy by electrons or photons will give different visible energy and signal shapes due to different interactions inside the scintillator. The aim of the optical simulation, developed for SuperNEMO, is to model the Optical Module response on the energy and time performances, regarding the particle type.

  13. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  14. NF-κB Essential MOdulator (NEMO) Is Critical for Thyroid Function

    OpenAIRE

    Reale, Carla; Iervolino, Anna; Scudiero, Ivan; Ferravante, Angela; Egildo D, Luca; Mazzone, Pellegrino; Zotti, Tiziana; Leonardi, Antonio; Roberto, Luca; Zannini, Mariastella; Cristofaro, Tiziana de; Shanmugakona, Muralitharan; Capasso, Giovambattista; Pasparakis, Manolis; Vito, Pasquale

    2016-01-01

    The I-?B kinase (IKK) subunit NEMO/IKK? (NEMO) is an adapter molecule that is critical for canonical activation of NF-?B, a pleiotropic transcription factor controlling immunity, differentiation, cell growth, tumorigenesis, and apoptosis. To explore the functional role of canonical NF-?B signaling in thyroid gland differentiation and function, we have generated a murine strain bearing a genetic deletion of the NEMO locus in thyroid. Here we show that thyrocyte-specific NEMO knock-out mice gra...

  15. Data transmission and acquisition in NEMO

    Energy Technology Data Exchange (ETDEWEB)

    Bunkheila, G. [Istituto Nazionale di Fisica Nucleare (INFN), sez. Roma 1, Marconi Building, University of Rome ' La Sapienza' , P.le Aldo Moro 2 - 00185 (Italy)]. E-mail: gabriele.bunkheila@gmail.com

    2006-11-15

    A comprehensive system for data transmission and acquisition has been developed for an 'a la NEMO' underwater neutrino telescope based on Cerenkov light detection using photomultipliers (PMTs) as sensors. Signals generated by each sensor are triggered, sampled and tagged by an electronics board, called Front End Module (FEM). Data streams from up to eight FEMs located on one tower floor are collected by a concentration board called Floor Control Module (FCM) and sent to a twin FCM board-located at the onshore station and plugged into an interface machine (FCM Interface, or FCMI) via a PCI bus-through a DWDM-compliant optical fiber and using a self-synchronous serial protocol. All sensor data reach the onshore lab through FCMI where they are made available to subsequent elaboration processes, such as time-wise alignment and muon track event-triggering. To meet requirements of the latter, onshore data unpacking is carried out with respect to their topological origin. The system promised, and keeps on showing, very light charges on power consumption and infrastructure complexity, while having recently proved to behave at high performance levels in its optical part.

  16. Expanding the substantial interactome of NEMO using protein microarrays.

    LENUS (Irish Health Repository)

    Fenner, Beau J

    2010-01-01

    Signal transduction by the NF-kappaB pathway is a key regulator of a host of cellular responses to extracellular and intracellular messages. The NEMO adaptor protein lies at the top of this pathway and serves as a molecular conduit, connecting signals transmitted from upstream sensors to the downstream NF-kappaB transcription factor and subsequent gene activation. The position of NEMO within this pathway makes it an attractive target from which to search for new proteins that link NF-kappaB signaling to additional pathways and upstream effectors. In this work, we have used protein microarrays to identify novel NEMO interactors. A total of 112 protein interactors were identified, with the most statistically significant hit being the canonical NEMO interactor IKKbeta, with IKKalpha also being identified. Of the novel interactors, more than 30% were kinases, while at least 25% were involved in signal transduction. Binding of NEMO to several interactors, including CALB1, CDK2, SAG, SENP2 and SYT1, was confirmed using GST pulldown assays and coimmunoprecipitation, validating the initial screening approach. Overexpression of CALB1, CDK2 and SAG was found to stimulate transcriptional activation by NF-kappaB, while SYT1 overexpression repressed TNFalpha-dependent NF-kappaB transcriptional activation in human embryonic kidney cells. Corresponding with this finding, RNA silencing of CDK2, SAG and SENP2 reduced NF-kappaB transcriptional activation, supporting a positive role for these proteins in the NF-kappaB pathway. The identification of a host of new NEMO interactors opens up new research opportunities to improve understanding of this essential cell signaling pathway.

  17. Neutrino Physics without Neutrinos: Recent results from the NEMO-3 experiment and plans for SuperNEMO

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The observation of neutrino oscillations has proved that neutrinos have mass. This discovery has renewed and strengthened the interest in neutrinoless double beta decay experiments which provide the only practical way to determine whether neutrinos are Majorana or Dirac particles. The recently completed NEMO-3 experiment, located in the Laboratoire Souterrain de Modane in the Frejus Tunnel, was an experiment searching for neutrinoless double beta decays using a powerful technique for detecting a two-electron final state by employing an apparatus combining tracking, calorimetry, and the time-of-flight measurements. We will present latest results from NEMO-3 and will discuss the status of SuperNEMO, the next generation experiment that will exploit the same experimental technique to extend the sensitivity of the current search.

  18. Nutritional education for management of osteodystrophy (NEMO) trial: Design and patient characteristics, Lebanon.

    Science.gov (United States)

    Karavetian, Mirey; Abboud, Saade; Elzein, Hafez; Haydar, Sarah; de Vries, Nanne

    2014-02-01

    THIS STUDY AIMS TO DETERMINE THE EFFECT OF A TRAINED DEDICATED DIETITIAN ON CLINICAL OUTCOMES AMONG LEBANESE HEMODIALYSIS (HD) PATIENTS: and thus demonstrate a viable developing country model. This paper describes the study protocol and baseline data. The study was a multicenter randomized controlled trial with parallel-group design involving 12 HD units: assigned to cluster A (n = 6) or B (n = 6). A total of 570 patients met the inclusion criteria. Patients in cluster A were randomly assigned as per dialysis shift to the following: Dedicated Dietitian (DD) (n = 133) and Existing Practice (EP) (n = 138) protocols. Cluster B patients (n = 299) received Trained Hospital Dietitian (THD) protocol. Dietitians of the DD and THD groups were trained by the research team on Kidney Disease Outcomes Quality Initiative nutrition guidelines. DD protocol included: individualized nutrition education for 2 hours/month/HD patient for 6 months focusing on renal osteodystrophy and using the Trans-theoretical theory for behavioral change. EP protocol included nutrition education given to patients by hospital dietitians who were blinded to the study. The THD protocol included nutrition education to patients given by hospital dietitian as per the training received but within hospital responsibilities, with no set educational protocol or tools. Baseline data revealed that 40% of patients were hyperphosphatemics (> 5.5 mg/dl) with low dietary adherence and knowledge of dietary P restriction in addition to inadequate daily protein intake (58.86%± 33.87% of needs) yet adequate dietary P intake (795.52 ± 366.94 mg/day). Quality of life (QOL) ranged from 48-75% of full health. Baseline differences between the 3 groups revealed significant differences in serum P, malnutrition status, adherence to diet and P chelators and in 2 factors of the QOL: physical and social functioning. The data show room for improvement in the nutritional status of the patients. The NEMO trial may be able to

  19. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall......Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  20. Quantification of cellular NEMO content and its impact on NF-κB activation by genotoxic stress.

    Directory of Open Access Journals (Sweden)

    Byounghoon Hwang

    Full Text Available NF-κB essential modulator, NEMO, plays a key role in canonical NF-κB signaling induced by a variety of stimuli, including cytokines and genotoxic agents. To dissect the different biochemical and functional roles of NEMO in NF-κB signaling, various mutant forms of NEMO have been previously analyzed. However, transient or stable overexpression of wild-type NEMO can significantly inhibit NF-κB activation, thereby confounding the analysis of NEMO mutant phenotypes. What levels of NEMO overexpression lead to such an artifact and what levels are tolerated with no significant impact on NEMO function in NF-κB activation are currently unknown. Here we purified full-length recombinant human NEMO protein and used it as a standard to quantify the average number of NEMO molecules per cell in a 1.3E2 NEMO-deficient murine pre-B cell clone stably reconstituted with full-length human NEMO (C5. We determined that the C5 cell clone has an average of 4 x 10(5 molecules of NEMO per cell. Stable reconstitution of 1.3E2 cells with different numbers of NEMO molecules per cell has demonstrated that a 10-fold range of NEMO expression (0.6-6x10(5 molecules per cell yields statistically equivalent NF-κB activation in response to the DNA damaging agent etoposide. Using the C5 cell line, we also quantified the number of NEMO molecules per cell in several commonly employed human cell lines. These results establish baseline numbers of endogenous NEMO per cell and highlight surprisingly normal functionality of NEMO in the DNA damage pathway over a wide range of expression levels that can provide a guideline for future NEMO reconstitution studies.

  1. NF-κB Essential Modulator (NEMO) Is Critical for Thyroid Function*

    Science.gov (United States)

    Reale, Carla; Iervolino, Anna; Scudiero, Ivan; Ferravante, Angela; D'Andrea, Luca Egildo; Mazzone, Pellegrino; Zotti, Tiziana; Leonardi, Antonio; Roberto, Luca; Zannini, Mariastella; de Cristofaro, Tiziana; Shanmugakonar, Muralitharan; Capasso, Giovambattista; Pasparakis, Manolis; Vito, Pasquale; Stilo, Romania

    2016-01-01

    The I-κB kinase (IKK) subunit NEMO/IKKγ (NEMO) is an adapter molecule that is critical for canonical activation of NF-κB, a pleiotropic transcription factor controlling immunity, differentiation, cell growth, tumorigenesis, and apoptosis. To explore the functional role of canonical NF-κB signaling in thyroid gland differentiation and function, we have generated a murine strain bearing a genetic deletion of the NEMO locus in thyroid. Here we show that thyrocyte-specific NEMO knock-out mice gradually develop hypothyroidism after birth, which leads to reduced body weight and shortened life span. Histological and molecular analysis indicate that absence of NEMO in thyrocytes results in a dramatic loss of the thyroid gland cellularity, associated with down-regulation of thyroid differentiation markers and ongoing apoptosis. Thus, NEMO-dependent signaling is essential for normal thyroid physiology. PMID:26786105

  2. Recent results and perspectives of the NEMO project

    Science.gov (United States)

    Capone, A.; Aiello, S.; Aloisio, A.; Ameli, F.; Amore, I.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Barbarito, E.; Battaglieri, M.; Bazzotti, M.; Bellotti, R.; Bersani, A.; Beverini, N.; Biagi, S.; Bonori, M.; Bouhdaef, B.; Brescia, M.; Cacopardo, G.; Calì, C.; Caponetto, L.; Carminati, G.; Cassano, B.; Castorina, E.; Ceres, A.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; D'Amico, A.; D'Amato, C.; D'Amato, V.; De Bonis, G.; De Rosa, G.; De Ruvo, G.; De Vita, R.; Distefano, C.; Falchini, E.; Flaminio, V.; Fratini, K.; Gabrielli, A.; Galeotti, S.; Gandolfi, E.; Giacomelli, G.; Giorgi, F.; Giovanetti, G.; Grimaldi, A.; Grmek, A.; Habel, R.; Leonora, E.; Lonardo, A.; Longo, G.; Lo Presti, D.; Lucarelli, F.; Maccione, L.; Margiotta, A.; Marinelli, A.; Martini, A.; Masullo, R.; Maugeri, F.; Megna, R.; Migneco, E.; Minutoli, S.; Mongelli, M.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Osipenko, M.; Osteria, G.; Papaleo, R.; Pappalardo, V.; Petta, C.; Piattelli, P.; Piombo, D.; Raffaelli, F.; Raia, G.; Randazzo, N.; Reito, S.; Ricco, G.; Riccobene, G.; Ripani, M.; Rovelli, A.; Ruppi, M.; Russo, G. V.; Russo, S.; Sapienza, P.; Sedita, M.; Shirokov, E.; Simeone, F.; Sipala, V.; Sollima, C.; Speziale, F.; Spurio, M.; Stefani, F.; Taiuti, M.; Terreni, G.; Trasatti, L.; Urso, S.; Valente, V.; Vecchi, M.; Vicini, P.; Wischnewski, R.

    2009-04-01

    The latest results and the activities towards the realization of a km 3 Cherenkov neutrino detector carried out by the NEMO Collaboration are described. The realization of a Phase-1 project has validated all relevant technologies proposed for the realization of the km 3 detector on a test site at 2000 m depth. The realization of a new infrastructure on the candidate Capo Passero site (for Phase-2 project) will provide the possibility to test detector components at 3500 m depth.

  3. NEMO on the shelf: assessment of the Iberia–Biscay–Ireland configuration

    Directory of Open Access Journals (Sweden)

    C. Maraldi

    2013-08-01

    Full Text Available This work describes the design and validation of a high-resolution (1/36° ocean forecasting model over the "Iberian–Biscay–Irish" (IBI area. The system has been set-up using the NEMO model (Nucleus for European Modelling of the Ocean. New developments have been incorporated in NEMO to make it suitable to open- as well as coastal-ocean modelling. In this paper, we pursue three main objectives: (1 to give an overview of the model configuration used for the simulations; (2 to give a broad-brush account of one particular aspect of this work, namely consistency verification; this type of validation is conducted upstream of the implementation of the system before it is used for production and routinely validated; it is meant to guide model development in identifying gross deficiencies in the modelling of several key physical processes; and (3 to show that such a regional modelling system has potential as a complement to patchy observations (an integrated approach to give information on non-observed physical quantities and to provide links between observations by identifying broader-scale patterns and processes. We concentrate on the year 2008. We first provide domain-wide consistency verification results in terms of barotropic tides, transports, sea surface temperature and stratification. We then focus on two dynamical subregions: the Celtic shelves and the Bay of Biscay slope and deep regions. The model–data consistency is checked for variables and processes such as tidal currents, tidal fronts, internal tides and residual elevation. We also examine the representation in the model of a seasonal pattern of the Bay of Biscay circulation: the warm extension of the Iberian Poleward Current along the northern Spanish coast (Navidad event in the winter of 2007–2008.

  4. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  5. Nano-electromechanical oscillators (NEMOs) for RF technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Joel Robert; Czaplewski, David A.; Gibson, John Murray (Argonne National Laboratory, Argonne, IL); Webster, James R.; Carton, Andrew James; Keeler, Bianca Elizabeth Nelson; Carr, Dustin Wade; Friedmann, Thomas Aquinas; Tallant, David Robert; Boyce, Brad Lee; Sullivan, John Patrick; Dyck, Christopher William; Chen, Xidong (Cedarville University, Cedarville, OH)

    2004-12-01

    Nano-electromechanical oscillators (NEMOs), capacitively-coupled radio frequency (RF) MEMS switches incorporating dissipative dielectrics, new processing technologies for tetrahedral amorphous carbon (ta-C) films, and scientific understanding of dissipation mechanisms in small mechanical structures were developed in this project. NEMOs are defined as mechanical oscillators with critical dimensions of 50 nm or less and resonance frequencies approaching 1 GHz. Target applications for these devices include simple, inexpensive clocks in electrical circuits, passive RF electrical filters, or platforms for sensor arrays. Ta-C NEMO arrays were used to demonstrate a novel optomechanical structure that shows remarkable sensitivity to small displacements (better than 160 fm/Hz {sup 1/2}) and suitability as an extremely sensitive accelerometer. The RF MEMS capacitively-coupled switches used ta-C as a dissipative dielectric. The devices showed a unipolar switching response to a unipolar stimulus, indicating the absence of significant dielectric charging, which has historically been the major reliability issue with these switches. This technology is promising for the development of reliable, low-power RF switches. An excimer laser annealing process was developed that permits full in-plane stress relaxation in ta-C films in air under ambient conditions, permitting the application of stress-reduced ta-C films in areas where low thermal budget is required, e.g. MEMS integration with pre-existing CMOS electronics. Studies of mechanical dissipation in micro- and nano-scale ta-C mechanical oscillators at room temperature revealed that mechanical losses are limited by dissipation associated with mechanical relaxation in a broad spectrum of defects with activation energies for mechanical relaxation ranging from 0.35 eV to over 0.55 eV. This work has established a foundation for the creation of devices based on nanomechanical structures, and outstanding critical research areas that need

  6. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels, and i...

  8. Nemo-3 experiment assets and limitations. Perspective for the double {beta} physics; Experience Nemo 3 avantage et limitations. Prospective pour la physique double {beta}

    Energy Technology Data Exchange (ETDEWEB)

    Augier, C

    2005-06-15

    After an introduction to this report in Chapter 1, I present a status of our knowledge in neutrino physics in Chapter 2. Then, I detail in Chapter 3 all the choices made for the design and realisation of the NEMO 3 detector for the research of double beta decay process. Performance of the detector is presented, concerning both the capacity of the detector to identify the backgrounds and the ability to study all the {beta}{beta} process. I also explain the methods chosen by the NEMO collaboration to reduce the radon activity inside the detector and to make this background negligible today. This chapter, which is written in English, is the 'Technical report of the NEMO 3 detector' and forms an independent report for the NEMO collaborators. I finish this report in Chapter 4 with a ten years prospect for experimental projects in physics, with both the SuperNEMO project and its experiment program, and also by comparing the most interesting experiments, CUORE and GERDA, showing as an example the effect of nuclear matrix elements on the neutrino effective mass measurement. (author)

  9. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  10. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-01-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  11. GeNemo: a search engine for web-based functional genomic data.

    Science.gov (United States)

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Measurement of 130Te double beta decay process in the NEMO-3 experiment- R and D of SuperNEMO project: study of the BiPo detector

    International Nuclear Information System (INIS)

    Bongrand, M.

    2008-09-01

    This thesis contains 2 parts: data analysis of the NEMO-3 experiment data and a study of a BiPo detector for the SuperNEMO project. NEMO-3 is searching for neutrinoless double beta decay process 2β0ν using direct detection of the 2 emitted electrons by a tracking detector coupled to a calorimeter. I completely studied the backgrounds in several analysis channels and gave the most accurate measurement of the allowed process with neutrinos emission for 130 Te: T 2ν (1/2) equals (6.1 ± 1.2 (stat) ± 0.6 (syst)) 10 20 years. This result allows a good knowledge of the ultimate 2β2ν background for 2β0ν process research and helps to constrain or check the theoretical calculations of nuclear matrix elements, which have to be known with a good precision to determine the neutrino effective mass in case of 2β0ν observation. From NEMO-3 data, I also gave a limit on this effective neutrino mass ββ 130 Te: T 0ν (1/2) > 6.3 10 22 years. Due to the low mass of 130 Te contained in NEMO-3 (454 g), this result is not competitive with the limit recently published by CUORICINO for this isotope: T 0ν (1/2) > 3.0 10 24 years and m ββ 0ν (1/2) > 10 26 years, using the NEMO-3 detection principle but improving efficiency, radio-purity, energy resolution and reducing backgrounds. This background will be then limited by natural radioactive contaminations inside the source foils. Thus the SuperNEMO specifications concerning the source foil radio-purity are very high: A( 208 Tl) 214 Bi) 208 Tl and 214 Bi contaminations, using identification of the Bi → Po chains. Foil source to measure is put between two scintillator planes allowing energy and time measurements. I studied BiPo-1 prototype, showed its technical feasibility, validated the principle and determined the sensitivity of the source measurement compared to backgrounds. Data analysis of BiPo-1 showed the possibility to measure 5 μBq/kg of 208 Tl with the final BiPo. This result is not so far from SuperNEMO

  13. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  14. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  15. Epithelial NEMO/IKKγ limits fibrosis and promotes regeneration during pancreatitis.

    Science.gov (United States)

    Chan, Lap Kwan; Gerstenlauer, Melanie; Konukiewitz, Björn; Steiger, Katja; Weichert, Wilko; Wirth, Thomas; Maier, Harald Jakob

    2017-11-01

    Inhibitory κB kinase (IKK)/nuclear factor κB (NF-κB) signalling has been implicated in the pathogenesis of pancreatitis, but its precise function has remained controversial. Here, we analyse the contribution of IKK/NF-κB signalling in epithelial cells to the pathogenesis of pancreatitis by targeting the IKK subunit NF-κB essential modulator (NEMO) (IKKγ), which is essential for canonical NF-κB activation. Mice with a targeted deletion of NEMO in the pancreas were subjected to caerulein pancreatitis. Pancreata were examined at several time points and analysed for inflammation, fibrosis, cell death, cell proliferation, as well as cellular differentiation. Human samples were used to corroborate findings established in mice. In acute pancreatitis, NEMO deletion in the pancreatic parenchyma resulted in minor changes during the early phase but led to the persistence of inflammatory and fibrotic foci in the recovery phase. In chronic pancreatitis, NEMO deletion aggravated inflammation and fibrosis, inhibited compensatory acinar cell proliferation, and enhanced acinar atrophy and acinar-ductal metaplasia. Gene expression analysis revealed sustained activation of profibrogenic genes and the CXCL12/CXCR4 axis in the absence of epithelial NEMO. In human chronic pancreatitis samples, the CXCL12/CXCR4 axis was activated as well, with CXCR4 expression correlating with the degree of fibrosis. The aggravating effects of NEMO deletion were attenuated by the administration of the CXCR4 antagonist AMD3100. Our results suggest that NEMO in epithelial cells exerts a protective effect during pancreatitis by limiting inflammation and fibrosis and improving acinar cell regeneration. The CXCL12/CXCR4 axis is an important mediator of that effect and may also be of importance in human chronic pancreatitis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. Porcine deltacoronavirus nsp5 inhibits interferon-β production through the cleavage of NEMO.

    Science.gov (United States)

    Zhu, Xinyu; Fang, Liurong; Wang, Dang; Yang, Yuting; Chen, Jiyao; Ye, Xu; Foda, Mohamed Frahat; Xiao, Shaobo

    2017-02-01

    Porcine deltacoronavirus (PDCoV) causes acute enteric disease and mortality in seronegative neonatal piglets. Previously we have demonstrated that PDCoV infection suppresses the production of interferon-beta (IFN-β), while the detailed mechanisms are poorly understood. Here, we demonstrate that nonstructural protein 5 (nsp5) of PDCoV, the 3C-like protease, significantly inhibits Sendai virus (SEV)-induced IFN-β production by targeting the NF-κB essential modulator (NEMO), confirmed by the diminished function of NEMO cleaved by PDCoV. The PDCoV nsp5 cleavage site in the NEMO protein was identified as glutamine 231, and was identical to the porcine epidemic diarrhea virus nsp5 cleavage site, revealing the likelihood of a common target in NEMO for coronaviruses. Furthermore, this cleavage impaired the ability of NEMO to activate the IFN response and downstream signaling. Taken together, our findings reveal PDCoV nsp5 to be a newly identified IFN antagonist and enhance the understanding of immune evasion by deltacoronaviruses. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Nemo-3 experiment assets and limitations. Perspective for the double β physics

    International Nuclear Information System (INIS)

    Augier, C.

    2005-06-01

    After an introduction to this report in Chapter 1, I present a status of our knowledge in neutrino physics in Chapter 2. Then, I detail in Chapter 3 all the choices made for the design and realisation of the NEMO 3 detector for the research of double beta decay process. Performance of the detector is presented, concerning both the capacity of the detector to identify the backgrounds and the ability to study all the ββ process. I also explain the methods chosen by the NEMO collaboration to reduce the radon activity inside the detector and to make this background negligible today. This chapter, which is written in English, is the 'Technical report of the NEMO 3 detector' and forms an independent report for the NEMO collaborators. I finish this report in Chapter 4 with a ten years prospect for experimental projects in physics, with both the SuperNEMO project and its experiment program, and also by comparing the most interesting experiments, CUORE and GERDA, showing as an example the effect of nuclear matrix elements on the neutrino effective mass measurement. (author)

  18. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  19. Graphical tools for model selection in generalized linear models.

    Science.gov (United States)

    Murray, K; Heritier, S; Müller, S

    2013-11-10

    Model selection techniques have existed for many years; however, to date, simple, clear and effective methods of visualising the model building process are sparse. This article describes graphical methods that assist in the selection of models and comparison of many different selection criteria. Specifically, we describe for logistic regression, how to visualize measures of description loss and of model complexity to facilitate the model selection dilemma. We advocate the use of the bootstrap to assess the stability of selected models and to enhance our graphical tools. We demonstrate which variables are important using variable inclusion plots and show that these can be invaluable plots for the model building process. We show with two case studies how these proposed tools are useful to learn more about important variables in the data and how these tools can assist the understanding of the model building process. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  1. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  2. Design of the optical Raman amplifier for the shore station of NEMO phase 2

    Energy Technology Data Exchange (ETDEWEB)

    D' Amico, A., E-mail: damico@lns.infn.i [LNS-INFN, Via S. Sofia 62 I-95123, Catania (Italy)

    2011-01-21

    A distributed Raman amplifier system for the NEMO phase 2 project has been simulated. The simulation goal was to optimize the Raman pump wavelengths in order to maximize the gain in the spectral region extending between 1530 and 1563 nm, where the DWDM channels of the data transport system are allocated. The results of the simulated gain will be shown.

  3. Advanced energy systems and technologies (NEMO 2). Final report 1993-1998

    Energy Technology Data Exchange (ETDEWEB)

    Lund, P.; Konttinen, P. [eds.

    1998-12-31

    NEMO2 has been the major Finnish energy research programme on advanced energy systems and technologies during 1993-1998. The main objective of the programme has been to support industrial technology development but also to increase the utilisation of wind and solar energy in Finland. The main technology fields covered are wind and solar energy. In addition, the programme has supported projects on energy storage and other small-scale energy technologies such as fuel cells that support the main technology fields chosen. NEMO2 is one of the energy research programmes of the Technology Development Centre of Finland (TEKES). The total R and D funding over the whole programme period was FIM 130 million (ECU 22 million). The public funding of the total programme costs has been 43 %. The industrial participation has been strong. International co-operation has been an important aspect in NEMO2: the programme has stimulated 24 EU-projects and participation in several IEA co-operative tasks. International funding adds nearly 20 % to the NEMO2 R and D funding. (orig.)

  4. Advanced energy systems and technologies (NEMO 2). Final report 1993-1998

    International Nuclear Information System (INIS)

    Lund, P.; Konttinen, P.

    1998-01-01

    NEMO2 has been the major Finnish energy research programme on advanced energy systems and technologies during 1993-1998. The main objective of the programme has been to support industrial technology development but also to increase the utilisation of wind and solar energy in Finland. The main technology fields covered are wind and solar energy. In addition, the programme has supported projects on energy storage and other small-scale energy technologies such as fuel cells that support the main technology fields chosen. NEMO2 is one of the energy research programmes of the Technology Development Centre of Finland (TEKES). The total R and D funding over the whole programme period was FIM 130 million (ECU 22 million). The public funding of the total programme costs has been 43 %. The industrial participation has been strong. International co-operation has been an important aspect in NEMO2: the programme has stimulated 24 EU-projects and participation in several IEA co-operative tasks. International funding adds nearly 20 % to the NEMO2 R and D funding. (orig.)

  5. Cellular automaton and elastic net for event reconstruction in the NEMO-2 experiment

    International Nuclear Information System (INIS)

    Kovalenko, V.

    1997-01-01

    A cellular automaton for track searching and an elastic net for charged particle trajectory fitting are presented. The advantages of the methods are: simplicity of the algorithms, fast and stable convergence to real tracks, and a reconstruction efficiency close to 100%. Demonstration programs are available at http://nuweb.jinr.dubna.su/LNP/NEMO using a Java enabled browser. (orig.)

  6. South Atlantic meridional transports from NEMO-based simulations and reanalyses

    Science.gov (United States)

    Mignac, Davi; Ferreira, David; Haines, Keith

    2018-02-01

    The meridional heat transport (MHT) of the South Atlantic plays a key role in the global heat budget: it is the only equatorward basin-scale ocean heat transport and it sets the northward direction of the global cross-equatorial transport. Its strength and variability, however, are not well known. The South Atlantic transports are evaluated for four state-of-the-art global ocean reanalyses (ORAs) and two free-running models (FRMs) in the period 1997-2010. All products employ the Nucleus for European Modelling of the Oceans (NEMO) model, and the ORAs share very similar configurations. Very few previous works have looked at ocean circulation patterns in reanalysis products, but here we show that the ORA basin interior transports are consistently improved by the assimilated in situ and satellite observations relative to the FRMs, especially in the Argo period. The ORAs also exhibit systematically higher meridional transports than the FRMs, which is in closer agreement with observational estimates at 35 and 11° S. However, the data assimilation impact on the meridional transports still greatly varies among the ORAs, leading to differences up to ˜ 8 Sv and 0.4 PW in the South Atlantic Meridional Overturning Circulation and the MHTs, respectively. We narrow this down to large inter-product discrepancies in the western boundary currents (WBCs) at both upper and deep levels explaining up to ˜ 85 % of the inter-product differences in MHT. We show that meridional velocity differences, rather than temperature differences, in the WBCs drive ˜ 83 % of this MHT spread. These findings show that the present ocean observation network and data assimilation schemes can be used to consistently constrain the South Atlantic interior circulation but not the overturning component, which is dominated by the narrow western boundary currents. This will likely limit the effectiveness of ORA products for climate or decadal prediction studies.

  7. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  8. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  9. New tools for generation IV assemblies modelling

    International Nuclear Information System (INIS)

    Sylvie Aniel-Buchheit; Edwige Richebois

    2005-01-01

    Full text of publication follows: In the framework of the development of generation IV concepts, the need of new assembly modelling tools arises. These concepts present more geometrical and spectral heterogeneities (radially and axially). Moreover thermal-hydraulics and neutronics aspects are so closely related that coupled computations are necessary. That raises the need for more precise and flexible tools presenting 3D features. The 3D-coupling of the thermal-hydraulic code FLICA4 with the Monte-Carlo neutronics code TRIPOLI4 was developed in that frame. This new tool enables for the first time to obtain realistic axial and radial power profiles with real feedback effects in an assembly where thermal-hydraulics and neutronics effects are closely related. The BWR is the existing concept presenting the closest heterogeneous characteristics to the various new proposed concepts. This assembly design is thus chosen to compare this new tool, presenting real 3D characteristics, to the existing ones. For design studies, the evaluation of the assembly behavior, currently necessitate a depletion scheme using a 3D thermal-hydraulics assembly calculation coupled with a 1D axial neutronics deterministic calculation (or an axial power profile chosen as a function of the assembly averaged burn-up). The 3D neutronics code (CRONOS2) uses neutronic data built by 2D deterministic assembly calculations without feedback. These cross section libraries enable to take feedbacks into account via parameters such as fuel temperature, moderator density and temperature (history parameters such as void and control rod are not useful in design evaluation). Recently, the libraries build-up has been replaced by on line multi-2D deterministic assembly calculations performed by a cell code (APOLLO2). That avoids interpolation between pre-determined parameters in the cross-section data used by the 1D axial neutronics calculation and enable to give a radial power map to the 3D thermal

  10. Comparative study of sea ice dynamics simulations with a Maxwell elasto-brittle rheology and the elastic-viscous-plastic rheology in NEMO-LIM3

    Science.gov (United States)

    Raulier, Jonathan; Dansereau, Véronique; Fichefet, Thierry; Legat, Vincent; Weiss, Jérôme

    2017-04-01

    Sea ice is a highly dynamical environment characterized by a dense mesh of fractures or leads, constantly opening and closing over short time scales. This characteristic geomorphology is linked to the existence of linear kinematic features, which consist of quasi-linear patterns emerging from the observed strain rate field of sea ice. Standard rheologies used in most state-of-the-art sea ice models, like the well-known elastic-viscous-plastic rheology, are thought to misrepresent those linear kinematic features and the observed statistical distribution of deformation rates. Dedicated rheologies built to catch the processes known to be at the origin of the formation of leads are developed but still need evaluations on the global scale. One of them, based on a Maxwell elasto-brittle formulation, is being integrated in the NEMO-LIM3 global ocean-sea ice model (www.nemo-ocean.eu; www.elic.ucl.ac.be/lim). In the present study, we compare the results of the sea ice model LIM3 obtained with two different rheologies: the elastic-viscous-plastic rheology commonly used in LIM3 and a Maxwell elasto-brittle rheology. This comparison is focused on the statistical characteristics of the simulated deformation rate and on the ability of the model to reproduce the existence of leads within the ice pack. The impact of the lead representation on fluxes between ice, atmosphere and ocean is also assessed.

  11. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  12. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  13. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  14. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    package, e.g. power module, DFR approach meets trade-offs in electrical, thermal and mechanical design of the device. Today, virtual prototyping of power electronic circuits using advanced simulation tools is becoming attractive due to cost/time saving in building potential designs. With simulations......This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... is processed through power electronics, highly efficient, sustainable, reliable and cost-effective power electronic devices are needed. Reliability of a product is defined as the ability to perform within its predefined functions under given conditions in a specific time. Because power electronic devices...

  15. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  16. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  17. Development of high performance and very low radioactivity scintillation counters for the SuperNEMO calorimeter

    International Nuclear Information System (INIS)

    Chauveau, E.

    2010-11-01

    SuperNEMO is a next generation double beta decay experiment which will extend the successful 'tracko-calo' technique employed in NEMO 3. The main characteristic of this type of detector is to identify not only double beta decays, but also to measure its own background components. The project aims to reach a sensitivity up to 10 26 years on the half-life of 82 Se. One of the main challenge of the Research and Development is to achieve an unprecedented energy resolution for the electron calorimeter, better than 8 % FWHM at 1 MeV. This thesis contributes to improve scintillators and photomultipliers performances and reduce their radioactivity, including in particular the development of a new photomultiplier in collaboration with Photonis. (author)

  18. Atmospheric muons in the NEMO Phase 1 detector at the Catania test site

    International Nuclear Information System (INIS)

    Margiotta, Annarita

    2006-01-01

    The NEMO Collaboration is involved in a long term R and D activity towards the construction of a km 3 telescope in the Mediterranean sea. It has dedicated special efforts in the development of technologies for a km 3 detector and in the search, characterization and monitoring of a deep sea site adequate for the installation of the Mediterranean km 3 . Now the NEMO Collaboration is involved in the Phase 1 of the project, planning to install a fully equipped deep-sea facility to test prototypes and develop new technologies for the detector. A full Monte Carlo simulation has been performed to analyse the response of a reduced-size detector to the passage of atmospheric muons. Preliminary steps of the simulation are presented in this work

  19. Radon Mitigation Strategy and Results for the SuperNEMO Experiment

    Science.gov (United States)

    Liu, Xin Ran; SuperNEMO Collaboration

    2017-09-01

    SuperNEMO is a modern neutrinoless double beta decay (0νββ) experiment with a design capability to reach half-life sensitivity of T1/2(0ν) >1026 years, equivalent to an effective Majorana neutrino mass of [mββ ] < 50 - 100 meV [1]. To achieve this sensitivity, SuperNEMO aims to become a zero background 0νββ experiment in the first Demonstrator phase. This target placed challenging demands on the radiopurity of detector components and the radon activity within the tracker. To minimise radon levels all internal detector components were screened for radon emanation, which was then confirmed through direct measurement of the gaseous tracker. First measurements of tracker indicated that target radon levels of <0.15 mBq/m3 can be achieved.

  20. Super-resolution microscopy reveals a preformed NEMO lattice structure that is collapsed in incontinentia pigmenti

    CSIR Research Space (South Africa)

    Scholefield, Janine

    2016-09-01

    Full Text Available control of signal amplification enhancing efficiency of subsequent catalytic reactions. Such cooperativity would be greatly facilitated if a lattice-like structure pre-existed within the cell, before signal induction5. Higher-order structures have been... known as IKKg), responsible for the regulation of the catalytic IKK subunits8,9. Indeed, hints of a higher-order oligomeric structure emerged in a previous study labelling NEMO following IL-1 stimulation10. However, confocal microscopy revealed...

  1. Cellular automaton and elastic net for event reconstruction in the NEMO-2 experiment

    International Nuclear Information System (INIS)

    Kisel, I.; Kovalenko, V.; Laplanche, F.

    1997-01-01

    A cellular automaton for track searching combined with an elastic net for charged particle trajectory fitting is presented. The advantages of the methods are: the simplicity of the algorithms, the fast and stable convergency to real tracks, and a good reconstruction efficiency. The combination of techniques have been used with success for event reconstruction on the data of the NEMO-2 double-beta (ββ) decay experiments. (orig.)

  2. Thermal behaviour modelling of superplastic forming tools

    OpenAIRE

    Velay , Vincent; Cutard , Thierry; Guegan , N.

    2008-01-01

    High-temperature operational conditions of super plastic forming (SPF) tools induce very complex thermomechanical loadings responsible to their failure. Various materials can be used to manufacture forming tools: ceramic, refractory castable or heat resistant steel. In this paper, an experimental and numerical analysis is performed in order to characterise the environmental loadings undergone by the tool whatever the considered material. This investigation allows to lead a thermal calculation...

  3. NEMO-SN-1 the first 'real-time' seafloor observatory of ESONET

    International Nuclear Information System (INIS)

    Favali, Paolo; Beranzoli, Laura; D'Anna, Giuseppe; Gasparoni, Francesco; Gerber, Hans W.

    2006-01-01

    The fruitful collaboration between Italian Research Institutions, particularly Istituto Nazionale di Fisica Nucleare (INFN) and Istituto Nazionale di Geofisica e Vulcanologia (INGV) together with Marine Engineering Companies, led to the development of NEMO-SN-1, the first European cabled seafloor multiparameter observatory. This observatory, deployed at 2060 m w.d. about 12 miles off-shore the Eastern coasts of Sicily (Southern Italy), is in real-time acquisition since January 2005 and addressed to different set of measurements: geophysical and oceanographic. In particular the SN-1 seismological data are integrated in the INGV land-based national seismic network, and they arrive in real-time to the Operative Centre in Rome. In the European Commission (EC) European Seafloor Observatory NETwork (ESONET) project, in connection to the Global Monitoring for Environment and Security (GMES) action plan, the NEMO-SN-1 site has been proposed as an European key area, both for its intrinsic importance for geo-hazards and for the availability of infrastructure as a stepwise development in GMES program. Presently, NEMO-SN-1 is the only ESONET site operative. The paper gives a description of SN-1 observatory with examples of data

  4. A Spectrum Handoff Scheme for Optimal Network Selection in NEMO Based Cognitive Radio Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Krishan Kumar

    2017-01-01

    Full Text Available When a mobile network changes its point of attachments in Cognitive Radio (CR vehicular networks, the Mobile Router (MR requires spectrum handoff. Network Mobility (NEMO in CR vehicular networks is concerned with the management of this movement. In future NEMO based CR vehicular networks deployment, multiple radio access networks may coexist in the overlapping areas having different characteristics in terms of multiple attributes. The CR vehicular node may have the capability to make call for two or more types of nonsafety services such as voice, video, and best effort simultaneously. Hence, it becomes difficult for MR to select optimal network for the spectrum handoff. This can be done by performing spectrum handoff using Multiple Attributes Decision Making (MADM methods which is the objective of the paper. The MADM methods such as grey relational analysis and cost based methods are used. The application of MADM methods provides wider and optimum choice among the available networks with quality of service. Numerical results reveal that the proposed scheme is effective for spectrum handoff decision for optimal network selection with reduced complexity in NEMO based CR vehicular networks.

  5. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  6. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  7. Simulation Tools Model Icing for Aircraft Design

    Science.gov (United States)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  8. Advanced energy systems and technologies research in Finland. NEMO-2 Programme Annual Report 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-10-01

    Advanced energy technologies were linked to the national energy research in the beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry established several energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on a few promising technological solutions. In the beginning of 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies, such as energy storage and hydrogen technology. Resources have been focused on three specific areas: arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). In Finland, the growth of the new energy technology industry is concentrated on these areas. The turnover of the Finnish industry has been growing considerably due to the national research activities and support of technology development. The sales have increased more than 10 times compared with the year 1987 and is now over 300 million FIM. The support to industries and their involvement in the program has grown considerably. In this report, the essential research projects of the programme during 1996-1997 are described. The total funding for these projects was about 30 million FIM per year, of which the TEKES`s share was about 40 per cent. The programme consists of 10 research projects, some 15 joint development projects, and 9 EU projects. In case the research projects and joint development projects are acting very closely, the description of the project is

  9. Advanced energy systems and technologies research in Finland. NEMO 2 annual report 1994-1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    Advanced energy technologies were linked to the national energy research in beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry set up many energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on technological solutions. In the beginning of the 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies such as energy storage and hydrogen technology. Resources has been focused on three specific areas: Arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). It seems that in Finland the growth of the new energy technology industry is focused on these areas. The sales of the industry have been growing considerable due to the national research activities and support of technology development. The sales have increased 6 - 7 times compared to the year 1987 and is now over 200 million FIM. The support to industries and their involvement in the program has grown more than 15 times compared to 1988. The total funding of the NEMO 2 program me was 30 million FIM in 1994 and 21 million FIM in 1995. The programme consists of 20 research projects, 15 joint development projects, and 5 EU projects. In this report, the essential research projects of the programme in 1994-1995 are described. The total funding for these projects was about 25 million FIM, of which the TEKES`s share was about half. When the research projects and joint development projects are

  10. Advanced energy systems and technologies research in Finland. NEMO-2 Programme Annual Report 1996-1997

    International Nuclear Information System (INIS)

    1998-01-01

    Advanced energy technologies were linked to the national energy research in the beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry established several energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on a few promising technological solutions. In the beginning of 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies, such as energy storage and hydrogen technology. Resources have been focused on three specific areas: arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). In Finland, the growth of the new energy technology industry is concentrated on these areas. The turnover of the Finnish industry has been growing considerably due to the national research activities and support of technology development. The sales have increased more than 10 times compared with the year 1987 and is now over 300 million FIM. The support to industries and their involvement in the program has grown considerably. In this report, the essential research projects of the programme during 1996-1997 are described. The total funding for these projects was about 30 million FIM per year, of which the TEKES's share was about 40 per cent. The programme consists of 10 research projects, some 15 joint development projects, and 9 EU projects. In case the research projects and joint development projects are acting very closely, the description of the project is

  11. AMM15: a new high-resolution NEMO configuration for operational simulation of the European north-west shelf

    Science.gov (United States)

    Graham, Jennifer A.; O'Dea, Enda; Holt, Jason; Polton, Jeff; Hewitt, Helene T.; Furner, Rachel; Guihou, Karen; Brereton, Ashley; Arnold, Alex; Wakelin, Sarah; Castillo Sanchez, Juan Manuel; Mayorga Adame, C. Gabriela

    2018-02-01

    This paper describes the next-generation ocean forecast model for the European north-west shelf, which will become the basis of operational forecasts in 2018. This new system will provide a step change in resolution and therefore our ability to represent small-scale processes. The new model has a resolution of 1.5 km compared with a grid spacing of 7 km in the current operational system. AMM15 (Atlantic Margin Model, 1.5 km) is introduced as a new regional configuration of NEMO v3.6. Here we describe the technical details behind this configuration, with modifications appropriate for the new high-resolution domain. Results from a 30-year non-assimilative run using the AMM15 domain demonstrate the ability of this model to represent the mean state and variability of the region.Overall, there is an improvement in the representation of the mean state across the region, suggesting similar improvements may be seen in the future operational system. However, the reduction in seasonal bias is greater off-shelf than on-shelf. In the North Sea, biases are largely unchanged. Since there has been no change to the vertical resolution or parameterization schemes, performance improvements are not expected in regions where stratification is dominated by vertical processes rather than advection. This highlights the fact that increased horizontal resolution will not lead to domain-wide improvements. Further work is needed to target bias reduction across the north-west shelf region.

  12. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  13. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  14. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  15. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models....... In this paper, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from an example: the UML modeling tool shall show all decisions related to a model and allow extending or updating them; the decision management tool shall trigger the modeling tool...

  16. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    of its strengths and areas of improvement (Section 6). Several key appendices are attached to this report including user manuals for teacher and students (Appendix 3). Fundamentally, all relevant information is included in the report for those wishing to do further development work with the tool...

  17. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    . To illustrate these concepts a number of examples are used. These include models of polymer membranes, distillation and catalyst behaviour. Some detailed considerations within these models are stated and discussed. Model generation concepts are introduced and ideas of a reference model are given that shows...

  18. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  19. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  20. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  1. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  2. Evaluation of QoS supported in Network Mobility NEMO environments

    International Nuclear Information System (INIS)

    Hussien, L F; Abdalla, A H; Habaebi, M H; Khalifa, O O; Hassan, W H

    2013-01-01

    Network mobility basic support (NEMO BS) protocol is an entire network, roaming as a unit which changes its point of attachment to the Internet and consequently its reachability in the network topology. NEMO BS doesn't provide QoS guarantees to its users same as traditional Internet IP and Mobile IPv6 as well. Typically, all the users will have same level of services without considering about their application requirements. This poses a problem to real-time applications that required QoS guarantees. To gain more effective control of the network, incorporated QoS is needed. Within QoS-enabled network the traffic flow can be distributed to various priorities. Also, the network bandwidth and resources can be allocated to different applications and users. Internet Engineering Task Force (IETF) working group has proposed several QoS solutions for static network such as IntServ, DiffServ and MPLS. These QoS solutions are designed in the context of a static environment (i.e. fixed hosts and networks). However, they are not fully adapted to mobile environments. They essentially demands to be extended and adjusted to meet up various challenges involved in mobile environments. With existing QoS mechanisms many proposals have been developed to provide QoS for individual mobile nodes (i.e. host mobility). In contrary, research based on the movement of the whole mobile network in IPv6 is still undertaking by the IETF working groups (i.e. network mobility). Few researches have been done in the area of providing QoS for roaming networks. Therefore, this paper aims to review and investigate (previous /and current) related works that have been developed to provide QoS in mobile network. Consequently, a new proposed scheme will be introduced to enhance QoS within NEMO environment, achieving by which seamless mobility to users of mobile network node (MNN)

  3. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  4. The km sup 3 Mediterranean neutrino observatory - the NEMO.RD project

    CERN Document Server

    De Marzo, C N

    2001-01-01

    The NEMO.RD Project is a feasibility study of a km sup 3 underwater telescope for high energy astrophysical neutrinos to be located in the Mediterranean Sea. Results on various issues of this project are presented on: i) Monte Carlo simulation study of the capabilities of various arrays of phototubes in order to determine the detector geometry that can optimize performance and cost; ii) oceanographic survey of various sites in search of the optimal one; iii) feasibility study of mechanics, deployment, connections and maintenance of such a detector. Parameters of a site near Capo Passero, Sicily, where depth, transparency and other water parameters seem optimal are shown.

  5. Hypohidrotic ectodermal dysplasia and immunodeficiency with coincident NEMO and EDA Mutations

    Directory of Open Access Journals (Sweden)

    Michael D. Keller

    2011-11-01

    Full Text Available Ectodermal dysplasias (ED are uncommon genetic disorders resulting in abnormalities in ectodermally-derived structures. Though many ED-associated genes have been described, the NF-κB Essential Modulator (NEMO encoded by the IKBKG gene is unique in that mutations also result in severe humoral and cellular immunologic defects. We describe three unrelated kindreds with defects in both EDA and IKBKG resulting from an X-chromosome crossover. This demonstrates the importance of thorough immunologic consideration of patients with ED even when an EDA etiology is confirmed, and raises the possibility of a specific phenotype arising from coincident mutations in EDA and IKBKB.

  6. Measurement of the atmospheric muon flux at 3500 m depth with the NEMO Phase-2 detector

    Directory of Open Access Journals (Sweden)

    Distefano C.

    2016-01-01

    Full Text Available In March 2013, the Nemo Phase-2 tower was successfully deployed at 80 km off-shore Capo Passero (Italy at 3500 m depth. The tower operated continuously until August 2014. We present the results of the atmospheric muon analysis from the data collected in 411 days of live time. The zenith-angle distribution of atmospheric muons was measured and results compared with Monte Carlo simulations. The associated depth intensity relation was then measured and compared with previous measurements and theoretical predictions.

  7. The optical modules of the phase-2 of the NEMO project

    Science.gov (United States)

    Aiello, S.; Leonora, E.; Ameli, F.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Barbarito, E.; Barbato, F.; Bersani, A.; Beverini, N.; Biagi, S.; Bonori, M.; Bouhadef, B.; Bozza, C.; Cacopardo, G.; Capone, A.; Caruso, F.; Ceres, A.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; D'Amico, A.; De Asmundis, R.; De Bonis, G.; De Rosa, G.; De Vita, R.; Distefano, C.; Fermani, P.; Flaminio, V.; Fusco, L. A.; Garufi, F.; Giordano, V.; Giovanetti, G.; Grella, G.; Grimaldi, A.; Habel, R.; Imbesi, M.; Kulikovsky, V.; Lattuada, D.; Leotta, G.; Lonardo, A.; Longhitano, F.; Lo Presti, D.; Maccioni, E.; Margiotta, A.; Marinelli, A.; Martini, A.; Masullo, R.; Maugeri, F.; Migliozzi, P.; Migneco, E.; Minutoli, S.; Miraglia, A.; Mollo, C.; Mongelli, M.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Papaleo, R.; Pappalardo, V.; Pellegrino, C.; Perrina, C.; Piattelli, P.; Pugliatti, C.; Pulvirenti, S.; Raffaelli, F.; Raia, G.; Randazzo, N.; Riccobene, G.; Rovelli, A.; Russo, A.; Russo, G. V.; Sapienza, P.; Sciliberto, D.; Sedita, M.; Sgura, I.; Shirokov, E.; Simeone, F.; Sipala, V.; Sollima, C.; Spina, M.; Spurio, M.; Stefani, F.; Taiuti, M.; Terreni, G.; Trasatti, L.; Trovato, A.; Vicini, P.; Viola, S.; Vivolo, D.

    2013-07-01

    A 13-inch Optical Module (OM) containing a large-area (10-inch) photomultiplier was designed as part of Phase-2 of the NEMO project. An intense R&D activity on the photomultipliers, the voltage supply boards, the optical coupling as well as the study of the influences of the Earth's magnetic field has driven the choice of each single component of the OM. Following a well-established production procedure, 32 OMs were assembled and their functionality tested. The design, the testing and the production phases are thoroughly described in this paper.

  8. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  9. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  10. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    of the cells total volume. The cytosol contains thousands of enzymes that are responsible for the catalyzation of glycolysis and gluconeogenesis ... dog , swine and pig models [Pantely, 1990, 1991; Stanley 1992]. In these studies, blood flow through the left anterior descending (LAD) coronary...perfusion. In conclusion, even thought our model falls within the (rather large) error bounds of experimental dog , pig and swine models, the

  11. Towards a generalized energy prediction model for machine tools.

    Science.gov (United States)

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  12. A model of tool wear monitoring system for turning

    OpenAIRE

    Šimunović, Goran; Ficko, Mirko; Šarić, Tomislav; Milošević, Mijodrag; Antić, Aco

    2015-01-01

    Acquiring high-quality and timely information on the tool wear condition in real time, presents a necessary prerequisite for identification of tool wear degree, which significantly improves the stability and quality of the machining process. Defined in this paper is a model of tool wear monitoring system with special emphasis on the module for acquisition and processing of vibration acceleration signal by applying discrete wavelet transformations (DWT) in signal decomposition. The paper prese...

  13. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused. In this project a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its new model import and export capabilities, have been developed. The new feature for model transfer...... has been developed by establishing a connection with an external modelling environment for code generation. The main contribution of this thesis is a creation of modelling templates and their connection with other modelling tools within a modelling framework. The goal was to create a user...

  14. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  15. Graphical Tools for Linear Structural Equation Modeling

    Science.gov (United States)

    2014-06-01

    regression coefficient βS A.CQ1 van- ishes, which can be used to test whether the specification of Model 2 is compatible with the data. Most...because they are all compatible with the graph in Figure 19a, which displays the skeleton and v-structures. Note that we cannot reverse the edge from...im- plications of linear structual equation models. R-428, <http://ftp.cs.ucla.edu/pub/stat_ser/r428.pdf>, CA. To ap- pear in Proceedings of AAAI-2014

  16. Toposcopy : A modelling tool for CITYGML

    NARCIS (Netherlands)

    Groneman, A.; Zlatanova, S.

    2009-01-01

    The new 3D standard CityGML has been attracting a lot of attention in the last few years. Many characteristics of the XML-based format make it suitable for storage and exchange of virtual 3D city models. It provides possibilities to store semantic and geometric information and has the potential to

  17. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    -annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  18. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  19. Sounds of silence : A research into the relationship between administrative supervision, criminal investigation and the nemo-tenetur principle

    NARCIS (Netherlands)

    Peçi, I.

    2006-01-01

    The subject of this thesis is the relationship between administrative supervision, criminal investigation and the nemo-tenetur principle. The point of departure is the distinction made in Dutch law and doctrine between administrative supervision and criminal investigation. Such a distinction is

  20. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  1. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    OpenAIRE

    O. K. Akmaev; B. A. Enikeev; A. I. Nigmatullin

    2015-01-01

    The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel k...

  2. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  3. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...... assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately...... describe contaminant transport in fractured media and develop practical tools with the relevant processes and level of complexity....

  4. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  5. Thermodynamic and dynamic ice thickness contributions in the Canadian Arctic Archipelago in NEMO-LIM2 numerical simulations

    Science.gov (United States)

    Hu, Xianmin; Sun, Jingfan; Chan, Ting On; Myers, Paul G.

    2018-04-01

    Sea ice thickness evolution within the Canadian Arctic Archipelago (CAA) is of great interest to science, as well as local communities and their economy. In this study, based on the NEMO numerical framework including the LIM2 sea ice module, simulations at both 1/4 and 1/12° horizontal resolution were conducted from 2002 to 2016. The model captures well the general spatial distribution of ice thickness in the CAA region, with very thick sea ice (˜ 4 m and thicker) in the northern CAA, thick sea ice (2.5 to 3 m) in the west-central Parry Channel and M'Clintock Channel, and thin ( Environment and Climate Change Canada (ECCC) New Ice Thickness Program data at first-year landfast ice sites except at the northern sites with high concentration of old ice. At 1/4 to 1/12° scale, model resolution does not play a significant role in the sea ice simulation except to improve local dynamics because of better coastline representation. Sea ice growth is decomposed into thermodynamic and dynamic (including all non-thermodynamic processes in the model) contributions to study the ice thickness evolution. Relatively smaller thermodynamic contribution to ice growth between December and the following April is found in the thick and very thick ice regions, with larger contributions in the thin ice-covered region. No significant trend in winter maximum ice volume is found in the northern CAA and Baffin Bay while a decline (r2 ≈ 0.6, p < 0.01) is simulated in Parry Channel region. The two main contributors (thermodynamic growth and lateral transport) have high interannual variabilities which largely balance each other, so that maximum ice volume can vary interannually by ±12 % in the northern CAA, ±15 % in Parry Channel, and ±9 % in Baffin Bay. Further quantitative evaluation is required.

  6. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  7. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  8. Study of tracking detector of NEMO3 experiment - simulation of the measurement of the ultra low {sup 208}Tl radioactivity in the source foils used as neutrinoless double beta decay emitters in NEMO3 experiment; Etude du detecteur de traces de l'experience NEMO3. Simulation de la mesure de l'ultra-faible radioactivite en {sup 208}Tl des sources de l'experience NEMO3 candidates a la double desintegration {beta} sans emission de neutrino

    Energy Technology Data Exchange (ETDEWEB)

    Errahmane, K

    2001-04-01

    The purpose of NEMO3 experiment is the research of the neutrinoless double beta decay. This low energy process can sign the massive and Majorana nature of neutrino. This experiment, with a very low radioactive background and containing 10 kg of enriched isotopes, studies mainly {sup 100}Mo. Installed at the Frejus underground laboratory, NEMO3 is a cylindrical detector, which consists in very thin central source foils, in a tracking detector made up of vertical drift cells operating in Geiger mode, in a calorimeter and in a suitable shielding. This thesis is divided in two different parts. The first part is a full study of the features of the tracking detector. With a prototype composed of 9 drift cells, we characterised the longitudinal and transverse reconstruction of position of the ionisation created by a LASER. With the first 3 modules under operation, we used radioactive external neutron sources to measure the transverse resolution of ionisation position in a drift cell for high energy electrons. To study the vertex reconstruction on the source foil, sources of {sup 207}Bi, which produced conversion electrons, were used inside the 3 modules. The second part of this thesis, we show, with simulations, that we can measure, with NEMO3 detector itself, the ultra low level of contamination in {sup 208}Tl of the source foil, which comes from the natural radioactive chain of thorium. Using electron-photons channels, we can obtain the {sup 208}Tl activity in the sources. With an analysis on the energy and on the time of flight of particles, NEMO3 is able to reach a sensitivity of 20{mu}Bq/kg after only 2 months of measurement. This sensitivity is the maximum {sup 208}Tl activity, which we accepted for the sources in the NEMO3 proposal. (author)

  9. State-of-the-art of NEMO 3. Detector dedicated to study of ββ0ν decay

    International Nuclear Information System (INIS)

    Dassie, D.; Guiral, A.; Levy, G.; Lewko, D.; Mesples-Carrere, F.

    1997-01-01

    The NEMO collaboration started the construction of the NEMO 3 detector heaving as objective reaching a 0.1 eV limit on mass of the Majorana type neutrino. The project has been accepted by IN2P3 in 1994 and the construction started in 1995, with the planned completion date 1998. The chosen project was similar to that of NEMO 2 up to a scale factor of about 10; it is this scale factor as well as a calorimetric enclosure of almost 4 π and measuring times of the order of several years that will permit reaching the necessary sensitivity. It is provided for studies of enriched simples of up to 10 kg in the emitting isotopes. NEMO 3 will allow investigation of the different processes ββ0ν, ββ0νm and ββ2ν up to lifetimes of 10 25 , 10 23 and 10 22 years. Such long lifetimes impose a severe selection on all the materials used in the construction of the detector to avoid that the expected signal be polluted from spurious signals induced by the own radioactivity of the detector. Also, a perfect knowledge of the radiation background of the laboratory and its effects on the detector is required. The paper gives a general layout of the NEMO 3 detector and presents the CENBG contribution to the construction of calorimetric wells, to the measurements of radioactive materials purity by γ spectroscopy and the computation by simulation of the effect of neutrons on the detector

  10. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  11. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  12. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  13. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  14. Advanced REACH Tool (ART) : Calibration of the mechanistic model

    NARCIS (Netherlands)

    Schinkel, J.; Warren, N.; Fransman, W.; Tongeren, M. van; McDonnell, P.; Voogd, E.; Cherrie, J.W.; Tischer, M.; Kromhout, H.; Tielemans, E.

    2011-01-01

    The mechanistic model of the Advanced Reach Tool (ART) provides a relative ranking of exposure levels from different scenarios. The objectives of the calibration described in this paper are threefold: to study whether the mechanistic model scores are accurately ranked in relation to exposure

  15. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Molecular modeling has become a valuable and essential tool to medicinal chemists in the drug design process. Molecular modeling describes the generation, manipula- tion or representation of three-dimensional structures of molecules and associated physico-chemical properties. It involves a range of computerized ...

  16. NEMO medium voltage converter factory acceptance, operational and final integration tests

    Energy Technology Data Exchange (ETDEWEB)

    Cocimano, Rosanna, E-mail: cocimano@lns.infn.i [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali del Sud, Via S. Sofia 62, 95123 Catania (Italy)

    2011-01-21

    The NEMO Collaboration, as part of the KM3NeT EU-funded consortium, is developing technical solutions for the construction of a cubic-kilometer scale neutrino telescope in the Mediterranean sea several kilometers below the sea level and far from the shore. In this framework, after years of design, development, assembly and testing the Alcatel deep sea medium voltage power converter (MVC) is ready for deployment at 100 km from the Capo Passero shore station. The MVC converts the 10 kV to an instrument-friendly 375 V for a 10 kW power. The MVC will be presented with focus on the factory acceptance, operational and final integration tests that recently have been carried out.

  17. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  18. Hypermedia as an experiential learning tool: a theoretical model

    OpenAIRE

    Jose Miguel Baptista Nunes; Susan P. Fowell

    1996-01-01

    The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learni...

  19. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  20. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2015-09-01

    Full Text Available The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel kinematics.

  1. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  2. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  3. Analytical Modelling Of Milling For Tool Design And Selection

    Science.gov (United States)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  4. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  5. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  6. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  7. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  8. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  9. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  10. Accessing Curriculum Through Technology Tools (ACTTT): A Model Development Project

    Science.gov (United States)

    Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda

    2012-01-01

    Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum.…

  11. Combining modelling tools to evaluate a goose management scheme

    NARCIS (Netherlands)

    Baveco, Hans; Bergjord, Anne Kari; Bjerke, Jarle W.; Chudzińska, Magda E.; Pellissier, Loïc; Simonsen, Caroline E.; Madsen, Jesper; Tombre, Ingunn M.; Nolet, Bart A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  12. Combining modelling tools to evaluate a goose management scheme.

    NARCIS (Netherlands)

    Baveco, J.M.; Bergjord, A.K.; Bjerke, J.W.; Chudzińska, M.E.; Pellissier, L.; Simonsen, C.E.; Madsen, J.; Tombre, Ingunn M.; Nolet, B.A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  13. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  14. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  16. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  17. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    Energy Technology Data Exchange (ETDEWEB)

    Adrian-Martinez, S.; Ardid, M.; Llorens Alvarez, C.D.; Saldana, M. [Universitat Politecnica de Valencia, Instituto de Investigacion para la Gestion Integrada de las Zonas Costeras, Gandia (Spain); Aiello, S.; Giordano, V.; Leonora, E.; Longhitano, F.; Randazzo, N.; Sipala, V.; Ventura, C. [INFN Sezione Catania, Catania (Italy); Ameli, F.; Biagioni, A.; De Bonis, G.; Fermani, P.; Lonardo, A.; Nicolau, C.A.; Simeone, F.; Vicini, P. [INFN Sezione Roma, Rome (Italy); Anghinolfi, M.; Hugon, C.; Musico, P.; Orzelli, A.; Sanguineti, M. [INFN Sezione Genova, Genoa (Italy); Barbarino, G.; Barbato, F.C.T.; De Rosa, G.; Di Capua, F.; Garufi, F.; Vivolo, D. [INFN Sezione Napoli, Naples (Italy); Dipartimento di Scienze Fisiche Universita di Napoli, Naples (Italy); Barbarito, E. [INFN Sezione Bari, Bari (Italy); Dipartimento Interateneo di Fisica Universita di Bari, Bari (Italy); Beverini, N.; Calamai, M.; Maccioni, E.; Marinelli, A.; Terreni, G. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Dipartimento di Fisica Universita di Pisa, Polo Fibonacci, Pisa (Italy); Biagi, S.; Cacopardo, G.; Cali, C.; Caruso, F.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D' Amato, C.; De Luca, V.; Distefano, C.; Gmerk, A.; Grasso, R.; Imbesi, M.; Kulikovskiy, V.; Larosa, G.; Lattuada, D.; Leismueller, K.P.; Litrico, P.; Migneco, E.; Miraglia, A.; Musumeci, M.; Orlando, A.; Papaleo, R.; Pulvirenti, S.; Riccobene, G.; Rovelli, A.; Sapienza, P.; Sciacca, V.; Speziale, F.; Spitaleri, A.; Trovato, A.; Viola, S. [INFN Laboratori Nazionali del Sud, Catania (Italy); Bouhadef, B.; Flaminio, V.; Raffaelli, F. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Bozza, C.; Grella, G.; Stellacci, S.M. [INFN Gruppo Collegato di Salerno, Fisciano (Italy); Dipartimento di Fisica Universita di Salerno, Fisciano (Italy); Calvo, D.; Real, D. [CSIC-Universitat de Valencia, IFIC-Instituto de Fisica Corpuscular, Valencia (Spain); Capone, A.; Masullo, R.; Perrina, C. [INFN Sezione Roma, Rome (Italy); Dipartimento di Fisica Universita ' ' Sapienza' ' , Rome (Italy); Ceres, A.; Circella, M.; Mongelli, M.; Sgura, I. [INFN Sezione Bari, Bari (Italy); Chiarusi, T. [INFN Sezione Bologna, Bologna (Italy); D' Amico, A. [INFN Laboratori Nazionali del Sud, Catania (Italy); Nikhef, Science Park, Amsterdam (Netherlands); Deniskina, N.; Migliozzi, P.; Mollo, C.M. [INFN Sezione Napoli, Naples (Italy); Enzenhoefer, A.; Lahmann, R. [Friedrich-Alexander-Universitaet Erlangen-Nuernberg, Erlangen Centre for Astroparticle Physics, Erlangen (Germany); Ferrara, G. [INFN Laboratori Nazionali del Sud, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Fusco, L.A.; Margiotta, A.; Pellegrino, C.; Spurio, M. [INFN Sezione Bologna, Bologna (Italy); Dipartimento di Fisica ed Astronomia Universita di Bologna, Bologna (Italy); Lo Presti, D.; Pugliatti, C. [INFN Sezione Catania, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Martini, A.; Trasatti, L. [INFN Laboratori Nazionali di Frascati, Frascati (Italy); Morganti, M. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Accademia Navale di Livorno, Livorno (Italy); Pellegriti, M.G. [INFN Laboratori Nazionali del Sud, Catania (IT); Piattelli, P. [INFN Laboratori Nazionali del Sud, Catania (IT); Taiuti, M. [INFN Sezione Genova, Genoa (IT); Dipartimento di Fisica Universita di Genova, Genoa (IT)

    2016-02-15

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than 1 year at the ''record'' depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the Italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than 1 year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of {sup 40}K light emission, with a small percentage of light bursts due to bioluminescence. All these features confirm the stability and good optical properties of the site. (orig.)

  19. Requirement of FADD, NEMO, and BAX/BAK for Aberrant Mitochondrial Function in Tumor Necrosis Factor Alpha-Induced Necrosis▿

    Science.gov (United States)

    Irrinki, Krishna M.; Mallilankaraman, Karthik; Thapa, Roshan J.; Chandramoorthy, Harish C.; Smith, Frank J.; Jog, Neelakshi R.; Gandhirajan, Rajesh Kumar; Kelsen, Steven G.; Houser, Steven R.; May, Michael J.; Balachandran, Siddharth; Madesh, Muniswamy

    2011-01-01

    Necroptosis represents a form of alternative programmed cell death that is dependent on the kinase RIP1. RIP1-dependent necroptotic death manifests as increased reactive oxygen species (ROS) production in mitochondria and is accompanied by loss of ATP biogenesis and eventual dissipation of mitochondrial membrane potential. Here, we show that tumor necrosis factor alpha (TNF-α)-induced necroptosis requires the adaptor proteins FADD and NEMO. FADD was found to mediate formation of the TNF-α-induced pronecrotic RIP1-RIP3 kinase complex, whereas the IκB Kinase (IKK) subunit NEMO appears to function downstream of RIP1-RIP3. Interestingly, loss of RelA potentiated TNF-α-dependent necroptosis, indicating that NEMO regulates necroptosis independently of NF-κB. Using both pharmacologic and genetic approaches, we demonstrate that the overexpression of antioxidants alleviates ROS elevation and necroptosis. Finally, elimination of BAX and BAK or overexpression of Bcl-xL protects cells from necroptosis at a later step. These findings provide evidence that mitochondria play an amplifying role in inflammation-induced necroptosis. PMID:21746883

  20. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  1. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  2. A communication tool to improve the patient journey modeling process.

    Science.gov (United States)

    Curry, Joanne; McGregor, Carolyn; Tracy, Sally

    2006-01-01

    Quality improvement is high on the agenda of Health Care Organisations (HCO) worldwide. Patient journey modeling is a relatively recent innovation in healthcare quality improvement that models the patient's movement through the HCO by viewing it from a patient centric perspective. Critical to the success of the redesigning care process is the involvement of all stakeholders and their commitment to actively participate in the process. Tools which promote this type of communication are a critical enabler that can significantly affect the overall process redesign outcomes. Such a tool must also be able to incorporate additional factors such as relevant policies and procedures, staff roles, system usage and measurements such as process time and cost. This paper presents a graphically based communication tool that can be used as part of the patient journey modeling process to promote stakeholder involvement, commitment and ownership as well highlighting the relationship of other relevant variables that contribute to the patient's journey. Examples of how the tool has been used and the framework employed are demonstrated via a midwife-led primary care case study. A key contribution of this research is the provision of a graphical communication framework that is simple to use, is easily understood by a diverse range of stakeholders and enables ready recognition of patient journey issues. Results include strong stakeholder buy-in and significant enhancement to the overall design of the future patient journey. Initial results indicate that the use of such a communication tool can improve the patient journey modeling process and the overall quality improvement outcomes.

  3. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  4. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  5. Numerical Model Metrics Tools in Support of Navy Operations

    Science.gov (United States)

    Dykes, J. D.; Fanguy, P.

    2017-12-01

    Increasing demands of accurate ocean forecasts that are relevant to the Navy mission decision makers demand tools that quickly provide relevant numerical model metrics to the forecasters. Increasing modelling capabilities with ever-higher resolution domains including coupled and ensemble systems as well as the increasing volume of observations and other data sources to which to compare the model output requires more tools for the forecaster to enable doing more with less. These data can be appropriately handled in a geographic information system (GIS) fused together to provide useful information and analyses, and ultimately a better understanding how the pertinent model performs based on ground truth.. Oceanographic measurements like surface elevation, profiles of temperature and salinity, and wave height can all be incorporated into a set of layers correlated to geographic information such as bathymetry and topography. In addition, an automated system that runs concurrently with the models on high performance machines matches routinely available observations to modelled values to form a database of matchups with which statistics can be calculated and displayed, to facilitate validation of forecast state and derived variables. ArcMAP, developed by Environmental Systems Research Institute, is a GIS application used by the Naval Research Laboratory (NRL) and naval operational meteorological and oceanographic centers to analyse the environment in support of a range of Navy missions. For example, acoustic propagation in the ocean is described with a three-dimensional analysis of sound speed that depends on profiles of temperature, pressure and salinity predicted by the Navy Coastal Ocean Model. The data and model output must include geo-referencing information suitable for accurately placing the data within the ArcMAP framework. NRL has developed tools that facilitate merging these geophysical data and their analyses, including intercomparisons between model

  6. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  7. MODELING OF ANIMATED SIMULATIONS BY MAXIMA PROGRAM TOOLS

    Directory of Open Access Journals (Sweden)

    Nataliya O. Bugayets

    2015-06-01

    Full Text Available The article deals with the methodical features in training of computer simulation of systems and processes using animation. In the article the importance of visibility of educational material that combines sensory and thinking sides of cognition is noted. The concept of modeling and the process of building models has been revealed. Attention is paid to the development of skills that are essential for effective learning of animated simulation by visual aids. The graphical environment tools of the computer mathematics system Maxima for animated simulation are described. The examples of creation of models animated visual aids and their use for the development of research skills are presented.

  8. Transfer Entropy as a Tool for Hydrodynamic Model Validation

    Directory of Open Access Journals (Sweden)

    Alicia Sendrowski

    2018-01-01

    Full Text Available The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.

  9. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  10. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  11. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  12. Designing a training tool for imaging mental models

    Science.gov (United States)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  13. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  14. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  15. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  16. Surviving the present: Modeling tools for organizational change

    Energy Technology Data Exchange (ETDEWEB)

    Pangaro, P. (Pangaro Inc., Washington, DC (United States))

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them.

  17. Surviving the present: Modeling tools for organizational change

    International Nuclear Information System (INIS)

    Pangaro, P.

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

  18. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  19. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  20. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  1. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  2. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  3. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  4. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  5. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  6. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  7. Measurement of the double-β decay half-life and search for the neutrinoless double-β decay of 48Ca with the NEMO-3 detector

    Science.gov (United States)

    Waters, David; Vilela, Cristóvão; NEMO-3 Collaboration

    2017-09-01

    Neutrinoless double-β decay is a powerful probe of lepton number violating processes that may arise from Majorana terms in neutrino masses, or from supersymmetric, left-right symmetric, and other extensions of the Standard Model. Of the candidate isotopes for the observation of this process, 48Ca has the highest Qββ -value, resulting in decays with energies significantly above most naturally occurring backgrounds. The nucleus also lends itself to precise matrix element calculations within the nuclear shell model. We present the world’s best measurement of the two-neutrino double-β decay of 48Ca, obtained by the NEMO-3 collaboration using 5.25 yr of data recorded with a 6.99 g sample of isotope, yielding ≈ 150 events with a signal to background ratio larger than 3. Neutrinoless modes of double-β decay are also investigated, with no evidence of new physics. Furthermore, these results indicate that two-neutrino double-β decay would be the main source of background for similar future searches using 48Ca with significantly larger exposures.

  8. The EDF/SEPTEN crisis team calculation tools and models

    International Nuclear Information System (INIS)

    De Magondeaux, B.; Grimaldi, X.

    1993-01-01

    Electricite de France (EDF) has developed a set of simplified tools and models called TOUTEC and CRISALIDE which are devoted to be used by the French utility National Crisis Team in order to perform the task of diagnosis and prognosis during an emergency situation. As a severe accident could have important radiological consequences, this method is focused on the diagnosis of the state of the safety barriers and on the prognosis of their behaviour. These tools allow the crisis team to deliver public authorities with information on the radiological risk and to provide advices to manage the accident on the damaged unit. At a first level, TOUTEC is intended to complement the hand-book with simplified calculation models and predefined relationships. It can avoid tedious calculation during stress conditions. The main items are the calculation of the primary circuit breach size and the evaluation of hydrogen over pressurization. The set of models called CRISALIDE is devoted to evaluate the following critical parameters: delay before core uncover, which would signify more severe consequences if it occurs, containment pressure behaviour and finally source term. With these models, crisis team comes able to take into account combinations of boundary conditions according to safety and auxiliary systems availability

  9. MODERN TOOLS FOR MODELING ACTIVITY IT-COMPANIES

    Directory of Open Access Journals (Sweden)

    Марина Петрівна ЧАЙКОВСЬКА

    2015-05-01

    Full Text Available Increasing competition in the market of the web-based applications increases the importance of the quality of services and optimization of processes of interaction with customers. The purpose of the article is to develop recommendations for improving the business processes of IT enterprises of web application segment based on technological tools for business modeling, shaping requirements for the development of an information system for customer interaction; analysis of the effective means of implementation and evaluation of the economic effects of the introduction. A scheme of the business process development and launch of the website was built, based on the analysis of business process models and “swim lane” models, requirements for IP customer relationship management for web studio were established. Market of software to create IP was analyzed, and the ones corresponding to the requirements were selected. IP system was developed and tested, implemented it in the company, an appraisal of the economic effect was conducted.

  10. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  11. A new model for the sonic borehole logging tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1990-12-01

    A number of models for the sonic borehole logging tool has earlier been developed. These models which are mainly based on experimental data, are discussed and compared. On this background the new model is developed. It is based on the assumptions that the pores of low porosity formations and the grains of high porosity media may be approximated by cylinders, and that the dimension of these cylinders are given by distribution functions. From these assumptions the transit time Δt p of low porosity formations and Δt g of high porosity media are calculated by use of the Monte Carlo method. Combining the Δt p and Δt g values obtained by use of selected weighting functions seems to permit the determination of the transit time Δt for the full porosity range (0 ≤ φ ≤ 100%). (author)

  12. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... speed doubly-fed induction generator wind turbine concept 3. Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies......, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control...

  13. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  15. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  16. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  17. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  18. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  19. ISRU System Model Tool: From Excavation to Oxygen Production

    Science.gov (United States)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  20. Modeling and Control of the Cobelli Model as a Personalized Prescriptive Tool for Diabetes Treatment

    Science.gov (United States)

    2016-11-05

    physiological accurate model allows for the use of control theory to investigate applications as a personalized prescription tool. This research...physiological accurate model allows for the use of control theory to investigate applications as a personalized prescription tool. This research...utilization increases toward healthy levels. The second pathway is by decreasing the endogenous glucose production of the liver to the bloodstream [6,7

  1. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  2. Failure of the Nemo trial: bumetanide is a promising agent to treat many brain disorders but not newborn seizures

    Directory of Open Access Journals (Sweden)

    Yehezkel eBen-Ari

    2016-04-01

    Full Text Available The diuretic bumetanide failed to treat acute seizures due to hypoxic ischemic encephalopathy (HIE in newborn babies and was associated with hearing loss (NEMO trial; 1. On the other hand, clinical and experimental observations suggest that the diuretic might provide novel therapy for many brain disorders including autistic spectrum disorder, schizophrenia, Rett syndrome and Parkinson disease. Here, we discuss the differences between the pathophysiology of severe recurrent seizures in the neonates and neurological and psychiatric disorders stressing the uniqueness of severe seizures in newborn in comparison to other disorders.

  3. Intestinal exposure to PCB 153 induces inflammation via the ATM/NEMO pathway.

    Science.gov (United States)

    Phillips, Matthew C; Dheer, Rishu; Santaolalla, Rebeca; Davies, Julie M; Burgueño, Juan; Lang, Jessica K; Toborek, Michal; Abreu, Maria T

    2018-01-15

    Polychlorinated biphenyls (PCBs) are persistent organic pollutants that adversely affect human health. PCBs bio-accumulate in organisms important for human consumption. PCBs accumulation in the body leads to activation of the transcription factor NF-κB, a major driver of inflammation. Despite dietary exposure being one of the main routes of exposure to PCBs, the gut has been widely ignored when studying the effects of PCBs. We investigated the effects of PCB 153 on the intestine and addressed whether PCB 153 affected intestinal permeability or inflammation and the mechanism by which this occurred. Mice were orally exposed to PCB 153 and gut permeability was assessed. Intestinal epithelial cells (IECs) were collected and evaluated for evidence of genotoxicity and inflammation. A human IEC line (SW480) was used to examine the direct effects of PCB 153 on epithelial function. NF-кB activation was measured using a reporter assay, DNA damage was assessed, and cytokine expression was ascertained with real-time PCR. Mice orally exposed to PCB 153 had an increase in intestinal permeability and inflammatory cytokine expression in their IECs; inhibition of NF-кB ameliorated both these effects. This inflammation was associated with genotoxic damage and NF-кB activation. Exposure of SW480 cells to PCB 153 led to similar effects as seen in vivo. We found that activation of the ATM/NEMO pathway by genotoxic stress was upstream of NF-kB activation. These results demonstrate that oral exposure to PCB 153 is genotoxic to IECs and induces downstream inflammation and barrier dysfunction in the intestinal epithelium. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  5. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  6. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  7. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  8. Edge effect modeling of small tool polishing in planetary movement

    Science.gov (United States)

    Li, Qi-xin; Ma, Zhen; Jiang, Bo; Yao, Yong-sheng

    2018-03-01

    As one of the most challenging problems in Computer Controlled Optical Surfacing (CCOS), the edge effect greatly affects the polishing accuracy and efficiency. CCOS rely on stable tool influence function (TIF), however, at the edge of the mirror surface,with the grinding head out of the mirror ,the contact area and pressure distribution changes, which resulting in a non-linear change of TIF, and leads to tilting or sagging at the edge of the mirror. In order reduce the adverse effects and improve the polishing accuracy and efficiency. In this paper, we used the finite element simulation to analyze the pressure distribution at the mirror edge and combined with the improved traditional method to establish a new model. The new method fully considered the non-uniformity of pressure distribution. After modeling the TIFs in different locations, the description and prediction of the edge effects are realized, which has a positive significance on the control and suppression of edge effects

  9. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  10. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  11. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  12. MODEL CAR TRANSPORT SYSTEM - MODERN ITS EDUCATION TOOL

    Directory of Open Access Journals (Sweden)

    Karel Bouchner

    2017-12-01

    Full Text Available The model car transport system is a laboratory intended for a practical development in the area of the motor traffic. It is also an important education tool for students’ hands-on training, enabling students to test the results of their own studies. The main part of the model car transportation network is a model in a ratio 1:87 (HO, based on component units of FALLER Car system, e.g. cars, traffic lights, carriage way, parking spaces, stop sections, branch-off junctions, sensors and control sections. The model enables to simulate real traffic situations. It includes a motor traffic in a city, in a small village, on a carriageway between a city and a village including a railway crossing. The traffic infrastructure includes different kinds of intersections, such as T-junctions, a classic four-way crossroad and four-way traffic circle, with and without traffic lights control. Another important part of the model is a segment of a highway which includes an elevated crossing with highway approaches and exits.

  13. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  14. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  15. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  16. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  17. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  18. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  19. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    Science.gov (United States)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  20. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  1. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  2. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  3. Development of hydrogeological modelling tools based on NAMMU

    International Nuclear Information System (INIS)

    Marsic, N.; Hartley, L.; Jackson, P.; Poole, M.; Morvik, A.

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  4. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  5. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  6. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  7. A crowdsourcing model for creating preclinical medical education study tools.

    Science.gov (United States)

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning.

  8. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  9. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  10. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  11. An Integrated Package of Neuromusculoskeletal Modeling Tools in Simulink (TM)

    National Research Council Canada - National Science Library

    Davoodi, R

    2001-01-01

    .... Blocks representing the skeletal linkage, sensors, muscles, and neural controllers are developed using separate software tools and integrated in the powerful simulation environment of Simulink (Mathworks Inc., USA...

  12. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  13. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  14. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  15. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  16. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  17. Use of System Dynamics Techniques in the Garrison Health Modelling Tool

    Science.gov (United States)

    2010-11-01

    Joint Health Command (JHC) tasked DSTO to develop techniques for modelling Defence health service delivery both in a Garrison environment in Australia ...UNCLASSIFIED UNCLASSIFIED Use of System Dynamics Techniques in the Garrison Health Modelling Tool Mark Burnett, Kerry Clifford and...Garrison Health Modelling Tool, a prototype software package designed to provide decision-support to JHC health officers and managers in a garrison

  18. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  19. A Modeling Tool for Household Biogas Burner Flame Port Design

    Science.gov (United States)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  20. SBML qualitative models: a model representation format and infrastructure to foster interactions between qualitative modelling formalisms and tools.

    Science.gov (United States)

    Chaouiya, Claudine; Bérenguier, Duncan; Keating, Sarah M; Naldi, Aurélien; van Iersel, Martijn P; Rodriguez, Nicolas; Dräger, Andreas; Büchel, Finja; Cokelaer, Thomas; Kowal, Bryan; Wicks, Benjamin; Gonçalves, Emanuel; Dorier, Julien; Page, Michel; Monteiro, Pedro T; von Kamp, Axel; Xenarios, Ioannis; de Jong, Hidde; Hucka, Michael; Klamt, Steffen; Thieffry, Denis; Le Novère, Nicolas; Saez-Rodriguez, Julio; Helikar, Tomáš

    2013-12-10

    Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. We present the Systems Biology Markup Language (SBML) Qualitative Models Package ("qual"), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models. SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.

  1. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  2. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  3. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Susannah B. Lerman; Keith H. Nislow; David J. Nowak; Stephen DeStefano; David I. King; D. Todd. Jones-Farrand

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat...

  4. Scenario Evaluator for Electrical Resistivity Survey Pre-modeling Tool

    Science.gov (United States)

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, su...

  5. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  6. An Energy Systems Modelling Tool for the Social Simulation Community

    NARCIS (Netherlands)

    Bollinger, L. Andrew; van Blijswijk, Martti J.; Dijkema, Gerard P.J.; Nikolic, Igor

    2016-01-01

    The growing importance of links between the social and technical dimensions of the electricity infrastructure mean that many research problems cannot be effectively addressed without joint consideration of social and technical dynamics. This paper motivates the need for and introduces a tool to

  7. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  8. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  9. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  10. Software tools for object-based audio production using the Audio Definition Model

    OpenAIRE

    Matthias , Geier; Carpentier , Thibaut; Noisternig , Markus; Warusfel , Olivier

    2017-01-01

    International audience; We present a publicly available set of tools for the integration of the Audio Definition Model (ADM) in production workflows. ADM is an open metadata model for the description of channel-, scene-, and object-based media within a Broadcast Wave Format (BWF) container. The software tools were developed within the European research project ORPHEUS (https://orpheus-audio.eu/) that aims at developing new end-to-end object-based media chains for broadcast. These tools allow ...

  11. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  12. Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.

    Science.gov (United States)

    Brockman, Vicki

    As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.

  13. influence.ME: tools for detecting influential data in mixed effects models

    NARCIS (Netherlands)

    Nieuwenhuis, Rense; te Grotenhuis, M.; Pelzer, B.

    2012-01-01

    influence.ME provides tools for detecting influential data in mixed effects models. The application of these models has become common practice, but the development of diagnostic tools has lagged behind. influence.ME calculates standardized measures of influential data for the point estimates of

  14. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  15. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  16. NEMO-\\onde: a submarine station for real-time monitoring of acoustic background installed at 2000 m depth in the Mediterranean Sea

    OpenAIRE

    The NEMO Collaboration; Cosentino, L.; Favetta, M.; Larosa, G.; Pavan, G.; Romeo, D. J.; Privitera, S.; Speziale, F.

    2008-01-01

    The NEMO (NEutrino Mediterranean Observatory) Collaboration installed, 25 km E offshore the port of Catania (Sicily) at 2000 m depth, an underwater laboratory to perform long-term tests of prototypes and new technologies for an underwater high energy neutrino km$^3$-scale detector in the Mediterranean Sea. In this framework the collaboration deployed and successfully operated for about two years, starting form January 2005, an experimental apparatus for on-line monitoring of deep-sea noise. T...

  17. Double-beta decay measurement of 100Mo to the excited 01+ state of 100Ru in the NEMO3 experiment - R/D program for SuperNEMO: development of a BiPo detector to measure ultra low contaminations in the source foils

    International Nuclear Information System (INIS)

    Chapon, A.

    2011-10-01

    The NEMO3 detector was designed for the study of double beta decay and in particular the search for neutrinoless double beta decay (ββ0ν). The quantity of 100 Mo in the detector (7 kg) allows also a competitive measurement of the two-neutrino double beta decay (ββ2ν) of 100 Mo to the excited 0 1 + state of 100 Ru (eeNγ channel). Monte-Carlo simulations of the effect and of all the possible sources of background have been studied in order to determine their contributions to the full NEMO3 experimental data (2003-2011). These one have then been analysed: the ββ2ν decay half-life has been measured, and a limit on the ββ0ν decay has been obtained. Moreover, the SuperNEMO experiment aims to reach a sensitivity up to 10 26 years on the half-life of neutrinoless double beta decay. The SuperNEMO detector radioactivity has to be as low as possible. Especially radio-purity levels of 2 μBq*kg -1 in 208 Tl and 10 μBq*kg -1 in 214 Bi are required for the source foils. The gamma-spectrometry can not measure such low contamination levels. Hence, a BiPo dedicated detector has been developed to measure 208 Tl and 214 Bi contaminations, identifying the Bi→Po→Pb β-α chains. A proof of principle has been performed and the detector background has been measured. Assuming these values, a full BiPo detector of 3.6 m 2 can achieve the required sensitivities for the SuperNEMO source foils within six months of measurement. (author)

  18. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  19. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  20. Tools and data for the geochemical modeling. Thermodynamic data for sulfur species and background salts and tools for the uncertainty analysis; WEDA. Werkzeuge und Daten fuer die Geochemische Modellierung. Thermodynamische Daten fuer Schwefelspezies und Hintergrundsalze sowie Tools zur Unsicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Sven; Schoenwiese, Dagmar; Scharge, Tina

    2015-07-15

    The report on tools and data for the geochemical modeling covers the following issues: experimental methods and theoretical models, design of a thermodynamic model for reduced sulfur species, thermodynamic models for background salts, tools for the uncertainty and sensitivity analyses of geochemical equilibrium modeling.

  1. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  2. The Quantum Atomic Model "Electronium": A Successful Teaching Tool.

    Science.gov (United States)

    Budde, Marion; Niedderer, Hans; Scott, Philip; Leach, John

    2002-01-01

    Focuses on the quantum atomic model Electronium. Outlines the Bremen teaching approach in which this model is used, and analyzes the learning of two students as they progress through the teaching unit. (Author/MM)

  3. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...

  4. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  5. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  6. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  7. Regional models - Emerging research tools for synoptic meteorologists

    Science.gov (United States)

    Keyser, Daniel; Uccellini, Louis W.

    1987-01-01

    A number of regional-scale numerical weather prediction models are discussed together with their application to the study of the structure and the dynamics of mesoscale phenomena. Consideration is given to investigations of natural phenomena (such as midlatitude cyclones and related baroclinic disturbances; upper-level jet-front systems; surface frontal zones, squall lines, and rain bands; mesoscale convective systems; and severe-storm environments) in which two operational models and four research models are used for regional-model studies. It is shown that these models provide investigators with four-dimensional dynamically consistent data sets to supplement and extend those available from observations.

  8. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  9. WeedML: a Tool for Collaborative Weed Demographic Modeling

    OpenAIRE

    Holst, Niels

    2010-01-01

    WeedML is a proposed standard to formulate models of weed demography, or maybe even complex models in general, that are both transparent and straightforward to re-use as building blocks for new models. The paper describes the design and thoughts behind WeedML which relies on XML and object-oriented systems development. Proof-of-concept software is provided as open-source C++ code and executables that can be downloaded freely.

  10. Ecological Modeling: A Tool for the Urban Educator.

    Science.gov (United States)

    Spikes, Frank

    Ecological modeling is a holistic systems level approach to situational analysis which can be used in planning activities for lifelong learning in an urban setting. It is the purpose of this essay to present a discussion of ecological modeling in its pure or conceptual sense and concomitantly to translate this analysis into an effective and…

  11. simulation tools for electrical machines modelling: teaching and ...

    African Journals Online (AJOL)

    Dr Obe

    used to model non-linearites in synchronous machine. The machine is modeled in ... Electrical machines who are involved in engineering undergraduate education will find the script very useful in terms of ... Keywords: Asynchronous machine; MATLAB scripts; engineering education; skin-effect; saturation effect; dynamic ...

  12. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  13. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  14. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  15. Towards diagnostic tools for analysing Swarm data through model retrievals

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Plank, Gernot; Haagmans, R.

    The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal dependency, and to gain new insights into improving our knowledge of the Earth’s interior and climate. The Swarm concept consists of a constellation of three satellites in three different...... polar orbits between 300 and 550 km altitude. Goal of the current study is to build tools and to analyze datasets, in order to allow a fast diagnosis of the Swarm system performance in orbit during the commission phase and operations of the spacecraft. The effects on the reconstruction of the magnetic...... to test the influence of ionospheric residual signal or the impact of data selection on the lithospheric retrieval. Initially, the study considers one satellite and emphasises on the lithospheric field reconstruction, but in a second step it is extended to a realistic Swarm constellation of three...

  16. Econometric Model – A Tool in Financial Management

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2011-06-01

    Full Text Available The economic situation in Romania requires from the trader a rigorous analysis of vulnerabilities and opportunities offered by the external environment and a careful analysis of internal environmental conditions in which the entity operates. In this context particular attention is paid to indicators presented in the financial statements. Many times they are a model for economic forecasts, future plans, basic business and businesses that use them with a good forecasting activity. In this paper we propose to analyze the comparative evolution of the main financial indicators highlighted in financial statements (profit and loss through a multi-equation econometric model, namely dynamic Keynesian model.

  17. Tactical Medical Logistics Planning Tool: Modeling Operational Risk Assessment

    National Research Council Canada - National Science Library

    Konoske, Paula

    2004-01-01

    ...) models the patient flow from the point of injury through more definitive care, and (2) supports operations research and systems analysis studies, operational risk assessment, and field medical services planning. TML+...

  18. An Introduction to Model Selection: Tools and Algorithms

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2006-03-01

    Full Text Available Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans prevents the use of Popper’s falsification principle (which is the norm in other sciences. Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the present paper, an error measure (likelihood, as well as five methods to compare model fits (the likelihood ratio test, Akaike’s information criterion, the Bayesian information criterion, bootstrapping and cross-validation, are presented. The use of each method is illustrated by an example, and the advantages and weaknesses of each method are also discussed.

  19. Modeling and Calculator Tools for State and Local Transportation Resources

    Science.gov (United States)

    Air quality models, calculators, guidance and strategies are offered for estimating and projecting vehicle air pollution, including ozone or smog-forming pollutants, particulate matter and other emissions that pose public health and air quality concerns.

  20. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    Science.gov (United States)

    2012-08-01

    1.56 0.48 Beale: MetalMapper Cued: Beale_MMstat Target: 477 Cell 202 of 1547 (SOI, 2OI) Model 1 of 3 (Inv #1 / 2 = SOI: 1 / 1) Tag...Statistical classification of buried unexploded ordnance using nonparametric prior models. IEEE Trans. Geosci. Remote Sensing, 45: 2794–2806, 2007. T...Bell and B. Barrow. Subsurface discrimination using electromagnetic induction sensors. IEEE Trans. Geosci. Remote Sensing, 39:1286–1293, 2001. S. D

  1. Overview of software tools for modeling single event upsets in microelectronic devices

    Directory of Open Access Journals (Sweden)

    Anatoly Alexandrovich Smolin

    2016-10-01

    Full Text Available The paper presents the results of the analysis of existing simulation tools for evaluation of single event upset susceptibility of microelectronic devices with deep sub-micron feature sizes. This simulation tools are meant to replace obsolete approach to single event rate estimation based on integral rectangular parallelepiped model. Three main approaches implemented in simulation tools are considered: combined use of particle transport codes and rectangular parallelepiped model, combined use of particle transport codes and analytical models of charge collection and circuit simulators, and combined use of particle transport codes and TCAD simulators.

  2. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  3. SARAH 4: A tool for (not only SUSY) model builders

    Science.gov (United States)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  4. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  5. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  6. Forest fire forecasting tool for air quality modelling systems

    International Nuclear Information System (INIS)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-01-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  7. Forest fire forecasting tool for air quality modelling systems

    International Nuclear Information System (INIS)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-01-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  8. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  9. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    already exist for the study of low energy supersymmetry and the MSSM in particular, this workshop will instead focus on tools for alternative TeV-scale physics models. The main goals of the workshop are: To survey what is available. To provide feedback on user experiences with Monte Carlo tools for BSM...

  10. Static Stiffness Modeling of a Novel PKM-Machine Tool Structure

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2014-07-01

    Full Text Available This article presents a new configuration of a 3-dof machine tool with parallel kinematics. Elastic deformations of the machine tool have been modeled with finite elements, stiffness coefficients at characteristic points of the working area for different cutting forces have been calculated.

  11. Nemo Solus Satis Sapit: Trends of Research Collaborations in the Vietnamese Social Sciences, Observing 2008–2017 Scopus Data

    Directory of Open Access Journals (Sweden)

    Quan-Hoang Vuong

    2017-10-01

    Full Text Available “Nemo solus satis sapit”—no one can be wise enough on his own. This is particularly true when it comes to collaborations in scientific research. Concerns over this issue in Vietnam, a developing country with limited academic resources, led to an in-depth study on Vietnamese social science research, using Google Scholar and Scopus, during 2008–2017. The results showed that more than 90% of scientists had worked with colleagues to publish, and they had collaborated 13 times on average during the time limit of the data sample. These collaborations, both domestic and international, mildly boosted author performance. On the other hand, the modest number of publications by Vietnamese authors was reportedly linked to Vietnamese social scientists’ heavy reliance on collaborative work as non-leading co-authors: for an entire decade (2008–2017, the average author assumes the leading role merely in two articles, and hardly ever published alone. This implies that policy-makers ought to consider promoting institutional collaborations while also encouraging authors to acquire the experience of publishing solo.

  12. Biological profiling and dose-response modeling tools ...

    Science.gov (United States)

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number of concentration-response data sets. Standard processing of these data sets involves finding a best fitting mathematical model and set of model parameters that specify this model. The model parameters include quantities such as the half-maximal activity concentration (or “AC50”) that have biological significance and can be used to inform the efficacy or potency of a given chemical with respect to a given assay. All of this data is processed and stored in an online-accessible database and website: http://actor.epa.gov/dashboard2. Results from these in vitro assays are used in a multitude of ways. New pathways and targets can be identified and incorporated into new or existing adverse outcome pathways (AOPs). Pharmacokinetic models such as those implemented EPA’s HTTK R package can be used to translate an in vitro concentration into an in vivo dose; i.e., one can predict the oral equivalent dose that might be expected to activate a specific biological pathway. Such predicted values can then be compared with estimated actual human exposures prioritize chemicals for further testing.Any quantitative examination should be accompanied by estimation of uncertainty. We are developing met

  13. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  14. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    Directory of Open Access Journals (Sweden)

    Raul Torres-Ruiz

    2015-09-01

    Full Text Available The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  15. Models and tools for studying drought stress responses in peas.

    Science.gov (United States)

    Magyar-Tábori, Katalin; Mendler-Drienyovszki, Nóra; Dobránszki, Judit

    2011-12-01

    The pea (Pisum sativum L.) is an important pulse crop but the growing area is limited because of its relatively low yield stability. In many parts of the world the most important abiotic factor limiting the survival and yield of plants is the restricted water supply, and the crop productivity can only be increased by improving drought tolerance. Development of pea cultivars well adapted to dry conditions has been one of the major tasks in breeding programs. Conventional breeding of new cultivars for dry conditions required extensive selection and testing for yield performance over diverse environments using various biometrical approaches. Several morphological and biochemical traits have been proven to be related to drought resistance, and methods based on physiological attributes can also be used in development of better varieties. Osmoregulation plays a role in the maintenance of turgor pressure under water stress conditions, and information on the behaviour of genotypes under osmotic stress can help selection for drought resistance. Biotechnological approaches including in vitro test, genetic transformation, and the use of molecular markers and mutants could be useful tools in breeding of pea. In this minireview we summarized the present status of different approaches related to drought stress improvement in the pea.

  16. Queuing Models: A Tool For Assessing The Profitability Of Barbing ...

    African Journals Online (AJOL)

    The study considered small scale business as an option in reducing the unemployment rate in our society. The study uses queuing models to assess the profitability of barbing salon business in Agbor town of Delta State. The result of the study indicates that the distribution of inter-arrival times, service times, and waiting ...

  17. Numerical modeling as a tool for sustainable water management

    Science.gov (United States)

    Zacharias, I.; Dimitriou, E.; Koussouris, Th.

    2003-04-01

    Combining environmental preservation and economic prosperity is a primary objective of most developmental activities nowadays. Sustainable Water Resources Management can contribute in achieving this objective, especially in wetland areas that often undergo significant stresses due to irrational water exploitation schemes. Applying numerical modeling for designing sustainable water management scenarios is a common practice during the last decade but it is also under controversy by many scientists and environmental managers. The particular scientific effort attempted to develop and assess a methodology for the formation of water management plans in lake catchments by combining GIS applications, remote-sensing techniques and physically-based hydrologic modeling. The advantages and disadvantages of the specific methodology and particularly of the numerical modeling utilization in the water management forming process have been examined through a case study application in Trichonis lake catchment, W. Greece. At this area, significant wetlands with the endangered Calcareous fens habitat are encountered and presented significant degradation during the last 30 years. The results indicated that the particular methodology provided water management scenarios that fulfilled both the environmental and anthropogenic demands without compromising the replenishment potential of the local water resources. Numerical modeling operated efficiently, accelerated the water management formation process and offered scenarios that can be easily applicable and amendable by the local Water Authorities.

  18. Mathematical modelling : a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, H; Hellriegel, B

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  19. Mathematical modelling: a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B.

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  20. Mathematical modelling: a tool for hospital infection control.

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  1. Agent-Based Modeling: A Powerful Tool for Tourism Researchers

    NARCIS (Netherlands)

    Nicholls, Sarah; Amelung, B.; Student, Jillian

    2017-01-01

    Agent-based modeling (ABM) is a way of representing complex systems of autonomous agents or actors, and of simulating the multiple potential outcomes of these agents’ behaviors and interactions in the form of a range of alternatives or futures. Despite the complexity of the tourism system, and the

  2. 3-C Models Teaching Tools to Promote Social Justice

    Science.gov (United States)

    Marbley, Aretha Faye; Rouson, Leon; Burley, Hansel; Ross, Wendy; Bonner, Fred A., II; Lértora, Ian; Huang, Shih-Han

    2017-01-01

    Equipping future professionals and educators with critical global multicultural competences and skills to work with people from diverse backgrounds is a challenge for both predominantly White institutions (PWIs) and Historically Black Colleges and Universities (HBCUs). The major objective of this article is to introduce an adaptable model with an…

  3. Modeling mind-wandering: a tool to better understand distraction

    NARCIS (Netherlands)

    van Vugt, Marieke; Taatgen, Niels; Sackur, Jerome; Bastian, Mikael; Taatgen, Niels; van Vugt, Marieke; Borst, Jelmer; Mehlhorn, Katja

    2015-01-01

    When we get distracted, we may engage in mind-wandering, or task-unrelated thinking, which impairs performance on cognitive tasks. Yet, we do not have cognitive models that make this process explicit. On the basis of both recent experiments that have started to investigate mind-wandering and

  4. Selecting Tools to Model Integer and Binomial Multiplication

    Science.gov (United States)

    Pratt, Sarah Smitherman; Eddy, Colleen M.

    2017-01-01

    Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…

  5. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation

  6. The Visible Signature Modelling and Evaluation ToolBox

    Science.gov (United States)

    2008-12-01

    Rhinoceros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 2.5.2 MODTRAN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33...HYDROLIGHT. The commercial support software has a number of dif- ferent functionalities. Rhinoceros provides the wireframe models required as input...Greyscale texture synthesis takes a greyscale input image and a uniform white -noise texture. The white -noise texture is modified to reproduce certain

  7. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  8. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    data. GENERAL I ARTICLE of programmable calculators (starting around 1956 with the introduction of Fortran), computers as visualization aids (around. 1970) .... ous applications of computer assisted molecular modeling tech- niques are .... thods are less complicated, fast, and are able to handle very large systems ...

  9. Animal models for arthritis: innovative tools for prevention and treatment

    NARCIS (Netherlands)

    Kollias, G.; Papadaki, P.; Apparailly, F.; Vervoordeldonk, M.J.; Holmdahl, R.; Baumans, V.; Desaintes, C.; Di Santo, J.; Distler, J.; Garside, P.; Hegen, M.; Huizinga, T.W.J.; Jüngel, A.; Klareskog, L.; McInnes, I.; Ragoussis, I.; Schett, G.; Hart, B.t.; Tak, P.P.; Toes, R.; van den Berg, W.; Wurst, W.; Gay, S.

    2011-01-01

    The development of novel treatments for rheumatoid arthritis (RA) requires the interplay between clinical observations and studies in animal models. Given the complex molecular pathogenesis and highly heterogeneous clinical picture of RA, there is an urgent need to dissect its multifactorial nature

  10. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  11. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  12. About Using Predictive Models and Tools To Assess Chemicals under TSCA

    Science.gov (United States)

    As part of EPA's effort to promote chemical safety, OPPT provides public access to predictive models and tools which can help inform the public on the hazards and risks of substances and improve chemical management decisions.

  13. The Integrated Medical Model: A Decision Support Tool for In-flight Crew Health Care

    Science.gov (United States)

    Butler, Doug

    2009-01-01

    This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.

  14. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  15. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  16. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  17. Prediction Model and Risk Stratification Tool for Survival in Patients With CKD

    Directory of Open Access Journals (Sweden)

    Alexander S. Goldfarb-Rumyantzev

    2018-03-01

    Conclusion: The risk stratification tool and prediction model of 2-year mortality demonstrated good performance and may be used in clinical practice to quantify the risk of death for individual patients with CKD.

  18. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  19. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  20. Multi-Physics Computational Modeling Tool for Materials Damage Assessment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is to provide a multi-physics modeling tool for materials damage assessment for application to future aircraft design. The software...

  1. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    coastal ocean sufficiently to have a complete picture of the flow. The analysis will thus consist of comparing these incomplete pictures of the current...50 cm. This would suggest that tidal flats would exist at synoptic scales but not daily because there are expanses of the lagoon that are < 50 cm...historical daily data from the correct time of year but not from the correct day. This indicates that the model flow is generally correct at synoptic

  2. 3D model tools for architecture and archaeology reconstruction

    Science.gov (United States)

    Vlad, Ioan; Herban, Ioan Sorin; Stoian, Mircea; Vilceanu, Clara-Beatrice

    2016-06-01

    The main objective of architectural and patrimonial survey is to provide a precise documentation of the status quo of the surveyed objects (monuments, buildings, archaeological object and sites) for preservation and protection, for scientific studies and restoration purposes, for the presentation to the general public. Cultural heritage documentation includes an interdisciplinary approach having as purpose an overall understanding of the object itself and an integration of the information which characterize it. The accuracy and the precision of the model are directly influenced by the quality of the measurements realized on field and by the quality of the software. The software is in the process of continuous development, which brings many improvements. On the other side, compared to aerial photogrammetry, close range photogrammetry and particularly architectural photogrammetry is not limited to vertical photographs with special cameras. The methodology of terrestrial photogrammetry has changed significantly and various photographic acquisitions are widely in use. In this context, the present paper brings forward a comparative study of TLS (Terrestrial Laser Scanner) and digital photogrammetry for 3D modeling. The authors take into account the accuracy of the 3D models obtained, the overall costs involved for each technology and method and the 4th dimension - time. The paper proves its applicability as photogrammetric technologies are nowadays used at a large scale for obtaining the 3D model of cultural heritage objects, efficacious in their assessment and monitoring, thus contributing to historic conservation. Its importance also lies in highlighting the advantages and disadvantages of each method used - very important issue for both the industrial and scientific segment when facing decisions such as in which technology to invest more research and funds.

  3. Variable fused deposition modelling - concept design and tool path generation

    OpenAIRE

    Brooks, Hadley Laurence

    2011-01-01

    Current Fused Deposition Modelling (FDM) techniques use fixed diameter nozzles to deposit a filament of plastic layer by layer. The consequence is that the same small nozzle, essential for fine details, is also used to fill in relatively large volumes. In practice a Pareto-optimal nozzle diameter is chosen that attempts to maximise resolution while minimising build time. This paper introduces a concept for adapting an additive manufacturing system, which exploits a variable diameter nozzle fo...

  4. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  5. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  6. Habitat hydraulic models - a tool for Danish stream quality assessment?

    DEFF Research Database (Denmark)

    Olsen, Martin

    In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality of the biol......In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality...... in Danish stream management and stream quality assessment. The stream Ledreborg catchment is modelled using a precipitation-run-off-model (NAM) and as an addition to the normal calibration procedure (Kronvang et al., 2000) the model is calibrated using DAISY adjusted evaporation data. The impact from...... groundwater abstraction upon stream discharge is assessed and in relation to this the relative importance of variations in precipitation, evaporation/temperature and groundwater abstraction are discussed. Physical habitat preferences for trout in the stream Ledreborg are assessed through a series of field...

  7. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  8. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  9. Decision modelling tools for utilities in the deregulated energy market

    Energy Technology Data Exchange (ETDEWEB)

    Makkonen, S. [Process Vision Oy, Helsinki (Finland)

    2005-07-01

    , strategic decision support has also faced new challenges. This thesis introduces two applications involving multiple criteria decision making methods. The first application explores the decision making problem caused by the introduction of 'green' electricity that creates additional value for renewable energy. In this problem the stochastic multicriteria acceptability analysis method (SMAA) is applied. The second strategic multi-criteria decision making study discusses two different energy-related operations research problems: the elements of risk analysis in the energy field and the evaluation of different choices with a decision support tool accommodating incomplete preference information to help energy companies to select a proper risk management system. The application is based on the rank inclusion in criteria hierarchies (RICH) method. (orig.)

  10. Decision modelling tools for utilities in the deregulated energy market

    International Nuclear Information System (INIS)

    Makkonen, S.

    2005-01-01

    , strategic decision support has also faced new challenges. This thesis introduces two applications involving multiple criteria decision making methods. The first application explores the decision making problem caused by the introduction of 'green' electricity that creates additional value for renewable energy. In this problem the stochastic multicriteria acceptability analysis method (SMAA) is applied. The second strategic multi-criteria decision making study discusses two different energy-related operations research problems: the elements of risk analysis in the energy field and the evaluation of different choices with a decision support tool accommodating incomplete preference information to help energy companies to select a proper risk management system. The application is based on the rank inclusion in criteria hierarchies (RICH) method. (orig.)

  11. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models

    Science.gov (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community. PMID:27536246

  12. Finding Difference: Nemo and Friends Opening the Door to Disability Theory

    Science.gov (United States)

    Preston, Daniel L.

    2010-01-01

    While middle school and high school students may have watched the Disney and Disney/Pixar films when they were younger, chances are they did not do so with a critical eye toward difference and disability, despite the fact that these films serve as excellent tools for teaching about difference. Recent estimates label 20% of the world's population…

  13. How can land-use modelling tools inform bioenergy policies?

    Science.gov (United States)

    Davis, Sarah C.; House, Joanna I.; Diaz-Chavez, Rocio A.; Molnar, Andras; Valin, Hugo; DeLucia, Evan H.

    2011-01-01

    Targets for bioenergy have been set worldwide to mitigate climate change. Although feedstock sources are often ambiguous, pledges in European nations, the United States and Brazil amount to more than 100 Mtoe of biorenewable fuel production by 2020. As a consequence, the biofuel sector is developing rapidly, and it is increasingly important to distinguish bioenergy options that can address energy security and greenhouse gas mitigation from those that cannot. This paper evaluates how bioenergy production affects land-use change (LUC), and to what extent land-use modelling can inform sound decision-making. We identified local and global internalities and externalities of biofuel development scenarios, reviewed relevant data sources and modelling approaches, identified sources of controversy about indirect LUC (iLUC) and then suggested a framework for comprehensive assessments of bioenergy. Ultimately, plant biomass must be managed to produce energy in a way that is consistent with the management of food, feed, fibre, timber and environmental services. Bioenergy production provides opportunities for improved energy security, climate mitigation and rural development, but the environmental and social consequences depend on feedstock choices and geographical location. The most desirable solutions for bioenergy production will include policies that incentivize regionally integrated management of diverse resources with low inputs, high yields, co-products, multiple benefits and minimal risks of iLUC. Many integrated assessment models include energy resources, trade, technological development and regional environmental conditions, but do not account for biodiversity and lack detailed data on the location of degraded and underproductive lands that would be ideal for bioenergy production. Specific practices that would maximize the benefits of bioenergy production regionally need to be identified before a global analysis of bioenergy-related LUC can be accomplished. PMID

  14. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  15. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  16. Systematic Methods and Tools for Computer Aided Modelling

    OpenAIRE

    Fedorova, Marina; Gani, Rafiqul; Sin, Gürkan

    2015-01-01

    Modeller spiller vigtige roller til design og analyse af kemi- og biokemibaserede produkter samt til processerne, der fremstille dem. Modelbaserede metoder og værktøjer har potentialet til at formindske antallet af eksperimenter, som kan være dyre og tidskrævende, og til at udvælge kandidater, hvorpå den eksperimentelle indsats bør fokuseres. I dette projekt blev en generel modelleringsramme udviklet til en systematisk modelopsætning ved hjælp af modelskabeloner. Modelrammen understøtter en n...

  17. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  18. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have effec...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  19. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  20. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  1. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  2. Network Models: An Underutilized Tool in Wildlife Epidemiology?

    Directory of Open Access Journals (Sweden)

    Meggan E. Craft

    2011-01-01

    Full Text Available Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration.

  3. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  4. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  5. Model-based development of a course of action scheduling tool

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Mechlenborg, Peter; Zhang, Lin

    2008-01-01

    This paper shows how a formal method in the form of Coloured Petri Nets (CPNs) and the supporting CPN Tools have been used in the development of the Course of Action Scheduling Tool (COAST). The aim of COAST is to support human planners in the specification and scheduling of tasks in a Course...... of Action. CPNs have been used to develop a formal model of the task execution framework underlying COAST. The CPN model has been extracted in executable form from CPN Tools and embedded directly into COAST, thereby automatically bridging the gap between the formal specification and its implementation....... The scheduling capabilities of COAST are based on state space exploration of the embedded CPN model. Planners interact with COAST using a domain-specific graphical user interface (GUI) that hides the embedded CPN model and analysis algorithms. This means that COAST is based on a rigorous semantical model...

  6. Aluminium in an ocean general circulation model compared with the West Atlantic Geotraces cruises

    NARCIS (Netherlands)

    van Hulten, M. M. P.; Sterl, A.; Tagliabue, A.; Dutay, J. -C.; Gehlen, M.; de Baar, H. J. W.; Middag, R.

    2013-01-01

    A model of aluminium has been developed and implemented in an Ocean General Circulation Model (NEMO-PISCES). In the model, aluminium enters the ocean by means of dust deposition. The internal oceanic processes are described by advection, mixing and reversible scavenging. The model has been evaluated

  7. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  8. Next-Generation Model-based Variability Management: Languages and Tools

    OpenAIRE

    Acher , Mathieu; Heymans , Patrick; Collet , Philippe; Lahire , Philippe

    2012-01-01

    International audience; Variability modelling and management is a key activity in a growing number of software engineering contexts, from software product lines to dynamic adaptive systems. Feature models are the defacto standard to formally represent and reason about commonality and variability of a software system. This tutorial aims at presenting next generation of feature modelling languages and tools, directly applicable to a wide range of model-based variability problems and application...

  9. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  10. Rapid evaluation of machine tools with position-dependent milling stability based on response surface model

    Directory of Open Access Journals (Sweden)

    Li Zhang

    2016-03-01

    Full Text Available The milling stability is one of the important evaluation criterions of dynamic characteristics of machine tools, and it is of great importance for machine tools’ design and manufacturing. The milling stability of machine tools generally varies with the position combinations of moving parts. The traditional milling stability analysis of machine tools is based on some specific positions in the whole workspace of machine tools, and the results are not comprehensive. Furthermore, it is very time-consuming for operation and calculation to complete analysis of multiple positions. A new method to rapidly evaluate the stability of machine tools with position dependence is developed in this article. In this method, the key position combinations of moving parts are set as the samples of calculation to calculate the dynamic characteristics of machine tools with SAMCEF finite element simulation analysis software. Then the minimum critical axial cutting depth of each sample is obtained. The relationship between the position and the value of minimum critical axial cutting depth at any position in the whole workspace can be obtained through established response surface model. The precision of the response surface model is evaluated and the model could be used to rapidly evaluate the milling stability of machine tools with position dependence. With a precision horizontal machining center with box-in-box structure as an example, the value of minimum critical axial cutting depth at any position is shown. This method of rapid evaluation of machine tools with position-dependent stability avoids complicated theoretical calculation, so it can be easily adopted by engineers and technicians in the phase of design process of machine tools.

  11. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  12. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  13. California Geriatric Education Center Logic Model: An Evaluation and Communication Tool

    Science.gov (United States)

    Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.

    2009-01-01

    A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…

  14. Circumplex Model of Family Systems: A Treatment Tool in Family Counseling.

    Science.gov (United States)

    Maynard, Peter E.; Olson, David H.

    1987-01-01

    Describes the Circumplex Model of Marital and Family Systems and its diagnostic inventory, the Family Adaptation and Coping Evaluation Scales, as important tools for the family counselor. A brief case example demonstrates how the model can be used in counseling a multiproblem family. (NB)

  15. Coloured Petri Nets and CPN Tools for Modelling and Validation of Concurrent Systems

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kristensen, Lars Michael; Wells, Lisa Marie

    2007-01-01

    Coloured Petri Nets (CPNs) is a language for the modeling and validation og systems in which concurrency, communication, and synchronisation play a major role. Coloured Petri Nets is a descrete-event modeling language combining Petri Nets with the funcitonal programming language Standard ML. Petri...... nets provide the doundation of the graphical notation and the basic primitives for modeling concurrency, communication, and synchronisation. Standard ML provides the primitives for the defintion of data types, describing data manipulation, and for creation compact and prarmeterisable models. A CPN...... taken to execute events in the modelled system. CPN Tolls is an industrial-strength computer tool for construction and analysing CPN models. Using CPN Tools, it is possible to investigate the behaviour of the modelled system using simulation, to verify properties by means of state sp0ece methods...

  16. Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2014-11-01

    Full Text Available Tool condition monitoring (TCM plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM, hidden Markov model (HMM and radius basis function (RBF are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  17. Tool flank wear model and parametric optimization in end milling of metal matrix composite using carbide tool: Response surface methodology approach

    Directory of Open Access Journals (Sweden)

    R. Arokiadass

    2012-04-01

    Full Text Available Highly automated CNC end milling machines in manufacturing industry requires reliable model for prediction of tool flank wear. This model later can be used to predict the tool flank wear (VBmax according to the process parameters. In this investigation an attempt was made to develop an empirical relationship to predict the tool flank wear (VBmax of carbide tools while machining LM25 Al/SiCp incorporating the process parameters such as spindle speed (N, feed rate (f, depth of cut (d and various % wt. of silicon carbide (S. Response surface methodology (RSM was applied to optimizing the end milling process parameters to attain the minimum tool flank wear. Predicted values obtained from the developed model and experimental results are compared, and error <5 percent is observed. In addition, it is concluded that the flank wear increases with the increase of SiCp percentage weight in the MMC.

  18. Thermomechanical modelling of laser surface glazing for H13 tool steel

    Science.gov (United States)

    Kabir, I. R.; Yin, D.; Tamanna, N.; Naher, S.

    2018-03-01

    A two-dimensional thermomechanical finite element (FE) model of laser surface glazing (LSG) has been developed for H13 tool steel. The direct coupling technique of ANSYS 17.2 (APDL) has been utilised to solve the transient thermomechanical process. A H13 tool steel cylindrical cross-section has been modelled for laser power 200 W and 300 W at constant 0.2 mm beam width and 0.15 ms residence time. The model can predict temperature distribution, stress-strain increments in elastic and plastic region with time and space. The crack formation tendency also can be assumed by analysing the von Mises stress in the heat-concentrated zone. Isotropic and kinematic hardening models have been applied separately to predict the after-yield phenomena. At 200 W laser power, the peak surface temperature achieved is 1520 K which is below the melting point (1727 K) of H13 tool steel. For laser power 300 W, the peak surface temperature is 2523 K. Tensile residual stresses on surface have been found after cooling, which are in agreement with literature. Isotropic model shows higher residual stress that increases with laser power. Conversely, kinematic model gives lower residual stress which decreases with laser power. Therefore, both plasticity models could work in LSG for H13 tool steel.

  19. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    Science.gov (United States)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  20. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...

  1. Snoopy's hybrid simulator: a tool to construct and simulate hybrid biological models.

    Science.gov (United States)

    Herajy, Mostafa; Liu, Fei; Rohr, Christian; Heiner, Monika

    2017-07-28

    Hybrid simulation of (computational) biochemical reaction networks, which combines stochastic and deterministic dynamics, is an important direction to tackle future challenges due to complex and multi-scale models. Inherently hybrid computational models of biochemical networks entail two time scales: fast and slow. Therefore, it is intricate to efficiently and accurately analyse them using only either deterministic or stochastic simulation. However, there are only a few software tools that support such an approach. These tools are often limited with respect to the number as well as the functionalities of the provided hybrid simulation algorithms. We present Snoopy's hybrid simulator, an efficient hybrid simulation software which builds on Snoopy, a tool to construct and simulate Petri nets. Snoopy's hybrid simulator provides a wide range of state-of-the-art hybrid simulation algorithms. Using this tool, a computational model of biochemical networks can be constructed using a (coloured) hybrid Petri net's graphical notations, or imported from other compatible formats (e.g. SBML), and afterwards executed via dynamic or static hybrid simulation. Snoopy's hybrid simulator is a platform-independent tool providing an accurate and efficient simulation of hybrid (biological) models. It can be downloaded free of charge as part of Snoopy from http://www-dssz.informatik.tu-cottbus.de/DSSZ/Software/Snoopy .

  2. A Temperature Sensor Clustering Method for Thermal Error Modeling of Heavy Milling Machine Tools

    Directory of Open Access Journals (Sweden)

    Fengchun Li

    2017-01-01

    Full Text Available A clustering method is an effective way to select the proper temperature sensor location for thermal error modeling of machine tools. In this paper, a new temperature sensor clustering method is proposed. By analyzing the characteristics of the temperature of the sensors in a heavy floor-type milling machine tool, an indicator involving both the Euclidean distance and the correlation coefficient was proposed to reflect the differences between temperature sensors, and the indicator was expressed by a distance matrix to be used for hierarchical clustering. Then, the weight coefficient in the distance matrix and the number of the clusters (groups were optimized by a genetic algorithm (GA, and the fitness function of the GA was also rebuilt by establishing the thermal error model at one rotation speed, then deriving its accuracy at two different rotation speeds with a temperature disturbance. Thus, the parameters for clustering, as well as the final selection of the temperature sensors, were derived. Finally, the method proposed in this paper was verified on a machine tool. According to the selected temperature sensors, a thermal error model of the machine tool was established and used to predict the thermal error. The results indicate that the selected temperature sensors can accurately predict thermal error at different rotation speeds, and the proposed temperature sensor clustering method for sensor selection is expected to be used for the thermal error modeling for other machine tools.

  3. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    International Nuclear Information System (INIS)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC

  4. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib, E-mail: garib@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC.

  5. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Model and numerical analysis of mechanical phenomena of tools steel hardening

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2010-01-01

    Full Text Available This paper the model hardening of tool steel takes into considerations of mechanical phenomena is presented. Fields stresses and strains are obtained from solutions by FEM equilibrium equations in rate form. The stresses generated during hardening were assumed to result from thermal load, structural deformation, and plastic deformation and transformation plasticity. Thermophysical values in the constitutive relations are depended upon both the temperature and the phase composition. Condition Huber-Misses with the isotropic strengthening for the creation of plastic strains is used. However model Leblond to determined transformations plasticity applied. The analysis of stresses associated of the elements hardening made of tool steel was done.

  7. An axisymmetrical non-linear finite element model for induction heating in injection molding tools

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Menotti, Stefano

    2016-01-01

    To analyze the heating and cooling phase of an induction heated injection molding tool accurately, the temperature dependent magnetic properties, namely the non-linear B-H curves, need to be accounted for in an induction heating simulation. Hence, a finite element model has been developed...... in to the injection molding tool. The model shows very good agreement with the experimental temperature measurements. It is also shown that the non-linearity can be used without the temperature dependency in some cases, and a proposed method is presented of how to estimate an effective linear permeability to use...

  8. NEMO-SN1 seafloor observatory at EMSO Western Ionian Sea site: a multidisciplinary approach for geophysical, oceanographic and environmental studies.

    Science.gov (United States)

    Embriaco, Davide; Marinaro, Giuditta; Monna, Stephen; Lo Bue, Nadia; Giovanetti, Gabriele; De Caro, Mariagrazia; De Santis, Angelo; Sgroi, Tiziana; Frugoni, Francesco; Montuori, Caterina; Riccobene, Giorgio; Viola, Salvo; Sciacca, Virginia; Pulvirenti, Sara; Caruso, Francesco; Simeone, Francesco; Chierici, Francesco; D'Amico, Antonio; Beranzoli, Laura; Favali, Paolo

    2017-04-01

    The Western Ionian Sea is one of the sites of the European Multidisciplinary Seafloor and water-column Observatory Research Infrastructure (EMSO). A prototype of a cabled deep-sea observatory (NEMO-SN1) was set up and has been operational in real-time since 2005 at 2100 m depth, 25 km off the harbour of Catania. In 2012 the observatory was upgraded to a fully integrated system for multidisciplinary deep-sea science, capable to transmit and distribute data in real time to the scientific community and to the general public. NEMO-SN1 hosts a large number of sensors to monitor and study oceanographic, environmental parameters (CTD, ADCP, current meter), and geophysical phenomena (hydrophones, accelerometer, gravity meter, magnetometers, seismometer, pressure gauges). Ocean noise monitoring and identification of biological acoustic sources in deep sea have also been possible with hydrophones working at low and high frequencies. The whole system was connected and powered from shore, by means of the electro-optical cable net installed at the East Sicily Site Infrastructure, and synchronised with GPS time. Sensors data sampling is performed underwater and transmitted via optical fibre link. A dedicated computing and networking infrastructure for data acquisition, storage and distribution through the internet has been also operative. Some examples of seafloor data analyses will be described to show the importance of such an integrated multidisciplinary infrastructure to geophysical, oceanographic and environmental studies.

  9. Results of the BiPo-1 prototype for radiopurity measurements for the SuperNEMO double beta decay source foils

    Energy Technology Data Exchange (ETDEWEB)

    Argyriades, J. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Arnold, R. [IPHC, Universite de Strasbourg, CNRS/IN2P3, F-67037 Strasbourg (France); Augier, C. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Baker, J. [INL, Idaho Falls, ID 83415 (United States); Barabash, A.S. [Institute of Theoretical and Experimental Physics, 117259 Moscow (Russian Federation); Basharina-Freshville, A. [University College London, WC1E 6BT London (United Kingdom); Bongrand, M.; Bourgeois, C.; Breton, D.; Briere, M.; Broudin-Bay, G. [LAL, Universite Paris-Sud, CNRS/IN2P3, F-91405 Orsay (France); Brudanin, V.B. [Joint Institute for Neear Research, 141980 Dubna (Russian Federation); Caffrey, A.J. [INL, Idaho Falls, ID 83415 (United States); Carcel, S. [Instituto de Fisica Corpuscular, CSIC, Universidad de Valencia, Valencia (Spain); Cebrian, S. [Instituto de Fisica Nuclear y Altas Energias, Universidad de Zaragoza, Zaragoza (Spain); Chapon, A. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France); Chauveau, E. [CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Universite de Bordeaux, Centre d' Etudes Nucleaires de Bordeaux Gradignan, UMR 5797, F-33175 Gradignan (France); Dafni, Th. [Instituto de Fisica Nuclear y Altas Energias, Universidad de Zaragoza, Zaragoza (Spain); Diaz, J. [Instituto de Fisica Corpuscular, CSIC, Universidad de Valencia, Valencia (Spain); Durand, D. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, F-14032 Caen (France)

    2010-10-01

    The development of BiPo detectors is dedicated to the measurement of extremely high radiopurity in {sup 208}Tl and {sup 214}Bi for the SuperNEMO double beta decay source foils. A modular prototype, called BiPo-1, with 0.8 m{sup 2} of sensitive surface area, has been running in the Modane Underground Laboratory since February, 2008. The goal of BiPo-1 is to measure the different components of the background and in particular the surface radiopurity of the plastic scintillators that make up the detector. The first phase of data collection has been dedicated to the measurement of the radiopurity in {sup 208}Tl. After more than one year of background measurement, a surface activity of the scintillators of A({sup 208}Tl)=1.5{mu}Bq/m{sup 2} is reported here. Given this level of background, a larger BiPo detector having 12 m{sup 2} of active surface area, is able to qualify the radiopurity of the SuperNEMO selenium double beta decay foils with the required sensitivity of A({sup 208}Tl)<2{mu}Bq/kg (90% C.L.) with a six month measurement.

  10. Barcoding nemo: DNA-based identifications for the ornamental fish trade.

    Directory of Open Access Journals (Sweden)

    Dirk Steinke

    Full Text Available BACKGROUND: Trade in ornamental fishes represents, by far, the largest route for the importation of exotic vertebrates. There is growing pressure to regulate this trade with the goal of ensuring that species are sustainably harvested and that their point of origin is accurately reported. One important element of such regulation involves easy access to specimen identifications, a task that is currently difficult for all but specialists because of the large number of species involved. The present study represents an important first step in making identifications more accessible by assembling a DNA barcode reference sequence library for nearly half of the ornamental fish species imported into North America. METHODOLOGY/PRINCIPAL FINDINGS: Analysis of the cytochrome c oxidase subunit I (COI gene from 391 species from 8 coral reef locations revealed that 98% of these species exhibit distinct barcode clusters, allowing their unambiguous identification. Most species showed little intra-specific variation (adjusted mean = 0.21%, but nine species included two or three lineages showing much more divergence (2.19-6.52% and likely represent overlooked species complexes. By contrast, three genera contained a species pair or triad that lacked barcode divergence, cases that may reflect hybridization, young taxa or taxonomic over-splitting. CONCLUSIONS/SIGNIFICANCE: Although incomplete, this barcode library already provides a new species identification tool for the ornamental fish industry, opening a realm of applications linked to collection practices, regulatory control and conservation.

  11. Analysis, Design, Implementation and Evaluation of Graphical Design Tool to Develop Discrete Event Simulation Models Using Event Graphs and Simkit

    National Research Council Canada - National Science Library

    San

    2001-01-01

    ... (OR) modeling and analysis. However, designing and implementing DES can be a time-consuming and error-prone task, This thesis designed, implemented and evaluated a tool, the Event Graph Graphical Design Tool (EGGDT...

  12. Final Report: Simulation Tools for Parallel Microwave Particle in Cell Modeling

    International Nuclear Information System (INIS)

    Stoltz, Peter H.

    2008-01-01

    Transport of high-power rf fields and the subsequent deposition of rf power into plasma is an important component of developing tokamak fusion energy. Two limitations on rf heating are: (i) breakdown of the metallic structures used to deliver rf power to the plasma, and (ii) a detailed understanding of how rf power couples into a plasma. Computer simulation is a main tool for helping solve both of these problems, but one of the premier tools, VORPAL, is traditionally too difficult to use for non-experts. During this Phase II project, we developed the VorpalView user interface tool. This tool allows Department of Energy researchers a fully graphical interface for analyzing VORPAL output to more easily model rf power delivery and deposition in plasmas.

  13. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    are missing. The aim was to develop a clinically applicable risk prediction tool. Methods The prediction models were developed and tested using three prospective data sets from Finland (n = 860), Denmark (n = 453), and Scotland (n = 231). Prediction models for persistent pain of moderate to severe intensity......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  14. Modelling tools to evaluate China's future energy system - a review of the Chinese perspective

    DEFF Research Database (Denmark)

    Mischke, Peggy; Karlsson, Kenneth Bernard

    2014-01-01

    compares 18 energy modelling tools from ten Chinese institutions. These models have been described in English language publications between 2005 and 2013, although not all are published in peer-reviewed journals. When comparing the results for three main energy system indicators across models, this paper...... finds that there are considerable ranges in the reference scenarios: (i) GDP is projected to grow by 630e840% from 2010 to 2050, (ii) energy demand could increase by 200e300% from 2010 to 2050, and (iii) CO2 emissions could rise by 160e250% from 2010 to 2050. Although the access to the modelling tools...... and the underlying data remains challenging, this study concludes that the Chinese perspective, independently from the modelling approach and institution, suggests a rather gradual and long-term transition towards a low carbon economy in China. Few reference scenarios include an emission peak or stabilisation period...

  15. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  16. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  17. Thermal Error Modeling of a Machine Tool Using Data Mining Scheme

    Science.gov (United States)

    Wang, Kun-Chieh; Tseng, Pai-Chang

    In this paper the knowledge discovery technique is used to build an effective and transparent mathematic thermal error model for machine tools. Our proposed thermal error modeling methodology (called KRL) integrates the schemes of K-means theory (KM), rough-set theory (RS), and linear regression model (LR). First, to explore the machine tool's thermal behavior, an integrated system is designed to simultaneously measure the temperature ascents at selected characteristic points and the thermal deformations at spindle nose under suitable real machining conditions. Second, the obtained data are classified by the KM method, further reduced by the RS scheme, and a linear thermal error model is established by the LR technique. To evaluate the performance of our proposed model, an adaptive neural fuzzy inference system (ANFIS) thermal error model is introduced for comparison. Finally, a verification experiment is carried out and results reveal that the proposed KRL model is effective in predicting thermal behavior in machine tools. Our proposed KRL model is transparent, easily understood by users, and can be easily programmed or modified for different machining conditions.

  18. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  19. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  20. Design process and tools for dynamic neuromechanical models and robot controllers.

    Science.gov (United States)

    Szczecinski, Nicholas S; Hunt, Alexander J; Quinn, Roger D

    2017-02-01

    We present a serial design process with associated tools to select parameter values for a posture and locomotion controller for simulation of a robot. The controller is constructed from dynamic neuron and synapse models and simulated with the open-source neuromechanical simulator AnimatLab 2. Each joint has a central pattern generator (CPG), whose neurons possess persistent sodium channels. The CPG rhythmically inhibits motor neurons that control the servomotor's velocity. Sensory information coordinates the joints in the leg into a cohesive stepping motion. The parameter value design process is intended to run on a desktop computer, and has three steps. First, our tool FEEDBACKDESIGN uses classical control methods to find neural and synaptic parameter values that stably and robustly control servomotor output. This method is fast, testing over 100 parameter value variations per minute. Next, our tool CPGDESIGN generates bifurcation diagrams and phase response curves for the CPG model. This reveals neural and synaptic parameter values that produce robust oscillation cycles, whose phase can be rapidly entrained to sensory feedback. It also designs the synaptic conductance of inter-joint pathways. Finally, to understand sensitivity to parameters and how descending commands affect a leg's stepping motion, our tool SIMSCAN runs batches of neuromechanical simulations with specified parameter values, which is useful for searching the parameter space of a complicated simulation. These design tools are demonstrated on a simulation of a robot, but may be applied to neuromechanical animal models or physical robots as well.

  1. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  2. Development of a surrogate model for elemental analysis using a natural gamma ray spectroscopy tool

    International Nuclear Information System (INIS)

    Zhang, Qiong

    2015-01-01

    A systematic computational method for obtaining accurate elemental standards efficiently for varying borehole conditions was developed based on Monte Carlo simulations, surrogate modeling, and data assimilation. Elemental standards are essential for spectral unfolding in formation evaluation applications commonly used for nuclear well logging tools. Typically, elemental standards are obtained by standardized measurements, but these experiments are expensive and lack the flexibility to address different logging conditions. In contrast, computer-based Monte Carlo simulations provide an accurate and more flexible approach to obtaining elemental standards for formation evaluation. The presented computational method recognizes that in contrast to typical neutron–photon simulations, where the source is typically artificial and well characterized (Galford, 2009), an accurate knowledge of the source is essential for matching the obtained Monte Carlo elemental standards with their experimental counterparts. Therefore, source distributions are adjusted to minimize the L2 difference of the Monte Carlo computed and experimental standards. Subsequently, an accurate surrogate model is developed accounting for different casing and cement thicknesses, and tool positions within the borehole. The adjusted source distributions are then utilized to generate and validate spectra for varying borehole conditions: tool position, casing and cement thickness. The effect of these conditions on the spectra are investigated and discussed in this work. Given that Monte Carlo modeling provides much lower cost and more flexibility, employing Monte Carlo could enhance the processing of nuclear tool logging data computed standards. - Highlights: • A novel computational model for efficiently computing elemental standards for varying borehole conditions has been developed. • A model of an experimental test pit was implemented in the Monte Carlo code GEANT4 for computing elemental standards.

  3. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    The objective of the present paper is to investigate the heat generation and contact condition during Friction Stir Welding (FSW). For this purpose, an analytical model is developed for the heat generation and this is combined with a Eulerian FE-analysis of the temperature field. The heat...... generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... is governed by the contact condition, i.e. whether there is sliding, sticking or partial sliding/sticking. The contact condition in FSW is complex (dependent on alloy, welding parameters, tool design etc.), and previous models (both analytical and numerical) for simulation of the heat generation assume...

  4. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2009-01-01

    cavity such as in the exterior cladding of building envelopes, i.e. a flow which is parallel to the construction plane. 2. Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the construction plane. The new models make it possible to predict the thermal......Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...... phenomena which occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: 1. Air flow in ventilated...

  5. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Karl Grau; Rode, Carsten

    2009-01-01

    phenomena that occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: (1) Air flow in a ventilated...... cavity such as behind the exterior cladding of a building envelope, i.e. a flow which is parallel to the construction plane. (2) Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the constructionplane. The paper presents the models and how they have......Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...

  6. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  7. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Science.gov (United States)

    Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo

    2017-05-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  8. Reach adaptation: what determines whether we learn an internal model of the tool or adapt the model of our arm?

    Science.gov (United States)

    Kluzik, JoAnn; Diedrichsen, Jörn; Shadmehr, Reza; Bastian, Amy J

    2008-09-01

    We make errors when learning to use a new tool. However, the cause of error may be ambiguous: is it because we misestimated properties of the tool or of our own arm? We considered a well-studied adaptation task in which people made goal-directed reaching movements while holding the handle of a robotic arm. The robot produced viscous forces that perturbed reach trajectories. As reaching improved with practice, did people recalibrate an internal model of their arm, or did they build an internal model of the novel tool (robot), or both? What factors influenced how the brain solved this credit assignment problem? To investigate these questions, we compared transfer of adaptation between three conditions: catch trials in which robot forces were turned off unannounced, robot-null trials in which subjects were told that forces were turned off, and free-space trials in which subjects still held the handle but watched as it was detached from the robot. Transfer to free space was 40% of that observed in unannounced catch trials. We next hypothesized that transfer to free space might increase if the training field changed gradually, rather than abruptly. Indeed, this method increased transfer to free space from 40 to 60%. Therefore although practice with a novel tool resulted in formation of an internal model of the tool, it also appeared to produce a transient change in the internal model of the subject's arm. Gradual changes in the tool's dynamics increased the extent to which the nervous system recalibrated the model of the subject's own arm.

  9. Procedures and results of the measurements on large area photomultipliers for the NEMO project

    Science.gov (United States)

    Aiello, S.; Leonora, E.; Aloisio, A.; Ameli, F.; Amore, I.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Barbarito, E.; Battaglieri, M.; Bazzotti, M.; Bellotti, R.; Bersani, A.; Beverini, N.; Biagi, S.; Bonori, M.; Bouhdaef, B.; Cacopardo, G.; Calı, C.; Capone, A.; Caponetto, L.; Carminati, G.; Cassano, B.; Ceres, A.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; D'Amico, A.; DeBonis, G.; DeRosa, G.; DeRuvo, G.; DeVita, R.; Distefano, C.; Flaminio, V.; Fratini, K.; Gabrielli, A.; Galeotti, S.; Gandolfi, E.; Giacomelli, G.; Giorgi, F.; Giovanetti, G.; Grimaldi, A.; Grmek, A.; Habel, R.; Imbesi, M.; Lonardo, A.; LoPresti, D.; Lucarelli, F.; Margiotta, A.; Marinelli, A.; Martini, A.; Masullo, R.; Maugeri, F.; Migneco, E.; Minutoli, S.; Mongelli, M.; Morganti, M.; Musico, P.; Musumeci, M.; Orlando, A.; Osipenko, M.; Papaleo, R.; Pappalardo, V.; Piattelli, P.; Piombo, D.; Raffaelli, F.; Raia, G.; Randazzo, N.; Reito, S.; Ricco, G.; Riccobene, G.; Ripani, M.; Rovelli, A.; Ruppi, M.; Russo, G. V.; Russo, S.; Sapienza, P.; Sedita, M.; Shirokov, E.; Simeone, F.; Sciliberto, D.; Sipala, V.; Sollima, C.; Spurio, M.; Stefani, F.; Taiuti, M.; Terreni, G.; Trasatti, L.; Urso, S.; Vecchi, M.; Vicini, P.; Wischnewski, R.

    2010-03-01

    The selection of the photomultiplier plays a crucial role in the R&D activity related to a large-scale underwater neutrino telescope. This paper illustrates the main procedures and facilities used to characterize the performances of 72 large area photomultipliers, Hamamatsu model R7081 sel. The voltage to achieve a gain of 5×10 7, dark count rate and single photoelectron time and charge properties of the overall response were measured with a properly attenuated 410 nm pulsed laser. A dedicated study of the spurious pulses was also performed. The results prove that the photomultipliers comply with the general requirements imposed by the project.

  10. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    International Nuclear Information System (INIS)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin

    2014-01-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  11. GAMBIT. The global and modular beyond-the-standard-model inference tool

    International Nuclear Information System (INIS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are; Buckley, Andy; Chrzaszcz, Marcin; Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan; Cornell, Jonathan M.; Dickinson, Hugh; Jackson, Paul; White, Martin; Kvellestad, Anders; Savage, Christopher; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; Wild, Sebastian

    2017-01-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  12. GAMBIT. The global and modular beyond-the-standard-model inference tool

    Energy Technology Data Exchange (ETDEWEB)

    Athron, Peter; Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Dickinson, Hugh [University of Minnesota, Minnesota Institute for Astrophysics, Minneapolis, MN (United States); Jackson, Paul; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); McKay, James [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Ripken, Joachim [Max Planck Institute for Solar System Research, Goettingen (Germany); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Seo, Seon-Hee [Seoul National University, Department of Physics and Astronomy, Seoul (Korea, Republic of); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Wild, Sebastian [DESY, Hamburg (Germany); Collaboration: The GAMBIT Collaboration

    2017-11-15

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  13. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products

  14. Using Model-Eliciting Activities as a Tool to Identify and Develop Mathematically Creative Students

    Science.gov (United States)

    Coxbill, Emmy; Chamberlin, Scott A.; Weatherford, Jennifer

    2013-01-01

    Traditional classroom methods for identifying mathematically creative students have been inadequate. Identifying students who could potentially be mathematically creative is instrumental in the development of students and in meeting their affective and educational needs. One prospective identification tool is the use of model-eliciting activities…

  15. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    Science.gov (United States)

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  16. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts of thi...

  17. Recommender System and Web 2.0 Tools to Enhance a Blended Learning Model

    Science.gov (United States)

    Hoic-Bozic, Natasa; Dlab, Martina Holenko; Mornar, Vedran

    2016-01-01

    Blended learning models that combine face-to-face and online learning are of great importance in modern higher education. However, their development should be in line with the recent changes in e-learning that emphasize a student-centered approach and use tools available on the Web to support the learning process. This paper presents research on…

  18. Towards Semantically Integrated Models and Tools for Cyber-Physical Systems Design

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Fitzgerald, John; Woodcock, Jim

    2016-01-01

    We describe an approach to the model-based engineering of embedded and cyber-physical systems, based on the semantic integration of diverse discipline-specific notations and tools. Using the example of a small unmanned aerial vehicle, we explain the need for multiple notations and collaborative...

  19. An online tool for business modelling and a refinement of the Business Canvas

    NARCIS (Netherlands)

    Rogier Brussee; Peter de Groot

    2016-01-01

    We give a refinement of the well known business model canvas by Osterwalder and Pigneur by splitting the basic blocks into further subblocks to reduce confusion and increase its expressive power. The splitting is used in an online tool which in addition comes with a set of questions to further

  20. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis

    Science.gov (United States)

    Moffitt, Kevin Christopher

    2011-01-01

    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  1. Simulation of Forming Process as an Educational Tool Using Physical Modeling

    Science.gov (United States)

    Abdullah, A. B.; Muda, M. R.; Samad, Z.

    2008-01-01

    Metal forming process simulation requires a very high cost including the cost for dies, machine and material and tight process control since the process involve very huge pressure. A physical modeling technique is developed and initiates a new era of educational tool of simulating the process effectively. Several publications and findings have…

  2. Using the Cognitive Apprenticeship Model with a Chat Tool to Enhance Online Collaborative Learning

    Science.gov (United States)

    Rodríguez-Bonces, Mónica; Ortiz, Kris

    2016-01-01

    In Colombia, many institutions are in the firm quest of virtual learning environments to improve instruction, and making the most of online tools is clearly linked to offering quality learning. Thus, the purpose of this action research was to identify how the Cognitive Apprenticeship Model enhances online collaborative learning by using a chat…

  3. What's new in the Atmospheric Model Evaluation Tool (AMET) version 1.3

    Science.gov (United States)

    A new version of the Atmospheric Model Evaluation Tool (AMET) has been released. The new version of AMET, version 1.3 (AMETv1.3), contains a number of updates and changes from the previous of version of AMET (v1.2) released in 2012. First, the Perl scripts used in the previous ve...

  4. Exposure Modeling Tools and Databases for Consideration for Relevance to the Amended TSCA (ISES)

    Science.gov (United States)

    The Agency’s Office of Research and Development (ORD) has a number of ongoing exposure modeling tools and databases. These efforts are anticipated to be useful in supporting ongoing implementation of the amended Toxic Substances Control Act (TSCA). Under ORD’s Chemic...

  5. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Blair, N.; Dobos, A. P.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% for all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.

  6. Characterization and Modeling of Insect Swarms Using tools from Fluid Dynamics

    Science.gov (United States)

    2016-09-01

    Characterization and Modeling of Insect Swarms Using tools from Fluid Dynamics The goals of this project were to develop a laboratory system for...quantitatively measuring the flight trajectories of swarming insects and to use the resulting data to evaluate currently used models of collective...behavior. We were successful in completing both goals, leading to the first highly resolved, statistically robust data sets for insect swarms, which we

  7. Visual Representation in GENESIS as a tool for Physical Modeling, Sound Synthesis and Musical Composition

    OpenAIRE

    Villeneuve, Jérôme; Cadoz, Claude; Castagné, Nicolas

    2015-01-01

    The motivation of this paper is to highlight the importance of visual representations for artists when modeling and simulating mass-interaction physical networks in the context of sound synthesis and musical composition. GENESIS is a musician-oriented software environment for sound synthesis and musical composition. However, despite this orientation, a substantial amount of effort has been put into building a rich variety of tools based on static or dynamic visual representations of models an...

  8. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    International Nuclear Information System (INIS)

    Benichou, Léo; Mayr, Sebastian

    2014-01-01

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  9. Hot metal temperature prediction in blast furnace using advanced model based on fuzzy logic tools

    Energy Technology Data Exchange (ETDEWEB)

    Martin, R.D.; Obeso, F.; Mochon, J.; Barea, R.; Jimenez, J.

    2007-05-15

    The present work presents a model based on fuzzy logic tools to predict and simulate the hot metal temperature in a blast furnace (BF). As input variables this model uses the control variables of a current BF such as moisture, pulverised coal injection, oxygen addition, mineral/coke ratio and blast volume, and it yields as a result of the hot metal temperature. The variables employed to develop the model have been obtained from data supplied by current sensors of a Spanish BF In the model training stage the adaptive neurofuzzy inference system and the subtractive clustering algorithms have been used.

  10. FORMAL MODELLING OF BUSINESS RULES: WHAT KIND OF TOOL TO USE?

    Directory of Open Access Journals (Sweden)

    Sandra Lovrenčić

    2006-12-01

    Full Text Available Business rules are today essential parts of a business system model. But presently, there are still various approaches to, definitions and classifications of this concept. Similarly, there are also different approaches in business rules formalization and implementation. This paper investigates formalization using formal language in association with easy domain modelling. Two of the tools that enable such approach are described and compared according to several factors. They represent ontology modelling and UML, nowadays widely used standard for object-oriented modelling. A simple example is also presented.

  11. Not finding Nemo: limited reef-scale retention in a coral reef fish

    KAUST Repository

    Nanninga, Gerrit B.

    2015-02-03

    The spatial scale of larval dispersal is a key predictor of marine metapopulation dynamics and an important factor in the design of reserve networks. Over the past 15 yr, studies of larval dispersal in coral reef fishes have generated accumulating evidence of consistently high levels of self-recruitment and local retention at various spatial scales. These findings have, to a certain degree, created a paradigm shift toward the perception that large fractions of locally produced recruitment may be the rule rather than the exception. Here we examined the degree of localized settlement in an anemonefish, Amphiprion bicinctus, at a solitary coral reef in the central Red Sea by integrating estimates of self-recruitment obtained from genetic parentage analysis with predictions of local retention derived from a biophysical dispersal model parameterized with real-time physical forcing. Self-recruitment at the reef scale (c. 0.7 km2) was virtually absent during two consecutive January spawning events (1.4 % in 2012 and 0 % in 2013). Predicted levels of local retention at the reef scale varied temporally, but were comparatively low for both simulations (7 % in 2012 and 0 % in 2013). At the same time, the spatial scale of simulated dispersal was restricted to approximately 20 km from the source. Model predictions of reef-scale larval retention were highly dependent on biological parameters, underlining the need for further empirical validations of larval traits over a range of species. Overall, our findings present an urgent caution when assuming the potential for self-replenishment in small marine reserves.

  12. Graphical and numerical diagnostic tools to assess suitability of multiple imputations and imputation models.

    Science.gov (United States)

    Bondarenko, Irina; Raghunathan, Trivellore

    2016-07-30

    Multiple imputation has become a popular approach for analyzing incomplete data. Many software packages are available to multiply impute the missing values and to analyze the resulting completed data sets. However, diagnostic tools to check the validity of the imputations are limited, and the majority of the currently available methods need considerable knowledge of the imputation model. In many practical settings, however, the imputer and the analyst may be different individuals or from different organizations, and the analyst model may or may not be congenial to the model used by the imputer. This article develops and evaluates a set of graphical and numerical diagnostic tools for two practical purposes: (i) for an analyst to determine whether the imputations are reasonable under his/her model assumptions without actually knowing the imputation model assumptions; and (ii) for an imputer to fine tune the imputation model by checking the key characteristics of the observed and imputed values. The tools are based on the numerical and graphical comparisons of the distributions of the observed and imputed values conditional on the propensity of response. The methodology is illustrated using simulated data sets created under a variety of scenarios. The examples focus on continuous and binary variables, but the principles can be used to extend methods for other types of variables. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Multi-Model R-Tool for uncertainty assessment in landslides susceptibility analysis

    Science.gov (United States)

    Cosmin Sandric, Ionut; Chitu, Zenaida; Jurchescu, Marta; Micu, Mihai

    2014-05-01

    The evaluation of landslide susceptibility requires understanding of the spatial distribution of the factors that control slope instability. It is known that the behavior of landslides is difficult to evaluate because of the various factors that trigger mass movements. The methodology used is very diverse, based on statistical methods, probabilistic methods, deterministic methods, empirical methods or a combination of them and the main factors used for landslide susceptibility assessment are composed from basic morphometric parameters, such as slope gradient, curvature, aspect, solar radiation etc. in combination with lithology, land-use/land-cover, soil types or soil properties. The reliability of susceptibility maps is mostly estimated by a comparison with ground truth and visualized as charts and statistical tables and less by maps for landslides susceptibility uncertainty. Due to similarity of inputs required by numerous susceptibility models, we have developed a Multi-Model tool for R, a free software environment for statistical computing and graphics, combines several landslides susceptibility models into one forecast, thereby improving the forecast accuracy even further. The tool uses as inputs all the predisposing factors and generates susceptibility maps for each model; it combines the resulted susceptibility maps in just one and assesses the uncertainty as a function of susceptibility levels from each map. The final results are susceptibility and uncertainty maps as a function of several susceptibility models. The Multi-Model R-Tool was tested in different areas from Romanian Subcarpathians with very good results

  14. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  15. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  16. A MASTER THESIS ON PORTING THE ENTERPRISE ARCHITECTURE ANALYSIS TOOL TO ECLIPSE MODELING PROJECT

    OpenAIRE

    Ivanov, Stanislav

    2011-01-01

    This master thesis is a part of the ongoing research on EAT development project. Its main goal is to research whether Eclipse Modeling Project can be used as an alternative platform to using NetBeans in implementing EAT tool. In order to fulfill this goal, it contains analysis of the current EAT tool version and design research of a new version using EMP. The design addresses most of the issues related to building a new version and eventually recommends porting EAT to EMP.

  17. Learning how to use a tool: Mutually exclusive tool-function mappings are selectively acquired from linguistic in-group models.

    Science.gov (United States)

    Pető, Réka; Elekes, Fruzsina; Oláh, Katalin; Király, Ildikó

    2018-07-01

    The current study investigated whether 4-year-olds used language as a cue to social group membership to infer whether the tool-use behavior of a model needed to be encoded as indicative of the tool's function. We built on children's tendency to treat functions as mutually exclusive, that is, their propensity to refrain from using the same tool for more than one function. We hypothesized that children would form mutually exclusive tool-function mappings only if the source of the function information was a linguistic in-group person (native) as opposed to an out-group (foreign) person. In Experiment 1, participants (N = 39) were presented with four tool-function pairs by a model who had previously spoken either in their native language or in a foreign language. During the test phase, children encountered new purposes for which they could either use the demonstrated tools' color variant or use another equally suitable, as yet unseen, alternative tool. In line with our predictions, children preferred to use the alternative tool for the new function only in the native language condition (native: 63.3%; foreign: 42.7%). Experiment 2 replicated the initial finding using another foreign language and demonstrated that the lack of mutually exclusive tool choice in the foreign condition did not originate from children's failure to encode the demonstration. These findings suggest that children restrict learning artifact functions from linguistic in-group models. The mutual exclusivity principle in the domain of function learning is used more flexibly than previously proposed. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Directory of Open Access Journals (Sweden)

    Okokpujie Imhade Princess

    2017-12-01

    Full Text Available In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N, feed rate (f, axial depth of cut (a and radial depth of cut (r. The experiment was designed using central composite design (CCD in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM. The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  19. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Science.gov (United States)

    Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.

    2017-12-01

    In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  20. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  1. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  2. Techniques for the construction of an elliptical-cylindrical model using circular rotating tools in non CNC machines

    International Nuclear Information System (INIS)

    Villalobos Mendoza, Brenda; Cordero Davila, Alberto; Gonzalez Garcia, Jorge

    2011-01-01

    This paper describes the construction of an elliptical-cylindrical model without spherical aberration using vertical rotating tools. The engine of the circular tool is placed on one arm so that the tool fits on the surface and this in turn is moved by an X-Y table. The test method and computer algorithms that predict the desired wear are described.

  3. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools....... Active stall wind turbine withinduction generator 2. Variable speed, variable pitch wind turbine with doubly-fed induction generator These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, controlstrategies, connection...... of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations....

  4. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    Science.gov (United States)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  5. Modelling tools for assessing bioremediation performance and risk of chlorinated solvents in clay tills

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia

    of future remediation of chlorinated ethenes in low-permeability settings. In conclusion, this PhD-project has developed our understanding on transport and degradation processes of chlorinated solvents in clay tills, and this knowledge was used to develop modelling tools for assessment of risk...... are trapped in the low-permeability matrix and can then slowly back diffuse to the fracture network, forming a long-term secondary contamination source to the underlying aquifers. Because of the complex transport and degradation processes and the mass transfer limitations, risk assessment and remediation...... design are challenging. This thesis presents the development and application of analytical and numerical models to improve our understanding of transport and degradation processes in clay tills, which is crucial for assessing bioremediation performance and risk to groundwater. A set of modelling tools...

  6. A Distributed Electrochemistry Modeling Tool for Simulating SOFC Performance and Degradation

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P.; Ryan, Emily M.; Khaleel, Mohammad A.

    2011-10-13

    This report presents a distributed electrochemistry (DEC) model capable of investigating the electrochemistry and local conditions with the SOFC MEA based on the local microstructure and multi-physics. The DEC model can calculate the global current-voltage (I-V) performance of the cell as determined by the spatially varying local conditions through the thickness of the electrodes and electrolyte. The simulation tool is able to investigate the electrochemical performance based on characteristics of the electrode microstructure, such as particle size, pore size, electrolyte and electrode phase volume fractions, and triple-phase-boundary length. It can also investigate performance as affected by fuel and oxidant gas flow distributions and other environmental/experimental conditions such as temperature and fuel gas composition. The long-term objective for the DEC modeling tool is to investigate factors that cause electrode degradation and the decay of SOFC performance which decrease longevity.

  7. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  8. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  9. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  10. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  11. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    International Nuclear Information System (INIS)

    Dunford, Gary; Williams, David; Smith, Rick

    2013-01-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  12. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  13. PredicT-ML: a tool for automating machine learning model building with big clinical data.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is fundamental to transforming large clinical data sets, or "big clinical data," into actionable knowledge for various healthcare applications. Machine learning is a major predictive modeling approach, but two barriers make its use in healthcare challenging. First, a machine learning tool user must choose an algorithm and assign one or more model parameters called hyper-parameters before model training. The algorithm and hyper-parameter values used typically impact model accuracy by over 40 %, but their selection requires many labor-intensive manual iterations that can be difficult even for computer scientists. Second, many clinical attributes are repeatedly recorded over time, requiring temporal aggregation before predictive modeling can be performed. Many labor-intensive manual iterations are required to identify a good pair of aggregation period and operator for each clinical attribute. Both barriers result in time and human resource bottlenecks, and preclude healthcare administrators and researchers from asking a series of what-if questions when probing opportunities to use predictive models to improve outcomes and reduce costs. This paper describes our design of and vision for PredicT-ML (prediction tool using machine learning), a software system that aims to overcome these barriers and automate machine learning model building with big clinical data. The paper presents the detailed design of PredicT-ML. PredicT-ML will open the use of big clinical data to thousands of healthcare administrators and researchers and increase the ability to advance clinical research and improve healthcare.

  14. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  15. Identifying a minimal rheological configuration: a tool for effective and efficient constitutive modeling of soft tissues.

    Science.gov (United States)

    Jordan, Petr; Kerdok, Amy E; Howe, Robert D; Socrate, Simona

    2011-04-01

    We describe a modeling methodology intended as a preliminary step in the identification of appropriate constitutive frameworks for the time-dependent response of biological tissues. The modeling approach comprises a customizable rheological network of viscous and elastic elements governed by user-defined 1D constitutive relationships. The model parameters are identified by iterative nonlinear optimization, minimizing the error between experimental and model-predicted structural (load-displacement) tissue response under a specific mode of deformation. We demonstrate the use of this methodology by determining the minimal rheological arrangement, constitutive relationships, and model parameters for the structural response of various soft tissues, including ex vivo perfused porcine liver in indentation, ex vivo porcine brain cortical tissue in indentation, and ex vivo human cervical tissue in unconfined compression. Our results indicate that the identified rheological configurations provide good agreement with experimental data, including multiple constant strain rate load/unload tests and stress relaxation tests. Our experience suggests that the described modeling framework is an efficient tool for exploring a wide array of constitutive relationships and rheological arrangements, which can subsequently serve as a basis for 3D constitutive model development and finite-element implementations. The proposed approach can also be employed as a self-contained tool to obtain simplified 1D phenomenological models of the structural response of biological tissue to single-axis manipulations for applications in haptic technologies.

  16. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  17. Impact of an electronic handoff documentation tool on team shared mental models in pediatric critical care.

    Science.gov (United States)

    Jiang, Silis Y; Murphy, Alexandrea; Heitkemper, Elizabeth M; Hum, R Stanley; Kaufman, David R; Mamykina, Lena

    2017-05-01

    To examine the impact of the implementation of an electronic handoff tool (the Handoff Tool) on shared mental models (SMM) within patient care teams as measured by content overlap and discrepancies in verbal handoff presentations given by different clinicians caring for the same patient. Researchers observed, recorded, and transcribed verbal handoffs given by different members of patient care teams in a pediatric intensive care unit. The transcripts were qualitatively coded and analyzed for content overlap scores and the number of discrepancies in handoffs of different team members before and after the implementation of the tool. Content overlap scores did not change post-implementation. The average number of discrepancies nearly doubled following the implementation (from 0.76 discrepancies per handoff group pre-implementation to 1.17 discrepancies per handoff group post-implementation); however, this change was not statistically significant (p=0.37). Discrepancies classified as related to dosage of treatment or procedure and to patients' symptoms increased in frequency post-implementation. The results suggest that the Handoff Tool did not have the desired positive impact on SMM within patient care teams. Future electronic tools for facilitating team handoff may need longer implementation times, complementary changes to handoff process and structure, and improved designs that integrate a common core of shared information with discipline-specific records. While electronic handoff tools provide great opportunities to improve communication and facilitate the formation of shared mental models within patient care teams, further work is necessary to realize their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  19. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  20. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  1. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    Science.gov (United States)

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  2. A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model

    Science.gov (United States)

    Jacobson, I. D.

    1977-01-01

    A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.

  3. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    Science.gov (United States)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state

  4. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  5. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  6. A tool for multi-scale modelling of the renal nephron

    Science.gov (United States)

    Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.

    2011-01-01

    We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210

  7. Testing and thermal modeling of radiant panels systems as commissioning tool

    International Nuclear Information System (INIS)

    Fonseca Diaz, Nestor; Cuevas, Cristian

    2010-01-01

    This paper presents the results of a study performed to develop a thermal modeling of radiant panels systems to be used in situ, as diagnosis tool in commissioning processes to determine the main operating conditions of the system in cooling or heating mode. The model considers the radiant panels as a finned heat exchanger in dry regime. By using as inputs the ceiling and room dimensions, the radiant ceiling material properties and the measurements of air and water mass flow rates and temperatures, the model is able to calculate the radiant ceiling capacity, ceiling surface average temperature, water exhaust temperature and resultant temperature as a comfort indicator. The modeling proposed considers combined convection, perforation effect and a detailed radiative heat exchange method for radiant ceiling systems. An example of each system considered in this study is shown, illustrating the validation of the model. A sensitive analysis of the model is performed.

  8. The Business Model Evaluation Tool for Smart Cities: Application to SmartSantander Use Cases

    Directory of Open Access Journals (Sweden)

    Raimundo Díaz-Díaz

    2017-02-01

    Full Text Available New technologies open up the door to multiple business models applied to public services in smart cities. However, there is not a commonly adopted methodology for evaluating business models in smart cities that can help both practitioners and researchers to choose the best option. This paper addresses this gap introducing the Business Model Evaluation Tool for Smart Cities. This methodology is a simple, organized, flexible and the transparent system that facilitates the work of the evaluators of potential business models. It is useful to compare two or more business models and take strategic decisions promptly. The method is part of a previous process of content analysis and it is based on the widely utilized Business Model Canvas. The evaluation method has been assessed by 11 experts and, subsequently it has been validated applying it to the case studies of Santander’s waste management and street lighting systems, which take advantage of innovative technologies commonly used in smart cities.

  9. A GUI-based Tool for Bridging the Gap between Models and Process-Oriented Studies

    Science.gov (United States)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2014-12-01

    Models used for simulation of photosynthesis and transpiration by canopies of terrestrial plants typically have subroutines such as STOMATA.F90, PHOSIB.F90 or BIOCHEM.m that solve for photosynthesis and associated processes. Key parameters such as the Vmax for Rubisco and temperature response parameters are required by these subroutines. These are often taken from the literature or determined by separate analysis of gas exchange experiments. It is useful to note however that subroutines can be extracted and run as standalone models to simulate leaf responses collected in gas exchange experiments. Furthermore, there are excellent non-linear fitting tools that can be used to optimize the parameter values in these models to fit the observations. Ideally the Vmax fit in this way should be the same as that determined by a separate analysis, but it may not because of interactions with other kinetic constants and the temperature dependence of these in the full subroutine. We submit that it is more useful to fit the complete model to the calibration experiments rather as disaggregated constants. We designed a graphical user interface (GUI) based tool that uses gas exchange photosynthesis data to directly estimate model parameters in the SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) model and, at the same time, allow researchers to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. We have also ported some of this functionality to an Excel spreadsheet, which could be used as a teaching tool to help integrate process-oriented and model-oriented studies.

  10. SIMULATION MODELLING AS A TOOL FORPERFORMING AVAILABILIlYAND SENSITIVIlY ANALYSIS

    Directory of Open Access Journals (Sweden)

    P.S. Kruger

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Simulation modelling is a general purpose tool that may be used to provide decision support in a number of application areas. It may be used to analyze, design or "optimize" manufacturing, materials handling, management, commercial and a wide variety of other systems.
    This paper will report on the design of a prototype decision support tool, based on a simulation model of a vehicle fleet availability problem. The primary purpose of the model is to serve as a tool for the evaluation of the availability of equipment under different conditions and to perform sensitivity analysis.

    AFRIKAANSE OPSOMMING: Simulasiemodellering is 'n algemeendoelige tegniek wat gebruik kan word vir die verskaffing van besluitsteun in 'n aantal toepassingsgebiede. Dit mag gebruik word vir die analise, ontwerp of "optimisering" van vervaardiging-, materiaalhantering-, bestuur-, kommersieIe en 'n wye verskeidenheid ander stelsels.
    Hierdie referaat doen verslag oor die ontwikkeling van 'n prototipe besluitnemingshulpmiddel wat gebaseer is op 'n simulasiemodel van 'n voertuigvloot beskikbaarheidsprobleem. Die hoofdoelwit van die model is om te dien as 'n hulpmiddel by die evaluasie van die beskikbaarheid van toerusting onder verskillende omstandighede asook vir die uitvoer van sensitiwiteitsanalise.

  11. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  12. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    Science.gov (United States)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  13. Interdisciplinary semantic model for managing the design of a steam-assisted gravity drainage tooling system

    Directory of Open Access Journals (Sweden)

    Michael Leitch

    2018-01-01

    Full Text Available Complex engineering systems often require extensive coordination between different expert areas in order to avoid costly design iterations and rework. Cyber-physics system (CPS engineering methods could provide valuable insights to help model these interactions and optimize the design of such systems. In this work, steam assisted gravity drainage (SAGD, a complex oil extraction process that requires deep understanding of several physical-chemical phenomena, is examined whereby the complexities and interdependencies of the system are explored. Based on an established unified feature modeling scheme, a software modeling framework is proposed to manage the design process of the production tools used for SAGD oil extraction. Applying CPS methods to unify complex phenomenon and engineering models, the proposed CPS model combines effective simulation with embedded knowledge of completion tooling design in order to optimize reservoir performance. The system design is expressed using graphical diagrams of the unified modelling language (UML convention. To demonstrate the capability of this system, a distributed research group is described, and their activities coordinated using the described CPS model.

  14. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  15. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  16. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  17. CSML2SBML: a novel tool for converting quantitative biological pathway models from CSML into SBML.

    Science.gov (United States)

    Li, Chen; Nagasaki, Masao; Ikeda, Emi; Sekiya, Yayoi; Miyano, Satoru

    2014-07-01

    CSML and SBML are XML-based model definition standards which are developed with the aim of creating exchange formats for modeling, visualizing and simulating biological pathways. In this article we report a release of a format convertor for quantitative pathway models, namely CSML2SBML. It translates models encoded by CSML into SBML without loss of structural and kinetic information. The simulation and parameter estimation of the resulting SBML model can be carried out with compliant tool CellDesigner for further analysis. The convertor is based on the standards CSML version 3.0 and SBML Level 2 Version 4. In our experiments, 11 out of 15 pathway models in CSML model repository and 228 models in Macrophage Pathway Knowledgebase (MACPAK) are successfully converted to SBML models. The consistency of the resulting model is validated by libSBML Consistency Check of CellDesigner. Furthermore, the converted SBML model assigned with the kinetic parameters translated from CSML model can reproduce the same dynamics with CellDesigner as CSML one running on Cell Illustrator. CSML2SBML, along with its instructions and examples for use are available at http://csml2sbml.csml.org. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  19. A modeling tool to support decision making in future hydropower development in Chile

    Science.gov (United States)

    Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.

    2017-12-01

    Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower

  20. The Climate-Agriculture-Modeling and Decision Tool (CAMDT) for Climate Risk Management in Agriculture

    Science.gov (United States)

    Ines, A. V. M.; Han, E.; Baethgen, W.

    2017-12-01

    Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT

  1. Modeling tools for the assessment of microbiological risks during floods: a review

    Science.gov (United States)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing

  2. Development from the seafloor to the sea surface of the cabled NEMO-SN1 observatory in the Western Ionian Sea

    Science.gov (United States)

    Sparnocchia, Stefania; Beranzoli, Laura; Borghini, Mireno; Durante, Sara; Favali, Paolo; Giovanetti, Gabriele; Italiano, Francesco; Marinaro, Giuditta; Meccia, Virna; Papaleo, Riccardo; Riccobene, Giorgio; Schroeder, Katrin

    2015-04-01

    A prototype of cabled deep-sea observatory has been operating in real-time since 2005 in Southern Italy (East Sicily, 37°30' N - 15°06'E), at 2100 m water depth, 25 km from the harbor of the city of Catania. It is the first-established real-time node of the "European Multidisciplinary Seafloor and water column Observatory" (EMSO, http://www.emso-eu.org) a research infrastructure of the Sector Environment of ESFRI. In the present configuration it consists of two components: the multi-parametric station NEMO-SN1 (TSN branch) equipped with geophysical and environmental sensors for measurements at the seafloor, and the NEMO-OνDE station (TSS branch) equipped with 4 wideband hydrophones. A 28 km long electro-optical cable connects the observatory to a shore laboratory in the Catania harbor, hosting the data acquisition system and supplying power and data transmission to the underwater instrumentation. The NEMO-SN1 observatory is located in an area particularly suited to multidisciplinary studies. The site is one of the most seismically active areas of the Mediterranean (some of the strongest earthquakes occurred in 1169, 1693 and 1908, also causing very intense tsunami waves) and is close to Mount Etna, one of the largest and most active volcanoes in Europe. The deployment area is also a key site for monitoring deep-water dynamics in the Ionian Sea, connecting the Levantine basin to the southern Adriatic basin where intermediate and deep waters are formed, and finally to the western Mediterranean Sea via the Strait of Sicily. The observatory is being further developed under EMSO MedIT (http://www.emso-medit.it/en/), a structural enhancement project contributing to the consolidation and enhancement of the European research infrastructure EMSO in Italian Convergence Regions. In this framework, a new Junction Box will be connected to the TSN branch and will provide wired and wireless (acoustic connections) for seafloor platforms and moorings. This will allow the

  3. Who stole Nemo?

    Science.gov (United States)

    Thibodeau, Edward; Mentasti, Lauren

    2007-05-01

    Motion pictures have the ability to reach wide audiences and affect the perceptions and behaviors of the general public. Unfortunately, depictions of the dentist throughout cinematic history often have resulted in negative images and stereotypes. The authors set out to determine whether the motion picture industry's portrayal of dentists and the dental profession has changed in the past 100 years. Dentists often still are portrayed in the movies in a comedic role or as incompetent, sadistic, immoral, disturbed or corrupt. The only significant changes in recent years have been the inclusion of historically underrepresented groups, such as African-Americans and women, cast in the role of dentist. While many hold dentists and the dental profession in high regard, millions of Americans still avoid dental care because of fear and anxiety. The challenge of countering negative stereotypes of the dental profession as it often is portrayed in the cinema is problematic and has yet to be addressed adequately.

  4. Scale models: A proven cost-effective tool for outage planning

    Energy Technology Data Exchange (ETDEWEB)

    Lee, R. [Commonwealth Edison Co., Morris, IL (United States); Segroves, R. [Sargent & Lundy, Chicago, IL (United States)

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning and monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.

  5. Surface Modeling of Workpiece and Tool Trajectory Planning for Spray Painting Robot

    Science.gov (United States)

    Tang, Yang; Chen, Wei

    2015-01-01

    Automated tool trajectory planning for spray-painting robots is still a challenging problem, especially for a large free-form surface. A grid approximation of a free-form surface is adopted in CAD modeling in this paper. A free-form surface model is approximated by a set of flat patches. We describe here an efficient and flexible tool trajectory optimization scheme using T-Bézier curves calculated in a new way from trigonometrical bases. The distance between the spray gun and the free-form surface along the normal vector is varied. Automotive body parts, which are large free-form surfaces, are used to test the scheme. The experimental results show that the trajectory planning algorithm achieves satisfactory performance. This algorithm can also be extended to other applications. PMID:25993663

  6. Transformation of Baumgarten's aesthetics into a tool for analysing works and for modelling

    DEFF Research Database (Denmark)

    Thomsen, Bente Dahl

    2006-01-01

      Abstract: Is this the best form, or does it need further work? The aesthetic object does not possess the perfect qualities; but how do I proceed with the form? These are questions that all modellers ask themselves at some point, and with which they can grapple for days - even weeks - before...... the inspiration to deliver the form finally presents itself. This was the outlet for our plan to devise a tool for analysing works and the practical development of forms. The tool is a set of cards with suggestions for investigations that may assist the modeller in identifying the weaknesses of the form......, or convince him-/herself about its strengths. The cards also contain aesthetical reflections that may be of inspiration in the development of the form....

  7. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  8. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    Science.gov (United States)

    Sulaiman, S.; Roshan, A.; Ariffin, M. K. A.

    2013-12-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes.

  9. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches

    OpenAIRE

    Shoubi, Mojtaba Valinejad; Shoubi, Masoud Valinejad; Bagchi, Ashutosh; Barough, Azin Shakiba

    2015-01-01

    A sustainable building is constructed of materials that could decrease environmental impacts, such as energy usage, during the lifecycle of the building. Building Information Modeling (BIM) has been identified as an effective tool for building performance analysis virtually in the design stage. The main aims of this study were to assess various combinations of materials using BIM and identify alternative, sustainable solutions to reduce operational energy consumption. The amount of energy con...

  10. Graphical surface-vegetation-atmosphere transfer (SVAT) model as a pedagogical and research tool

    OpenAIRE

    Gillies, Robert R.; Carlson, Toby N.; Ripley, David A.J.

    1998-01-01

    This paper considers, by example, the use of a Surface-Atmosphere-Vegetation-Transfer (SVAT), Atmospheric Boundary Layer (ABL) model designed as a pedagogical tool. The goal of the computer software and the approach is to improve the efficiency and effectiveness of communicating often complex and mathematical based disciplines (e.g., micrometeorology, land surface processes) to the non-specialist interested in studying problems involving interactions between vegetation and the atmosphere and,...

  11. Army Sustainability Modelling Analysis and Reporting Tool Phase 1: User Manual and Results Interpretation Guide

    Science.gov (United States)

    2009-11-01

    Force Sustainability Modelling Tool Prototype GB Gigabyte GRES General Reserve HQ Headquarters HTA Hardening the Army JOLTS Joint Operational...Hardening the Army ( HTA ) proposed force structure.1 Following this work, the Director General Preparedness and Plans – Army (DGPP-A) approached DSTO to...that the different elements of the results for the corps have been identified, we can turn our attention to what the results say about the

  12. Teaching Integrated Scope-Cost Methods with Model-based Tools

    OpenAIRE

    Peterson, Forest; Fischer, Martin; Wingate, Thomas; Seppänen, Olli; Tutti, Tomi; See, Richard

    2009-01-01

    The purpose of this paper is to outline teaching integrated scope-cost methods in a course on fabrication and construction planning using model-based tools. Through project-based active discovery using project documents students create an integrated takeoff, schedule and cost estimate. The goal is to illustrate the processes and interrelation between professions required to effectively obtain the scope, schedule and cost of a proposed project. Students who are provided with a scope-time-cost ...

  13. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  14. Towards a Tool-Supported Quality Model for Model-Driven Engineering

    OpenAIRE

    Mohagheghi, Parastoo

    2008-01-01

    This paper reviews definitions of model quality before introducing five properties of models that are important for building high-quality models. These are identified to be correctness, completeness, consistency, comprehensibility and confinement. We have earlier defined a quality model that separates intangible quality goals from tangible quality-carrying properties and practices that should be in place to support these properties.  A part of that work was to define a metamodel for deve...

  15. using explanatory models to derive simple tools for Avanced Life Support system studies - Crop Modelling

    Science.gov (United States)

    Cavazzoni, J.

    System-level analyses for Advanced Life Support (ALS) require mathematical models for various processes, such as biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of processes specific to ALS studies. However, integrating such models may not always be practicable because of their complexity, especially for initial system-level analyses where simple sub-models may be satisfactory. One way to address this is to capture important features of explanatory models in simple models that may be readily integrated for system-level analyses. In this paper, explanatory crop models were used to generate parameters and multi-variable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling due to off- nominal conditions for ALS system studies. The simplest variant of these models consists of only a few equations, and has been integrated into a top-level SIMULINK model for the Bioregenerative Planetary Life Support Systems Test Complex (BIO-Plex), a large-scale human-rated test facility under development at NASA Johnson Space Center. When included in systems studies, the simple crop models may help identify issues that need to be addressed using more detailed modeling studies and specific experiments. Similar modeling simplifications may also prove useful for other ALS sub-systems, as well as for Earth system applications.

  16. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Science.gov (United States)

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  17. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Directory of Open Access Journals (Sweden)

    Saurabh Gupta BPharm

    2015-10-01

    Full Text Available Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories, service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  18. Mathematical Modeling: A Tool for Optimization of Lipid Nanoparticle-Mediated Delivery of siRNA.

    Science.gov (United States)

    Mihaila, Radu; Ruhela, Dipali; Keough, Edward; Cherkaev, Elena; Chang, Silvia; Galinski, Beverly; Bartz, René; Brown, Duncan; Howell, Bonnie; Cunningham, James J

    2017-06-16

    Lipid nanoparticles (LNPs) have been used to successfully deliver small interfering RNAs (siRNAs) to target cells in both preclinical and clinical studies and currently are the leading systems for in vivo delivery. Here, we propose the use of an ordinary differential equation (ODE)-based model as a tool for optimizing LNP-mediated delivery of siRNAs. As a first step, we have used a combination of experimental and computational approaches to develop and validate a mathematical model that captures the critical features for efficient siRNA-LNP delivery in vitro. This model accurately predicts mRNA knockdown resulting from novel combinations of siRNAs and LNPs in vitro. As demonstrated, this model can be effectively used as a screening tool to select the most efficacious LNPs, which can then further be evaluated in vivo. The model serves as a starting point for the future development of next generation models capable of capturing the additional complexity of in vivo delivery. Copyright © 2017 Elena Cherkaev, Merck Sharp & Dohme Corp., a subsidiary of Merck & Co., Inc., Kenilworth, NJ USA. Published by Elsevier Inc. All rights reserved.

  19. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  20. The mesoscale dispersion modeling system a simulation tool for development of an emergency response system

    International Nuclear Information System (INIS)

    Uliasz, M.

    1990-01-01

    The mesoscale dispersion modeling system is under continuous development. The included numerical models require further improvements and evaluation against data from meteorological and tracer field experiments. The system can not be directly applied to real time predictions. However, it seems to be a useful simulation tool for solving several problems related to planning the monitoring network and development of the emergency response system for the nuclear power plant located in a coastal area. The modeling system can be also applied to another environmental problems connected with air pollution dispersion in complex terrain. The presented numerical models are designed for the use on personal computers and are relatively fast in comparison with the similar mesoscale models developed on mainframe computers

  1. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  2. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  3. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Science.gov (United States)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  4. A sensitivity driven meta-model optimisation tool for hydrological models

    Science.gov (United States)

    Oppel, Henning; Schumann, Andreas

    2017-04-01

    The calibration of rainfall-runoff-models containing a high number of parameters can be done readily by the use of different calibration methods and algorithms. Monte-Carlo Methods, gradient based search algorithms and others are well-known and established in hydrological sciences. Thus, the calibration of a model for a desired application is not a challenging task, but retaining regional comparability and process integrity is, due to the equifinality-problem, a prevailing topic. This set of issues is mainly a result of the overdeterminaton given the high number of parameters in rainfall-runoff-models, where different parameters are affecting the same facet of model performance (i.e. runoff volume, variance and timing). In this study a calibration strategy is presented which considers model sensitivity as well as parameter interaction and different criteria of model performance. At first a range of valid values for each model parameter was defined and the individual effect on model performance within the defined parameter range was evaluated. By use of the gained knowledge a meta-model, lumping different parameters affecting the same facet of model performance, was established. Hereafter, the parsimonious meta-model, where each parameter is assigned to a nearly disjoint facet of model performance is optimized. By retransformation of the lumped parameters to the original model, a parametrisation for the original model is obtained. An application of this routine to a set of watersheds in the eastern part of Germany displays the benefits of the routine. Results of the meta-parametrised model are compared to parametrisations obtained from common calibration routines in a validation study and process orientated numerical experiment.

  5. The role of measurement and modelling of machine tools in improving product quality

    Directory of Open Access Journals (Sweden)

    Longstaff A.P.

    2013-01-01

    Full Text Available Manufacturing of high-quality components and assemblies is clearly recognised by industrialised nations as an important means of wealth generation. A “right first time” paradigm to producing finished components is the desirable goal to maximise economic benefits and reduce environmental impact. Such an ambition is only achievable through an accurate model of the machinery used to shape the finished article. In the first analysis, computer aided design (CAD and computer aided manufacturing (CAM can be used to produce an instruction list of three-dimensional coordinates and intervening tool paths to translate the intent of a design engineer into an unambiguous set of commands for a manufacturing machine. However, in order for the resultant manufacturing program to produce the desired output within the specified tolerance, the model of the machine has to be sufficiently accurate. In this paper, the spatial and temporal sources of error and various contemporary means of modelling are discussed. Limitations and assumptions in the models are highlighted and an estimate of their impact is made. Measurement of machine tools plays a vital role in establishing the accuracy of a particular machine and calibrating its unique model, but is an often misunderstood and misapplied discipline. Typically, the individual errors of the machine will be quantified at a given moment in time, but without sufficient consideration either for the uncertainty of individual measurements or a full appreciation of the complex interaction between each independently measured error. This paper draws on the concept of a “conformance zone”, as specified in the ISO 230:1 – 2012, to emphasise the need for a fuller understanding of the complex uncertainty of measurement model for a machine tool. Work towards closing the gap in this understanding is described and limitations are noted.

  6. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Science.gov (United States)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  7. Software tools for 3d modeling as a part of design and technology in primary school

    OpenAIRE

    Mihovec, Nastja

    2013-01-01

    There are numerous programs that enable 3D modeling. We can choose from various free programs or the ones that we must pay for. Many designers and engineers use payable programs such as AutoCad, Maya, ProEngineer, Cinema 3D, SolidWorks, etc. In their opinion these programs give their users more than the free ones mainly because of their better modeling quality, tools, functions, easy usage, support, maintenance, etc. Free program developers try very hard to convince these users to reconsider,...

  8. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  9. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  10. Translating statistical species-habitat models to interactive decision support tools.

    Directory of Open Access Journals (Sweden)

    Lyndsie S Wszola

    Full Text Available Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  11. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  12. NOTATION TOOLS OF BUSINESS MODELING OF THE SERVICES ON REAL ESTATE MARKET

    Directory of Open Access Journals (Sweden)

    Mishlanova Marina Yur’evna

    2016-04-01

    Full Text Available The article is devoted to the development of the main provisions of realtor business modeling. In the paper the development of notational complex is presented, which is involved in the design of the conceptual model, the formation of a reference model of real estate business and basic rules for the implementation of the model. In the construction of the proposed model important notational aspects are highlighted. Functional orientation of real estate business for rendering services reflects a functional approach to business modeling. In order to ensure the assessment of the offered services it is proposed to implement a nested model of the object. A reasonable functional approach using object-based elements allows optimizing the processes of business modeling and assessment of the results. The article discusses functional modeling of business, focusing on the results. Synchronizing the functional model with the models of business processes and sub-models of objects, in particular, the model of business result, contributes to the improvement of the notations tools. The article presents the adaptation of the template of the business model to the conditions of the realtor activity. The proposed reference model specifies the logical scheme of decomposition activity, which detaches economic, social and other values. The decomposition of services into functional groups with account for individual values and functional modules is presented: buying and selling real estate; mortgages and loans; rent of residential and commercial property; an independent evaluation of real estate; consultations concerning the issues of real estate transactions. In the focus of the results of business processes and performance standards of realtor organizations transitional notation to the evaluation system efficiency of business performance is developed. The simplest method of feedback for assessing customer satisfaction and, consequently, system efficiency is offered

  13. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  14. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo; Goodsell, Mark D.; Harries, Dylan; Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby; Ubaldi, Lorenzo; Vicente, Avelino; Voigt, Alexander

    2016-01-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  15. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Athron, Peter [Monash Univ., Melbourne (Australia). ARC Center of Excellence for Particle Physics at the Terascale; Basso, Lorenzo [Aix-Marseille Univ., CNRS-IN2P3, UMR 7346 (France). CPPM; and others

    2016-02-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  16. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  17. MESSI: An engineering tool for conceptual hydrological modeling using SUPERFLEX, MOSCEM and GLUE

    Science.gov (United States)

    van Osnabrugge, Bart; Mondeel, Herman; Hrachowitz, Markus

    2014-05-01

    The progress of hydrology as a science is mentioned quite often and indeed lots of theoretical research is done to improving hydrological rainfall-runoff (RR) modeling. At the same time however, it is concluded that engineering practice lags behind on this scientific progress by at least a couple of years. In this research, it is investigated how this gap can be closed. An engineering tool is developed called Model Ensemble, Sampling, Selection and Interpretation (MESSI) and tested in the engineering environment. The tool uses the model hypothesis framework SUPERFLEX to build an 'a-priori' ensemble of possible model structures for the case at hand. Then, the Multi-objective Shuffled Complex Evolution Metropolis algorithm (MOSCEM) is used for sampling of the parameter space. Finally, the Generalized Likelihood Uncertainty Estimation (GLUE) methodology is used to select a posterior ensemble which is then interpreted using the Pareto front and generated uncertainty bounds. During the trial it was found that MESSI provides a plug-and-play method which is able to provide catchment process information, a mathematical optimal model and a measure of uncertainty based on the observation. Most important, it is shown that with a little effort new techniques can be brought directly to the engineering arena which will improve the interaction between the scientist and the engineer.

  18. SModelS: A Tool for Making Systematic Use of Simplified Models Results

    Science.gov (United States)

    Waltenberger, Wolfgang; SModelS Group

    2016-10-01

    We present an automated software tool ”SModelS” to systematically confront theories Beyond the Standard Model (BSM) with experimental data. The tool consists of a general procedure to decompose such BSM theories into their Simplified Models Spectra (SMS). In addition, SModelS features a database containing the majority of the published SMS results of CMS and ATLAS. These results consist of the 95% confidence level upper limits on signal production cross sections. The two components together allow us to quickly confront any BSM model with LHC results. As a show-case example we will briefly discuss an application of our procedure to a specific supersymmetric model. It is one of our ongoing efforts to extend the framework to include also efficiency maps produced either by the experimental collaborations, by efforts performed within the phenomenological groups, or possibly also by ourselves. While the current implementation can handle null results only, it is our ultimate goal to build the Next Standard Model in a bottom-up fashion from both negative and positive results of several experiments. The implementation is open source, written in python, and available from http://smodels.hephy.at.

  19. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  20. ARCHITECTURAL FORM CREATION IN THE DESIGN STUDIO: PHYSICAL MODELING AS AN EFFECTIVE DESIGN TOOL

    Directory of Open Access Journals (Sweden)

    Wael Abdelhameed

    2011-11-01

    Full Text Available This research paper attempts to shed more light on an area of the design studio, which concerns with the use of physical modeling as a design medium in architectural form creation. An experiment has been carried out during an architectural design studio in order to not only investigate physical modeling as a tool of form creation but also improve visual design thinking that students employ while using this manual tool. To achieve the research objective, a method was proposed and applied to track form creation processes, based upon three types of operation, namely: sketching transformations, divergent physical-modeling transformations, and convergent physical-modeling transformations. The method helps record the innovative transitions of form during conceptual designing in a simple way. Investigating form creation processes and activities associated with visual design thinking enables the research to conclude to general results of the role of physical modeling in the conceptual phase of designing, and to specific results of the methods used in this architectural design studio experiment.