WorldWideScience

Sample records for modeling tool nemo

  1. NEMO Oceanic Model Optimization

    Science.gov (United States)

    Epicoco, I.; Mocavero, S.; Murli, A.; Aloisio, G.

    2012-04-01

    NEMO is an oceanic model used by the climate community for stand-alone or coupled experiments. Its parallel implementation, based on MPI, limits the exploitation of the emerging computational infrastructures at peta and exascale, due to the weight of communications. As case study we considered the MFS configuration developed at INGV with a resolution of 1/16° tailored on the Mediterranenan Basin. The work is focused on the analysis of the code on the MareNostrum cluster and on the optimization of critical routines. The first performance analysis of the model aimed at establishing how much the computational performance are influenced by the GPFS file system or the local disks and wich is the best domain decomposition. The results highlight that the exploitation of local disks can reduce the wall clock time up to 40% and that the best performance is achieved with a 2D decomposition when the local domain has a square shape. A deeper performance analysis highlights the obc_rad, dyn_spg and tra_adv routines are the most time consuming routines. The obc_rad implements the evaluation of the open boundaries and it has been the first routine to be optimized. The communication pattern implemented in obc_rad routine has been redesigned. Before the introduction of the optimizations all processes were involved in the communication, but only the processes on the boundaries have the actual data to be exchanged and only the data on the boundaries must be exchanged. Moreover the data along the vertical levels are "packed" and sent with only one MPI_send invocation. The overall efficiency increases compared with the original version, as well as the parallel speed-up. The execution time was reduced of about 33.81%. The second phase of optimization involved the SOR solver routine, implementing the Red-Black Successive-Over-Relaxation method. The high frequency of exchanging data among processes represent the most part of the overall communication time. The number of communication is

  2. NEMO-Nordic : A NEMO based ocean modelling configuration for Baltic & North Seas

    Science.gov (United States)

    Hordoir, Robinson; Schimanke, Semjon; Axell, Lars; Gröger, Matthias; Dieterich, Christian; Liu, Ye; Höglund, Anders; Kuznetsov, Ivan; Ljungemyr, Patrik; Nygren, Petter; Jönsson, Anette; Meier, Markus

    2015-04-01

    Based on the NEMO ocean engine, three regional setups for the North Sea and Baltic Sea domain have been developed : the NEMO-Nordic configuration is declined in an operational setup, a stand-alone version used for climate and process studies, and a NEMO-Nordic-RCA4 atmosphere/ocean coupled configuration used for downscalling climate scenarios. We give a brief overview of the options chosen within the NEMO engine to design the configurations. Based on the results provided by each of the three configurations, we also provide an assessment of the strengths and weaknesses of NEMO-Nordic. Finally, a validation of the configurations is provided based on an extensive comparison between in-situ measurements and model results for temperature, salinity, sea-ice extent, sea level and mean circulation.

  3. Nanoelectronic Modeling (NEMO): Moving from commercial grade 1-D simulation to prototype 3-D simulation

    Science.gov (United States)

    Klimeck, Gerhard

    2001-03-01

    The quantum mechanical functionality of commercially pursued heterostructure devices such as resonant tunneling diodes (RTDs), quantum well infrared photodetectors, and quantum well lasers are enabled by material variations on an atomic scale. The creation of these heterostructure devices is realized in a vast design space of material compositions, layer thicknesses and doping profiles. The full experimental exploration of this design space is unfeasible and a reliable design tool is needed. The Nanoelectronic Modeling tool (NEMO) is one of the first commercial grade attempts for such a modeling tool. NEMO was developed as a general-purpose quantum mechanics-based 1-D device design and analysis tool from 1993-97 by the Central Research Laboratory of Texas Instruments (later Raytheon Systems). NEMO enables(R. Lake, G. Klimeck, R. C. Bowen, and D. Jovanovic, J. Appl. Phys. 81), 7845 (1997). the fundamentally sound inclusion of the required(G. Klimeck et al.), in the 1997 55th Annual Device Research Conference Digest, (IEEE, NJ, 1997), p. 92^,(R. C. Bowen et al.), J. Appl. Phys 81, 3207 (1997). physics: bandstructure, scattering, and charge self-consistency based on the non-equilibrium Green function approach. A new class of devices which require full 3-D quantum mechanics based models is starting to emerge: quantum dots, or in general semiconductor based deca-nano devices. We are currently building a 3-D modeling tool based on NEMO to include the important physics to understand electronic stated in such superscaled structures. This presentation will overview various facets of the NEMO 1-D tool such electron transport physics in RTDs, numerical technology, software engineering and graphical user interface. The lessons learned from that work are now entering the NEMO 3-D development and first results using the NEMO 3-D prototype will be shown. More information about the publically available NEMO 1-D executables can be found at http://hpc.jpl.nasa.gov/ PEP/gekco/nemo

  4. Towards petascaling of the NEMO ocean model

    Science.gov (United States)

    Donners, J.; Audiffren, N.; Molines, J.-M.

    2012-04-01

    PRACE, the Partnership for Advanced Computing in Europe, offers acces to the largest high-performance computing systems in Europe. These systems follow the trend of increasing numbers of nodes, each with an increasing number of cores. To utilize these computing systems, it is necessary to use a model that is parallellized and has a good scalability. This poster describes different efforts to improve the scalability of the NEMO ocean model. Most importantly, the problem size needs to be chosen adequately: it should contain enough computations to keep thousands of cores busy, but foremostly it has to be scientifically relevant. The global, 1/12degree, NEMO ocean model configuration, developed by the Mercator team, is used for operational ocean forecasting. Therefore, PRACE selected this model for the PRACE Benchmarking suite. However, an increased problem size alone was not enough to efficiently use these petascale systems. Different optimizations were required to reach the necessary performance. Scientifically, the model should simulate one year within a wallclock day. Technically, the application needs to scale up to a minimum number of cores. For example, to utilize the fastest system in Europe, the new Curie system in France, the lower limit is 2048 cores. Scalability can be increased by minimizing the time needed for communication between cores. This has been done in two ways. Firstly, advanced parameters of the MPI-communication library were optimized. The improvement consists in: 1. using RDMA for eager messages (NEMO messages size are below the eager size limit) conjugated with adequate openib flags. 2. tuning for openMPI for collective communication through the btl_coll_tuned_dynamic_rules flag. Overall, the improvement is 33%. Secondly, NEMO uses a tri-polar and staggered grid, which involves a complicated fold across the northpole. Communication along this fold involves collective gather and scatter operations which create a bottleneck at a single core, so

  5. Spectral modeling of scintillator for the NEMO-3 and SuperNEMO detectors

    CERN Document Server

    Argyriades, J; Augier, C; Baker, J; Barabash, A S; Bongrand, M; Broudin-Bay, G; Brudanin, V B; Caffrey, A J; Cebrián, S; Chapon, A; Chauveau, E; Dafni, Th; Daraktchieva, Z; iaz, J D; Durand, D; Egorov, V G; Evans, J J; Fatemi-Ghomi, N; Flack, R; Basharina-Freshville, A; Fushimi, K-I; Garrido, X; Gómez, H; Guillon, B; Holin, A; Holy, K; Horkey, J J; Hubert, Ph; Hugon, C; Iguaz, F J; Irastorza, I G; Ishihara, N; Jackson, C M; Jullian, S; Kanamaru, S; Kauer, M; Kochetov, O I; Konovalov, S I; Kovalenko, V E; Lalanne, D; Lang, K; ere, Y Lemi; Lutter, G; Luzón, G; Mamedov, F; Marquet, Ch; Martin-Albo, J; Mauger, F; Monrabal, F; Nachab, A; Nasteva, I; Nemchenok, I B; Nguyen, C H; Nova, F; Novella, P; Ohsumi, H; Pahlka, R B; Perrot, F; Piquemal, F; Povinec, P P; Richards, B; Ricol, J S; Riddle, C L; Rodriguez, A; Saakyan, R; Sarazin, X; Sedgbeer, J K; Serra, L; Simard, L; Šimkovic, F; Shitov, Yu A; Smolnikov, A A; Soldner-Rembold, S; Štekl, I; Sugaya, Y; Sutton, C S; Szklarz, G; Tamagawa, Y; Thomas, J; Thompson, R; Timkin, V V; Tretyak, V I; Tretyak, Vl I; Umatov, V I; ala, L V; Vanyushin, I A; Vasiliev, R; Vorobel, V; Vylov, Ts; Waters, D; Yahlali, N; Žukauskas, A

    2010-01-01

    We have constructed a GEANT4-based detailed software model of photon transport in plastic scintillator blocks and have used it to study the NEMO-3 and SuperNEMO calorimeters employed in experiments designed to search for neutrinoless double beta decay. We compare our simulations to measurements using conversion electrons from a calibration source of $\\rm ^{207}Bi$ and show that the agreement is improved if wavelength-dependent properties of the calorimeter are taken into account. In this article, we briefly describe our modeling approach and results of our studies.

  6. Coarsening of physics for biogeochemical model in NEMO

    Science.gov (United States)

    Bricaud, Clement; Le Sommer, Julien; Madec, Gurvan; Deshayes, Julie; Chanut, Jerome; Perruche, Coralie

    2017-04-01

    Ocean mesoscale and submesoscale turbulence contribute to ocean tracer transport and to shaping ocean biogeochemical tracers distribution. Representing adequately tracer transport in ocean models therefore requires to increase model resolution so that the impact of ocean turbulence is adequately accounted for. But due to supercomputers power and storage limitations, global biogeochemical models are not yet run routinely at eddying resolution. Still, because the "effective resolution" of eddying ocean models is much coarser than the physical model grid resolution, tracer transport can be reconstructed to a large extent by computing tracer transport and diffusion with a model grid resolution close to the effective resolution of the physical model. This observation has motivated the implementation of a new capability in NEMO ocean model (http://www.nemo-ocean.eu/) that allows to run the physical model and the tracer transport model at different grid resolutions. In a first time, we present results obtained with this new capability applied to a synthetic age tracer in a global eddying model configuration. In this model configuration, ocean dynamic is computed at ¼° resolution but tracer transport is computed at 3/4° resolution. The solution obtained is compared to 2 reference setup ,one at ¼° resolution for both physics and passive tracer models and one at 3/4° resolution for both physics and passive tracer model. We discuss possible options for defining the vertical diffusivity coefficient for the tracer transport model based on information from the high resolution grid. We describe the impact of this choice on the distribution and one the penetration of the age tracer. In a second time we present results obtained by coupling the physics with the biogeochemical model PISCES. We look at the impact of this methodology on some tracers distribution and dynamic. The method described here can found applications in ocean forecasting, such as the Copernicus Marine

  7. OpenDA-NEMO framework for ocean data assimilation

    NARCIS (Netherlands)

    Van Velzen, C.; Altaf, M.U.; Verlaan, M.

    2016-01-01

    Data assimilation methods provide a means to handle the modeling errors and uncertainties in sophisticated ocean models. In this study, we have created an OpenDA-NEMO framework unlocking the data assimilation tools available in OpenDA for use with NEMO models. This includes data assimilation methods

  8. Performance Optimization of NEMO Oceanic Model at High Resolution

    Science.gov (United States)

    Epicoco, Italo; Mocavero, Silvia; Aloisio, Giovanni

    2014-05-01

    The NEMO oceanic model is based on the Navier-Stokes equations along with a nonlinear equation of state, which couples the two active tracers (temperature and salinity) to the fluid velocity. The code is written in Fortan 90 and parallelized using MPI. The resolution of the global ocean models used today for climate change studies limits the prediction accuracy. To overcome this limit, a new high-resolution global model, based on NEMO, simulating at 1/16° and 100 vertical levels has been developed at CMCC. The model is computational and memory intensive, so it requires many resources to be run. An optimization activity is needed. The strategy requires a preliminary analysis to highlight scalability bottlenecks. It has been performed on a SandyBridge architecture at CMCC. An efficiency of 48% on 7K cores (the maximum available) has been achieved. The analysis has been also carried out at routine level, so that the improvement actions could be designed for the entire code or for the single kernel. The analysis highlighted for example a loss of performance due to the routine used to implement the north fold algorithm (i.e. handling the points at the north pole of the 3-poles Grids): indeed an optimization of the routine implementation is needed. The folding is achieved considering only the last 4 rows on the top of the global domain and by applying a rotation pivoting on the point in the middle. During the folding, the point on the top left is updated with the value of the point on bottom right and so on. The current version of the parallel algorithm is based on the domain decomposition. Each MPI process takes care of a block of points. Each process can update its points using values belonging to the symmetric process. In the current implementation, each received message is placed in a buffer with a number of elements equal to the total dimension of the global domain. Each process sweeps the entire buffer, but only a part of that computation is really useful for the

  9. Adapting NEMO for use as the UK operational storm surge forecasting model

    Science.gov (United States)

    Furner, Rachel; Williams, Jane; Horsburgh, Kevin; Saulter, Andrew

    2016-04-01

    The United Kingdom is an area vulnerable to damage due to storm surges, particularly the East Coast which suffered losses estimated at over £1 billion during the North Sea surge event of the 5th and 6th December 2013. Accurate forecasting of storm surge events for this region is crucial to enable government agencies to assess the risk of overtopping of coastal defences so they can respond appropriately, minimising risk to life and infrastructure. There has been an operational storm surge forecast service for this region since 1978, using a numerical model developed by the National Oceanography Centre (NOC) and run at the UK Met Office. This is also implemented as part of an ensemble prediction system, using perturbed atmospheric forcing to produce an ensemble surge forecast. In order to ensure efficient use of future supercomputer developments and to create synergy with existing operational coastal ocean models the Met Office and NOC have begun a joint project transitioning the storm surge forecast system from the current CS3X code base to a configuration based on the Nucleus for European Modelling of the Ocean (NEMO). This work involves both adapting NEMO to add functionality, such as allowing the drying out of ocean cells and changes allowing NEMO to run efficiently as a two-dimensional, barotropic model. As the ensemble surge forecast system is run with 12 members 4 times a day computational efficiency is of high importance. Upon completion this project will enable interesting scientific comparisons to be made between a NEMO based surge model and the full three-dimensional baroclinic NEMO based models currently run within the Met Office, facilitating assessment of the impact of baroclinic processes, and vertical resolution on sea surface height forecasts. Moving to a NEMO code base will also allow many future developments to be more easily used within the storm surge model due to the wide range of options which currently exist within NEMO or are planned for

  10. Surface Wave Effects in the NEMO Ocean Model: Forced and Coupled Experiments

    CERN Document Server

    Breivik, Øyvind; Bidlot, Jean-Raymond; Balmaseda, Magdalena Alonso; Janssen, Peter A E M

    2015-01-01

    The NEMO general circulation ocean model is extended to incorporate three physical processes related to ocean surface waves, namely the surface stress (modified by growth and dissipation of the oceanic wave field), the turbulent kinetic energy flux from breaking waves, and the Stokes-Coriolis force. Experiments are done with NEMO in ocean-only (forced) mode and coupled to the ECMWF atmospheric and wave models. Ocean-only integrations are forced with fields from the ERA-Interim reanalysis. All three effects are noticeable in the extra-tropics, but the sea-state dependent turbulent kinetic energy flux yields by far the largest difference. This is partly because the control run has too vigorous deep mixing due to an empirical mixing term in NEMO. We investigate the relation between this ad hoc mixing and Langmuir turbulence and find that it is much more effective than the Langmuir parameterization used in NEMO. The biases in sea surface temperature as well as subsurface temperature are reduced, and the total oce...

  11. Modelling turbulent vertical mixing sensitivity using a 1-D version of NEMO

    Directory of Open Access Journals (Sweden)

    G. Reffray

    2014-08-01

    Full Text Available Through two numerical experiments, a 1-D vertical model called NEMO1D was used to investigate physical and numerical turbulent-mixing behaviour. The results show that all the turbulent closures tested (k + l from Blanke and Delecluse, 1993 and two equation models: Generic Lengh Scale closures from Umlauf and Burchard, 2003 are able to correctly reproduce the classical test of Kato and Phillips (1969 under favourable numerical conditions while some solutions may diverge depending on the degradation of the spatial and time discretization. The performances of turbulence models were then compared with data measured over a one-year period (mid-2010 to mid-2011 at the PAPA station, located in the North Pacific Ocean. The modelled temperature and salinity were in good agreement with the observations, with a maximum temperature error between −2 and 2 °C during the stratified period (June to October. However the results also depend on the numerical conditions. The vertical RMSE varied, for different turbulent closures, from 0.1 to 0.3 °C during the stratified period and from 0.03 to 0.15 °C during the homogeneous period. This 1-D configuration at the PAPA station (called PAPA1D is now available in NEMO as a reference configuration including the input files and atmospheric forcing set described in this paper. Thus, all the results described can be recovered by downloading and launching PAPA1D. The configuration is described on the NEMO site (http://www.nemo-ocean.eu/Using-NEMO/Configurations/C1D_PAPA. This package is a good starting point for further investigation of vertical processes.

  12. NEMO-ICB (v1.0: interactive icebergs in the NEMO ocean model globally configured at coarse and eddy-permitting resolution

    Directory of Open Access Journals (Sweden)

    R. Marsh

    2014-08-01

    Full Text Available NEMO-ICB features interactive icebergs in the NEMO ocean model. Simulations with coarse (2° and eddy-permitting (0.25° global configurations of NEMO-ICB are undertaken to evaluate the influence of icebergs on sea-ice, hydrography and transports, through comparison with control simulations in which the equivalent iceberg mass flux is applied as coastal runoff, the default forcing in NEMO. Comparing a short (14 year spin-up of the 0.25° model with a computationally cheaper 105 year spin-up of the 2° configuration, calving, drift and melting of icebergs is evidently near equilibrium in the shorter simulation, justifying closer examination of iceberg influences in the eddy-permitting configuration. Freshwater forcing due to iceberg melt is most pronounced in southern high latitudes, where it is locally dominant over precipitation. Sea ice concentration and thickness in the Southern Ocean are locally increased with icebergs, by up to ~ 8 and ~ 25% respectively. Iceberg melting reduces surface salinity by ~ 0.2 psu around much of Antarctica, with compensating increases immediately adjacent to Antarctica, where coastal runoff is suppressed. Discernible effects on salinity and temperature extend to 1000 m. At many locations and levels, freshening and cooling indicate a degree of density compensation. However, freshening is a dominant influence on upper ocean density gradients across much of the high-latitude Southern Ocean, leading to weaker meridional density gradients, a reduced eastward transport tendency, and hence an increase of ~ 20% in westward transport of the Antarctic Coastal Current.

  13. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  14. The Network Modification (NeMo) Tool: Elucidating the Effect of White Matter Integrity Changes on Cortical and Subcortical Structural Connectivity

    OpenAIRE

    Kuceyeski, Amy; Maruta, Jun; Relkin, Norman; Raj, Ashish

    2013-01-01

    Accurate prediction of brain dysfunction caused by disease or injury requires the quantification of resultant neural connectivity changes compared with the normal state. There are many methods with which to assess anatomical changes in structural or diffusion magnetic resonance imaging, but most overlook the topology of white matter (WM) connections that make up the healthy brain network. Here, a new neuroimaging software pipeline called the Network Modification (NeMo) Tool is presented that ...

  15. Development of a probabilistic ocean modelling system based on NEMO 3.5: application at eddying resolution

    Science.gov (United States)

    Bessières, Laurent; Leroux, Stéphanie; Brankart, Jean-Michel; Molines, Jean-Marc; Moine, Marie-Pierre; Bouttier, Pierre-Antoine; Penduff, Thierry; Terray, Laurent; Barnier, Bernard; Sérazin, Guillaume

    2017-03-01

    This paper presents the technical implementation of a new, probabilistic version of the NEMO ocean-sea-ice modelling system. Ensemble simulations with N members running simultaneously within a single executable, and interacting mutually if needed, are made possible through an enhanced message-passing interface (MPI) strategy including a double parallelization in the spatial and ensemble dimensions. An example application is then given to illustrate the implementation, performances, and potential use of this novel probabilistic modelling tool. A large ensemble of 50 global ocean-sea-ice hindcasts has been performed over the period 1960-2015 at eddy-permitting resolution (1/4°) for the OCCIPUT (oceanic chaos - impacts, structure, predictability) project. This application aims to simultaneously simulate the intrinsic/chaotic and the atmospherically forced contributions to the ocean variability, from mesoscale turbulence to interannual-to-multidecadal timescales. Such an ensemble indeed provides a unique way to disentangle and study both contributions, as the forced variability may be estimated through the ensemble mean, and the intrinsic chaotic variability may be estimated through the ensemble spread.

  16. Performance and results of the high-resolution biogeochemical model PELAGOS025 within NEMO

    Directory of Open Access Journals (Sweden)

    I. Epicoco

    2015-12-01

    Full Text Available The present work aims at evaluating the scalability performance of a high-resolution global ocean biogeochemistry model (PELAGOS025 on massive parallel architectures and the benefits in terms of the time-to-solution reduction. PELAGOS025 is an on-line coupling between the physical ocean model NEMO and the BFM biogeochemical model. Both the models use a parallel domain decomposition along the horizontal dimension. The parallelisation is based on the message passing paradigm. The performance analysis has been done on two parallel architectures, an IBM BlueGene/Q at ALCF (Argonne Leadership Computing Facilities and an IBM iDataPlex with Sandy Bridge processors at CMCC (Euro Mediterranean Center on Climate Change. The outcome of the analysis demonstrated that the lack of scalability is due to several factors such as the I/O operations, the memory contention, the load unbalancing due to the memory structure of the BFM component and, for the BlueGene/Q, the absence of a hybrid parallelisation approach.

  17. Simulation of snowbands in the Baltic Sea area with the coupled atmosphere-ocean-ice model COSMO-CLM/NEMO

    Directory of Open Access Journals (Sweden)

    Trang Van Pham

    2017-02-01

    Full Text Available Wind-parallel bands of snowfall over the Baltic Sea area are common during late autumn and early winter. This phenomenon occurs when cold air flows over the warm water surface, enhancing convection and leading to heavy snow fall. Six snowband events from 1985 to 2010 are simulated by using the coupled atmosphere-ocean-ice model COSMO-CLM/NEMO. The model resolution is reasonably high to capture the snowbands; the atmospheric model COSMO-CLM has a horizontal grid-spacing of approximately 25 km and the ocean sea-ice model NEMO has a horizontal grid-spacing of approximately 3 km. The model results show that the coupled system COSMO-CLM/NEMO successfully reproduced the snowband events with a high contrast of temperatures between the surface and the atmosphere, sharp bands of precipitation over the sea, as well as the enormous heat fluxes released by the ocean to the atmosphere during the days when snowbands occurred. In the two cases when radar data are available, the model precipitation is shown to be in satisfactory agreement. The precipitation patterns closely follow the cloud shapes on satellite images. When not coupled with the ocean model, the atmospheric stand-alone model provided acceptable results if forced by high-quality sea surface temperatures (SSTs from reanalysis data. However, COSMO-CLM forced with lower quality SSTs could not recreate the snowbands. The results indicate the need of an atmospheric model with high SST skill or a coupled ocean model when extreme event climatology is the primary aim in the Baltic Sea area.

  18. The link between the Barents Sea and ENSO events simulated by NEMO model

    Directory of Open Access Journals (Sweden)

    V. N. Stepanov

    2012-11-01

    Full Text Available An analysis of observational data in the Barents Sea along a meridian at 33°30' E between 70°30' and 72°30' N has reported a negative correlation between El Niño/La Niña Southern Oscillation (ENSO events and water temperature in the top 200 m: the temperature drops about 0.5 °C during warm ENSO events while during cold ENSO events the top 200 m layer of the Barents Sea is warmer.

    Results from 1 and 1/4-degree global NEMO models show a similar response for the whole Barents Sea. During the strong warm ENSO event in 1997–1998 an anomalous anticyclonic atmospheric circulation over the Barents Sea enhances heat loses, as well as substantially influencing the Barents Sea inflow from the North Atlantic, via changes in ocean currents. Under normal conditions along the Scandinavian peninsula there is a warm current entering the Barents Sea from the North Atlantic, however after the 1997–1998 event this current is weakened.

    During 1997–1998 the model annual mean temperature in the Barents Sea is decreased by about 0.8 °C, also resulting in a higher sea ice volume. In contrast during the cold ENSO events in 1999–2000 and 2007–2008, the model shows a lower sea ice volume, and higher annual mean temperatures in the upper layer of the Barents Sea of about 0.7 °C. An analysis of model data shows that the strength of the Atlantic inflow in the Barents Sea is the main cause of heat content variability, and is forced by changing pressure and winds in the North Atlantic. However, surface heat-exchange with the atmosphere provides the means by which the Barents sea heat budget relaxes to normal in the subsequent year after the ENSO events.

  19. Sea-ice evaluation of NEMO-Nordic 1.0: a NEMO-LIM3.6-based ocean-sea-ice model setup for the North Sea and Baltic Sea

    Science.gov (United States)

    Pemberton, Per; Löptien, Ulrike; Hordoir, Robinson; Höglund, Anders; Schimanke, Semjon; Axell, Lars; Haapala, Jari

    2017-08-01

    The Baltic Sea is a seasonally ice-covered marginal sea in northern Europe with intense wintertime ship traffic and a sensitive ecosystem. Understanding and modeling the evolution of the sea-ice pack is important for climate effect studies and forecasting purposes. Here we present and evaluate the sea-ice component of a new NEMO-LIM3.6-based ocean-sea-ice setup for the North Sea and Baltic Sea region (NEMO-Nordic). The setup includes a new depth-based fast-ice parametrization for the Baltic Sea. The evaluation focuses on long-term statistics, from a 45-year long hindcast, although short-term daily performance is also briefly evaluated. We show that NEMO-Nordic is well suited for simulating the mean sea-ice extent, concentration, and thickness as compared to the best available observational data set. The variability of the annual maximum Baltic Sea ice extent is well in line with the observations, but the 1961-2006 trend is underestimated. Capturing the correct ice thickness distribution is more challenging. Based on the simulated ice thickness distribution we estimate the undeformed and deformed ice thickness and concentration in the Baltic Sea, which compares reasonably well with observations.

  20. Explicit representation and parametrised impacts of under ice shelf seas in the z∗ coordinate ocean model NEMO 3.6

    Directory of Open Access Journals (Sweden)

    P. Mathiot

    2017-07-01

    Full Text Available Ice-shelf–ocean interactions are a major source of freshwater on the Antarctic continental shelf and have a strong impact on ocean properties, ocean circulation and sea ice. However, climate models based on the ocean–sea ice model NEMO (Nucleus for European Modelling of the Ocean currently do not include these interactions in any detail. The capability of explicitly simulating the circulation beneath ice shelves is introduced in the non-linear free surface model NEMO. Its implementation into the NEMO framework and its assessment in an idealised and realistic circum-Antarctic configuration is described in this study. Compared with the current prescription of ice shelf melting (i.e. at the surface, inclusion of open sub-ice-shelf cavities leads to a decrease in sea ice thickness along the coast, a weakening of the ocean stratification on the shelf, a decrease in salinity of high-salinity shelf water on the Ross and Weddell sea shelves and an increase in the strength of the gyres that circulate within the over-deepened basins on the West Antarctic continental shelf. Mimicking the overturning circulation under the ice shelves by introducing a prescribed meltwater flux over the depth range of the ice shelf base, rather than at the surface, is also assessed. It yields similar improvements in the simulated ocean properties and circulation over the Antarctic continental shelf to those from the explicit ice shelf cavity representation. With the ice shelf cavities opened, the widely used three equation ice shelf melting formulation, which enables an interactive computation of melting, is tested. Comparison with observational estimates of ice shelf melting indicates realistic results for most ice shelves. However, melting rates for the Amery, Getz and George VI ice shelves are considerably overestimated.

  1. Explicit representation and parametrised impacts of under ice shelf seas in the z∗ coordinate ocean model NEMO 3.6

    Science.gov (United States)

    Mathiot, Pierre; Jenkins, Adrian; Harris, Christopher; Madec, Gurvan

    2017-07-01

    Ice-shelf-ocean interactions are a major source of freshwater on the Antarctic continental shelf and have a strong impact on ocean properties, ocean circulation and sea ice. However, climate models based on the ocean-sea ice model NEMO (Nucleus for European Modelling of the Ocean) currently do not include these interactions in any detail. The capability of explicitly simulating the circulation beneath ice shelves is introduced in the non-linear free surface model NEMO. Its implementation into the NEMO framework and its assessment in an idealised and realistic circum-Antarctic configuration is described in this study. Compared with the current prescription of ice shelf melting (i.e. at the surface), inclusion of open sub-ice-shelf cavities leads to a decrease in sea ice thickness along the coast, a weakening of the ocean stratification on the shelf, a decrease in salinity of high-salinity shelf water on the Ross and Weddell sea shelves and an increase in the strength of the gyres that circulate within the over-deepened basins on the West Antarctic continental shelf. Mimicking the overturning circulation under the ice shelves by introducing a prescribed meltwater flux over the depth range of the ice shelf base, rather than at the surface, is also assessed. It yields similar improvements in the simulated ocean properties and circulation over the Antarctic continental shelf to those from the explicit ice shelf cavity representation. With the ice shelf cavities opened, the widely used three equation ice shelf melting formulation, which enables an interactive computation of melting, is tested. Comparison with observational estimates of ice shelf melting indicates realistic results for most ice shelves. However, melting rates for the Amery, Getz and George VI ice shelves are considerably overestimated.

  2. The link between the Barents Sea and ENSO events reproduced by NEMO model

    Directory of Open Access Journals (Sweden)

    V. N. Stepanov

    2012-05-01

    Full Text Available An analysis of observational data in the Barents Sea along a meridian at 33°30´ E between 70°30´ and 72°30´ N has reported a negative correlation between El Niño/La Niña-Southern Oscillation (ENSO events and water temperature in the top 200 m: the temperature drops about 0.5 °C during warm ENSO events while during cold ENSO events the top 200 m layer of the Barents Sea is warmer. Results from 1 and 1/4-degree global NEMO models show a similar response for the whole Barents Sea. During the strong warm ENSO event in 1997–1998 an anticyclonic atmospheric circulation is settled over the Barents Sea instead of a usual cyclonic circulation. This change enhances heat loses in the Barents Sea, as well as substantially influencing the Barents Sea inflow from the North Atlantic, via changes in ocean currents. Under normal conditions along the Scandinavian peninsula there is a warm current entering the Barents sea from the North Atlantic, however after the 1997–1998 event this current is weakened.

    During 1997–1998 the model annual mean temperature in the Barents Sea is decreased by about 0.8 °C, also resulting in a higher sea ice volume. In contrast during the cold ENSO events in 1999–2000 and 2007–2008 the model shows a lower sea ice volume, and higher annual mean temperatures in the upper layer of the Barents Sea of about 0.7 °C.

    An analysis of model data shows that the Barents Sea inflow is the main source for the variability of Barents Sea heat content, and is forced by changing pressure and winds in the North Atlantic. However, surface heat-exchange with atmosphere can also play a dominant role in the Barents Sea annual heat balance, especially for the subsequent year after ENSO events.

  3. OpenDA-NEMO framework for ocean data assimilation

    Science.gov (United States)

    van Velzen, Nils; Altaf, Muhammad Umer; Verlaan, Martin

    2016-05-01

    Data assimilation methods provide a means to handle the modeling errors and uncertainties in sophisticated ocean models. In this study, we have created an OpenDA-NEMO framework unlocking the data assimilation tools available in OpenDA for use with NEMO models. This includes data assimilation methods, automatic parallelization, and a recently implemented automatic localization algorithm that removes spurious correlations in the model based on uncertainties in the computed Kalman gain matrix. We have set up a twin experiment where we assimilate sea surface height (SSH) satellite measurements. From the experiments, we can conclude that the OpenDA-NEMO framework performs as expected and that the automatic localization significantly improves the performance of the data assimilation algorithm by successfully removing spurious correlations. Based on these results, it looks promising to extend the framework with new kinds of observations and work on improving the computational speed of the automatic localization technique such that it becomes feasible to include large number of observations.

  4. Assimilation of simulated satellite altimetric data and ARGO temperature data into a double-gyre NEMO ocean model

    Science.gov (United States)

    Yan, Yajing; Barth, Alexander; Laenen, François; Beckers, Jean-Marie

    2013-04-01

    In recent years, data assimilation, adressing the problem of producing useful analyses and forecasts given imperfect dynamical models and observations, has shown increasing interest in the atmosphere and ocean science community. The efficiency of data assimilation in improving the model prediction has been proven by numerous work. However, it is still a challenge to design operational data assimilation schemes which can be operated with realistic ocean models, with reasonable quality and at acceptable cost. In this work, several experiments, assimilating the simulated altimetry and temperature observations into a double-gyre NEMO ocean model, are performed with objective to investigate the impact of different assimilation setups, including changing the observation distribution, the ensemble size and the localisation scale, on the quality of the analysis. The double-gyre NEMO ocean model corresponds to an idealized configuration of the NEMO model: a square and 5000-meter deep flat bottom ocean at mid latitudes (the so called square-box or SQB configuration). The main physical parameters governing the dominant characteristics of the flow are the initial stratification, the wind stress, the bottom friction and the lateral mixing parameterization. The domain extends from 24N to 44N, over 30° in longitude (60W - 30W) with 11 vertical levels between 152 m and 4613 m in depth. The minimum horizontal resolution of the model is 1/4°. The observations are generated from the model simulations (the truth) by adding spatially uncorrelated gaussian noise with given standard deviation. Two types of observation are considered : sea surface height (SSH) and temperature. The observation grid of the SSH is simulated from the ENVISAT and Jason-1 satellite tracks, and that of the temperature is generated in order to mimic the ARGO float profile. The observation localisation is performed in order to avoid spurious correlation at large distance. For this, the observations are weighted

  5. Interactions between Arctic sea ice drift, concentration and thickness modeled by NEMO-LIM3 at different resolutions

    Science.gov (United States)

    Docquier, David; Massonnet, François; Raulier, Jonathan; Lecomte, Olivier; Fichefet, Thierry

    2016-04-01

    Sea ice concentration and thickness have substantially decreased in the Arctic since the beginning of the satellite era. As a result, mechanical strength has decreased allowing more fracturing and leading to increased sea ice drift. However, recent studies have highlighted that the interplay between sea ice thermodynamics and dynamics is poorly represented in contemporary global climate model (GCM) simulations. Thus, the considerable inter-model spread in terms of future sea ice extent projections could be reduced by better understanding the interactions between drift, concentration and thickness. This study focuses on the results coming from the global coupled ocean-sea ice model NEMO-LIM3 between 1979 and 2012. Three different simulations are forced by the Drakkar Forcing Set (DFS) 5.2 and run on the global tripolar ORCA grid at spatial resolutions of 0.25, 1° and 2°. The relation between modeled sea ice drift, concentration and thickness is further analyzed, compared to observations and discussed in the framework of the above-mentioned poor representation. It is proposed as a process-based metric for evaluating model performance. This study forms part of the EU Horizon 2020 PRIMAVERA project aiming at developing a new generation of advanced and well-evaluated high-resolution GCMs.

  6. Response of water temperature to surface wave effects in the Baltic Sea: simulations with the coupled NEMO-WAM model

    Science.gov (United States)

    Alari, Victor; Staneva, Joanna; Breivik, Øyvind; Bidlot, Jean-Raymond; Mogensen, Kristian; Janssen, Peter

    2016-04-01

    The effects of wind waves on the Baltic Sea water temperature has been studied by coupling the hydrodynamical model NEMO with the wave model WAM. The wave forcing terms that have been taken into consideration are: Stokes-Coriolis force, seastate dependent energy flux and sea-state dependent momentum flux. The combined role of these processes as well as their individual contributions on simulated temperature is analysed. The results indicate a pronounced effect of waves on surface temperature, on the distribution of vertical temperature and on upwellinǵs. In northern parts of the Baltic Sea a warming of the surface layer occurs in the wave included simulations. This in turn reduces the cold bias between simulated and measured data. The warming is primarily caused by sea-state dependent energy flux. Wave induced cooling is mostly observed in near coastal areas and is mainly due to Stokes-Coriolis forcing. The latter triggers effect of intensifying upwellings near the coasts, depending on the direction of the wind. The effect of sea-state dependent momentum flux is predominantly to warm the surface layer. During the summer the wave induced water temperature changes were up to 1 °C.

  7. A comparative signaling cost analysis of Macro Mobility scheme in NEMO (MM-NEMO) with mobility management protocol

    Science.gov (United States)

    Islam, Shayla; Abdalla, Aisha H.; Habaebi, Mohamed H.; Latif, Suhaimi A.; Hassan, Wan H.; Hasan, Mohammad K.; Ramli, H. A. M.; Khalifa, Othman O.

    2013-12-01

    NEMO BSP is an upgraded addition to Mobile IPv6 (MIPv6). As MIPv6 and its enhancements (i.e. HMIPv6) possess some limitations like higher handoff latency, packet loss, NEMO BSP also faces all these shortcomings by inheritance. Network Mobility (NEMO) is involved to handle the movement of Mobile Router (MR) and it's Mobile Network Nodes (MNNs) during handoff. Hence it is essential to upgrade the performance of mobility management protocol to obtain continuous session connectivity with lower delay and packet loss in NEMO environment. The completion of handoff process in NEMO BSP usually takes longer period since MR needs to register its single primary care of address (CoA) with home network that may cause performance degradation of the applications running on Mobile Network Nodes. Moreover, when a change in point of attachment of the mobile network is accompanied by a sudden burst of signaling messages, "Signaling Storm" occurs which eventually results in temporary congestion, packet delays or even packet loss. This effect is particularly significant for wireless environment where a wireless link is not as steady as a wired link since bandwidth is relatively limited in wireless link. Hence, providing continuous Internet connection without any interruption through applying multihoming technique and route optimization mechanism in NEMO are becoming the center of attention to the current researchers. In this paper, we propose a handoff cost model to compare the signaling cost of MM-NEMO with NEMO Basic Support Protocol (NEMO BSP) and HMIPv6.The numerical results shows that the signaling cost for the MM-NEMO scheme is about 69.6 % less than the NEMO-BSP and HMIPv6.

  8. Background constrains of the SuperNEMO experiment for neutrinoless double beta-decay searches

    Science.gov (United States)

    Povinec, Pavel P.

    2017-02-01

    The SuperNEMO experiment is a new generation of experiments dedicated to the search for neutrinoless double beta-decay, which if observed, would confirm the existence of physics beyond the Standard Model. It is based on the tracking and calorimetry techniques, which allow the reconstruction of the final state topology, including timing and kinematics of the double beta-decay transition events, offering a powerful tool for background rejection. While the basic detection strategy of the SuperNEMO detector remains the same as of the NEMO-3 detector, a number of improvements were accomplished for each of detector main components. Upgrades of the detector technologies and development of low-level counting techniques ensure radiopurity control of construction parts of the SuperNEMO detector. A reference material made of glass pellets has been developed to assure quality management and quality control of radiopurity measurements. The first module of the SuperNEMO detector (Demonstrator) is currently under construction in the Modane underground laboratory. No background event is expected in the neutrinoless double beta-decay region in 2.5 years of its operation using 7 kg of 82Se. The half-life sensitivity of the Demonstrator is expected to be >6.5·1024 y, corresponding to an effective Majorana neutrino mass sensitivity of |0.2-0.4| eV (90% C.L.). The full SuperNEMO experiment comprising of 20 modules with 100 kg of 82Se source should reach an effective Majorana neutrino mass sensitivity of |0.04-0.1| eV, and a half-life limit 1·1026 y.

  9. Recent Developments of NEMO: Detection of Solar Eruptions Characteristics

    CERN Document Server

    Podladchikova, Olena; Leontiev, Pavel; Van der Linden, Ronald

    2011-01-01

    The recent developments in space instrumentation for solar observations and telemetry have caused the necessity of advanced pattern recognition tools for the different classes of solar events. The Extreme ultraviolet Imaging Telescope (EIT) of solar corona on-board SOHO spacecraft has uncovered a new class of eruptive events which are often identified as signatures of Coronal Mass Ejection (CME) initiations on solar disk. It is evident that a crucial task is the development of an automatic detection tool of CMEs precursors. The Novel EIT wave Machine Observing (NEMO) (http://sidc.be/nemo) code is an operational tool that detects automatically solar eruptions using EIT image sequences. NEMO applies techniques based on the general statistical properties of the underlying physical mechanisms of eruptive events on the solar disc. In this work, the most recent updates of NEMO code - that have resulted to the increase of the recognition efficiency of solar eruptions linked to CMEs - are presented. These updates pro...

  10. Surface wave effects on water temperature in the Baltic Sea: simulations with the coupled NEMO-WAM model

    Science.gov (United States)

    Alari, Victor; Staneva, Joanna; Breivik, Øyvind; Bidlot, Jean-Raymond; Mogensen, Kristian; Janssen, Peter

    2016-08-01

    Coupled circulation (NEMO) and wave model (WAM) system was used to study the effects of surface ocean waves on water temperature distribution and heat exchange at regional scale (the Baltic Sea). Four scenarios—including Stokes-Coriolis force, sea-state dependent energy flux (additional turbulent kinetic energy due to breaking waves), sea-state dependent momentum flux and the combination these forcings—were simulated to test the impact of different terms on simulated temperature distribution. The scenario simulations were compared to a control simulation, which included a constant wave-breaking coefficient, but otherwise was without any wave effects. The results indicate a pronounced effect of waves on surface temperature, on the distribution of vertical temperature and on upwelling's. Overall, when all three wave effects were accounted for, did the estimates of temperature improve compared to control simulation. During the summer, the wave-induced water temperature changes were up to 1 °C. In northern parts of the Baltic Sea, a warming of the surface layer occurs in the wave included simulations in summer months. This in turn reduces the cold bias between simulated and measured data, e.g. the control simulation was too cold compared to measurements. The warming is related to sea-state dependent energy flux. This implies that a spatio-temporally varying wave-breaking coefficient is necessary, because it depends on actual sea state. Wave-induced cooling is mostly observed in near-coastal areas and is the result of intensified upwelling in the scenario, when Stokes-Coriolis forcing is accounted for. Accounting for sea-state dependent momentum flux results in modified heat exchange at the water-air boundary which consequently leads to warming of surface water compared to control simulation.

  11. Effects of lateral processes on the seasonal water stratification of the Gulf of Finland: 3-D NEMO-based model study

    Science.gov (United States)

    Vankevich, Roman E.; Sofina, Ekaterina V.; Eremina, Tatiana E.; Ryabchenko, Vladimir A.; Molchanov, Mikhail S.; Isaev, Alexey V.

    2016-08-01

    This paper aims to fill the gaps in knowledge of processes affecting the seasonal water stratification in the Gulf of Finland (GOF). We used a state-of-the-art modelling framework NEMO (Nucleus for European Modelling of the Ocean) designed for oceanographic research, operational oceanography, seasonal forecasting, and climate studies to build an eddy-resolving model of the GOF. To evaluate the model skill and performance, two different solutions were obtained on 0.5 km eddy-resolving and commonly used 2 km grids for a 1-year simulation. We also explore the efficacy of non-hydrostatic effect (convection) parameterizations available in NEMO for coastal application. It is found that the solutions resolving submesoscales have a more complex mixed layer structure in the regions of the GOF directly affected by the upwelling/downwelling and intrusions from the open Baltic Sea. Presented model estimations of the upper mixed layer depth are in good agreement with in situ CTD (BED) data. A number of model sensitivity tests to the vertical mixing parameterization confirm the model's robustness. Further progress in the submesoscale process simulation and understanding is apparently not connected mainly with the finer resolution of the grids, but with the use of non-hydrostatic models because of the failure of the hydrostatic approach at submesoscale.

  12. Effects of lateral processes on the seasonal water stratification of the Gulf of Finland: 3-D NEMO-based model study

    Directory of Open Access Journals (Sweden)

    R. E. Vankevich

    2015-10-01

    Full Text Available This paper tries to fill the gaps in knowledge of processes affecting the seasonal water stratification in the Gulf of Finland (GOF. We used state-of-the-art modeling framework NEMO aimed for oceanographic research, operational oceanography, seasonal forecasting and climate studies to build an eddy resolving model of the GOF. To evaluate the model skill and performance two different solutions where obtained on 0.5 km eddy resolving and commonly used 2 km grids for one year simulation. We also explore the efficacy of nonhydrostatic effect (convection parameterizations available in NEMO for coastal application. It is found that the solutions resolving sub-mesoscales have a more complex mixed layer structure in the regions of GOF directly affected by the upwelling/downwelling and intrusions from the open Baltic Sea. Presented model estimations of the upper mixed layer depth are in a good agreement with in situ CTD data. A number of model sensitivity tests to the vertical mixing parameterization confirm the model robustness.

  13. Implementation of Black Sea numerical model based on NEMO and 3DVAR data assimilation scheme for operational forecasting

    Science.gov (United States)

    Ciliberti, Stefania Angela; Peneva, Elisaveta; Storto, Andrea; Rostislav, Kandilarov; Lecci, Rita; Yang, Chunxue; Coppini, Giovanni; Masina, Simona; Pinardi, Nadia

    2016-04-01

    This study describes a new model implementation for the Black Sea, which uses data assimilation, towards operational forecasting, based on NEMO (Nucleus for European Modelling of the Ocean, Madec et al., 2012). The Black Sea domain is resolved with 1/27°×1/36° horizontal resolution (~3 km) and 31 z-levels with partial steps based on the GEBCO bathymetry data (Grayek et al., 2010). The model is forced by momentum, water and heat fluxes interactively computed by bulk formulae using high resolution atmospheric forcing provided by the European Centre for Medium-Range Forecast (ECMWF). The initial condition is calculated from long-term climatological temperature and salinity 3D fields. Precipitation field over the basin has been computed from the climatological GPCP rainfall monthly data (Adler et al., 2003; Huffman et al., 2009), while the evaporation is derived from the latent heat flux. The climatological monthly mean runoff of the major rivers in the Black Sea is computed using the hydrological dataset provided by SESAME project (Ludvig et al., 2009). The exchange with Mediterranean Sea through the Bosporus Straits is represented by a surface boundary condition taking into account the barotropic transport calculated to balance the fresh water fluxes on monthly bases (Stanev and Beckers, 1999, Peneva et al., 2001). A multi-annual run 2011-2015 has been completed in order to describe the main characteristics of the Black Sea circulation dynamics and thermohaline structure and the numerical results have been validated using in-situ (ARGO) and satellite (SST, SLA) data. The Black Sea model represents also the core of the new Black Sea Forecasting System, implemented at CMCC operationally since January 2016, which produces at daily frequency 10-day forecasts, 3-days analyses and 1-day simulation. Once a week, the system is run 15-day in the past in analysis mode to compute the new optimal initial condition for the forecast cycle. The assimilation is performed by a

  14. Implementation of the NEMO model for estimating the spread of leakage from chemical munitions in the Baltic Sea - the first approach

    Science.gov (United States)

    Andrzejewski, Jan

    2017-04-01

    After the Second World War, during the Potsdam Conference a decision about demilitarization of Germany was made, and as a consequence, ammunition including chemical warfare agents (CWA) was dumped into the basins of the Baltic Sea. This type of weapon was stored in metal barrels that were under strong influence of electrochemical oxidation, also known as corrosion. Several tens years later, scientists were wondering what consequences for marine ecosystem could a leakage from this weapon bring. Although over 70 years passed since the Second World War, the influence of potential leakage of the CWA has not been properly estimated. Thus, the main goal of this work is to estimate dangerous area caused by potential leakage using the NEMO (Nucleus for European Modelling of the Ocean) ocean model. The NEMO ocean model is developed by the European Consortium including research institutes from France, England and Italy. The first step of this work is to implement the model for the area of the Baltic Sea. It requires generation of horizontal and vertical grid, bathymetry, atmospheric forces and lateral boundary conditions. Implemented model will have to be checked - it means it will have to pass a validation process. The Baltic Sea is one of the best measured sea in the World - as a consequence a lot of data are freely available for researchers. After validation and tuning up the model, implementation of passive tracer is planned. Passive tracer is the prognostic variable that could represent concentration of potential leakage and does not have influence on the density of the model. Based on distribution of the passive tracer, dangerous areas in the locations of dumpsites will be assessed. The research work was funded by the European Union (European Regional Development Fund) under the Interreg Baltic Sea Region Programme 2014-2020, project #R013 DAIMON (Decision Aid for Marine Munitions).

  15. NEMO-O$\

    CERN Document Server

    Aiello, S

    2008-01-01

    The NEMO (NEutrino Mediterranean Observatory) Collaboration installed, 25 km E offshore the port of Catania (Sicily) at 2000 m depth, an underwater laboratory to perform long-term tests of prototypes and new technologies for an underwater high energy neutrino km$^3$-scale detector in the Mediterranean Sea. In this framework the collaboration deployed and successfully operated for about two years, starting form January 2005, an experimental apparatus for on-line monitoring of deep-sea noise. The station was equipped with 4 hydrophones and it is operational in the range 30 Hz - 43 kHz. This interval of frequencies matches the range suitable for the proposed acoustic detection technique of high energy neutrinos. Hydrophone signals were digitized underwater at 96 kHz sampling frequency and 24 bits resolution. A custom software was developed to record data on high resolution 4-channels digital audio file. This paper deals with the data analysis procedure and first results on the determination of sea noise sound pr...

  16. Stochastic parameterizations of biogeochemical uncertainties in a 1/4° NEMO/PISCES model for probabilistic comparisons with ocean color data

    Science.gov (United States)

    Garnier, F.; Brankart, J.-M.; Brasseur, P.; Cosme, E.

    2016-03-01

    In spite of recent advances, biogeochemical models are still unable to represent the full complexity of natural ecosystems. Their formulations are mainly based on empirical laws involving many parameters. Improving biogeochemical models therefore requires to properly characterize model uncertainties and their consequences. Subsequently, this paper investigates the potential of using random processes to simulate some uncertainties of the 1/4° coupled Physical-Biogeochemical NEMO/PISCES model of the North Atlantic ocean. Starting from a deterministic simulation performed with the original PISCES formulation, we propose a generic method based on AR(1) random processes to generate perturbations with temporal and spatial correlations. These perturbations are introduced into the model formulations to simulate 2 classes of uncertainties: the uncertainties on biogeochemical parameters and the uncertainties induced by unresolved scales in the presence of non-linear processes. Using these stochastic parameterizations, a probabilistic version of PISCES is designed and a 60-member ensemble simulation is performed. With respect to the simulation of chlorophyll, the relevance of the probabilistic configuration and the impacts of these stochastic parameterizations are assessed. In particular, it is shown that the ensemble simulation is in good agreement with the SeaWIFS ocean color data. Using these observations, the statistical consistency (reliability) of the ensemble is evaluated with rank histograms. Finally, the benefits expected from the probabilistic description of uncertainties (model error) are discussed in the context of future ocean color data assimilation.

  17. ArcNEMO, a spatially distributed nutrient emission model developed in Python to quantify losses of nitrogen and phosphorous from agriculture to surface waters

    Science.gov (United States)

    Van Opstal, Mattias; Tits, Mia; Beckers, Veronique; Batelaan, Okke; Van Orshoven, Jos; Elsen, Annemie; Diels, Jan; D'heygere, Tom; Van Hoof, Kor

    2014-05-01

    Pollution of surface water bodies with nitrogen (N) and phosphorous (P) from agricultural sources is a major problem in areas with intensive agriculture in Europe. The Flemish Environment Agency requires information on how spatially explicit policy measures on manure and fertilizer use, and changes in land use and soil management affect the N and P concentration in the surface waters in the region of Flanders, Belgium. To assist in this, a new spatially distributed, mechanistic nutrient emission model was developed in the open-source language Python. The model is called ArcNEMO (Nutrient Emission MOdel). The model is fully integrated in ArcGIS, but could be easily adapted to work with open-source GIS software. In Flanders, detailed information is available each year on the delineation of each agricultural parcel and the crops grown on them. Parcels are linked to farms, and for each farm yearly manure and fertilizer use is available. To take full advantage of this information and to be able to simulate nutrient losses to the high-density surface water network, the model makes use of grid cells of 50 by 50m. A fertilizer allocation model was developed to calculate from the yearly parcel and farm data the fertilizer and manure input per grid cell for further use in the ArcNEMO-model. The model architecture was chosen such that the model can be used to simulate spatially explicit monthly discharge and losses of N and P to the surface water for the whole of Flanders (13,500 km²) over periods of 10-20 years. The extended time period is necessary because residence times in groundwater and the rates of organic matter turnover imply that water quality reacts slowly to changes of land use and fertilization practices. Vertical water flow and nutrient transport in the unsaturated zone are described per grid cell using a cascading bucket-type model with daily time steps. Groundwater flow is described by solving the 2D-groundwater flow equation using an explicit numerical

  18. Status of NEMO: results from the NEMO Phase-1 detector

    Energy Technology Data Exchange (ETDEWEB)

    Distefano, C. [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali del Sud, Via S. Sofia 62, 95123 Catania (Italy)

    2009-05-15

    The NEMO Collaboration installed an underwater detector including most of the critical elements of a possible km{sup 3} neutrino telescope: a four-floor tower (called Mini-Tower) and a Junction Box, including the data transmission, the power distribution, the timing calibration and the acoustic positioning systems. These technical solutions will be evaluated, among others proposed for the construction of the km{sup 3} detector, within the KM3NeT Consortium. The main test of this test experiment was the validation of the proposed design solutions mentioned above. We present results of the analysis of data collected with the NEMO Mini-Tower. The position of PMTs is determined through the acoustic position system; signals detected with PMTs are used to reconstruct the tracks of atmospheric muons. The angular distribution of atmospheric muons was measured and results were compared with Monte Carlo simulations.

  19. Status of NEMO: results from the NEMO Phase-1 detector

    CERN Document Server

    Distefano, Carla

    2009-01-01

    The NEMO Collaboration installed an underwater detector including most of the critical elements of a possible km$^3$ neutrino telescope: a four-floor tower (called Mini-Tower) and a Junction Box, including the data transmission, the power distribution, the timing calibration and the acoustic positioning systems. These technical solutions will be evaluated, among others proposed for the construction of the km$^3$ detector, within the KM3NeT Consortium. The main test of this test experiment was the validation of the proposed design solutions mentioned above. We present results of the analysis of data collected with the NEMO Mini-Tower. The position of PMTs is determined through the acoustic position system; signals detected with PMTs are used to reconstruct the tracks of atmospheric muons. The angular distribution of atmospheric muons was measured and results were compared with Monte Carlo simulations.

  20. Hepatocyte-specific NEMO deletion promotes NK/NKT cell- and TRAIL-dependent liver damage.

    Science.gov (United States)

    Beraza, Naiara; Malato, Yann; Sander, Leif E; Al-Masaoudi, Malika; Freimuth, Julia; Riethmacher, Dieter; Gores, Gregory J; Roskams, Tania; Liedtke, Christian; Trautwein, Christian

    2009-08-03

    Nuclear factor kappaB (NF-kappaB) is one of the main transcription factors involved in regulating apoptosis, inflammation, chronic liver disease, and cancer progression. The IKK complex mediates NF-kappaB activation and deletion of its regulatory subunit NEMO in hepatocytes (NEMO(Delta hepa)) triggers chronic inflammation and spontaneous hepatocellular carcinoma development. We show that NEMO(Delta hepa) mice were resistant to Fas-mediated apoptosis but hypersensitive to tumor necrosis factor-related apoptosis-inducing ligand (TRAIL) as the result of a strong up-regulation of its receptor DR5 on hepatocytes. Additionally, natural killer (NK) cells, the main source of TRAIL, were activated in NEMO(Delta hepa) livers. Interestingly, depletion of the NK1.1(+) cells promoted a significant reduction of liver inflammation and an improvement of liver histology in NEMO(Delta hepa) mice. Furthermore, hepatocyte-specific NEMO deletion strongly sensitized the liver to concanavalin A (ConA)-mediated injury. The critical role of the NK cell/TRAIL axis in NEMO(Delta hepa) livers during ConA hepatitis was further confirmed by selective NK cell depletion and adoptive transfer of TRAIL-deficient(-/-) mononuclear cells. Our results uncover an essential mechanism of NEMO-mediated protection of the liver by preventing NK cell tissue damage via TRAIL/DR5 signaling. As this mechanism is important in human liver diseases, NEMO(Delta hepa) mice are an interesting tool to give insight into liver pathophysiology and to develop future therapeutic strategies.

  1. Evaluation of an operational ocean model configuration at 1/12° spatial resolution for the Indonesian seas (NEMO2.3/INDO12) - Part 2: Biogeochemistry

    Science.gov (United States)

    Gutknecht, Elodie; Reffray, Guillaume; Gehlen, Marion; Triyulianti, Iis; Berlianty, Dessy; Gaspar, Philippe

    2016-04-01

    In the framework of the INDESO (Infrastructure Development of Space Oceanography) project, an operational ocean forecasting system was developed to monitor the state of the Indonesian seas in terms of circulation, biogeochemistry and fisheries. This forecasting system combines a suite of numerical models connecting physical and biogeochemical variables to population dynamics of large marine predators (tunas). The physical-biogeochemical coupled component (the INDO12BIO configuration) covers a large region extending from the western Pacific Ocean to the eastern Indian Ocean at 1/12° horizontal resolution. The NEMO-OPA (Nucleus for European Model of the Ocean) physical ocean model and the PISCES (Pelagic Interactions Scheme for Carbon and Ecosystem Studies) biogeochemical model are running simultaneously ("online" coupling), at the same resolution. The operational global ocean forecasting system (1/4°) operated by Mercator Océan provides the physical forcing, while climatological open boundary conditions are prescribed for the biogeochemistry. This paper describes the skill assessment of the INDO12BIO configuration. Model skill is assessed by evaluating a reference hindcast simulation covering the last 8 years (2007-2014). Model results are compared to satellite, climatological and in situ observations. Diagnostics are performed on nutrients, oxygen, chlorophyll a, net primary production and mesozooplankton. The model reproduces large-scale distributions of nutrients, oxygen, chlorophyll a, net primary production and mesozooplankton biomasses. Modelled vertical distributions of nutrients and oxygen are comparable to in situ data sets although gradients are slightly smoothed. The model simulates realistic biogeochemical characteristics of North Pacific tropical waters entering in the archipelago. Hydrodynamic transformation of water masses across the Indonesian archipelago allows for conserving nitrate and oxygen vertical distribution close to observations, in the

  2. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  3. Evaluation of an operational ocean model configuration at 1/12° spatial resolution for the Indonesian seas (NEMO2.3/INDO12) - Part 1: Ocean physics

    Science.gov (United States)

    Tranchant, Benoît; Reffray, Guillaume; Greiner, Eric; Nugroho, Dwiyoga; Koch-Larrouy, Ariane; Gaspar, Philippe

    2016-03-01

    INDO12 is a 1/12° regional version of the NEMO physical ocean model covering the whole Indonesian EEZ (Exclusive Economic Zone). It has been developed and is now running every week in the framework of the INDESO (Infrastructure Development of Space Oceanography) project implemented by the Indonesian Ministry of Marine Affairs and Fisheries. The initial hydrographic conditions as well as open-boundary conditions are derived from the operational global ocean forecasting system at 1/4° operated by Mercator Océan. Atmospheric forcing fields (3-hourly ECMWF (European Centre for Medium-Range Weather Forecast) analyses) are used to force the regional model. INDO12 is also forced by tidal currents and elevations, and by the inverse barometer effect. The turbulent mixing induced by internal tides is taken into account through a specific parameterisation. In this study we evaluate the model skill through comparisons with various data sets including outputs of the parent model, climatologies, in situ temperature and salinity measurements, and satellite data. The biogeochemical model results assessment is presented in a companion paper (Gutknecht et al., 2015). The simulated and altimeter-derived Eddy Kinetic Energy fields display similar patterns and confirm that tides are a dominant forcing in the area. The volume transport of the Indonesian throughflow (ITF) is in good agreement with the INSTANT estimates while the transport through Luzon Strait is, on average, westward but probably too weak. Compared to satellite data, surface salinity and temperature fields display marked biases in the South China Sea. Significant water mass transformation occurs along the main routes of the ITF and compares well with observations. Vertical mixing is able to modify the South and North Pacific subtropical water-salinity maximum as seen in T-S diagrams. In spite of a few weaknesses, INDO12 proves to be able to provide a very realistic simulation of the ocean circulation and water mass

  4. Assessment of the sea-ice carbon pump: Insights from a three-dimensional ocean-sea-ice biogeochemical model (NEMO-LIM-PISCES

    Directory of Open Access Journals (Sweden)

    Sébastien Moreau

    2016-08-01

    Full Text Available Abstract The role of sea ice in the carbon cycle is minimally represented in current Earth System Models (ESMs. Among potentially important flaws, mentioned by several authors and generally overlooked during ESM design, is the link between sea-ice growth and melt and oceanic dissolved inorganic carbon (DIC and total alkalinity (TA. Here we investigate whether this link is indeed an important feature of the marine carbon cycle misrepresented in ESMs. We use an ocean general circulation model (NEMO-LIM-PISCES with sea-ice and marine carbon cycle components, forced by atmospheric reanalyses, adding a first-order representation of DIC and TA storage and release in/from sea ice. Our results suggest that DIC rejection during sea-ice growth releases several hundred Tg C yr−1 to the surface ocean, of which < 2% is exported to depth, leading to a notable but weak redistribution of DIC towards deep polar basins. Active carbon processes (mainly CaCO3 precipitation but also ice-atmosphere CO2 fluxes and net community production increasing the TA/DIC ratio in sea-ice modified ocean-atmosphere CO2 fluxes by a few Tg C yr−1 in the sea-ice zone, with specific hemispheric effects: DIC content of the Arctic basin decreased but DIC content of the Southern Ocean increased. For the global ocean, DIC content increased by 4 Tg C yr−1 or 2 Pg C after 500 years of model run. The simulated numbers are generally small compared to the present-day global ocean annual CO2 sink (2.6 ± 0.5 Pg C yr−1. However, sea-ice carbon processes seem important at regional scales as they act significantly on DIC redistribution within and outside polar basins. The efficiency of carbon export to depth depends on the representation of surface-subsurface exchanges and their relationship with sea ice, and could differ substantially if a higher resolution or different ocean model were used.

  5. Sea-ice evaluation of NEMO-Nordic 1.0: a NEMO–LIM3.6-based ocean–sea-ice model setup for the North Sea and Baltic Sea

    Directory of Open Access Journals (Sweden)

    P. Pemberton

    2017-08-01

    Full Text Available The Baltic Sea is a seasonally ice-covered marginal sea in northern Europe with intense wintertime ship traffic and a sensitive ecosystem. Understanding and modeling the evolution of the sea-ice pack is important for climate effect studies and forecasting purposes. Here we present and evaluate the sea-ice component of a new NEMO–LIM3.6-based ocean–sea-ice setup for the North Sea and Baltic Sea region (NEMO-Nordic. The setup includes a new depth-based fast-ice parametrization for the Baltic Sea. The evaluation focuses on long-term statistics, from a 45-year long hindcast, although short-term daily performance is also briefly evaluated. We show that NEMO-Nordic is well suited for simulating the mean sea-ice extent, concentration, and thickness as compared to the best available observational data set. The variability of the annual maximum Baltic Sea ice extent is well in line with the observations, but the 1961–2006 trend is underestimated. Capturing the correct ice thickness distribution is more challenging. Based on the simulated ice thickness distribution we estimate the undeformed and deformed ice thickness and concentration in the Baltic Sea, which compares reasonably well with observations.

  6. NEMO: Advanced energy systems and technologies

    Science.gov (United States)

    Lund, P.

    In this report, the contents and major results of the national research program on advanced energy system and technologies (NEMO) are presented. The NEMO-program was one of the energy research programs of the Ministry of Trade and Industry during 1988-1992. Helsinki University of Technology had the responsibility of the overall coordination of the program. NEMO has been the largest resource allocation into advanced energy systems in Finland so far. The total budget was 70 million FIM. The focus of the program has been in solar energy, wind power, and energy storage. Hydrogen and fuel cells have been included in smaller amount. On all major fields of the NEMO-program, useful and high quality results have been obtained. Results of international significance include among others arctic wind energy, new approaches for the energy storage problem in solar energy applications, and the development of a completely new storage battery. International collaboration has been given high priority. The NEMO-program has also been active in informing the industries of the various business and utilization possibilities that advanced energy technologies offer. For example, major demonstration plants of each technology group have been realized. It is recommended that the further R and D should be still more focused on commercial applications. Through research efforts at universities, a good technology base should be maintained, whereas the industries should take a stronger position in commercializing new technology. Parallel to technology R and D, more public resources should be allocated for market introduction.

  7. The NEMO project: A status report

    Science.gov (United States)

    Taiuti, M.; Aiello, S.; Ameli, F.; Amore, I.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Battaglieri, M.; Bazzotti, M.; Bersani, A.; Beverini, N.; Biagi, S.; Bonori, M.; Bouhdaef, B.; Brunoldi, M.; Cacopardo, G.; Capone, A.; Caponetto, L.; Carminati, G.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; D'Amico, A.; de Bonis, G.; de Rosa, G.; de Ruvo, G.; de Vita, R.; Distefano, C.; Falchini, E.; Flaminio, V.; Fratini, K.; Gabrielli, A.; Galatà, S.; Gandolfi, E.; Giacomelli, G.; Giorgi, F.; Giovanetti, G.; Grimaldi, A.; Habel, R.; Imbesi, M.; Kulikovsky, V.; Lattuada, D.; Leonora, E.; Lonardo, A.; Lo Presti, D.; Lucarelli, F.; Margiotta, A.; Marinelli, A.; Martini, A.; Masullo, R.; Migneco, E.; Minutoli, S.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Osipenko, M.; Papaleo, R.; Pappalardo, V.; Piattelli, P.; Piombo, D.; Raia, G.; Randazzo, N.; Reito, S.; Ricco, G.; Riccobene, G.; Ripani, M.; Rovelli, A.; Ruppi, M.; Russo, G. V.; Russo, S.; Sapienza, P.; Sciliberto, D.; Sedita, M.; Shirokov, E.; Simeone, F.; Sipala, V.; Spurio, M.; Trasatti, L.; Urso, S.; Vecchi, M.; Vicini, P.; Wischnewski, R.

    2011-01-01

    The latest results and the activities towards the construction of a km3 Cherenkov neutrino detector carried out by the NEMO Collaboration are described. Long-term exploration of a 3500 m deep-sea site close to the Sicilian coast has shown that it is optimal for the installation of the detector. The NEMO Phase-1 project has validated several technologies proposed for the construction of the km3 detector on a test site at 2000 m depth. The new infrastructure on the candidate Capo Passero site set up as part of the Phase-2 project will provide the possibility to test detector components at 3500 m depth.

  8. The NEMO project: A status report

    Energy Technology Data Exchange (ETDEWEB)

    Taiuti, M., E-mail: Mauro.Taiuti@ge.infn.i [INFN Sezione di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Dipartimento di Fisica, Universita di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Aiello, S. [INFN Sezione di Catania, Via S. Sofia 64, 95123 Catania (Italy); Ameli, F. [INFN Sezione di Roma 1, P.le A. Moro 2, 00185 Roma (Italy); Amore, I. [Laboratori Nazionali del Sud INFN, Via S. Sofia 62, 95123 Catania (Italy); Dipartimento di Fisica e Astronomia, Universita di Catania, Via S. Sofia 64, 95123 Catania (Italy); Anghinolfi, M. [INFN Sezione di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Anzalone, A. [Laboratori Nazionali del Sud INFN, Via S. Sofia 62, 95123 Catania (Italy); Barbarino, G. [INFN Sezione di Napoli, Via Cintia, 80126 Napoli (Italy); Dipartimento di Scienze Fisiche, Universita di Napoli, Via Cintia, 80126 Napoli (Italy); Battaglieri, M. [INFN Sezione di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Bazzotti, M. [INFN Sezione di Bologna, V.le Berti Pichat 6/2, 40127 Bologna (Italy); Dipartimento di Fisica, Universita di Bologna, V.le Berti Pichat 6/2, 40127 Bologna (Italy); Bersani, A. [INFN Sezione di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Beverini, N. [INFN Sezione di Pisa, Polo Fibonacci, Largo B. Pontecorvo 3, 56127 Pisa (Italy); Dipartimento di Fisica, Universita di Pisa, Polo Fibonacci, Largo B. Pontecorvo 3, 56127 Pisa (Italy); Biagi, S. [INFN Sezione di Bologna, V.le Berti Pichat 6/2, 40127 Bologna (Italy); Dipartimento di Fisica, Universita di Bologna, V.le Berti Pichat 6/2, 40127 Bologna (Italy); Bonori, M. [INFN Sezione di Roma 1, P.le A. Moro 2, 00185 Roma (Italy); Dipartimento di Fisica, Universita di Roma La Sapienza, P.le A. Moro 2, 00185 Roma (Italy); Bouhdaef, B. [INFN Sezione di Pisa, Polo Fibonacci, Largo B. Pontecorvo 3, 56127 Pisa (Italy); Dipartimento di Fisica, Universita di Pisa, Polo Fibonacci, Largo B. Pontecorvo 3, 56127 Pisa (Italy)

    2011-01-21

    The latest results and the activities towards the construction of a km{sup 3} Cherenkov neutrino detector carried out by the NEMO Collaboration are described. Long-term exploration of a 3500 m deep-sea site close to the Sicilian coast has shown that it is optimal for the installation of the detector. The NEMO Phase-1 project has validated several technologies proposed for the construction of the km{sup 3} detector on a test site at 2000 m depth. The new infrastructure on the candidate Capo Passero site set up as part of the Phase-2 project will provide the possibility to test detector components at 3500 m depth.

  9. Comparing sea ice, hydrography and circulation between NEMO3.6 LIM3 and LIM2

    Science.gov (United States)

    Uotila, Petteri; Iovino, Doroteaciro; Vancoppenolle, Martin; Lensu, Mikko; Rousset, Clement

    2017-03-01

    A set of hindcast simulations with the new version 3.6 of the Nucleus for European Modelling of the Ocean (NEMO) ocean-ice model in the ORCA1 configuration and forced by the DRAKKAR Forcing Set version 5.2 (DFS5.2) atmospheric data was performed from 1958 to 2012. Simulations differed in their sea-ice component: the old standard version Louvain-la-Neuve Sea Ice Model (LIM2) and its successor LIM3. Main differences between these sea-ice models are the parameterisations of sub-grid-scale sea-ice thickness distribution, ice deformation, thermodynamic processes, and sea-ice salinity. Our main objective was to analyse the response of the ocean-ice system sensitivity to the change in sea-ice physics. Additional sensitivity simulations were carried out for the attribution of observed differences between the two main simulations.In the Arctic, NEMO-LIM3 compares better with observations by realistically reproducing the sea-ice extent decline during the last few decades due to its multi-category sea-ice thickness. In the Antarctic, NEMO-LIM3 more realistically simulates the seasonal evolution of sea-ice extent than NEMO-LIM2. In terms of oceanic properties, improvements are not as evident, although NEMO-LIM3 reproduces a more realistic hydrography in the Labrador Sea and in the Arctic Ocean, including a reduced cold temperature bias of the Arctic Intermediate Water at 250 m. In the extra-polar regions, the oceanographic conditions of the two NEMO-LIM versions remain relatively similar, although they slowly drift apart over decades. This drift is probably due to a stronger deep water formation around Antarctica in LIM3.

  10. Review paper of gateway selection schemes for MANET of NEMO (MANEMO)

    Science.gov (United States)

    Mahmood, Z.; Hashim, A.; Khalifa, O.; Anwar, F.; Hameed, S.

    2013-12-01

    The fast growth of Internet applications brings with it new challenges for researchers to provide new solutions that guarantee better Internet access for mobile hosts and networks. The globally reachable, Home-Agent based, infrastructure Network Mobility (NEMO) and the local, multi-hop, and infrastructure-less Mobile Ad hoc Network (MANET) developed by Internet Engineering Task Force (IETF) support different topologies of the mobile networks. A new architecture was proposed by combining both topologies to obtain Mobile Ad Hoc NEMO (MANEMO). However, the integration of NEMO and MANET introduces many challenges such as network loops, sub-optimal route, redundant tunnel problem, absence of communication without Home Agent reachability, and exit router selection when multiple Exit Routers to the Internet exist. This paper aims to review the different proposed models that could be used to implement the gateway selection mechanism and it highlights the strengths as well as the limitations of these approaches.

  11. Sensitivity of the northwestern Mediterranean Sea coastal and thermohaline circulations simulated by the 1/12°-resolution ocean model NEMO-MED12 to the spatial and temporal resolution of atmospheric forcing

    Science.gov (United States)

    Lebeaupin Brossier, Cindy; Béranger, Karine; Drobinski, Philippe

    The northwestern Mediterranean (NWM) Sea is prone to intense weather events, associated with high winds, that are characterized by strong shallow jets and a high spatial and temporal variability. The ocean response in this area is very sensitive to the atmospheric conditions, particularly in the Gulf of Lions coastal zone. The ocean response to strong winds is here investigated using the NEMO-MED12 eddy-resolving model, driven by four atmospheric forcings differing in spatial resolution (20 km, 6.7 km) and temporal resolution (daily or 3 h) and produced with the non-hydrostratic mesoscale WRF model. The noticeable effects of the higher-frequency forcing are (i) to reduce the shelf dense-water formation and the deep offshore convection in winter due to the explicit simulation of the diurnal cycle that warms and stratifies the ocean upper layers and (ii) to increase the vertical velocity in the upwelling cells. The higher spatial resolution allows, in particular, the production of stronger winds and the accurate reproduction of the near-surface sub-mesoscale eddies in the coastal areas, in agreement with observations.

  12. NEuronMOrphological analysis tool: open-source software for quantitative morphometrics

    Directory of Open Access Journals (Sweden)

    Lucia eBilleci

    2013-02-01

    Full Text Available Morphometric analysis of neurons and brain tissue is relevant to the study of neuron circuitry development during the first phases of brain growth or for probing the link between microstructural morphology and degenerative diseases. As neural imaging techniques become ever more sophisticated, so does the amount and complexity of data generated. The NEuronMOrphological analysis tool NEMO was purposely developed to handle and process large numbers of optical microscopy image files of neurons in culture or slices in order to automatically run batch routines, store data and apply multivariate classification and feature extraction using3-way principal component analysis. Here we describe the software's main features, underlining the differences between NEMO and other commercial and non-commercial image processing tools, and show an example of how NEMO can be used to classify neurons from wild-type mice and from animal models of autism.

  13. Multi-million Atom Electronic Structure Simulations using NEMO 3-D

    Science.gov (United States)

    Klimeck, Gerhard; Oyafuso, Fabiano; Boykin, Timothy B.; Bowen, R. Chris

    2002-03-01

    The detailed physical understanding of heterostructure interfaces enabled the creation of now well developed devices such as quantum well lasers, quantum well detectors, heterostructure field transistors and resonant tunneling diodes. The design and optimization of these devices and their implementation required the development and utilization of quantitative simulation tools. One such example is the nanoelectronic modeling tool (NEMO 1-D) originally developed by Texas Instruments. The need for such simulation tools is expected to only increase as device feature sizes and experimental characterization capabilities decrease and as manufacturing uncertainties increase. Quantum dot are a proptotypical 3-D nanoelectronic device and they have been studied experimentally and theoretically extensively in the past few years. The presentation will outline our recent developments to model such quantum dots on an atomistic level using the tight-binding method. The parallelization of the software on Intel-based Beowulfs and an SGI Origin, will be discussed. Simulation domains consisting of several million atoms will be analyzed for effects of random particle disorder, interfaces and confinement. More information about the work can be found at this website http://hpc.jpl.nasa.gov/PEP/ gekco.

  14. The SuperNEMO tracking detector

    CERN Document Server

    Cascella, M

    2015-01-01

    The SuperNEMO detector will search for neutrinoless double beta decay at the Modane Underground Laboratory on the French-Italian border. This decay mode, if observed, would be proof that the neutrino is its own antiparticle, would constitute evidence for total lepton number violation, and could allow a measurement of the absolute neutrino mass. The SuperNEMO experiment is designed to reach a half-life sensitivity of $10^{26}$ years corresponding to an effective Majorana neutrino mass of $50-100~$meV. The SuperNEMO detector design allows complete topological reconstruction of the double beta decay event enabling excellent levels of background rejection. In the event of a discovery, such topological measurements will be vital in determining the nature of the lepton number violating process. This reconstruction will be performed by a gaseous tracking detector, consisting of 2034 drift cells per module operated in Geiger mode. The tracker of the Demonstrator Module is currently under construction in the UK. This ...

  15. AlignNemo: a local network alignment method to integrate homology and topology.

    Science.gov (United States)

    Ciriello, Giovanni; Mina, Marco; Guzzi, Pietro H; Cannataro, Mario; Guerra, Concettina

    2012-01-01

    Local network alignment is an important component of the analysis of protein-protein interaction networks that may lead to the identification of evolutionary related complexes. We present AlignNemo, a new algorithm that, given the networks of two organisms, uncovers subnetworks of proteins that relate in biological function and topology of interactions. The discovered conserved subnetworks have a general topology and need not to correspond to specific interaction patterns, so that they more closely fit the models of functional complexes proposed in the literature. The algorithm is able to handle sparse interaction data with an expansion process that at each step explores the local topology of the networks beyond the proteins directly interacting with the current solution. To assess the performance of AlignNemo, we ran a series of benchmarks using statistical measures as well as biological knowledge. Based on reference datasets of protein complexes, AlignNemo shows better performance than other methods in terms of both precision and recall. We show our solutions to be biologically sound using the concept of semantic similarity applied to Gene Ontology vocabularies. The binaries of AlignNemo and supplementary details about the algorithms and the experiments are available at: sourceforge.net/p/alignnemo.

  16. Toward variational assimilation of SARAL/Altika altimeter data in a North Atlantic circulation model at eddy-permitting resolution: assessment of a NEMO-based 4D-VAR system

    Science.gov (United States)

    Bouttier, Pierre-Antoine; Brankart, Jean-Michel; Candille, Guillem; Vidard, Arthur; Blayo, Eric; Verron, Jacques; Brasseur, Pierre

    2015-04-01

    In this project, the response of a variational data assimilation system based on NEMO and its linear tangent and adjoint model is investigated using a 4DVAR algorithm into a North-Atlantic model at eddy-permitting resolution. The assimilated data consist of Jason-2 and SARAL/AltiKA dataset collected during the 2013-2014 period. The main objective is to explore the robustness of the 4DVAR algorithm in the context of a realistic turbulent oceanic circulation at mid-latitude constrained by multi-satellite altimetry missions. This work relies on two previous studies. First, a study with similar objectives was performed based on academic double-gyre turbulent model and synthetic SARAL/AltiKA data, using the same DA experimental framework. Its main goal was to investigate the impact of turbulence on variational DA methods performance. The comparison with this previous work will bring to light the methodological and physical issues encountered by variational DA algorithms in a realistic context at similar, eddy-permitting spatial resolution. We also have demonstrated how a dataset mimicking future SWOT observations improves 4DVAR incremental performances at eddy-permitting resolution. Then, in the context of the OSTST and FP7 SANGOMA projects, an ensemble DA experiment based on the same model and observational datasets has been realized (see poster by Brasseur et al.). This work offers the opportunity to compare efficiency, pros and cons of both DA methods in the context of KA-band altimetric data, at spatial resolution commonly used today for research and operational applications. In this poster we will present the validation plan proposed to evaluate the skill of variational experiment vs. ensemble assimilation experiments covering the same period using independent observations (e.g. from Cryosat-2 mission).

  17. The SuperNEMO double beta decay experiment

    OpenAIRE

    Nasteva, Irina; Collaboration, for the SuperNEMO

    2007-01-01

    The SuperNEMO project studies the feasibility of employing a technique of tracking plus calorimetry to search for neutrinoless double beta decay in 100 kg of enriched isotopes. It aims to reach an effective neutrino mass sensitivity of 50 meV. The current status of the SuperNEMO R&D programme is described, focusing on the main areas of improvement.

  18. Graphical Modeling Language Tool

    NARCIS (Netherlands)

    Rumnit, M.

    2003-01-01

    The group of the faculty EE-Math-CS of the University of Twente is developing a graphical modeling language for specifying concurrency in software design. This graphical modeling language has a mathematical background based on the theorie of CSP. This language contains the power to create trustworth

  19. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  20. Death receptor-independent FADD signalling triggers hepatitis and hepatocellular carcinoma in mice with liver parenchymal cell-specific NEMO knockout.

    Science.gov (United States)

    Ehlken, H; Krishna-Subramanian, S; Ochoa-Callejero, L; Kondylis, V; Nadi, N E; Straub, B K; Schirmacher, P; Walczak, H; Kollias, G; Pasparakis, M

    2014-11-01

    Hepatocellular carcinoma (HCC) usually develops in the context of chronic hepatitis triggered by viruses or toxic substances causing hepatocyte death, inflammation and compensatory proliferation of liver cells. Death receptors of the TNFR superfamily regulate cell death and inflammation and are implicated in liver disease and cancer. Liver parenchymal cell-specific ablation of NEMO/IKKγ, a subunit of the IκB kinase (IKK) complex that is essential for the activation of canonical NF-κB signalling, sensitized hepatocytes to apoptosis and caused the spontaneous development of chronic hepatitis and HCC in mice. Here we show that hepatitis and HCC development in NEMO(LPC-KO) mice is triggered by death receptor-independent FADD-mediated hepatocyte apoptosis. TNF deficiency in all cells or conditional LPC-specific ablation of TNFR1, Fas or TRAIL-R did not prevent hepatocyte apoptosis, hepatitis and HCC development in NEMO(LPC-KO) mice. To address potential functional redundancies between death receptors we generated and analysed NEMO(LPC-KO) mice with combined LPC-specific deficiency of TNFR1, Fas and TRAIL-R and found that also simultaneous lack of all three death receptors did not prevent hepatocyte apoptosis, chronic hepatitis and HCC development. However, LPC-specific combined deficiency in TNFR1, Fas and TRAIL-R protected the NEMO-deficient liver from LPS-induced liver failure, showing that different mechanisms trigger spontaneous and LPS-induced hepatocyte apoptosis in NEMO(LPC-KO) mice. In addition, NK cell depletion did not prevent liver damage and hepatitis. Moreover, NEMO(LPC-KO) mice crossed into a RAG-1-deficient genetic background-developed hepatitis and HCC. Collectively, these results show that the spontaneous development of hepatocyte apoptosis, chronic hepatitis and HCC in NEMO(LPC-KO) mice occurs independently of death receptor signalling, NK cells and B and T lymphocytes, arguing against an immunological trigger as the critical stimulus driving

  1. Multfilm "V poiskah Nemo" delajet detei ubiitsami rõbok

    Index Scriptorium Estoniae

    2004-01-01

    Animafilm "Kalapoeg Nemo" : režissöör Andrew Stanton : Ameerika Ühendriigid 2003. Filmi vaatamise järgselt on tuhanded lapsed lasknud oma akvaariumikalad vabadusse, põhjustades sellega nende huku või keskkonnaprobleeme

  2. SuperNEMO - the next generation double beta decay experiment

    CERN Document Server

    Nasteva, Irina

    2009-01-01

    The SuperNEMO experiment is being designed to search for neutrinoless double beta decay to test if neutrinos are Majorana particles. The experimental technique follows that of the currently running NEMO-3 experiment, which successfully combines tracking and calorimetry to measure the topology and energy of the final state electrons. Unique particle identification capabilities of SuperNEMO will be employed with about 100 kg of 82 Se and will reach sensitivity to a half-life of about 2 x 10^26 years, which corresponds to Majorana neutrino masses of about 50 meV, depending on the calculated value of the nuclear matrix element. In this poster, the current status of the SuperNEMO project is presented.

  3. Brain endothelial TAK1 and NEMO safeguard the neurovascular unit

    Science.gov (United States)

    Ridder, Dirk A.; Wenzel, Jan; Müller, Kristin; Töllner, Kathrin; Tong, Xin-Kang; Assmann, Julian C.; Stroobants, Stijn; Weber, Tobias; Niturad, Cristina; Fischer, Lisanne; Lembrich, Beate; Wolburg, Hartwig; Grand’Maison, Marilyn; Papadopoulos, Panayiota; Korpos, Eva; Truchetet, Francois; Rades, Dirk; Sorokin, Lydia M.; Schmidt-Supprian, Marc; Bedell, Barry J.; Pasparakis, Manolis; Balschun, Detlef; D’Hooge, Rudi; Löscher, Wolfgang; Hamel, Edith

    2015-01-01

    Inactivating mutations of the NF-κB essential modulator (NEMO), a key component of NF-κB signaling, cause the genetic disease incontinentia pigmenti (IP). This leads to severe neurological symptoms, but the mechanisms underlying brain involvement were unclear. Here, we show that selectively deleting Nemo or the upstream kinase Tak1 in brain endothelial cells resulted in death of endothelial cells, a rarefaction of brain microvessels, cerebral hypoperfusion, a disrupted blood–brain barrier (BBB), and epileptic seizures. TAK1 and NEMO protected the BBB by activating the transcription factor NF-κB and stabilizing the tight junction protein occludin. They also prevented brain endothelial cell death in a NF-κB–independent manner by reducing oxidative damage. Our data identify crucial functions of inflammatory TAK1–NEMO signaling in protecting the brain endothelium and maintaining normal brain function, thus explaining the neurological symptoms associated with IP. PMID:26347470

  4. PORFIDO on the NEMO Phase 2 tower

    Energy Technology Data Exchange (ETDEWEB)

    Ciaffoni, Orlando; Cordelli, Marco; Habel, Roberto; Martini, Agnese; Trasatti, Luciano [INFN-Laboratori Nazionali di Frascati, Via E. Fermi 40, I-00044 Frascati (RM) (Italy)

    2014-11-18

    We have designed and built an underwater measurement system, PORFIDO (Physical Oceanography by RFID Outreach) to gather oceanographic data from the Optical Modules of a neutrino telescope with a minimum of disturbance to the main installation. PORFIDO is composed of a sensor glued to the outside of an Optical Module, in contact with seawater, and of a reader placed inside the sphere, facing the sensor. Data are transmitted to the reader through the glass by RFID and to shore in real time for periods of years. The sensor gathers power from the radio frequency, thus eliminating the need for batteries or connectors through the glass. We have deployed four PORFIDO probes measuring temperatures with the NEMO-KM3Net-Italy Phase 2 tower in april 2013. The four probes are operative and are transmitting temperature data from 3500 m depth.

  5. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  6. Epithelial NEMO links innate immunity to chronic intestinal inflammation.

    Science.gov (United States)

    Nenci, Arianna; Becker, Christoph; Wullaert, Andy; Gareus, Ralph; van Loo, Geert; Danese, Silvio; Huth, Marion; Nikolaev, Alexei; Neufert, Clemens; Madison, Blair; Gumucio, Deborah; Neurath, Markus F; Pasparakis, Manolis

    2007-03-29

    Deregulation of intestinal immune responses seems to have a principal function in the pathogenesis of inflammatory bowel disease. The gut epithelium is critically involved in the maintenance of intestinal immune homeostasis-acting as a physical barrier separating luminal bacteria and immune cells, and also expressing antimicrobial peptides. However, the molecular mechanisms that control this function of gut epithelial cells are poorly understood. Here we show that the transcription factor NF-kappaB, a master regulator of pro-inflammatory responses, functions in gut epithelial cells to control epithelial integrity and the interaction between the mucosal immune system and gut microflora. Intestinal epithelial-cell-specific inhibition of NF-kappaB through conditional ablation of NEMO (also called IkappaB kinase-gamma (IKKgamma)) or both IKK1 (IKKalpha) and IKK2 (IKKbeta)-IKK subunits essential for NF-kappaB activation-spontaneously caused severe chronic intestinal inflammation in mice. NF-kappaB deficiency led to apoptosis of colonic epithelial cells, impaired expression of antimicrobial peptides and translocation of bacteria into the mucosa. Concurrently, this epithelial defect triggered a chronic inflammatory response in the colon, initially dominated by innate immune cells but later also involving T lymphocytes. Deficiency of the gene encoding the adaptor protein MyD88 prevented the development of intestinal inflammation, demonstrating that Toll-like receptor activation by intestinal bacteria is essential for disease pathogenesis in this mouse model. Furthermore, NEMO deficiency sensitized epithelial cells to tumour-necrosis factor (TNF)-induced apoptosis, whereas TNF receptor-1 inactivation inhibited intestinal inflammation, demonstrating that TNF receptor-1 signalling is crucial for disease induction. These findings demonstrate that a primary NF-kappaB signalling defect in intestinal epithelial cells disrupts immune homeostasis in the gastrointestinal tract

  7. Modeling Tool Advances Rotorcraft Design

    Science.gov (United States)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  8. Expanding the substantial interactome of NEMO using protein microarrays.

    LENUS (Irish Health Repository)

    Fenner, Beau J

    2010-01-01

    Signal transduction by the NF-kappaB pathway is a key regulator of a host of cellular responses to extracellular and intracellular messages. The NEMO adaptor protein lies at the top of this pathway and serves as a molecular conduit, connecting signals transmitted from upstream sensors to the downstream NF-kappaB transcription factor and subsequent gene activation. The position of NEMO within this pathway makes it an attractive target from which to search for new proteins that link NF-kappaB signaling to additional pathways and upstream effectors. In this work, we have used protein microarrays to identify novel NEMO interactors. A total of 112 protein interactors were identified, with the most statistically significant hit being the canonical NEMO interactor IKKbeta, with IKKalpha also being identified. Of the novel interactors, more than 30% were kinases, while at least 25% were involved in signal transduction. Binding of NEMO to several interactors, including CALB1, CDK2, SAG, SENP2 and SYT1, was confirmed using GST pulldown assays and coimmunoprecipitation, validating the initial screening approach. Overexpression of CALB1, CDK2 and SAG was found to stimulate transcriptional activation by NF-kappaB, while SYT1 overexpression repressed TNFalpha-dependent NF-kappaB transcriptional activation in human embryonic kidney cells. Corresponding with this finding, RNA silencing of CDK2, SAG and SENP2 reduced NF-kappaB transcriptional activation, supporting a positive role for these proteins in the NF-kappaB pathway. The identification of a host of new NEMO interactors opens up new research opportunities to improve understanding of this essential cell signaling pathway.

  9. First results from the NEMO Phase 1 experiment

    CERN Document Server

    Amore, Isabella

    2008-01-01

    The NEMO prototype detector, called "NEMO Phase-1", has been successfully operated at 2000 m depth from December 2006 to May 2007. The apparatus comprises a \\emph{Junction Box} and a \\emph{Mini-Tower} hosting 16 optical sensors. Preliminary results are presented. Positions of the optical sensors in the Mini-Tower were reconstructed through the acoustic positioning system with high level accuracy. Environmental parameters were analyzed. From data corresponding to a livetime of 11.3 hours, atmospheric muon tracks have been reconstructed and their angular distributions were measured and compared with Monte Carlo simulations.

  10. Neutrino Physics without Neutrinos: Recent results from the NEMO-3 experiment and plans for SuperNEMO

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The observation of neutrino oscillations has proved that neutrinos have mass. This discovery has renewed and strengthened the interest in neutrinoless double beta decay experiments which provide the only practical way to determine whether neutrinos are Majorana or Dirac particles. The recently completed NEMO-3 experiment, located in the Laboratoire Souterrain de Modane in the Frejus Tunnel, was an experiment searching for neutrinoless double beta decays using a powerful technique for detecting a two-electron final state by employing an apparatus combining tracking, calorimetry, and the time-of-flight measurements. We will present latest results from NEMO-3 and will discuss the status of SuperNEMO, the next generation experiment that will exploit the same experimental technique to extend the sensitivity of the current search.

  11. Hepatocyte-specific NEMO deletion promotes NK/NKT cell– and TRAIL-dependent liver damage

    Science.gov (United States)

    Beraza, Naiara; Malato, Yann; Sander, Leif E.; Al-Masaoudi, Malika; Freimuth, Julia; Riethmacher, Dieter; Gores, Gregory J.; Roskams, Tania; Liedtke, Christian

    2009-01-01

    Nuclear factor κB (NF-κB) is one of the main transcription factors involved in regulating apoptosis, inflammation, chronic liver disease, and cancer progression. The IKK complex mediates NF-κB activation and deletion of its regulatory subunit NEMO in hepatocytes (NEMOΔhepa) triggers chronic inflammation and spontaneous hepatocellular carcinoma development. We show that NEMOΔhepa mice were resistant to Fas-mediated apoptosis but hypersensitive to tumor necrosis factor–related apoptosis-inducing ligand (TRAIL) as the result of a strong up-regulation of its receptor DR5 on hepatocytes. Additionally, natural killer (NK) cells, the main source of TRAIL, were activated in NEMOΔhepa livers. Interestingly, depletion of the NK1.1+ cells promoted a significant reduction of liver inflammation and an improvement of liver histology in NEMOΔhepa mice. Furthermore, hepatocyte-specific NEMO deletion strongly sensitized the liver to concanavalin A (ConA)–mediated injury. The critical role of the NK cell/TRAIL axis in NEMOΔhepa livers during ConA hepatitis was further confirmed by selective NK cell depletion and adoptive transfer of TRAIL-deficient−/− mononuclear cells. Our results uncover an essential mechanism of NEMO-mediated protection of the liver by preventing NK cell tissue damage via TRAIL/DR5 signaling. As this mechanism is important in human liver diseases, NEMOΔhepa mice are an interesting tool to give insight into liver pathophysiology and to develop future therapeutic strategies. PMID:19635861

  12. Nutritional education for management of osteodystrophy (NEMO) trial: Design and patient characteristics, Lebanon.

    Science.gov (United States)

    Karavetian, Mirey; Abboud, Saade; Elzein, Hafez; Haydar, Sarah; de Vries, Nanne

    2014-02-01

    THIS STUDY AIMS TO DETERMINE THE EFFECT OF A TRAINED DEDICATED DIETITIAN ON CLINICAL OUTCOMES AMONG LEBANESE HEMODIALYSIS (HD) PATIENTS: and thus demonstrate a viable developing country model. This paper describes the study protocol and baseline data. The study was a multicenter randomized controlled trial with parallel-group design involving 12 HD units: assigned to cluster A (n = 6) or B (n = 6). A total of 570 patients met the inclusion criteria. Patients in cluster A were randomly assigned as per dialysis shift to the following: Dedicated Dietitian (DD) (n = 133) and Existing Practice (EP) (n = 138) protocols. Cluster B patients (n = 299) received Trained Hospital Dietitian (THD) protocol. Dietitians of the DD and THD groups were trained by the research team on Kidney Disease Outcomes Quality Initiative nutrition guidelines. DD protocol included: individualized nutrition education for 2 hours/month/HD patient for 6 months focusing on renal osteodystrophy and using the Trans-theoretical theory for behavioral change. EP protocol included nutrition education given to patients by hospital dietitians who were blinded to the study. The THD protocol included nutrition education to patients given by hospital dietitian as per the training received but within hospital responsibilities, with no set educational protocol or tools. Baseline data revealed that 40% of patients were hyperphosphatemics (> 5.5 mg/dl) with low dietary adherence and knowledge of dietary P restriction in addition to inadequate daily protein intake (58.86%± 33.87% of needs) yet adequate dietary P intake (795.52 ± 366.94 mg/day). Quality of life (QOL) ranged from 48-75% of full health. Baseline differences between the 3 groups revealed significant differences in serum P, malnutrition status, adherence to diet and P chelators and in 2 factors of the QOL: physical and social functioning. The data show room for improvement in the nutritional status of the patients. The NEMO trial may be able to

  13. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  14. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  15. Quantification of cellular NEMO content and its impact on NF-κB activation by genotoxic stress.

    Directory of Open Access Journals (Sweden)

    Byounghoon Hwang

    Full Text Available NF-κB essential modulator, NEMO, plays a key role in canonical NF-κB signaling induced by a variety of stimuli, including cytokines and genotoxic agents. To dissect the different biochemical and functional roles of NEMO in NF-κB signaling, various mutant forms of NEMO have been previously analyzed. However, transient or stable overexpression of wild-type NEMO can significantly inhibit NF-κB activation, thereby confounding the analysis of NEMO mutant phenotypes. What levels of NEMO overexpression lead to such an artifact and what levels are tolerated with no significant impact on NEMO function in NF-κB activation are currently unknown. Here we purified full-length recombinant human NEMO protein and used it as a standard to quantify the average number of NEMO molecules per cell in a 1.3E2 NEMO-deficient murine pre-B cell clone stably reconstituted with full-length human NEMO (C5. We determined that the C5 cell clone has an average of 4 x 10(5 molecules of NEMO per cell. Stable reconstitution of 1.3E2 cells with different numbers of NEMO molecules per cell has demonstrated that a 10-fold range of NEMO expression (0.6-6x10(5 molecules per cell yields statistically equivalent NF-κB activation in response to the DNA damaging agent etoposide. Using the C5 cell line, we also quantified the number of NEMO molecules per cell in several commonly employed human cell lines. These results establish baseline numbers of endogenous NEMO per cell and highlight surprisingly normal functionality of NEMO in the DNA damage pathway over a wide range of expression levels that can provide a guideline for future NEMO reconstitution studies.

  16. Fire behavior modeling-a decision tool

    Science.gov (United States)

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  17. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    34 iv Figure 5.1: Computational results for a diffusion problem on planar square thin film............ 36 Figure 5.2... Wisc . Open Microscopy Env. Pre-CoBi Model Lib. CFDRC CoBi Tools CFDRC CoBi Tools Simulation Environment JigCell Tools Figure 4.1: Cell biology

  18. NF-κB Essential Modulator (NEMO) Is Critical for Thyroid Function.

    Science.gov (United States)

    Reale, Carla; Iervolino, Anna; Scudiero, Ivan; Ferravante, Angela; D'Andrea, Luca Egildo; Mazzone, Pellegrino; Zotti, Tiziana; Leonardi, Antonio; Roberto, Luca; Zannini, Mariastella; de Cristofaro, Tiziana; Shanmugakonar, Muralitharan; Capasso, Giovambattista; Pasparakis, Manolis; Vito, Pasquale; Stilo, Romania

    2016-03-11

    The I-κB kinase (IKK) subunit NEMO/IKKγ (NEMO) is an adapter molecule that is critical for canonical activation of NF-κB, a pleiotropic transcription factor controlling immunity, differentiation, cell growth, tumorigenesis, and apoptosis. To explore the functional role of canonical NF-κB signaling in thyroid gland differentiation and function, we have generated a murine strain bearing a genetic deletion of the NEMO locus in thyroid. Here we show that thyrocyte-specific NEMO knock-out mice gradually develop hypothyroidism after birth, which leads to reduced body weight and shortened life span. Histological and molecular analysis indicate that absence of NEMO in thyrocytes results in a dramatic loss of the thyroid gland cellularity, associated with down-regulation of thyroid differentiation markers and ongoing apoptosis. Thus, NEMO-dependent signaling is essential for normal thyroid physiology.

  19. NF-κB Essential Modulator (NEMO) Is Critical for Thyroid Function*

    Science.gov (United States)

    Reale, Carla; Iervolino, Anna; Scudiero, Ivan; Ferravante, Angela; D'Andrea, Luca Egildo; Mazzone, Pellegrino; Zotti, Tiziana; Leonardi, Antonio; Roberto, Luca; Zannini, Mariastella; de Cristofaro, Tiziana; Shanmugakonar, Muralitharan; Capasso, Giovambattista; Pasparakis, Manolis; Vito, Pasquale; Stilo, Romania

    2016-01-01

    The I-κB kinase (IKK) subunit NEMO/IKKγ (NEMO) is an adapter molecule that is critical for canonical activation of NF-κB, a pleiotropic transcription factor controlling immunity, differentiation, cell growth, tumorigenesis, and apoptosis. To explore the functional role of canonical NF-κB signaling in thyroid gland differentiation and function, we have generated a murine strain bearing a genetic deletion of the NEMO locus in thyroid. Here we show that thyrocyte-specific NEMO knock-out mice gradually develop hypothyroidism after birth, which leads to reduced body weight and shortened life span. Histological and molecular analysis indicate that absence of NEMO in thyrocytes results in a dramatic loss of the thyroid gland cellularity, associated with down-regulation of thyroid differentiation markers and ongoing apoptosis. Thus, NEMO-dependent signaling is essential for normal thyroid physiology. PMID:26786105

  20. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. UK low-background infrastructure for delivering SuperNEMO

    CERN Document Server

    Liu, Xin Ran

    2015-01-01

    SuperNEMO is a next generation neutrinoless double beta decay experiment with a design capability to reach a half-life sensitivity of $10^{26}$ years corresponding to an effective Majorana neutrino mass of $\\langle m_{\\beta\\beta} \\rangle$ $<$ 50 - 100 meV. To achieve this sensitivity, stringent radio-purity requirements are imposed resulting in an equally stringent screening programme. Dedicated facilities have been established in the UK for screening and selection of detector construction materials. Gamma ray spectroscopy using high-purity germanium (HPGe) detectors has been the standard method for the measurement of material contamination. A low-background facility has been established at Boulby Underground Laboratory. The first results from the 2 current HPGe detector are shown. Radon is one of the most critical backgrounds for SuperNEMO and most other low background experiments. It can enter the detector either through diffusion, contamination during construction or emanation from the detector material...

  2. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define...... a taxonomy of aspects around conservation, constraints and constitutive relations. Aspects of the ICAS-MoT toolbox are given to illustrate the functionality of a computer aided modelling tool, which incorporates an interface to MS Excel....

  3. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  4. Sensitivity and pointing accuracy of the NEMO km$^3$ telescope

    CERN Document Server

    Distefano, C

    2006-01-01

    n this paper we present the results of Monte Carlo simulation studies on the capability of the proposed NEMO km3 telescope to detect high energy neutrinos. We calculated the detector sensitivity to muon neutrinos coming from a generic point-like source. We also simulated the lack of atmospheric muons in correspondence to the Moon disk in order to determine the detector angular resolution and to check the absolute pointing capability.

  5. Sensitivity of the NEMO telescope to neutrinos from microquasars

    OpenAIRE

    Distefano, C.

    2006-01-01

    We present the results of Monte Carlo simulation studies of the capability of the proposed NEMO telescope to detect TeV muon neutrinos from Galactic microquasars. In particular we determined the number of the detectable events from each known microquasar together with the expected atmospheric neutrino and muon background events. We also discuss the detector sensitivity to neutrino fluxes expected from microquasars, optimizing the event selection in order to reject the atmospheric background, ...

  6. Sensitivity of the NEMO detector to galactic microquasars

    OpenAIRE

    Distefano, C.

    2007-01-01

    We present the results of Monte Carlo simulation studies of the capability of the proposed NEMO km$^3$ telescope to detect TeV muon neutrinos from Galactic microquasars. In particular we determined the detector sensitivity to each known microquasar, optimizing the event selection in order to reject the atmospheric background. We also determined the expected number of source and background events surviving the selection.

  7. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  8. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  9. NEMO on the shelf: assessment of the Iberia–Biscay–Ireland configuration

    Directory of Open Access Journals (Sweden)

    C. Maraldi

    2013-08-01

    Full Text Available This work describes the design and validation of a high-resolution (1/36° ocean forecasting model over the "Iberian–Biscay–Irish" (IBI area. The system has been set-up using the NEMO model (Nucleus for European Modelling of the Ocean. New developments have been incorporated in NEMO to make it suitable to open- as well as coastal-ocean modelling. In this paper, we pursue three main objectives: (1 to give an overview of the model configuration used for the simulations; (2 to give a broad-brush account of one particular aspect of this work, namely consistency verification; this type of validation is conducted upstream of the implementation of the system before it is used for production and routinely validated; it is meant to guide model development in identifying gross deficiencies in the modelling of several key physical processes; and (3 to show that such a regional modelling system has potential as a complement to patchy observations (an integrated approach to give information on non-observed physical quantities and to provide links between observations by identifying broader-scale patterns and processes. We concentrate on the year 2008. We first provide domain-wide consistency verification results in terms of barotropic tides, transports, sea surface temperature and stratification. We then focus on two dynamical subregions: the Celtic shelves and the Bay of Biscay slope and deep regions. The model–data consistency is checked for variables and processes such as tidal currents, tidal fronts, internal tides and residual elevation. We also examine the representation in the model of a seasonal pattern of the Bay of Biscay circulation: the warm extension of the Iberian Poleward Current along the northern Spanish coast (Navidad event in the winter of 2007–2008.

  10. NEMO on the shelf: assessment of the Iberia-Biscay-Ireland configuration

    Directory of Open Access Journals (Sweden)

    C. Maraldi

    2013-01-01

    Full Text Available The Iberia-Biscay-Ireland (IBI system serves one of the 7 MyOcean "Monitoring and Forecasting Centres". A high resolution simulation covering the IBI region is set-up over July 2007–February 2009. The NEMO (Nucleus for European Modelling of the Ocean model is used with a 1/36° horizontal resolution and 50 z-levels in the vertical. New developments have been incorporated in NEMO to make it suitable to open- as well as coastal-ocean modelling. In this paper, we pursue three main objectives: (1 give an overview of the model configuration used for the simulations; (2 give a broad-brush account of one particular aspect of this work, namely consistency verification; this type of validation is conducted upstream of the implementation of the system before it is used for production and routinely validated; it is meant to guide model development in identifying gross deficiencies in the modelling of several key physical processes; (3 show that such a regional modelling system has potential as a complement to patchy observations (an integrated approach to give information on non-observed physical quantities and to provide links between observations by identifying broader-scale patterns and processes. We concentrate on the year 2008. We first provide domain-wide consistency verification results in terms of barotropic tides, transports, sea surface temperature and stratification. We then focus on two dynamical sub-regions: the Celtic shelves and the Bay of Biscay slope and deep regions. The model-data consistency is checked for variables and processes such as tidal currents, tidal fronts, internal tides, residual elevation. We also examine the representation in the model of a seasonal pattern of the Bay of Biscay circulation: the warm extension of the Iberian Poleward Current along the northern Spanish coast (Navidad event in winter 2007–2008.

  11. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-08-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  12. Detection of point-like neutrino sources with the NEMO-km3 telescope

    CERN Document Server

    Distefano, C

    2006-01-01

    The NEMO Collaboration is conducting an R&D activity towards the construction of a Mediterranean km3 neutrino telescope. In this work, we present the results of Monte Carlo simulation studies on the capability of the proposed NEMO telescope to detect and identify point-like sources of high energy muon neutrinos.

  13. Detection potential to point-like neutrino sources with the NEMO-km3 telescope

    OpenAIRE

    Distefano, C.

    2006-01-01

    The NEMO Collaboration is conducting an R&D activity towards the construction of a Mediterranean km3 neutrino telescope. In this work, we present the results of Monte Carlo simulation studies on the capability of the proposed NEMO telescope to detect and identify point-like sources of high energy muon neutrinos.

  14. GSK-3β controls NF-kappaB activity via IKKγ/NEMO

    Science.gov (United States)

    Medunjanin, Senad; Schleithoff, Lisa; Fiegehenn, Christian; Weinert, Soenke; Zuschratter, Werner; Braun-Dullaeus, Ruediger C.

    2016-01-01

    The NF-κB signaling pathway is central for the innate immune response and its deregulation is found in multiple disorders such as autoimmune, chronic inflammatory and metabolic diseases. IKKγ/NEMO is essential for NF-κB activation and NEMO dysfunction in humans has been linked to so-called progeria syndromes, which are characterized by advanced ageing due to age-dependent inflammatory diseases. It has been suggested that glycogen synthase kinase-3β (GSK-3β) participates in NF-κB regulation but the exact mechanism remained incompletely understood. In this study, we identified NEMO as a GSK-3β substrate that is phosphorylated at serine 8, 17, 31 and 43 located within its N-terminal domain. The kinase forms a complex with wild-type NEMO while point mutations of NEMO at the specific serines abrogated GSK-3β binding and subsequent phosphorylation of NEMO resulting in its destabilization. However, K63-linked polyubiquitination was augmented in mutated NEMO explaining an increased binding to IKKα and IKKβ. Even IκBα was found degraded. Still, TNFα-stimulated NF-κB activation was impaired pointing towards an un-controlled signalling process. Our data suggest that GSK-3β is critically important for ordered NF-κB signalling through modulation of NEMO phosphorylation. PMID:27929056

  15. Measurement of the background in the NEMO 3 double beta decay experiment

    CERN Document Server

    Argyriades, J; Augier, C; Baker, J; Barabash, A S; Bongrand, M; Broudin-Bay, G; Brudanin, V B; Caffrey, A J; Chapon, A; Chauveau, E; Daraktchieva, Z; Durand, D; Egorov, V G; Fatemi-Ghomi, N; Flack, R; Freshville, A; Guillon, B; Hubert, Ph; Jullian, S; Kauer, M; King, S; Kochetov, O I; Konovalov, S I; Kovalenko, V E; Lalanne, D; Lang, K; Lemi`ere, Y; Lutter, G; Mamedov, F; Marquet, Ch; Martín-Albo, J; Mauger, F; Nachab, A; Nasteva, I; Nemchenok, I B; Nova, F; Novella, P; Ohsumi, H; Pahlka, R B; Perrot, F; Piquemal, F; Reyss, J L; Ricol, J S; Saakyan, R; Sarazin, X; Simard, L; Shitov, Yu A; Smolnikov, A A; Snow, S; Söldner-Rembold, S; Stekl, I; Sutton, C S; Szklarz, G; Thomas, J; Timkin, V V; Tretyak, V I; Tretyak, Vl I; Umatov, V I; Vàla, L; Vanyushin, I A; Vasiliev, V A; Vorobel, V; Vylov, Ts

    2009-01-01

    In the double beta decay experiment NEMO~3 a precise knowledge of the background in the signal region is of outstanding importance. This article presents the methods used in NEMO~3 to evaluate the backgrounds resulting from most if not all possible origins. It also illustrates the power of the combined tracking-calorimetry technique used in the experiment.

  16. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  17. Renzo Piano's NEMO. Example of integration. Decentralization of technical spaces; Renzo's Piano NEMO. Voorbeeld van integratie. Sterke decentralisatie van technische ruimten

    Energy Technology Data Exchange (ETDEWEB)

    Zeiler, W. [Installatietechnologie, Technische Universiteit Eindhoven, Eindhoven (Netherlands)

    2007-11-15

    At the time of the opening of Nemo, which was designed by the architect Renzo Piano, it was immediately the largest 'science center' of the Netherlands with 300,000 visitors annually. It started under the name NewMetropolis and is now called Nemo. Fitting in the installations requires closer collaboration between the architect and the advisor.(mk) [Dutch] Bij opening was het door de architect Renzo Piano ontworpen Nemo direct het grootste 'science-center' van Nederland met jaarlijks ruim 300.000 bezoekers. Het begon onder de naam NewMetropolis en heet nu Nemo. De inpassing van de installaties vroeg om een nauwe samenwerking tussen architect en adviseur.

  18. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  19. Nano-electromechanical oscillators (NEMOs) for RF technologies.

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Joel Robert; Czaplewski, David A.; Gibson, John Murray (Argonne National Laboratory, Argonne, IL); Webster, James R.; Carton, Andrew James; Keeler, Bianca Elizabeth Nelson; Carr, Dustin Wade; Friedmann, Thomas Aquinas; Tallant, David Robert; Boyce, Brad Lee; Sullivan, John Patrick; Dyck, Christopher William; Chen, Xidong (Cedarville University, Cedarville, OH)

    2004-12-01

    Nano-electromechanical oscillators (NEMOs), capacitively-coupled radio frequency (RF) MEMS switches incorporating dissipative dielectrics, new processing technologies for tetrahedral amorphous carbon (ta-C) films, and scientific understanding of dissipation mechanisms in small mechanical structures were developed in this project. NEMOs are defined as mechanical oscillators with critical dimensions of 50 nm or less and resonance frequencies approaching 1 GHz. Target applications for these devices include simple, inexpensive clocks in electrical circuits, passive RF electrical filters, or platforms for sensor arrays. Ta-C NEMO arrays were used to demonstrate a novel optomechanical structure that shows remarkable sensitivity to small displacements (better than 160 fm/Hz {sup 1/2}) and suitability as an extremely sensitive accelerometer. The RF MEMS capacitively-coupled switches used ta-C as a dissipative dielectric. The devices showed a unipolar switching response to a unipolar stimulus, indicating the absence of significant dielectric charging, which has historically been the major reliability issue with these switches. This technology is promising for the development of reliable, low-power RF switches. An excimer laser annealing process was developed that permits full in-plane stress relaxation in ta-C films in air under ambient conditions, permitting the application of stress-reduced ta-C films in areas where low thermal budget is required, e.g. MEMS integration with pre-existing CMOS electronics. Studies of mechanical dissipation in micro- and nano-scale ta-C mechanical oscillators at room temperature revealed that mechanical losses are limited by dissipation associated with mechanical relaxation in a broad spectrum of defects with activation energies for mechanical relaxation ranging from 0.35 eV to over 0.55 eV. This work has established a foundation for the creation of devices based on nanomechanical structures, and outstanding critical research areas that need

  20. Sensitivity of the NEMO telescope to neutrinos from microquasars

    Energy Technology Data Exchange (ETDEWEB)

    Distefano, C. [LNS-INFN, via S. Sofia 62, 95123 Catania (Italy)

    2007-03-15

    We present the results of Monte Carlo simulation studies of the capability of the proposed NEMO telescope to detect TeV muon neutrinos from Galactic microquasars. In particular we determined the number of the detectable events from each known microquasar together with the expected atmospheric neutrino and muon background events. We also discuss the detector sensitivity to neutrino fluxes expected from microquasars, optimizing the event selection in order to reject the atmospheric background, and we show the number of events surviving the event selection.

  1. Sensitivity of the NEMO telescope to neutrinos from microquasars

    CERN Document Server

    Distefano, C

    2006-01-01

    We present the results of Monte Carlo simulation studies of the capability of the proposed NEMO telescope to detect TeV muon neutrinos from Galactic microquasars. In particular we determined the number of the detectable events from each known microquasar together with the expected atmospheric neutrino and muon background events. We also discuss the detector sensitivity to neutrino fluxes expected from microquasars, optimizing the event selection in order to reject the atmospheric background, and we show the number of events surviving the event selection.

  2. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  3. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland wate

  4. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  5. TOOL FORCE MODEL FOR DIAMOND TURNING

    Institute of Scientific and Technical Information of China (English)

    Wang Hongxiang; Sun Tao; Li Dan; Dong Shen

    2004-01-01

    A new tool force model to be presented is based upon process geometry and the characteristics of the force system,in which the forces acting on the tool rake face,the cutting edge rounding and the clearance face have been considered,and the size effect is accountable for the new model.It is desired that the model can be well applicable to conventional diamond turning and the model may be employed as a tool in the design of diamond tools.This approach is quite different from traditional investigations primarily based on empirical studies.As the depth of cut becomes the same order as the rounded cutting edge radius,sliding along the clearance face due to elastic recovery of workpiece material and plowing due to the rounded cutting edge may become important in micro-machining,the forces acting on the cutting edge rounding and the clearance face can not be neglected.For this reason,it is very important to understand the influence of some parameters on tool forces and develop a model of the relationship between them.

  6. Nemo-like kinase is a novel regulator of spinal and bulbar muscular atrophy.

    Science.gov (United States)

    Todd, Tiffany W; Kokubu, Hiroshi; Miranda, Helen C; Cortes, Constanza J; La Spada, Albert R; Lim, Janghoo

    2015-08-26

    Spinal and bulbar muscular atrophy (SBMA) is a progressive neuromuscular disease caused by polyglutamine expansion in the androgen receptor (AR) protein. Despite extensive research, the exact pathogenic mechanisms underlying SBMA remain elusive. In this study, we present evidence that Nemo-like kinase (NLK) promotes disease pathogenesis across multiple SBMA model systems. Most remarkably, loss of one copy of Nlk rescues SBMA phenotypes in mice, including extending lifespan. We also investigated the molecular mechanisms by which NLK exerts its effects in SBMA. Specifically, we have found that NLK can phosphorylate the mutant polyglutamine-expanded AR, enhance its aggregation, and promote AR-dependent gene transcription by regulating AR-cofactor interactions. Furthermore, NLK modulates the toxicity of a mutant AR fragment via a mechanism that is independent of AR-mediated gene transcription. Our findings uncover a crucial role for NLK in controlling SBMA toxicity and reveal a novel avenue for therapy development in SBMA.

  7. NEMO: A Mission to Explore and Return Samples from Europa's Oceans

    Science.gov (United States)

    Powell, James R.; Paniagua, John C.; Maise, George

    2004-02-01

    The NEMO [Nuclear Europa Mobile Ocean] mission would explore and return samples and possible life forms from Europa's sub-surface oceans to Earth. The NEMO spacecraft would land on Europa two years after leaving Earth, using a compact bi-modal NTP engine. NEMO'S small nuclear reactor melt probe would then melt a channel through the multi-km ice sheet to the ocean, which a small robotic submarine would explore, transmitting data by sonic link and optical fiber to the spacecraft for relay to Earth. After its exploration, the submarine would rejoin the melt probe for return to the NEMO spacecraft. Using electricity from the bi-modal MITEE engine, fresh H2 propellant would be manufactured by electrolysis of melt water from surface ice. NEMO would then hop to a new site, exploring ten sites in a year before returning with samples and life forms to Earth, six years after it left. The design and performance of the NEMO spacecraft, MITEE engine, melt probe, and submarine are described. The probe and submarine use existing reactor technology. A NEMO mission could launch shortly after 2013 AD.

  8. A review of electricity market modelling tools

    Directory of Open Access Journals (Sweden)

    Sandra Milena Londoño Hernández

    2010-05-01

    Full Text Available Deregulating electricity markets around the world in the search for efficiency has introduced competition into the electricity marke- ting and generation business. Studying interactions amongst the participants has thus acquired great importance for regulators and market participants for analysing market evolution and suitably defining their bidding strategies. Different tools have thereof- re been used for modelling competitive electricity markets during the last few years. This paper presents an analytical review of the bibliography found regarding this subject; it also presents the most used tools along with their advantages and disadvantages. Such analysis was done by comparing the models used, identifying the main market characteristics such as market structure, bid structure and kind of bidding. This analysis concluded that the kind of tool to be used mainly depends on a particular study’s goal and scope.

  9. HYDROLOGICAL PROCESSES MODELLING USING ADVANCED HYDROINFORMATIC TOOLS

    Directory of Open Access Journals (Sweden)

    BEILICCI ERIKA

    2014-03-01

    Full Text Available The water has an essential role in the functioning of ecosystems by integrating the complex physical, chemical, and biological processes that sustain life. Water is a key factor in determining the productivity of ecosystems, biodiversity and species composition. Water is also essential for humanity: water supply systems for population, agriculture, fisheries, industries, and hydroelectric power depend on water supplies. The modelling of hydrological processes is an important activity for water resources management, especially now, when the climate change is one of the major challenges of our century, with strong influence on hydrological processes dynamics. Climate change and needs for more knowledge in water resources require the use of advanced hydroinformatic tools in hydrological processes modelling. The rationale and purpose of advanced hydroinformatic tools is to develop a new relationship between the stakeholders and the users and suppliers of the systems: to offer the basis (systems which supply useable results, the validity of which cannot be put in reasonable doubt by any of the stakeholders involved. For a successful modelling of hydrological processes also need specialists well trained and able to use advanced hydro-informatics tools. Results of modelling can be a useful tool for decision makers to taking efficient measures in social, economical and ecological domain regarding water resources, for an integrated water resources management.

  10. An analytical model for resistivity tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1991-04-01

    An analytical model for resistivity tools is developed. It takes into account the effect of the borehole and the actual shape of the electrodes. The model is two-dimensional, i.e. the model does not deal with eccentricity. The electrical potential around a current source satisfies Poisson`s equation. The method used here to solve Poisson`s equation is the expansion fo the potential function in terms of a complete set of functions involving one of the coordinates with coefficients which are undetermined functions of the other coordinate. Numerical examples of the use of the model are presented. The results are compared with results given in the literature. (au).

  11. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  12. Nemo-3 experiment assets and limitations. Perspective for the double {beta} physics; Experience Nemo 3 avantage et limitations. Prospective pour la physique double {beta}

    Energy Technology Data Exchange (ETDEWEB)

    Augier, C

    2005-06-15

    After an introduction to this report in Chapter 1, I present a status of our knowledge in neutrino physics in Chapter 2. Then, I detail in Chapter 3 all the choices made for the design and realisation of the NEMO 3 detector for the research of double beta decay process. Performance of the detector is presented, concerning both the capacity of the detector to identify the backgrounds and the ability to study all the {beta}{beta} process. I also explain the methods chosen by the NEMO collaboration to reduce the radon activity inside the detector and to make this background negligible today. This chapter, which is written in English, is the 'Technical report of the NEMO 3 detector' and forms an independent report for the NEMO collaborators. I finish this report in Chapter 4 with a ten years prospect for experimental projects in physics, with both the SuperNEMO project and its experiment program, and also by comparing the most interesting experiments, CUORE and GERDA, showing as an example the effect of nuclear matrix elements on the neutrino effective mass measurement. (author)

  13. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  14. Calorimeter R&D for the SuperNEMO Double Beta Decay Experiment

    CERN Document Server

    Kauer, Matthew

    2008-01-01

    SuperNEMO is a next-generation double beta decay experiment based on the successful tracking plus calorimetry design approach of the NEMO3 experiment currently running in the Laboratoire Souterrain de Modane (LSM). SuperNEMO can study a range of isotopes, the baseline isotopes are 82Se and possibly 150Nd. The total isotope mass will be 100-200 kg. A sensitivity to neutrinoless double beta decay half-life greater than 10e26 years can be reached which gives access to Majorana neutrino masses of 50-100 meV. One of the main challenges of the SuperNEMO R&D is the development of the calorimeter with an unprecedented energy resolution of 4% FWHM at 3 MeV (Qbb value of 82Se).

  15. NEMO: A Project for a km$^3$ Underwater Detector for Astrophysical Neutrinos in the Mediterranean Sea

    CERN Document Server

    Amore, I; Ambriola, M; Ameli, F; Anghinolfi, M; Anzalone, A; Barbarino, G; Barbarito, E; Battaglieri, M; Bellotti, R; Beverini, N; Bonori, M; Bouhadef, B; Brescia, M; Cacopardo, G; Cafagna, F; Capone, A; Caponetto, L; Castorina, E; Ceres, A; Chiarusi, T; Circella, M; Cocimano, R; Coniglione, R; Cordelli, M; Costa, M; Cuneo, S; D'Amico, A; De Bonis, G; De Marzo, C; De Rosa, G; De Vita, R; Distefano, C; Falchini, E; Fiorello, C; Flaminio, V; Fratini, K; Gabrielli, A; Galeotti, S; Gandolfi, E; Giacomelli, G; Giorgi, F; Grimaldi, A; Habel, R; Leonora, E; Lonardo, A; Longo, G; Lo Presti, D; Lucarelli, F; Maccioni, E; Margiotta, A; Martini, A; Masullo, R; Megna, R; Migneco, E; Mongelli, M; Montaruli, T; Morganti, M; Musumeci, M S; Nicolau, C A; Orlando, A; Osipenko, M; Osteria, G; Papaleo, R; Pappalardo, V; Petta, C; Piattelli, P; Raia, G; Randazzo, N; Reito, S; Ricco, G; Riccobene, G; Ripani, M; Rovelli, A; Ruppi, M; Russo, G V; Russo, S; Sapienza, P; Sedita, M; Shirokov, E; Simeone, F; Sipala, V; Spurio, M; Taiuti, M; Terreni, G; Trasatti, L; Urso, S; Valente, V; Vicini, P

    2007-01-01

    The status of the project is described: the activity on long term characterization of water optical and oceanographic parameters at the Capo Passero site candidate for the Mediterranean km$^3$ neutrino telescope; the feasibility study; the physics performances and underwater technology for the km$^3$; the activity on NEMO Phase 1, a technological demonstrator that has been deployed at 2000 m depth 25 km offshore Catania; the realization of an underwater infrastructure at 3500 m depth at the candidate site (NEMO Phase 2).

  16. GeNemo: a search engine for web-based functional genomic data.

    Science.gov (United States)

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org.

  17. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  18. GridTool: A surface modeling and grid generation tool

    Science.gov (United States)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  19. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  20. An MCMC Circumstellar Disks Modeling Tool

    Science.gov (United States)

    Wolff, Schuyler; Perrin, Marshall D.; Mazoyer, Johan; Choquet, Elodie; Soummer, Remi; Ren, Bin; Pueyo, Laurent; Debes, John H.; Duchene, Gaspard; Pinte, Christophe; Menard, Francois

    2016-01-01

    We present an enhanced software framework for the Monte Carlo Markov Chain modeling of circumstellar disk observations, including spectral energy distributions and multi wavelength images from a variety of instruments (e.g. GPI, NICI, HST, WFIRST). The goal is to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in the derived properties. This modular code is designed to work with a collection of existing modeling tools, ranging from simple scripts to define the geometry for optically thin debris disks, to full radiative transfer modeling of complex grain structures in protoplanetary disks (using the MCFOST radiative transfer modeling code). The MCMC chain relies on direct chi squared comparison of model images/spectra to observations. We will include a discussion of how best to weight different observations in the modeling of a single disk and how to incorporate forward modeling from PCA PSF subtraction techniques. The code is open source, python, and available from github. Results for several disks at various evolutionary stages will be discussed.

  1. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  2. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  3. Dynamical downscaling of warming scenarios with NEMO-Nordic setup for the North Sea and Baltic Sea

    Science.gov (United States)

    Gröger, Matthias; Almroth Rosell, Elin; Anderson, Helén; Axell, Lars; Dieterich, Christain; Edman, Moa; Eilola, Kari; Höglund, Anders; Hordoir, Robinson; Hieronymus, Jenny; Karlsson, Bengt; Liu, Ye; Meier, Markus; Pemberton, Per; Saraiva, Sofia

    2016-04-01

    The North Sea and Baltic Sea constitute one of the most complex and challenging areas in the world. The oceanographic setting ranges from quasi open ocean conditions in the northern North Sea to more brackish conditions in the Baltic Sea which is also affected by sea ice in winter. The two seas are connected by narrow straits which sporadically allow the important inflow of salt and oxygen rich bottom waters into the Baltic Sea. For this, the high resolution regional model NEMO-Nordic has recently been developed. Here, the model is applied on hindcast simulations and used to downscale several climate warming scenarios. The model can be interactively coupled to the regional atmosphere model RCA4 by exchanging air sea fluxes of mass and energy (Wang et al., 2015). Comparison with well established models and newly compiled observational data sets (Bersch et al., 2013) indicates NEMO-Nordic performs well on climate relevant time scales. Emphasis is laid on thermal dynamics. Hindcast simulations demonstrate that simulated winter temperatures in the Baltic Sea can benefit from interactive air sea coupling by allowing interactive feedback loops to take place between the ocean and the atmosphere (Gröger et al. 2015). Likewise, a more realistic dynamical behaviour makes the interactive coupled model suitable for dynamic downscaling of climate warming scenarios. Depending on the driving global climate model and IPCC representative concentration pathway scenario NEMO-Nordic shows an average warming of the North Sea between 2 and 4 K at the end of the 21st century. However the warming pattern is spatially inhomogeneous showing strong east west gradients. Involved processes such as circulation changes and changes in radiative forcing will be discussed. Bersch, M., Gouretski, V., Sadikni, R., Hinrichs, I., 2013. Hydrographic climatology of the North Sea and surrounding regions. Centre for Earth System Research and Sustainability, University of Hamburg, www

  4. Status and first results of the NEMO Phase-2 tower

    Science.gov (United States)

    Chiarusi, T.; Aiello, S.; Ameli, F.; Anghinolfi, M.; Barbarino, G.; Barbarito, E.; Barbato, F.; Beverini, N.; Biagi, S.; Bouhadef, B.; Bozza, C.; Cacopardo, G.; Calamai, M.; Calì, C.; Capone, A.; Caruso, F.; Ceres, A.; Circella, M.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D'Amato, C.; D'Amato, V.; D'Amico, A.; DeBonis, G.; De Luca, V.; Deniskina, N.; De Rosa, G.; Distefano, C.; Fermani, P.; Flaminio, V.; Fusco, L. A.; Garufi, F.; Giordano, V.; Giovanetti, G.; Gmerk, A.; Grasso, R.; Grella, G.; Hugon, C.; Imbesi, M.; Kulikovsky, V.; Larosa, G.; Lattuada, D.; Leonora, E.; Litrico, P.; Lonardo, A.; Longhitano, F.; Lo Presti, D.; Maccioni, E.; Margiotta, A.; Martini, A.; Masullo, R.; Migliozzi, P.; Migneco, E.; Miraglia, A.; Mollo, C.; Mongelli, M.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Papaleo, R.; Pellegrino, C.; Pellegriti, M. G.; Perrina, C.; Piattelli, P.; Pugliatti, C.; Pulvirenti, S.; Raffaelli, F.; Randazzo, N.; Riccobene, G.; Rovelli, A.; Sanguineti, M.; Sapienza, P.; Sgura, I.; Simeone, F.; Sipala, V.; Spurio, M.; Speziale, F.; Spitaleri, A.; Taiuti, M.; Terreni, G.; Trasatti, L.; Trovato, A.; Ventura, C.; Vicini, P.; Viola, S.; Vivolo, D.

    2014-03-01

    In March 2013, the NEMO Phase 2 tower has been successfully installed in the Capo Passero site, at a depth of 3500 m and 80 km off from the southern coast of Sicily. The unfurled tower is 450 m high; it is composed of 8 mechanical floors, for a total amount of 32 PMTs and various instruments for environmental measurements. The tower positioning is achieved by an acoustic system. The tower is continuously acquiring and transmitting all the measured signals to shore. Data reduction is completely performed in the Portopalo shore station by a dedicated computing facility connected to the persistent storage system at LNS, in Catania. Results from the last 9 months of acquisition will be presented. In particular, the analyzed optical rates, showing stable and low baseline values, are compatible with the contribution mainly of 40K light emission, with a small percentage of light bursts due to bioluminescence. These features reveal the optimal nature of the Capo Passero abyssal site to host a km3-sized Neutrino Telescope.

  5. NEMO inhibits programmed necrosis in an NFκB-independent manner by restraining RIP1.

    Directory of Open Access Journals (Sweden)

    Marie Anne O'Donnell

    Full Text Available TNF can trigger two opposing responses: cell survival and cell death. TNFR1 activates caspases that orchestrate apoptosis but some cell types switch to a necrotic death when treated with caspase inhibitors. Several genes that are required to orchestrate cell death by programmed necrosis have been identified, such as the kinase RIP1, but very little is known about the inhibitory signals that keep this necrotic cell death pathway in check. We demonstrate that T cells lacking the regulatory subunit of IKK, NFκB essential modifier (NEMO, are hypersensitive to programmed necrosis when stimulated with TNF in the presence of caspase inhibitors. Surprisingly, this pro-survival activity of NEMO is independent of NFκB-mediated gene transcription. Instead, NEMO inhibits necrosis by binding to ubiquitinated RIP1 to restrain RIP1 from engaging the necrotic death pathway. In the absence of NEMO, or if ubiquitination of RIP1 is blocked, necrosis ensues when caspases are blocked. These results indicate that recruitment of NEMO to ubiquitinated RIP1 is a key step in the TNFR1 signaling pathway that determines whether RIP1 triggers a necrotic death response.

  6. Porcine deltacoronavirus nsp5 inhibits interferon-β production through the cleavage of NEMO.

    Science.gov (United States)

    Zhu, Xinyu; Fang, Liurong; Wang, Dang; Yang, Yuting; Chen, Jiyao; Ye, Xu; Foda, Mohamed Frahat; Xiao, Shaobo

    2017-02-01

    Porcine deltacoronavirus (PDCoV) causes acute enteric disease and mortality in seronegative neonatal piglets. Previously we have demonstrated that PDCoV infection suppresses the production of interferon-beta (IFN-β), while the detailed mechanisms are poorly understood. Here, we demonstrate that nonstructural protein 5 (nsp5) of PDCoV, the 3C-like protease, significantly inhibits Sendai virus (SEV)-induced IFN-β production by targeting the NF-κB essential modulator (NEMO), confirmed by the diminished function of NEMO cleaved by PDCoV. The PDCoV nsp5 cleavage site in the NEMO protein was identified as glutamine 231, and was identical to the porcine epidemic diarrhea virus nsp5 cleavage site, revealing the likelihood of a common target in NEMO for coronaviruses. Furthermore, this cleavage impaired the ability of NEMO to activate the IFN response and downstream signaling. Taken together, our findings reveal PDCoV nsp5 to be a newly identified IFN antagonist and enhance the understanding of immune evasion by deltacoronaviruses. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. NEMO: A mission to search for and return to Earth possible life forms on Europa

    Science.gov (United States)

    Powell, Jesse; Powell, James; Maise, George; Paniagua, John

    2005-07-01

    The Nuclear Europa Mobile Ocean (NEMO) mission would land on the surface of Europa, and deploy a small, lightweight melt probe powered by a compact nuclear reactor to melt down through the multi-kilometer ice sheet. After reaching the sub-surface ocean, a small nuclear Autonomous Underwater Vehicle (AUV) would deploy to explore the sub-ice ocean. After exploration and sample collection, the AUV would return to the probe and melt back to the lander. The lander would have replenished its H2 propellant by electrolysis of H2O ice, and then hop to a new site on Europa to repeat the probe/AUV process. After completing the mission, the NEMO spacecraft would return to Earth with its collected samples. The NEMO melt probe and AUV utilize enriched U-235 fuel and conventional water reactor technology. The lander utilizes a compact nuclear thermal propulsion (NTP) engine based on the 710tungsten/UO2 cermet fuel and high-temperature H2 propellant. The compact nuclear reactors in both the NEMO melt probe and AUV drive a steam power cycle, generating over 10 kW(e) for use in each. Each nuclear reactor's operating lifetime is several years. With its high-mobility and long-duration mission, NEMO provides an ideal platform for life detection experiments.

  8. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  9. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  10. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  11. A 1/16° eddying simulation of the global NEMO sea-ice-ocean system

    Science.gov (United States)

    Iovino, Doroteaciro; Masina, Simona; Storto, Andrea; Cipollone, Andrea; Stepanov, Vladimir N.

    2016-08-01

    Analysis of a global eddy-resolving simulation using the NEMO general circulation model is presented. The model has 1/16° horizontal spacing at the Equator, employs two displaced poles in the Northern Hemisphere, and uses 98 vertical levels. The simulation was spun up from rest and integrated for 11 model years, using ERA-Interim reanalysis as surface forcing. Primary intent of this hindcast is to test how the model represents upper ocean characteristics and sea ice properties. Analysis of the zonal averaged temperature and salinity, and the mixed layer depth indicate that the model average state is in good agreement with observed fields and that the model successfully represents the variability in the upper ocean and at intermediate depths. Comparisons against observational estimates of mass transports through key straits indicate that most aspects of the model circulation are realistic. As expected, the simulation exhibits turbulent behaviour and the spatial distribution of the sea surface height (SSH) variability from the model is close to the observed pattern. The distribution and volume of the sea ice are, to a large extent, comparable to observed values. Compared with a corresponding eddy-permitting configuration, the performance of the model is significantly improved: reduced temperature and salinity biases, in particular at intermediate depths, improved mass and heat transports, better representation of fluxes through narrow and shallow straits, and increased global-mean eddy kinetic energy (by ˜ 40 %). However, relatively minor weaknesses still exist such as a lower than observed magnitude of the SSH variability. We conclude that the model output is suitable for broader analysis to better understand upper ocean dynamics and ocean variability at global scales. This simulation represents a major step forward in the global ocean modelling at the Euro-Mediterranean Centre on Climate Change and constitutes the groundwork for future applications to short

  12. Hepatic tissue environment in NEMO-deficient mice critically regulates positive selection of donor cells after hepatocyte transplantation.

    Directory of Open Access Journals (Sweden)

    Michaela Kaldenbach

    Full Text Available BACKGROUND: Hepatocyte transplantation (HT is a promising alternative treatment strategy for end-stage liver diseases compared with orthotopic liver transplantation. A limitation for this approach is the low engraftment of donor cells. The deletion of the I-kappa B kinase-regulatory subunit IKKγ/NEMO in hepatocytes prevents nuclear factor (NF-kB activation and triggers spontaneous liver apoptosis, chronic hepatitis and the development of liver fibrosis and hepatocellular carcinoma. We hypothesized that NEMOΔhepa mice may therefore serve as an experimental model to study HT. METHODS: Pre-conditioned NEMOΔhepa mice were transplanted with donor-hepatocytes from wildtype (WT and mice deficient for the pro-apoptotic mediator Caspase-8 (Casp8Δhepa. RESULTS: Transplantation of isolated WT-hepatocytes into pre-conditioned NEMOΔhepa mice resulted in a 6-7 fold increase of donor cells 12 weeks after HT, while WT-recipients showed no liver repopulation. The use of apoptosis-resistant Casp8Δhepa-derived donor cells further enhanced the selection 3-fold after 12-weeks and up to 10-fold increase after 52 weeks compared with WT donors. While analysis of NEMOΔhepa mice revealed strong liver injury, HT-recipient NEMOΔhepa mice showed improved liver morphology and decrease in serum transaminases. Concomitant with these findings, the histological examination elicited an improved liver tissue architecture associated with significantly lower levels of apoptosis, decreased proliferation and a lesser amount of liver fibrogenesis. Altogether, our data clearly support the therapeutic benefit of the HT procedure into NEMOΔhepa mice. CONCLUSION: This study demonstrates the feasibility of the NEMOΔhepa mouse as an in vivo tool to study liver repopulation after HT. The improvement of the characteristic phenotype of chronic liver injury in NEMOΔhepa mice after HT suggests the therapeutic potential of HT in liver diseases with a chronic inflammatory phenotype and

  13. Hepatic Tissue Environment in NEMO-Deficient Mice Critically Regulates Positive Selection of Donor Cells after Hepatocyte Transplantation

    Science.gov (United States)

    Kaldenbach, Michaela; Cubero, Francisco Javier; Erschfeld, Stephanie; Liedtke, Christian; Trautwein, Christian; Streetz, Konrad

    2014-01-01

    Background Hepatocyte transplantation (HT) is a promising alternative treatment strategy for end-stage liver diseases compared with orthotopic liver transplantation. A limitation for this approach is the low engraftment of donor cells. The deletion of the I-kappa B kinase-regulatory subunit IKKγ/NEMO in hepatocytes prevents nuclear factor (NF)-kB activation and triggers spontaneous liver apoptosis, chronic hepatitis and the development of liver fibrosis and hepatocellular carcinoma. We hypothesized that NEMOΔhepa mice may therefore serve as an experimental model to study HT. Methods Pre-conditioned NEMOΔhepa mice were transplanted with donor-hepatocytes from wildtype (WT) and mice deficient for the pro-apoptotic mediator Caspase-8 (Casp8Δhepa). Results Transplantation of isolated WT-hepatocytes into pre-conditioned NEMOΔhepa mice resulted in a 6-7 fold increase of donor cells 12 weeks after HT, while WT-recipients showed no liver repopulation. The use of apoptosis-resistant Casp8Δhepa-derived donor cells further enhanced the selection 3-fold after 12-weeks and up to 10-fold increase after 52 weeks compared with WT donors. While analysis of NEMOΔhepa mice revealed strong liver injury, HT-recipient NEMOΔhepa mice showed improved liver morphology and decrease in serum transaminases. Concomitant with these findings, the histological examination elicited an improved liver tissue architecture associated with significantly lower levels of apoptosis, decreased proliferation and a lesser amount of liver fibrogenesis. Altogether, our data clearly support the therapeutic benefit of the HT procedure into NEMOΔhepa mice. Conclusion This study demonstrates the feasibility of the NEMOΔhepa mouse as an in vivo tool to study liver repopulation after HT. The improvement of the characteristic phenotype of chronic liver injury in NEMOΔhepa mice after HT suggests the therapeutic potential of HT in liver diseases with a chronic inflammatory phenotype and opens a new door for

  14. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  15. NEMO5: Achieving High-end Internode Communication for Performance Projection Beyond Moore's Law

    CERN Document Server

    Andrawis, Robert; Charles, James; Fang, Jianbin; Fonseca, Jim; He, Yu; Klimeck, Gerhard; Jiang, Zhengping; Kubis, Tillmann; Mejia, Daniel; Lemus, Daniel; Povolotskyi, Michael; Rubiano, Santiago Alonso Perez; Sarangapani, Prasad; Zeng, Lang

    2015-01-01

    Electronic performance predictions of modern nanotransistors require nonequilibrium Green's functions including incoherent scattering on phonons as well as inclusion of random alloy disorder and surface roughness effects. The solution of all these effects is numerically extremely expensive and has to be done on the world's largest supercomputers due to the large memory requirement and the high performance demands on the communication network between the compute nodes. In this work, it is shown that NEMO5 covers all required physical effects and their combination. Furthermore, it is also shown that NEMO5's implementation of the algorithm scales very well up to about 178176CPUs with a sustained performance of about 857 TFLOPS. Therefore, NEMO5 is ready to simulate future nanotransistors.

  16. In vitro detection of NEMO-ubiquitin binding using DELFIA and microscale thermophoresis assays.

    Science.gov (United States)

    Vincendeau, Michelle; Krappmann, Daniel; Hadian, Kamyar

    2015-01-01

    Canonical NF-κB signaling in response to various stimuli converges at the level of the IκB kinase (IKK) complex to ultimately activate NF-κB. To achieve this, the IKK complex uses one of its regulatory subunit (IKKγ/NEMO) to sense ubiquitin chains formed by upstream complexes. Various studies have shown that different Ubiquitin chains are involved in the binding of NEMO and thereby the activation of NF-κB. We have utilized two distinct biochemical methods, i.e., Dissociation-Enhanced Lanthanide Fluorescence Immunoassay (DELFIA) and Microscale Thermophoresis (MST), to detect the interaction of NEMO to linear and K63-linked Ubiquitin chains, respectively. Here, we describe the brief basis of the methods and a detailed underlying protocol.

  17. DATA QUALITY TOOLS FOR DATAWAREHOUSE MODELS

    Directory of Open Access Journals (Sweden)

    JASPREETI SINGH

    2015-05-01

    Full Text Available Data quality tools aim at detecting and correcting data problems that influence the accuracy and efficiency of data analysis applications. Data warehousing activities require data quality tools to ready the data and ensure that clean data populates the warehouse, thus raising usability of the warehouse. This research targets on the problems in the data that are addressed by data quality tools. We classify data quality tools based on datawarehouse stages and features of tool; which address the data quality problems and understand their functionalities.

  18. Update on the diagnosis and treatment of neuromyelitis optica: recommendations of the Neuromyelitis Optica Study Group (NEMOS).

    Science.gov (United States)

    Trebst, Corinna; Jarius, Sven; Berthele, Achim; Paul, Friedemann; Schippling, Sven; Wildemann, Brigitte; Borisow, Nadja; Kleiter, Ingo; Aktas, Orhan; Kümpfel, Tania

    2014-01-01

    Neuromyelitis optica (NMO, Devic's syndrome), long considered a clinical variant of multiple sclerosis, is now regarded as a distinct disease entity. Major progress has been made in the diagnosis and treatment of NMO since aquaporin-4 antibodies (AQP4-Ab; also termed NMO-IgG) were first described in 2004. In this review, the Neuromyelitis Optica Study Group (NEMOS) summarizes recently obtained knowledge on NMO and highlights new developments in its diagnosis and treatment, based on current guidelines, the published literature and expert discussion at regular NEMOS meetings. Testing of AQP4-Ab is essential and is the most important test in the diagnostic work-up of suspected NMO, and helps to distinguish NMO from other autoimmune diseases. Furthermore, AQP4-Ab testing has expanded our knowledge of the clinical presentation of NMO spectrum disorders (NMOSD). In addition, imaging techniques, particularly magnetic resonance imaging of the brain and spinal cord, are obligatory in the diagnostic workup. It is important to note that brain lesions in NMO and NMOSD are not uncommon, do not rule out the diagnosis, and show characteristic patterns. Other imaging modalities such as optical coherence tomography are proposed as useful tools in the assessment of retinal damage. Therapy of NMO should be initiated early. Azathioprine and rituximab are suggested as first-line treatments, the latter being increasingly regarded as an established therapy with long-term efficacy and an acceptable safety profile in NMO patients. Other immunosuppressive drugs, such as methotrexate, mycophenolate mofetil and mitoxantrone, are recommended as second-line treatments. Promising new therapies are emerging in the form of anti-IL6 receptor, anti-complement or anti-AQP4-Ab biologicals.

  19. GeNemo: a search engine for web-based functional genomic data

    OpenAIRE

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-01-01

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of E...

  20. General model for boring tool optimization

    Science.gov (United States)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  1. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  2. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  3. Frequent somatic mosaicism of NEMO in T cells of patients with X-linked anhidrotic ectodermal dysplasia with immunodeficiency.

    Science.gov (United States)

    Kawai, Tomoki; Nishikomori, Ryuta; Izawa, Kazushi; Murata, Yuuki; Tanaka, Naoko; Sakai, Hidemasa; Saito, Megumu; Yasumi, Takahiro; Takaoka, Yuki; Nakahata, Tatsutoshi; Mizukami, Tomoyuki; Nunoi, Hiroyuki; Kiyohara, Yuki; Yoden, Atsushi; Murata, Takuji; Sasaki, Shinya; Ito, Etsuro; Akutagawa, Hiroshi; Kawai, Toshinao; Imai, Chihaya; Okada, Satoshi; Kobayashi, Masao; Heike, Toshio

    2012-06-07

    Somatic mosaicism has been described in several primary immunodeficiency diseases and causes modified phenotypes in affected patients. X-linked anhidrotic ectodermal dysplasia with immunodeficiency (XL-EDA-ID) is caused by hypomorphic mutations in the NF-κB essential modulator (NEMO) gene and manifests clinically in various ways. We have previously reported a case of XL-EDA-ID with somatic mosaicism caused by a duplication mutation of the NEMO gene, but the frequency of somatic mosaicism of NEMO and its clinical impact on XL-EDA-ID is not fully understood. In this study, somatic mosaicism of NEMO was evaluated in XL-EDA-ID patients in Japan. Cells expressing wild-type NEMO, most of which were derived from the T-cell lineage, were detected in 9 of 10 XL-EDA-ID patients. These data indicate that the frequency of somatic mosaicism of NEMO is high in XL-ED-ID patients and that the presence of somatic mosaicism of NEMO could have an impact on the diagnosis and treatment of XL-ED-ID patients.

  4. Advanced energy systems and technologies (NEMO 2). Final report 1993-1998

    Energy Technology Data Exchange (ETDEWEB)

    Lund, P.; Konttinen, P. [eds.

    1998-12-31

    NEMO2 has been the major Finnish energy research programme on advanced energy systems and technologies during 1993-1998. The main objective of the programme has been to support industrial technology development but also to increase the utilisation of wind and solar energy in Finland. The main technology fields covered are wind and solar energy. In addition, the programme has supported projects on energy storage and other small-scale energy technologies such as fuel cells that support the main technology fields chosen. NEMO2 is one of the energy research programmes of the Technology Development Centre of Finland (TEKES). The total R and D funding over the whole programme period was FIM 130 million (ECU 22 million). The public funding of the total programme costs has been 43 %. The industrial participation has been strong. International co-operation has been an important aspect in NEMO2: the programme has stimulated 24 EU-projects and participation in several IEA co-operative tasks. International funding adds nearly 20 % to the NEMO2 R and D funding. (orig.)

  5. Design and Experimental Evaluation of a Vehicular Network Based on NEMO and MANET

    Science.gov (United States)

    Tsukada, Manabu; Santa, José; Mehani, Olivier; Khaled, Yacine; Ernst, Thierry

    2010-12-01

    Mobile Ad hoc Network (MANET) routing protocols and Network Mobility (NEMO) Basic Support are considered key technologies for vehicular networks. MANEMO, that is, the combination of MANET (for infrastructureless communications) and NEMO (for infrastructure-based communications) offers a number of benefits, such as route optimization or multihoming. With the aim of assessing the benefits of this synergy, this paper presents a policy-based solution to distribute traffic among multiple paths to improve the overall performance of a vehicular network. An integral vehicular communication testbed has been developed to carry out field trials. First, the performance of the Optimized Link State Routing protocol (OLSR) is evaluated in a vehicular network with up to four vehicles. To analyze the impact of the vehicles' position and movement on network performances, an integrated evaluation environment called AnaVANET has been developed. Performance results have been geolocated using GPS information. Second, by switching from NEMO to MANET, routes between vehicles are optimized, and the final performance is improved in terms of latency and bandwidth. Our experimental results show that the network operation is further improved with simultaneous usage of NEMO and MANET.

  6. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  7. Tool for physics beyond the standard model

    Science.gov (United States)

    Newby, Christopher A.

    The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. et al. focused on elastic dark matter, whereas we extended this work to include the in elastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU(2)L of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation

  8. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  9. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. T

  10. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  11. Comparative study of sea ice dynamics simulations with a Maxwell elasto-brittle rheology and the elastic-viscous-plastic rheology in NEMO-LIM3

    Science.gov (United States)

    Raulier, Jonathan; Dansereau, Véronique; Fichefet, Thierry; Legat, Vincent; Weiss, Jérôme

    2017-04-01

    Sea ice is a highly dynamical environment characterized by a dense mesh of fractures or leads, constantly opening and closing over short time scales. This characteristic geomorphology is linked to the existence of linear kinematic features, which consist of quasi-linear patterns emerging from the observed strain rate field of sea ice. Standard rheologies used in most state-of-the-art sea ice models, like the well-known elastic-viscous-plastic rheology, are thought to misrepresent those linear kinematic features and the observed statistical distribution of deformation rates. Dedicated rheologies built to catch the processes known to be at the origin of the formation of leads are developed but still need evaluations on the global scale. One of them, based on a Maxwell elasto-brittle formulation, is being integrated in the NEMO-LIM3 global ocean-sea ice model (www.nemo-ocean.eu; www.elic.ucl.ac.be/lim). In the present study, we compare the results of the sea ice model LIM3 obtained with two different rheologies: the elastic-viscous-plastic rheology commonly used in LIM3 and a Maxwell elasto-brittle rheology. This comparison is focused on the statistical characteristics of the simulated deformation rate and on the ability of the model to reproduce the existence of leads within the ice pack. The impact of the lead representation on fluxes between ice, atmosphere and ocean is also assessed.

  12. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  13. Modeling Languages: metrics and assessing tools

    OpenAIRE

    Fonte, Daniela; Boas, Ismael Vilas; Azevedo, José; Peixoto, José João; Faria, Pedro; Silva, Pedro; Sá, Tiago de, 1990-; Costa, Ulisses; da Cruz, Daniela; Henriques, Pedro Rangel

    2012-01-01

    Any traditional engineering field has metrics to rigorously assess the quality of their products. Engineers know that the output must satisfy the requirements, must comply with the production and market rules, and must be competitive. Professionals in the new field of software engineering started a few years ago to define metrics to appraise their product: individual programs and software systems. This concern motivates the need to assess not only the outcome but also the process and tools em...

  14. The mathematical and computer modeling of the worm tool shaping

    Science.gov (United States)

    Panchuk, K. L.; Lyashkov, A. A.; Ayusheev, T. V.

    2017-06-01

    Traditionally mathematical profiling of the worm tool is carried out on the first T. Olivier method, known in the theory of gear gearings, with receiving an intermediate surface of the making lath. It complicates process of profiling and its realization by means of computer 3D-modeling. The purpose of the work is the improvement of mathematical model of profiling and its realization based on the methods of 3D-modeling. Research problems are: receiving of the mathematical model of profiling which excludes the presence of the making lath in it; realization of the received model by means of frame and superficial modeling; development and approbation of technology of solid-state modeling for the solution of the problem of profiling. As the basic, the kinematic method of research of the mutually envelope surfaces is accepted. Computer research is executed by means of CAD based on the methods of 3D-modeling. We have developed mathematical model of profiling of the worm tool; frame, superficial and solid-state models of shaping of the mutually enveloping surfaces of the detail and the tool are received. The offered mathematical models and the technologies of 3D-modeling of shaping represent tools for theoretical and experimental profiling of the worm tool. The results of researches can be used at design of metal-cutting tools.

  15. Many-Task Computing Tools for Multiscale Modeling

    OpenAIRE

    Katz, Daniel S.; Ripeanu, Matei; Wilde, Michael

    2011-01-01

    This paper discusses the use of many-task computing tools for multiscale modeling. It defines multiscale modeling and places different examples of it on a coupling spectrum, discusses the Swift parallel scripting language, describes three multiscale modeling applications that could use Swift, and then talks about how the Swift model is being extended to cover more of the multiscale modeling coupling spectrum.

  16. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  17. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  18. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    -friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend and/or adopt a model. This is based on the idea of model reuse, which emphasizes the use...... and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer...... aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML...

  19. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  20. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  1. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  2. XLISP-Stat Tools for Building Generalised Estimating Equation Models

    Directory of Open Access Journals (Sweden)

    Thomas Lumley

    1996-12-01

    Full Text Available This paper describes a set of Lisp-Stat tools for building Generalised Estimating Equation models to analyse longitudinal or clustered measurements. The user interface is based on the built-in regression and generalised linear model prototypes, with the addition of object-based error functions, correlation structures and model formula tools. Residual and deletion diagnostic plots are available on the cluster and observation level and use the dynamic graphics capabilities of Lisp-Stat.

  3. The Ising model as a pedagogical tool

    Science.gov (United States)

    Smith, Ryan; Hart, Gus L. W.

    2010-10-01

    Though originally developed to analyze ferromagnetic systems, the Ising model also provides an excellent framework for modeling alloys. The original Ising model represented magnetic moments (up or down) by a +1 or -1 at each point on a lattice and allowed only nearest neighbors interactions to be non-zero. In alloy modeling, the values ±1 represent A and B atoms. The Ising Hamiltonian can be used in a Monte Carlo approach to simulate the thermodynamics of the system (e.g., an order-disorder transition occuring as the temperature is lowered). The simplicity of the model makes it an ideal starting point for a qualitative understanding of magnetism or configuration ordering in a metal. I will demonstrate the application of the Ising model in simple, two-dimensional ferromagnetic systems and alloys.

  4. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  5. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sourc

  6. Techniques and tools for efficiently modeling multiprocessor systems

    Science.gov (United States)

    Carpenter, T.; Yalamanchili, S.

    1990-01-01

    System-level tools and methodologies associated with an integrated approach to the development of multiprocessor systems are examined. Tools for capturing initial program structure, automated program partitioning, automated resource allocation, and high-level modeling of the combined application and resource are discussed. The primary language focus of the current implementation is Ada, although the techniques should be appropriate for other programming paradigms.

  7. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  8. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  9. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an estimat

  10. Model atmospheres - Tool for identifying interstellar features

    Science.gov (United States)

    Frisch, P. C.; Slojkowski, S. E.; Rodriguez-Bell, T.; York, D.

    1993-01-01

    Model atmosphere parameters are derived for 14 early A stars with rotation velocities, from optical spectra, in excess of 80 km/s. The models are compared with IUE observations of the stars in regions where interstellar lines are expected. In general, with the assumption of solar abundances, excellent fits are obtained in regions longward of 2580 A, and accurate interstellar equivalent widths can be derived using models to establish the continuum. The fits are poorer at shorter wavelengths, particularly at 2026-2062 A, where the stellar model parameters seem inadequate. Features indicating mass flows are evident in stars with known infrared excesses. In gamma TrA, variability in the Mg II lines is seen over the 5-year interval of these data, and also over timescales as short as 26 days. The present technique should be useful in systematic studies of episodic mass flows in A stars and for stellar abundance studies, as well as interstellar features.

  11. Measurement of the atmospheric muon flux with the NEMO Phase-1 detector

    Science.gov (United States)

    Aiello, S.; Ameli, F.; Amore, I.; Anghinolfi, M.; Anzalone, A.; Barbarino, G.; Battaglieri, M.; Bazzotti, M.; Bersani, A.; Beverini, N.; Biagi, S.; Bonori, M.; Bouhadef, B.; Brunoldi, M.; Cacopardo, G.; Capone, A.; Caponetto, L.; Carminati, G.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Cordelli, M.; Costa, M.; D'Amico, A.; De Bonis, G.; De Marzo, C.; De Rosa, G.; De Ruvo, G.; De Vita, R.; Distefano, C.; Falchini, E.; Flaminio, V.; Fratini, K.; Gabrielli, A.; Galatà, S.; Gandolfi, E.; Giacomelli, G.; Giorgi, F.; Giovanetti, G.; Grimaldi, A.; Habel, R.; Imbesi, M.; Kulikovsky, V.; Lattuada, D.; Leonora, E.; Lonardo, A.; Lo Presti, D.; Lucarelli, F.; Marinelli, A.; Margiotta, A.; Martini, A.; Masullo, R.; Migneco, E.; Minutoli, S.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Osipenko, M.; Papaleo, R.; Pappalardo, V.; Piattelli, P.; Piombo, D.; Raia, G.; Randazzo, N.; Reito, S.; Ricco, G.; Riccobene, G.; Ripani, M.; Rovelli, A.; Ruppi, M.; Russo, G. V.; Russo, S.; Sapienza, P.; Sciliberto, D.; Sedita, M.; Shirokov, E.; Simeone, F.; Sipala, V.; Spurio, M.; Taiuti, M.; Trasatti, L.; Urso, S.; Vecchi, M.; Vicini, P.; Wischnewski, R.

    2010-05-01

    The NEMO Collaboration installed and operated an underwater detector including prototypes of the critical elements of a possible underwater km 3 neutrino telescope: a four-floor tower (called Mini-Tower) and a Junction Box. The detector was developed to test some of the main systems of the km 3 detector, including the data transmission, the power distribution, the timing calibration and the acoustic positioning systems as well as to verify the capabilities of a single tridimensional detection structure to reconstruct muon tracks. We present results of the analysis of the data collected with the NEMO Mini-Tower. The position of photomultiplier tubes (PMTs) is determined through the acoustic position system. Signals detected with PMTs are used to reconstruct the tracks of atmospheric muons. The angular distribution of atmospheric muons was measured and results compared to Monte Carlo simulations.

  12. Measurement of the atmospheric muon flux with the NEMO Phase-1 detector

    CERN Document Server

    Aiello, S; Amore, I; Anghinolfi, M; Anzalone, A; Barbarino, G; Battaglieri, M; Bazzotti, M; Bersani, A; Beverini, N; Biagi, S; Bonori, M; Bouhadef, B; Brunoldi, M; Cacopardo, G; Capone, A; Caponetto, L; Carminati, G; Chiarusi, T; Circella, M; Cocimano, R; Coniglione, R; Cordelli, M; Costa, M; D'Amico, A; De Bonis, G; De Marzo, C; 1,; De Rosa, G; De Ruvo, G; De Vita, R; Distefano, C; Falchini, E; Flaminio, V; Fratini, K; Gabrielli, A; Galatà, S; Gandolfi, E; Giacomelli, G; Giorgi, F; Giovanetti, G; Grimaldi, A; Habel, R; Imbesi, M; Kulikovsky, V; Lattuada, D; Leonora, E; Lonardo, A; Presti, D Lo; Lucarelli, F; Marinelli, A; Margiotta, A; Martini, A; Masullo, R; Migneco, E; Minutoli, S; Morganti, M; Musico, P; Musumeci, M; Nicolau, C A; Orlando, A; Osipenko, M; Papaleo, R; Pappalardo, V; Piattelli, P; Piombo, D; Raia, G; Randazzo, N; Reito, S; Ricco, G; Riccobene, G; Ripani, M; Rovelli, A; Ruppi, M; Russo, G V; Russo, S; Sapienza, P; Sciliberto, D; Sedita, M; Shirokov, E; Simeone, F; Sipala, V; Spurio, M; Taiuti, M; Trasatti, L; Urso, S; Vecchi, M; Vicini, P; Wischnewski, R

    2009-01-01

    The NEMO Collaboration installed and operated an underwater detector including prototypes of the critical elements of a possible underwater km3 neutrino telescope: a four-floor tower (called Mini-Tower) and a Junction Box. The detector was developed to test some of the main systems of the km3 detector, including the data transmission, the power distribution, the timing calibration and the acoustic positioning systems as well as to verify the capabilities of a single tridimensional detection structure to reconstruct muon tracks. We present results of the analysis of the data collected with the NEMO Mini-Tower. The position of photomultiplier tubes (PMTs) is determined through the acoustic position system. Signals detected with PMTs are used to reconstruct the tracks of atmospheric muons. The angular distribution of atmospheric muons was measured and results compared with Monte Carlo simulations.

  13. Decreased linear ubiquitination of NEMO and FADD on apoptosis with caspase-mediated cleavage of HOIP.

    Science.gov (United States)

    Goto, Eiji; Tokunaga, Fuminori

    2017-02-09

    NF-κB is crucial to regulate immune and inflammatory responses and cell survival. LUBAC generates a linear ubiquitin chain and activates NF-κB through ubiquitin ligase (E3) activity in the HOIP subunit. Here, we show that HOIP is predominantly cleaved by caspase at Asp390 upon apoptosis, and that is subjected to proteasomal degradation. We identified that FADD, as well as NEMO, is a substrate for LUBAC. Although the C-terminal fragment of HOIP retains NF-κB activity, linear ubiquitination of NEMO and FADD decreases upon apoptosis. Moreover, the N-terminal fragment of HOIP binds with deubiquitinases, such as OTULIN and CYLD-SPATA2. These results indicate that caspase-mediated cleavage of HOIP divides critical functional regions of HOIP, and that this regulates linear (de)ubiquitination of substrates upon apoptosis.

  14. NEMO: Extraction and normalization of organization names from PubMed affiliation strings

    CERN Document Server

    Jonnalagadda, Siddhartha

    2011-01-01

    We propose NEMO, a system for extracting organization names in the affiliation and normalizing them to a canonical organization name. Our parsing process involves multi-layered rule matching with multiple dictionaries. The system achieves more than 98% f-score in extracting organization names. Our process of normalization that involves clustering based on local sequence alignment metrics and local learning based on finding connected components. A high precision was also observed in normalization. NEMO is the missing link in associating each biomedical paper and its authors to an organization name in its canonical form and the Geopolitical location of the organization. This research could potentially help in analyzing large social networks of organizations for landscaping a particular topic, improving performance of author disambiguation, adding weak links in the co-author network of authors, augmenting NLM's MARS system for correcting errors in OCR output of affiliation field, and automatically indexing the P...

  15. TRAF6-mediated ubiquitination of NEMO requires p62/sequestosome-1.

    Science.gov (United States)

    Zotti, Tiziana; Scudiero, Ivan; Settembre, Pio; Ferravante, Angela; Mazzone, Pellegrino; D'Andrea, Luca; Reale, Carla; Vito, Pasquale; Stilo, Romania

    2014-03-01

    The atypical protein kinase C-interacting protein p62/sequestosome-1 (p62) has emerged as a crucial molecule in a variety of cellular functions due to its involvement in various signaling mechanisms. p62 has been implicated in the activation of NF-κB in TNFα-stimulated cells and has been shown to be activated in response to interleukin-1β (IL-1β). Here we demonstrate that p62 interacts with NEMO, the regulatory subunit of the complex responsible for activation of NF-κB transcription factor. Depletion of p62 obtained through a short interfering RNA targeting p62 mRNA abrogated TRAF6 capacity to promote NEMO ubiquitination and severely impairs NF-κB activation following IL-1β stimulation. Together, these results indicate that p62 is an important intermediary in the NF-κB activation pathways implemented through non-degradative ubiquitination events.

  16. TRAF6-mediated ubiquitination of NEMO requires p62/sequestosome-1☆☆☆

    Science.gov (United States)

    Zotti, Tiziana; Scudiero, Ivan; Settembre, Pio; Ferravante, Angela; Mazzone, Pellegrino; D’Andrea, Luca; Reale, Carla; Vito, Pasquale; Stilo, Romania

    2014-01-01

    The atypical protein kinase C-interacting protein p62/sequestosome-1 (p62) has emerged as a crucial molecule in a variety of cellular functions due to its involvement in various signaling mechanisms. p62 has been implicated in the activation of NF-κB in TNFα-stimulated cells and has been shown to be activated in response to interleukin-1β (IL-1β). Here we demonstrate that p62 interacts with NEMO, the regulatory subunit of the complex responsible for activation of NF-κB transcription factor. Depletion of p62 obtained through a short interfering RNA targeting p62 mRNA abrogated TRAF6 capacity to promote NEMO ubiquitination and severely impairs NF-κB activation following IL-1β stimulation. Together, these results indicate that p62 is an important intermediary in the NF-κB activation pathways implemented through non-degradative ubiquitination events. PMID:24270048

  17. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  18. Sensitivity and pointing accuracy of the NEMO km{sup 3} telescope

    Energy Technology Data Exchange (ETDEWEB)

    Distefano, C. [Laboratori Nazionali del Sud, INFN, Catania (Italy)]. E-mail: carla.distefano@lns.infn.it

    2006-11-15

    In this paper we present the results of Monte Carlo simulation studies on the capability of the proposed NEMO km{sup 3} telescope to detect high-energy neutrinos. We calculated the detector sensitivity to muon neutrinos coming from a generic point-like source. We also simulated the lack of atmospheric muons in correspondence to the Moon disk in order to determine the detector angular resolution and to check the absolute pointing capability.

  19. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... in reliability assessment of power modules, a three-dimensional lumped thermal network is proposed to be used for fast, accurate and detailed temperature estimation of power module in dynamic operation and different boundary conditions. Since an important issue in the reliability of power electronics...... are generic and valid to be used in circuit simulators or any programing software. These models are important building blocks for the reliable design process or performance assessment of power electronic circuits. The models can save time and cost in power electronics packaging and power converter to evaluate...

  20. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  1. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  2. NEMO-SN-1 the first 'real-time' seafloor observatory of ESONET

    Energy Technology Data Exchange (ETDEWEB)

    Favali, Paolo [Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome (Italy) and Universita degli Studi ' La Sapienza' , Rome (Italy)]. E-mail: Paolofa@ingv.it; Beranzoli, Laura [Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome (Italy); D' Anna, Giuseppe [Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome (Italy); Gasparoni, Francesco [Tecnomare-ENI S.p.A., Venice (Italy); Gerber, Hans W. [Technische Fachhochschule, Berlin (Germany)

    2006-11-15

    The fruitful collaboration between Italian Research Institutions, particularly Istituto Nazionale di Fisica Nucleare (INFN) and Istituto Nazionale di Geofisica e Vulcanologia (INGV) together with Marine Engineering Companies, led to the development of NEMO-SN-1, the first European cabled seafloor multiparameter observatory. This observatory, deployed at 2060 m w.d. about 12 miles off-shore the Eastern coasts of Sicily (Southern Italy), is in real-time acquisition since January 2005 and addressed to different set of measurements: geophysical and oceanographic. In particular the SN-1 seismological data are integrated in the INGV land-based national seismic network, and they arrive in real-time to the Operative Centre in Rome. In the European Commission (EC) European Seafloor Observatory NETwork (ESONET) project, in connection to the Global Monitoring for Environment and Security (GMES) action plan, the NEMO-SN-1 site has been proposed as an European key area, both for its intrinsic importance for geo-hazards and for the availability of infrastructure as a stepwise development in GMES program. Presently, NEMO-SN-1 is the only ESONET site operative. The paper gives a description of SN-1 observatory with examples of data.

  3. A Spectrum Handoff Scheme for Optimal Network Selection in NEMO Based Cognitive Radio Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Krishan Kumar

    2017-01-01

    Full Text Available When a mobile network changes its point of attachments in Cognitive Radio (CR vehicular networks, the Mobile Router (MR requires spectrum handoff. Network Mobility (NEMO in CR vehicular networks is concerned with the management of this movement. In future NEMO based CR vehicular networks deployment, multiple radio access networks may coexist in the overlapping areas having different characteristics in terms of multiple attributes. The CR vehicular node may have the capability to make call for two or more types of nonsafety services such as voice, video, and best effort simultaneously. Hence, it becomes difficult for MR to select optimal network for the spectrum handoff. This can be done by performing spectrum handoff using Multiple Attributes Decision Making (MADM methods which is the objective of the paper. The MADM methods such as grey relational analysis and cost based methods are used. The application of MADM methods provides wider and optimum choice among the available networks with quality of service. Numerical results reveal that the proposed scheme is effective for spectrum handoff decision for optimal network selection with reduced complexity in NEMO based CR vehicular networks.

  4. Viral mediated redirection of NEMO/IKKγ to autophagosomes curtails the inflammatory cascade.

    Directory of Open Access Journals (Sweden)

    Patricia M Fliss

    2012-02-01

    Full Text Available The early host response to viral infections involves transient activation of pattern recognition receptors leading to an induction of inflammatory cytokines such as interleukin-1β (IL-1β and tumor necrosis factor α (TNFα. Subsequent activation of cytokine receptors in an autocrine and paracrine manner results in an inflammatory cascade. The precise mechanisms by which viruses avert an inflammatory cascade are incompletely understood. Nuclear factor (NF-κB is a central regulator of the inflammatory signaling cascade that is controlled by inhibitor of NF-κB (IκB proteins and the IκB kinase (IKK complex. In this study we show that murine cytomegalovirus inhibits the inflammatory cascade by blocking Toll-like receptor (TLR and IL-1 receptor-dependent NF-κB activation. Inhibition occurs through an interaction of the viral M45 protein with the NF-κB essential modulator (NEMO, the regulatory subunit of the IKK complex. M45 induces proteasome-independent degradation of NEMO by targeting NEMO to autophagosomes for subsequent degradation in lysosomes. We propose that the selective and irreversible degradation of a central regulatory protein by autophagy represents a new viral strategy to dampen the inflammatory response.

  5. Modular target acquisition model & visualization tool

    NARCIS (Netherlands)

    Bijl, P.; Hogervorst, M.A.; Vos, W.K.

    2008-01-01

    We developed a software framework for image-based simulation models in the chain: scene-atmosphere-sensor-image enhancement-display-human observer: EO-VISTA. The goal is to visualize the steps and to quantify (Target Acquisition) task performance. EO-VISTA provides an excellent means to systematical

  6. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    This document contains a wealth of information about the design and implementation of the Next-TELL open learner model. Information is included about the final specification (Section 3), the interfaces and features (Section 4), its implementation and technical design (Section 5) and also a summary...

  7. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn; Remke, Anne; Haverkort, Boudewijn R.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to demonstra

  8. Engineering tools for robust creep modelling

    OpenAIRE

    Holmström, Stefan

    2010-01-01

    High temperature creep is often dealt with simplified models to assess and predict the future behavior of materials and components. Also, for most applications the creep properties of interest require costly long-term testing that limits the available data to support design and life assessment. Such test data sets are even smaller for welded joints that are often the weakest links of structures. It is of considerable interest to be able to reliably predict and extrapolate long term creep beha...

  9. Theme E: disabilities: analysis models and tools

    OpenAIRE

    Vigouroux, Nadine; Gorce, Philippe; Roby-Brami, Agnès; Rémi-Néris, Olivier

    2013-01-01

    International audience; This paper presents the topics and the activity of the theme E “disabilities: analysis models and tools” within the GDR STIC Santé. This group has organized a conference and a workshop during the period 2011–2012. The conference has focused on technologies for cognitive, sensory and motor impairments, assessment and use study of assistive technologies, user centered method design and the place of ethics in these research topics. The objective of “bodily integration of ...

  10. Constructing an advanced software tool for planetary atmospheric modeling

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  11. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  12. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  13. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  14. A Hybrid Tool for User Interface Modeling and Prototyping

    Science.gov (United States)

    Trætteberg, Hallvard

    Although many methods have been proposed, model-based development methods have only to some extent been adopted for UI design. In particular, they are not easy to combine with user-centered design methods. In this paper, we present a hybrid UI modeling and GUI prototyping tool, which is designed to fit better with IS development and UI design traditions. The tool includes a diagram editor for domain and UI models and an execution engine that integrates UI behavior, live UI components and sample data. Thus, both model-based user interface design and prototyping-based iterative design are supported

  15. Advanced energy systems and technologies research in Finland. NEMO-2 Programme Annual Report 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-10-01

    Advanced energy technologies were linked to the national energy research in the beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry established several energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on a few promising technological solutions. In the beginning of 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies, such as energy storage and hydrogen technology. Resources have been focused on three specific areas: arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). In Finland, the growth of the new energy technology industry is concentrated on these areas. The turnover of the Finnish industry has been growing considerably due to the national research activities and support of technology development. The sales have increased more than 10 times compared with the year 1987 and is now over 300 million FIM. The support to industries and their involvement in the program has grown considerably. In this report, the essential research projects of the programme during 1996-1997 are described. The total funding for these projects was about 30 million FIM per year, of which the TEKES`s share was about 40 per cent. The programme consists of 10 research projects, some 15 joint development projects, and 9 EU projects. In case the research projects and joint development projects are acting very closely, the description of the project is

  16. Advanced energy systems and technologies research in Finland. NEMO 2 annual report 1994-1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    Advanced energy technologies were linked to the national energy research in beginning of 1988 when energy research was reorganised in Finland. The Ministry of Trade and Industry set up many energy research programmes and NEMO was one of them. Major objectives of the programme were to assess the potential of new energy systems for the national energy supply system and to promote industrial activities. Within the NEMO 2 programme for the years 1993-1998, research was focused on technological solutions. In the beginning of the 1995, the national energy research activities were passed on to the Technology Development Centre TEKES. The NEMO 2 programme is directed towards those areas that have particular potential for commercial exploitation or development. Emphasis is placed particularly on solar and wind energy, as well as supporting technologies such as energy storage and hydrogen technology. Resources has been focused on three specific areas: Arctic wind technology, wind turbine components, and the integration of solar energy into applications (including thin film solar cells). It seems that in Finland the growth of the new energy technology industry is focused on these areas. The sales of the industry have been growing considerable due to the national research activities and support of technology development. The sales have increased 6 - 7 times compared to the year 1987 and is now over 200 million FIM. The support to industries and their involvement in the program has grown more than 15 times compared to 1988. The total funding of the NEMO 2 program me was 30 million FIM in 1994 and 21 million FIM in 1995. The programme consists of 20 research projects, 15 joint development projects, and 5 EU projects. In this report, the essential research projects of the programme in 1994-1995 are described. The total funding for these projects was about 25 million FIM, of which the TEKES`s share was about half. When the research projects and joint development projects are

  17. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  18. Sensitivity of the Mediterranean sea level to atmospheric pressure and free surface elevation numerical formulation in NEMO

    Directory of Open Access Journals (Sweden)

    P. Oddo

    2014-06-01

    Full Text Available The sensitivity of the dynamics of the Mediterranean Sea to atmospheric pressure and free surface elevation formulation using NEMO (Nucleus for European Modelling of the Ocean was evaluated. Four different experiments were carried out in the Mediterranean Sea using filtered or explicit free surface numerical schemes and accounting for the effect of atmospheric pressure in addition to wind and buoyancy fluxes. Model results were evaluated by coherency and power spectrum analysis with tide gauge data. We found that atmospheric pressure plays an important role for periods shorter than 100 days. The free surface formulation is important to obtain the correct ocean response for periods shorter than 30 days. At frequencies higher than 15 days−1 the Mediterranean basin's response to atmospheric pressure was not coherent and the performance of the model strongly depended on the specific area considered. A large amplitude seasonal oscillation observed in the experiments using a filtered free surface was not evident in the corresponding explicit free surface formulation case which was due to a phase shift between mass fluxes in the Gibraltar Strait and at the surface. The configuration with time splitting and atmospheric pressure always performed best; the differences were enhanced at very high frequencies.

  19. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  20. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  1. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  2. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  3. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  4. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately......The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...

  5. Homology modeling: an important tool for the drug discovery.

    Science.gov (United States)

    França, Tanos Celmar Costa

    2015-01-01

    In the last decades, homology modeling has become a popular tool to access theoretical three-dimensional (3D) structures of molecular targets. So far several 3D models of proteins have been built by this technique and used in a great diversity of structural biology studies. But are those models consistent enough with experimental structures to make this technique an effective and reliable tool for drug discovery? Here we present, briefly, the fundamentals and current state-of-the-art of the homology modeling techniques used to build 3D structures of molecular targets, which experimental structures are not available in databases, and list some of the more important works, using this technique, available in literature today. In many cases those studies have afforded successful models for the drug design of more selective agonists/antagonists to the molecular targets in focus and guided promising experimental works, proving that, when the appropriate templates are available, useful models can be built using some of the several software available today for this purpose. Limitations of the experimental techniques used to solve 3D structures allied to constant improvements in the homology modeling software will maintain the need for theoretical models, establishing the homology modeling as a fundamental tool for the drug discovery.

  6. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  7. NEMO-SN1 observatory developments in view of the European Research Infrastructures EMSO and KM3NET

    Energy Technology Data Exchange (ETDEWEB)

    Favali, Paolo, E-mail: emsopp@ingv.i [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Beranzoli, Laura [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Italiano, Francesco [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Palermo, Via Ugo La Malfa 153, 90146 Palermo (Italy); Migneco, Emilio; Musumeci, Mario; Papaleo, Riccardo [Istituto Nazionale di Fisica Nucleare (INFN), Laboratori Nazionali del Sud, Via di S. Sofia 62, 95125 Catania (Italy)

    2011-01-21

    NEMO-SN1 (Western Ionian Sea off Eastern Sicily), the first real-time multiparameter observatory operating in Europe since 2005, is one of the nodes of the upcoming European ESFRI large-scale research infrastructure EMSO (European Multidisciplinary Seafloor Observatory), a network of seafloor observatories placed at marine sites on the European Continental Margin. NEMO-SN1 constitutes also an important test-site for the study of prototypes of Kilometre Cube Neutrino Telescope (KM3NeT), another European ESFRI large-scale research infrastructure. Italian resources have been devoted to the development of NEMO-SN1 facilities and logistics, as with the PEGASO project, while the EC project ESONET-NoE is funding a demonstration mission and a technological test. EMSO and KM3NeT are presently in the Preparatory Phase as projects funded under the EC-FP7.

  8. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    to enforce design decisions (modify the models). We define tool-independent concepts and architecture building blocks supporting these requirements and present first ideas how this can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless integration......Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...

  9. A general thermal model of machine tool spindle

    Directory of Open Access Journals (Sweden)

    Yanfang Dong

    2017-01-01

    Full Text Available As the core component of machine tool, the thermal characteristics of the spindle have a significant influence on machine tool running status. Lack of an accurate model of the spindle system, particularly the model of load–deformation coefficient between the bearing rolling elements and rings, severely limits the thermal error analytic precision of the spindle. In this article, bearing internal loads, especially the function relationships between the principal curvature difference F(ρ and auxiliary parameter nδ, semi-major axis a, and semi-minor axis b, have been determined; furthermore, high-precision heat generation combining the heat sinks in the spindle system is calculated; finally, an accurate thermal model of the spindle was established. Moreover, a conventional spindle with embedded fiber Bragg grating temperature sensors has been developed. By comparing the experiment results with simulation, it indicates that the model has good accuracy, which verifies the reliability of the modeling process.

  10. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  11. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    , Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...... signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi-annual...... workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops...

  12. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  13. Evaluation of QoS supported in Network Mobility NEMO environments

    Science.gov (United States)

    Hussien, L. F.; Abdalla, A. H.; Habaebi, M. H.; Khalifa, O. O.; Hassan, W. H.

    2013-12-01

    Network mobility basic support (NEMO BS) protocol is an entire network, roaming as a unit which changes its point of attachment to the Internet and consequently its reachability in the network topology. NEMO BS doesn't provide QoS guarantees to its users same as traditional Internet IP and Mobile IPv6 as well. Typically, all the users will have same level of services without considering about their application requirements. This poses a problem to real-time applications that required QoS guarantees. To gain more effective control of the network, incorporated QoS is needed. Within QoS-enabled network the traffic flow can be distributed to various priorities. Also, the network bandwidth and resources can be allocated to different applications and users. Internet Engineering Task Force (IETF) working group has proposed several QoS solutions for static network such as IntServ, DiffServ and MPLS. These QoS solutions are designed in the context of a static environment (i.e. fixed hosts and networks). However, they are not fully adapted to mobile environments. They essentially demands to be extended and adjusted to meet up various challenges involved in mobile environments. With existing QoS mechanisms many proposals have been developed to provide QoS for individual mobile nodes (i.e. host mobility). In contrary, research based on the movement of the whole mobile network in IPv6 is still undertaking by the IETF working groups (i.e. network mobility). Few researches have been done in the area of providing QoS for roaming networks. Therefore, this paper aims to review and investigate (previous /and current) related works that have been developed to provide QoS in mobile network. Consequently, a new proposed scheme will be introduced to enhance QoS within NEMO environment, achieving by which seamless mobility to users of mobile network node (MNN).

  14. Pokretanje doma za starije i nemoćne u Požeško-slavonskoj županiji

    OpenAIRE

    Asančaić, Ana; Plazibat, Olga; Vukoja, Ivan; Vukoja, Marko

    2015-01-01

    Poslovni pothvat izgradnje doma za starije i nemoćne osobe u Požeško-slavonskoj županiji temelji se na demografskom trendu starenja stanovništva i povećanog iseljavanja mladih. Uočena je povećana potreba za odgovarajućim oblicima smještaja starijih i nemoćnih osoba, koji bi korisnike zadovoljavali kvalitetom sadržaja. Na osnovi Porterovog modela pet konkurentskih snaga pokazalo se da je industrija usluge smještaja u dom umjereno atraktivna; za ulazak novih pothvata, zbog prijetnje od ulaska n...

  15. RedNemo: topology-based PPI network reconstruction via repeated diffusion with neighborhood modifications

    DEFF Research Database (Denmark)

    Alkan, Ferhat; Erten, Cesim

    2016-01-01

    tested on small-scale networks thus far and when applied on large-scale networks of popular PPI databases, the executions require unreasonable amounts of time, or may even crash without producing any output for some instances even after several months of execution. We provide an algorithm, Red...... material including source code, useful scripts, experimental data and the results are available at http://webprs.khas.edu.tr/∼cesim/Red Nemo. tar.gz CONTACT: cesim@khas.edu.trSupplementary information: Supplementary data are available at Bioinformatics online....

  16. Measurement of the atmospheric muon flux at 3500 m depth with the NEMO Phase-2 detector

    Directory of Open Access Journals (Sweden)

    Distefano C.

    2016-01-01

    Full Text Available In March 2013, the Nemo Phase-2 tower was successfully deployed at 80 km off-shore Capo Passero (Italy at 3500 m depth. The tower operated continuously until August 2014. We present the results of the atmospheric muon analysis from the data collected in 411 days of live time. The zenith-angle distribution of atmospheric muons was measured and results compared with Monte Carlo simulations. The associated depth intensity relation was then measured and compared with previous measurements and theoretical predictions.

  17. The km sup 3 Mediterranean neutrino observatory - the NEMO.RD project

    CERN Document Server

    De Marzo, C N

    2001-01-01

    The NEMO.RD Project is a feasibility study of a km sup 3 underwater telescope for high energy astrophysical neutrinos to be located in the Mediterranean Sea. Results on various issues of this project are presented on: i) Monte Carlo simulation study of the capabilities of various arrays of phototubes in order to determine the detector geometry that can optimize performance and cost; ii) oceanographic survey of various sites in search of the optimal one; iii) feasibility study of mechanics, deployment, connections and maintenance of such a detector. Parameters of a site near Capo Passero, Sicily, where depth, transparency and other water parameters seem optimal are shown.

  18. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  19. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  20. Simulation modeling: a powerful tool for process improvement.

    Science.gov (United States)

    Boxerman, S B

    1996-01-01

    Simulation modeling provides an efficient means of examining the operation of a system under a variety of alternative conditions. This tool can potentially enhance a benchmarking project by providing a means for evaluating proposed modifications to the system or process under study.

  1. Designing a Training Tool for Imaging Mental Models

    Science.gov (United States)

    1990-11-01

    about how to weave together their disparate fields into a seamless web of knowledge . Learners often cannot visualize how the concepts and skills they...a seamless web of knowledge ? " How does the availability of a mental modeling tool enhance the ability of instructional designers to prepare

  2. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  3. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  4. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  5. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    Science.gov (United States)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  6. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  7. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  8. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced.

  9. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  10. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  11. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  12. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D; Halford, Keith J.; Binley, Andrew; Lane, John; Werkema, Dale

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  13. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  14. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  15. Teuvincenone F Suppresses LPS-Induced Inflammation and NLRP3 Inflammasome Activation by Attenuating NEMO Ubiquitination.

    Science.gov (United States)

    Zhao, Xibao; Pu, Debing; Zhao, Zizhao; Zhu, Huihui; Li, Hongrui; Shen, Yaping; Zhang, Xingjie; Zhang, Ruihan; Shen, Jianzhong; Xiao, Weilie; Chen, Weilin

    2017-01-01

    Inflammation causes many diseases that are serious threats to human health. However, the molecular mechanisms underlying regulation of inflammation and inflammasome activation are not fully understood which has delayed the discovery of new anti-inflammatory drugs of urgent clinic need. Here, we found that the natural compound Teuvincenone F, which was isolated and purified from the stems and leaves of Premna szemaoensis, could significantly inhibit lipopolysaccharide (LPS)-induced pro-inflammatory cytokines production and NLRP3 inflammasome activation. Our results showed that Teuvincenone F attenuated K63-linked ubiquitination of NF-κB-essential modulator (NEMO, also known as IKKγ) to suppress LPS-induced phosphorylation of NF-κB, and inhibited mRNA expression of IL-1β, IL-6, TNF-α, and NLRP3. In addition, we found that decreased NLRP3 expression by Teuvincenone F suppressed NLRP3 inflammasome activation and IL-1β/IL-18 maturation. In vivo, we revealed that Teuvincenone F treatment relieved LPS-induced inflammation. In conclusion, Teuvincenone F is a highly effective natural compound to suppress LPS-induced inflammation by attenuating K63-linked ubiquitination of NEMO, highlighting that Teuvincenone F may be a potential new anti-inflammatory drug for the treatment of inflammatory and NLRP3 inflammasome-driven diseases.

  16. Nephrectomized and hepatectomized animal models as tools in preclinical pharmacokinetics.

    Science.gov (United States)

    Vestergaard, Bill; Agersø, Henrik; Lykkesfeldt, Jens

    2013-08-01

    Early understanding of the pharmacokinetics and metabolic patterns of new drug candidates is essential for selection of optimal candidates to move further in to the drug development process. In vitro methodologies can be used to investigate metabolic patterns, but in general, they lack several aspects of the whole-body physiology. In contrast, the complexity of intact animals does not necessarily allow individual processes to be identified. Animal models lacking a major excretion organ can be used to investigate these individual metabolic processes. Animal models of nephrectomy and hepatectomy have considerable potential as tools in preclinical pharmacokinetics to assess organs of importance for drug clearance and thereby knowledge of potential metabolic processes to manipulate to improve pharmacokinetic properties of the molecules. Detailed knowledge of anatomy and surgical techniques is crucial to successfully establish the models, and a well-balanced anaesthesia and adequate monitoring of the animals are also of major importance. An obvious drawback of animal models lacking an organ is the disruption of normal homoeostasis and the induction of dramatic and ultimately mortal systemic changes in the animals. Refining of the surgical techniques and the post-operative supportive care of the animals can increase the value of these models by minimizing the systemic changes induced, and thorough validation of nephrectomy and hepatectomy models is needed before use of such models as a tool in preclinical pharmacokinetics. The present MiniReview discusses pros and cons of the available techniques associated with establishing nephrectomy and hepatectomy models.

  17. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... strategies have different goals e.g. fast response over disturbances, optimum power efficiency over a wider range of wind speeds, voltage ride-through capability including grid support. A dynamic model of a DC connection for active stall wind farms to the grid including the control is also implemented...

  18. Hypermedia as an experiential learning tool: a theoretical model

    Directory of Open Access Journals (Sweden)

    Jose Miguel Baptista Nunes

    1996-01-01

    Full Text Available The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learning approach and simultaneously aims at minimising the inherent problems of hypermedia as the underlying support technology.

  19. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  20. Sounds of silence : A research into the relationship between administrative supervision, criminal investigation and the nemo-tenetur principle

    NARCIS (Netherlands)

    Peçi, I.

    2006-01-01

    The subject of this thesis is the relationship between administrative supervision, criminal investigation and the nemo-tenetur principle. The point of departure is the distinction made in Dutch law and doctrine between administrative supervision and criminal investigation. Such a distinction is

  1. Étude du détecteur de traces de l'expérience NEMO3. Simulation de la mesure de l'ultra-faible radioactivité en ${208}^TL$ des sources de l'expérience NEMO3 candidates à la double désintégration $\\beta$ sans émission de neutrino

    CERN Document Server

    Errahmane, K

    2001-01-01

    Étude du détecteur de traces de l'expérience NEMO3. Simulation de la mesure de l'ultra-faible radioactivité en ${208}^TL$ des sources de l'expérience NEMO3 candidates à la double désintégration $\\beta$ sans émission de neutrino

  2. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  3. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  4. Application of Process Modeling Tools to Ship Design

    Science.gov (United States)

    2011-05-01

    Release; Distribution is unlimited. Different People – Different Preferences • We need to view process data in multiple formats. – DSM – GANTT Charts...CertificatePrograms/tRI Scheduling Software Spreadsheet Software Info Modeling Software DSM Tool Schedules IDEF Diagrams Spreadsheets DSM Schema 6/2...Chart Gantt Chart DSM Flow Chart by Geography 6/2/2011 3:41 PM 16Statement A: Approved for Public Release; Distribution is unlimited. Multi-domain

  5. Schistosomiasis japonica: modelling as a tool to explore transmission patterns.

    Science.gov (United States)

    Xu, Jun-Fang; Lv, Shan; Wang, Qing-Yun; Qian, Men-Bao; Liu, Qin; Bergquist, Robert; Zhou, Xiao-Nong

    2015-01-01

    Modelling is an important tool for the exploration of Schistosoma japonicum transmission patterns. It provides a general theoretical framework for decision-makers and lends itself specifically to assessing the progress of the national control programme by following the outcome of surveys. The challenge of keeping up with the many changes of social, ecological and environmental factors involved in control activities is greatly facilitated by modelling that can also indicate which activities are critical and which are less important. This review examines the application of modelling tools in the epidemiological study of schistosomiasis japonica during the last 20 years and explores the application of enhanced models for surveillance and response. Updated and timely information for decision-makers in the national elimination programme is provided but, in spite of the new modelling techniques introduced, many questions remain. Issues on application of modelling are discussed with the view to improve the current situation with respect to schistosomiasis japonica. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  7. Virtual Sensor for Calibration of Thermal Models of Machine Tools

    Directory of Open Access Journals (Sweden)

    Alexander Dementjev

    2014-01-01

    strictly depends on the accuracy of these machines, but they are prone to deformation caused by their own heat. The deformation needs to be compensated in order to assure accurate production. So an adequate model of the high-dimensional thermal deformation process must be created and parameters of this model must be evaluated. Unfortunately, such parameters are often unknown and cannot be calculated a priori. Parameter identification during real experiments is not an option for these models because of its high engineering and machine time effort. The installation of additional sensors to measure these parameters directly is uneconomical. Instead, an effective calibration of thermal models can be reached by combining real and virtual measurements on a machine tool during its real operation, without additional sensors installation. In this paper, a new approach for thermal model calibration is presented. The expected results are very promising and can be recommended as an effective solution for this class of problems.

  8. Study of tracking detector of NEMO3 experiment - simulation of the measurement of the ultra low {sup 208}Tl radioactivity in the source foils used as neutrinoless double beta decay emitters in NEMO3 experiment; Etude du detecteur de traces de l'experience NEMO3. Simulation de la mesure de l'ultra-faible radioactivite en {sup 208}Tl des sources de l'experience NEMO3 candidates a la double desintegration {beta} sans emission de neutrino

    Energy Technology Data Exchange (ETDEWEB)

    Errahmane, K

    2001-04-01

    The purpose of NEMO3 experiment is the research of the neutrinoless double beta decay. This low energy process can sign the massive and Majorana nature of neutrino. This experiment, with a very low radioactive background and containing 10 kg of enriched isotopes, studies mainly {sup 100}Mo. Installed at the Frejus underground laboratory, NEMO3 is a cylindrical detector, which consists in very thin central source foils, in a tracking detector made up of vertical drift cells operating in Geiger mode, in a calorimeter and in a suitable shielding. This thesis is divided in two different parts. The first part is a full study of the features of the tracking detector. With a prototype composed of 9 drift cells, we characterised the longitudinal and transverse reconstruction of position of the ionisation created by a LASER. With the first 3 modules under operation, we used radioactive external neutron sources to measure the transverse resolution of ionisation position in a drift cell for high energy electrons. To study the vertex reconstruction on the source foil, sources of {sup 207}Bi, which produced conversion electrons, were used inside the 3 modules. The second part of this thesis, we show, with simulations, that we can measure, with NEMO3 detector itself, the ultra low level of contamination in {sup 208}Tl of the source foil, which comes from the natural radioactive chain of thorium. Using electron-photons channels, we can obtain the {sup 208}Tl activity in the sources. With an analysis on the energy and on the time of flight of particles, NEMO3 is able to reach a sensitivity of 20{mu}Bq/kg after only 2 months of measurement. This sensitivity is the maximum {sup 208}Tl activity, which we accepted for the sources in the NEMO3 proposal. (author)

  9. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  10. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  11. Error Model and Accuracy Calibration of 5-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Fangyu Pan

    2013-08-01

    Full Text Available To improve the machining precision and reduce the geometric errors for 5-axis machinetool, error model and calibration are presented in this paper. Error model is realized by the theory of multi-body system and characteristic matrixes, which can establish the relationship between the cutting tool and the workpiece in theory. The accuracy calibration was difficult to achieve, but by a laser approach-laser interferometer and laser tracker, the errors can be displayed accurately which is benefit for later compensation.

  12. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  13. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0

  14. A Tool for Sharing Empirical Models of Climate Impacts

    Science.gov (United States)

    Rising, J.; Kopp, R. E.; Hsiang, S. M.

    2013-12-01

    Scientists, policy advisors, and the public struggle to synthesize the quickly evolving empirical work on climate change impacts. The Integrated Assessment Models (IAMs) used to estimate the impacts of climate change and the effects of adaptation and mitigation policies can also benefit greatly from recent empirical results (Kopp, Hsiang & Oppenheimer, Impacts World 2013 discussion paper). This paper details a new online tool for exploring, analyzing, combining, and communicating a wide range of impact results, and supporting their integration into IAMs. The tool uses a new database of statistical results, which researchers can expand both in depth (by providing additional results that describing existing relationships) and breadth (by adding new relationships). Scientists can use the tool to quickly perform meta-analyses of related results, using Bayesian techniques to produce pooled and partially-pooled posterior distributions. Policy advisors can apply the statistical results to particular contexts, and combine different kinds of results in a cost-benefit framework. For example, models of the impact of temperature changes on agricultural yields can be first aggregated to build a best-estimate of the effect under given assumptions, then compared across countries using different temperature scenarios, and finally combined to estimate a social cost of carbon. The general public can better understand the many estimates of climate impacts and their range of uncertainty by exploring these results dynamically, with maps, bar charts, and dose-response-style plots. Front page of the climate impacts tool website. Sample "collections" of models, within which all results are estimates of the same fundamental relationship, are shown on the right. Simple pooled result for Gelman's "8 schools" example. Pooled results are calculated analytically, while partial-pooling (Bayesian hierarchical estimation) uses posterior simulations.

  15. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    . In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...

  16. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  17. Modeling as a tool for process control: alcoholic fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Tayeb, A.M.; Ashour, I.A.; Mostafa, N.A. (El-Minia Univ. (EG). Faculty of Engineering)

    1991-01-01

    The results of the alcoholic fermentation of beet sugar molasses and wheat milling residues (Akalona) were fed into a computer program. Consequently, the kinetic parameters for these fermentation reactions were determined. These parameters were put into a kinetic model. Next, the model was tested, and the results obtained were compared with the experimental results of both beet molasses and Akalona. The deviation of the experimental results from the results obtained from the model was determined. An acceptable deviation of 1.2% for beet sugar molasses and 3.69% for Akalona was obtained. Thus, the present model could be a tool for chemical engineers working in fermentation processes both with respect to the control of the process and the design of the fermentor. (Author).

  18. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  19. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  20. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  1. Laser melting of carbide tool surface: Model and experimental studies

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S., E-mail: bsyilbas@kfupm.edu.sa [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia); Shuja, S.Z.; Khan, S.M.A.; Aleem, A. [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia)

    2009-09-15

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ({beta}) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing {beta}.

  2. Laser melting of carbide tool surface: Model and experimental studies

    Science.gov (United States)

    Yilbas, B. S.; Shuja, S. Z.; Khan, S. M. A.; Aleem, A.

    2009-09-01

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ( β) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing β.

  3. MGP : a tool for wide range temperature modelling

    Energy Technology Data Exchange (ETDEWEB)

    Morales, A.F. [Inst. Tecnologico Autonomo de Mexico, Mexico City (Mexico); Seisdedos, L.V. [Univ. de Oriente, Santiago de Cuba (Cuba). Dept. de Control Automatico

    2006-07-01

    This paper proposed a practical temperature modelling tool that used genetic multivariate polynomials to determine polynomial expressions of enthalpy and empirical heat transfer equations in superheaters. The model was designed to transform static parameter estimations from distributed into lumped parameter systems. Two dynamic regimes were explored: (1) a power dynamics regime containing major inputs and outputs needed for overall plant control; and (2) a steam temperature dynamics scheme that considered consecutive superheater sections considered in terms of cooling water mass flow and steam mass flow. The single lumped parameters model was developed to provide temperature control for a fossil fuel-fired power plant. The design procedure used enthalpy to determine the plant's energy balance. The enthalpy curve was seen as a function of either temperature and steam pressure. A graphic simulation tool was used to optimize the model by comparing real and simulated plant data. The study showed that the amount of energy taken by the steam mass flow per time unit can be calculated by measuring temperatures and pressures at both ends of the superheater. An algorithm was then developed to determine the polynomial's coefficients according to best curve fitting over the training set and best maximum errors. It was concluded that a unified approach is now being developed to simulate and emulate the dynamics of steam temperature for each section's attemporator-superheater. 14 refs., 3 tabs., 5 figs.

  4. Advanced Reach Tool (ART): development of the mechanistic model.

    Science.gov (United States)

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  5. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL......, the first transformation language implementing the principles of Transparent Model Transformation: syntax, environment, and execution transparency. VMTL works by weaving a transformation aspect into its host modeling language. We show how our implementation of VMTL turns any model editor into a flexible...

  6. The ADAPT Tool: From AADL Architectural Models to Stochastic Petri Nets through Model Transformation

    CERN Document Server

    Rugina, Ana E; Kaaniche, Mohamed

    2008-01-01

    ADAPT is a tool that aims at easing the task of evaluating dependability measures in the context of modern model driven engineering processes based on AADL (Architecture Analysis and Design Language). Hence, its input is an AADL architectural model annotated with dependability-related information. Its output is a dependability evaluation model in the form of a Generalized Stochastic Petri Net (GSPN). The latter can be processed by existing dependability evaluation tools, to compute quantitative measures such as reliability, availability, etc.. ADAPT interfaces OSATE (the Open Source AADL Tool Environment) on the AADL side and SURF-2, on the dependability evaluation side. In addition, ADAPT provides the GSPN in XML/XMI format, which represents a gateway to other dependability evaluation tools, as the processing techniques for XML files allow it to be easily converted to a tool-specific GSPN.

  7. Modeling in the Classroom: An Evolving Learning Tool

    Science.gov (United States)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  8. A Tool for Model-Based Language Specification

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2011-01-01

    Formal languages let us define the textual representation of data with precision. Formal grammars, typically in the form of BNF-like productions, describe the language syntax, which is then annotated for syntax-directed translation and completed with semantic actions. When, apart from the textual representation of data, an explicit representation of the corresponding data structure is required, the language designer has to devise the mapping between the suitable data model and its proper language specification, and then develop the conversion procedure from the parse tree to the data model instance. Unfortunately, whenever the format of the textual representation has to be modified, changes have to propagated throughout the entire language processor tool chain. These updates are time-consuming, tedious, and error-prone. Besides, in case different applications use the same language, several copies of the same language specification have to be maintained. In this paper, we introduce a model-based parser generat...

  9. T:XML: A Tool Supporting User Interface Model Transformation

    Science.gov (United States)

    López-Jaquero, Víctor; Montero, Francisco; González, Pascual

    Model driven development of user interfaces is based on the transformation of an abstract specification into the final user interface the user will interact with. The design of transformation rules to carry out this transformation process is a key issue in any model-driven user interface development approach. In this paper, we introduce T:XML, an integrated development environment for managing, creating and previewing transformation rules. The tool supports the specification of transformation rules by using a graphical notation that works on the basis of the transformation of the input model into a graph-based representation. T:XML allows the design and execution of transformation rules in an integrated development environment. Furthermore, the designer can also preview how the generated user interface looks like after the transformations have been applied. These previewing capabilities can be used to quickly create prototypes to discuss with the users in user-centered design methods.

  10. Ontology-based tools to expedite predictive model construction.

    Science.gov (United States)

    Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey

    2014-01-01

    Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.

  11. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  12. NEMO medium voltage converter factory acceptance, operational and final integration tests

    Science.gov (United States)

    Cocimano, Rosanna; NEMO Collaboration

    2011-01-01

    The NEMO Collaboration, as part of the KM3NeT EU-funded consortium, is developing technical solutions for the construction of a cubic-kilometer scale neutrino telescope in the Mediterranean sea several kilometers below the sea level and far from the shore. In this framework, after years of design, development, assembly and testing the Alcatel deep sea medium voltage power converter (MVC) is ready for deployment at 100 km from the Capo Passero shore station. The MVC converts the 10 kV to an instrument-friendly 375 V for a 10 kW power. The MVC will be presented with focus on the factory acceptance, operational and final integration tests that recently have been carried out.

  13. NEMO medium voltage converter factory acceptance, operational and final integration tests

    Energy Technology Data Exchange (ETDEWEB)

    Cocimano, Rosanna, E-mail: cocimano@lns.infn.i [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali del Sud, Via S. Sofia 62, 95123 Catania (Italy)

    2011-01-21

    The NEMO Collaboration, as part of the KM3NeT EU-funded consortium, is developing technical solutions for the construction of a cubic-kilometer scale neutrino telescope in the Mediterranean sea several kilometers below the sea level and far from the shore. In this framework, after years of design, development, assembly and testing the Alcatel deep sea medium voltage power converter (MVC) is ready for deployment at 100 km from the Capo Passero shore station. The MVC converts the 10 kV to an instrument-friendly 375 V for a 10 kW power. The MVC will be presented with focus on the factory acceptance, operational and final integration tests that recently have been carried out.

  14. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  15. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  16. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  17. Numerical modelling of structure and mechanical properties for medical tools

    Directory of Open Access Journals (Sweden)

    L. Jeziorski

    2007-09-01

    , mechanical properties and work condition of medical tools. For bowl cutter was improve geometrical sharp and distribution of cutting holes. Originality/value: This paper presents conception to obtain new medical tools, after optimizing the basic construction parameters by numerical calculations. The prepared model could be a helpful for engineering decisions used in the designing and producing forceps and bowl cutter.

  18. Nemo-like kinase (NLK) expression in osteoblastic cells and suppression of osteoblastic differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Nifuji, Akira, E-mail: nifuji-a@tsurumi-u.ac.jp [Transcriptome profiling group, National Institute of Radiological Sciences, Chiba (Japan); Department of Pharmacology, Tsurumi University School of Dental Medicine, Yokohama (Japan); Ideno, Hisashi [Transcriptome profiling group, National Institute of Radiological Sciences, Chiba (Japan); Ohyama, Yoshio [Department of Molecular Pharmacology, Medical Research Institute, Tokyo Medical and Dental University, Tokyo (Japan); Takanabe, Rieko; Araki, Ryoko; Abe, Masumi [Transcriptome profiling group, National Institute of Radiological Sciences, Chiba (Japan); Noda, Masaki [Department of Molecular Pharmacology, Medical Research Institute, Tokyo Medical and Dental University, Tokyo (Japan); Shibuya, Hiroshi [Department of Molecular Cell Biology, Medical Research Institute and School of Biomedical Science, Tokyo Medical and Dental University, Tokyo (Japan)

    2010-04-15

    Mitogen-activated protein kinases (MAPKs) regulate proliferation and differentiation in osteoblasts. The vertebral homologue of nemo, nemo-like kinase (NLK), is an atypical MAPK that targets several signaling components, including the T-cell factor/lymphoid enhancer factor (TCF/Lef1) transcription factor. Recent studies have shown that NLK forms a complex with the histone H3-K9 methyltransferase SETDB1 and suppresses peroxisome proliferator-activated receptor (PPAR)-gamma:: action in the mesenchymal cell line ST2. Here we investigated whether NLK regulates osteoblastic differentiation. We showed that NLK mRNA is expressed in vivo in osteoblasts at embryonic day 18.5 (E18.5) mouse calvariae. By using retrovirus vectors, we performed forced expression of NLK in primary calvarial osteoblasts (pOB cells) and the mesenchymal cell line ST2. Wild-type NLK (NLK-WT) suppressed alkaline phosphatase activity and expression of bone marker genes such as alkaline phosphatase, type I procollagen, runx2, osterix, steopontin and osteocalcin in these cells. NLK-WT also decreased type I collagen protein expression in pOB and ST2 cells. Furthermore, mineralized nodule formation was reduced in pOB cells overexpressing NLK-WT. In contrast, kinase-negative form of NLK (NLK-KN) did not suppress or partially suppress ALP activity and bone marker gene expression in pOB and ST2 cells. NLK-KN did not suppress nodule formation in pOB cells. In addition to forced expression, suppression of endogenous NLK expression by siRNA increased bone marker gene expression in pOB and ST2 cells. Finally, transcriptional activity analysis of gene promoters revealed that NLK-WT suppressed Wnt1 activation of TOP flash promoter and Runx2 activation of the osteocalcin promoter. Taken together, these results suggest that NLK negatively regulates osteoblastic differentiation.

  19. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  20. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  1. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  2. Computational Tools for Modeling and Measuring Chromosome Structure

    Science.gov (United States)

    Ross, Brian Christopher

    DNA conformation within cells has many important biological implications, but there are challenges both in modeling DNA due to the need for specialized techniques, and experimentally since tracing out in vivo conformations is currently impossible. This thesis contributes two computational projects to these efforts. The first project is a set of online and offline calculators of conformational statistics using a variety of published and unpublished methods, addressing the current lack of DNA model-building tools intended for general use. The second project is a reconstructive analysis that could enable in vivo mapping of DNA conformation at high resolution with current experimental technology. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  3. Mathematical modelling: a tool for hospital infection control.

    Science.gov (United States)

    Grundmann, H; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associated infections in hospitals and communities. We aim to explain to a broader audience of professionals in health care, infection control, and health systems administration some of these models that can improve the understanding of the hidden dynamics of health-care-associated infections. We also appraise their usefulness and limitations as an innovative research and decision tool for control purposes.

  4. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  5. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    Science.gov (United States)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  6. Modeling as a research tool in poultry science.

    Science.gov (United States)

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  7. Introducing BioSARN - an ecological niche model refinement tool.

    Science.gov (United States)

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available.

  8. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  9. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  10. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; Destefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  11. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  12. Modeling of tool path for the CNC sheet cutting machines

    Science.gov (United States)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  13. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  14. Standalone visualization tool for three-dimensional DRAGON geometrical models

    Energy Technology Data Exchange (ETDEWEB)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E. [Faculty of Energy Systems and Nuclear Science, Univ. of Ontario Inst. of Tech., Oshawa, Ontario (Canada)

    2008-07-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  15. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  16. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    CERN Document Server

    Adrián-Martínez, S; Ameli, F; Anghinolfi, M; Ardid, M; Barbarino, G; Barbarito, E; Barbato, F C T; Beverini, N; Biagi, S; Biagioni, A; Bouhadef, B; Bozza, C; Cacopardo, G; Calamai, M; Calí, C; Calvo, D; Capone, A; Caruso, F; Ceres, A; Chiarusi, T; Circella, M; Cocimano, R; Coniglione, R; Costa, M; Cuttone, G; D'Amato, C; D'Amico, A; De Bonis, G; De Luca, V; Deniskina, N; De Rosa, G; di Capua, F; Distefano, C; Enzenhöfer, A; Fermani, P; Ferrara, G; Flaminio, V; Fusco, L A; Garufi, F; Giordano, V; Gmerk, A; Grasso, R; Grella, G; Hugon, C; Imbesi, M; Kulikovskiy, V; Lahmann, R; Larosa, G; Lattuada, D; Leismüller, K P; Leonora, E; Litrico, P; Alvarez, C D Llorens; Lonardo, A; Longhitano, F; Presti, D Lo; Maccioni, E; Margiotta, A; Marinelli, A; Martini, A; Masullo, R; Migliozzi, P; Migneco, E; Miraglia, A; Mollo, C M; Mongelli, M; Morganti, M; Musico, P; Musumeci, M; Nicolau, C A; Orlando, A; Orzelli, A; Papaleo, R; Pellegrino, C; Pellegriti, M G; Perrina, C; Piattelli, P; Pugliatti, C; Pulvirenti, S; Raffaelli, F; Randazzo, N; Real, D; Riccobene, G; Rovelli, A; Saldaña, M; Sanguineti, M; Sapienza, P; Sciacca, V; Sgura, I; Simeone, F; Sipala, V; Speziale, F; Spitaleri, A; Spurio, M; Stellacci, S M; Taiuti, M; Terreni, G; Trasatti, L; Trovato, A; Ventura, C; Vicini, P; Viola, S; Vivolo, D

    2015-01-01

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than one year at the "record" depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than one year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of 40K light emission, with a small percentage of light bursts due to bio...

  17. Autoubiquitination of TRIM26 links TBK1 to NEMO in RLR-mediated innate antiviral immune response.

    Science.gov (United States)

    Ran, Yong; Zhang, Jing; Liu, Li-Li; Pan, Zhao-Yi; Nie, Ying; Zhang, Hong-Yan; Wang, Yan-Yi

    2016-02-01

    The transcription factors IRF3 and NF-κB are required for the expression of many genes involved in antiviral innate immune response, including type I interferons (IFNs) and proinflammatory cytokines. It is well established that TBK1 is an essential kinase engaged downstream of multiple pattern-recognition receptors (PRRs) to mediate IRF3 phosphorylation and activation, whereas the precise mechanisms of TBK1 activation have not been fully elucidated yet. Here, we identified tripartite motif 26 (TRIM26) as an important regulator for RNA virus-triggered innate immune response. Knockdown of TRIM26 impaired virus-triggered IRF3, NF-κB activation, IFN-β induction, and cellular antiviral response. TRIM26 was physically associated with TBK1 independent of viral infection. As an E3 ligase, TRIM26 underwent autoubiquitination upon viral infection. Ubiquitinated TRIM26 subsequently associated with NEMO, thus bridging TBK1-NEMO interaction, which is critical for the recruitment of TBK1 to the VISA signalsome and activation of TBK1. Our findings suggest that TRIM26 is an important regulator of innate immune responses against RNA viruses, which functions by bridging TBK1 to NEMO and mediating the activation of TBK1.

  18. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    Science.gov (United States)

    Adrián-Martínez, S.; Aiello, S.; Ameli, F.; Anghinolfi, M.; Ardid, M.; Barbarino, G.; Barbarito, E.; Barbato, F. C. T.; Beverini, N.; Biagi, S.; Biagioni, A.; Bouhadef, B.; Bozza, C.; Cacopardo, G.; Calamai, M.; Calì, C.; Calvo, D.; Capone, A.; Caruso, F.; Ceres, A.; Chiarusi, T.; Circella, M.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D'Amato, C.; D'Amico, A.; De Bonis, G.; De Luca, V.; Deniskina, N.; De Rosa, G.; di Capua, F.; Distefano, C.; Enzenhöfer, A.; Fermani, P.; Ferrara, G.; Flaminio, V.; Fusco, L. A.; Garufi, F.; Giordano, V.; Gmerk, A.; Grasso, R.; Grella, G.; Hugon, C.; Imbesi, M.; Kulikovskiy, V.; Lahmann, R.; Larosa, G.; Lattuada, D.; Leismüller, K. P.; Leonora, E.; Litrico, P.; Llorens Alvarez, C. D.; Lonardo, A.; Longhitano, F.; Lo Presti, D.; Maccioni, E.; Margiotta, A.; Marinelli, A.; Martini, A.; Masullo, R.; Migliozzi, P.; Migneco, E.; Miraglia, A.; Mollo, C. M.; Mongelli, M.; Morganti, M.; Musico, P.; Musumeci, M.; Nicolau, C. A.; Orlando, A.; Orzelli, A.; Papaleo, R.; Pellegrino, C.; Pellegriti, M. G.; Perrina, C.; Piattelli, P.; Pugliatti, C.; Pulvirenti, S.; Raffaelli, F.; Randazzo, N.; Real, D.; Riccobene, G.; Rovelli, A.; Saldaña, M.; Sanguineti, M.; Sapienza, P.; Sciacca, V.; Sgura, I.; Simeone, F.; Sipala, V.; Speziale, F.; Spitaleri, A.; Spurio, M.; Stellacci, S. M.; Taiuti, M.; Terreni, G.; Trasatti, L.; Trovato, A.; Ventura, C.; Vicini, P.; Viola, S.; Vivolo, D.

    2016-02-01

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than 1 year at the "record" depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the Italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than 1 year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of ^{40}K light emission, with a small percentage of light bursts due to bioluminescence. All these features confirm the stability and good optical properties of the site.

  19. Long term monitoring of the optical background in the Capo Passero deep-sea site with the NEMO tower prototype

    Energy Technology Data Exchange (ETDEWEB)

    Adrian-Martinez, S.; Ardid, M.; Llorens Alvarez, C.D.; Saldana, M. [Universitat Politecnica de Valencia, Instituto de Investigacion para la Gestion Integrada de las Zonas Costeras, Gandia (Spain); Aiello, S.; Giordano, V.; Leonora, E.; Longhitano, F.; Randazzo, N.; Sipala, V.; Ventura, C. [INFN Sezione Catania, Catania (Italy); Ameli, F.; Biagioni, A.; De Bonis, G.; Fermani, P.; Lonardo, A.; Nicolau, C.A.; Simeone, F.; Vicini, P. [INFN Sezione Roma, Rome (Italy); Anghinolfi, M.; Hugon, C.; Musico, P.; Orzelli, A.; Sanguineti, M. [INFN Sezione Genova, Genoa (Italy); Barbarino, G.; Barbato, F.C.T.; De Rosa, G.; Di Capua, F.; Garufi, F.; Vivolo, D. [INFN Sezione Napoli, Naples (Italy); Dipartimento di Scienze Fisiche Universita di Napoli, Naples (Italy); Barbarito, E. [INFN Sezione Bari, Bari (Italy); Dipartimento Interateneo di Fisica Universita di Bari, Bari (Italy); Beverini, N.; Calamai, M.; Maccioni, E.; Marinelli, A.; Terreni, G. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Dipartimento di Fisica Universita di Pisa, Polo Fibonacci, Pisa (Italy); Biagi, S.; Cacopardo, G.; Cali, C.; Caruso, F.; Cocimano, R.; Coniglione, R.; Costa, M.; Cuttone, G.; D' Amato, C.; De Luca, V.; Distefano, C.; Gmerk, A.; Grasso, R.; Imbesi, M.; Kulikovskiy, V.; Larosa, G.; Lattuada, D.; Leismueller, K.P.; Litrico, P.; Migneco, E.; Miraglia, A.; Musumeci, M.; Orlando, A.; Papaleo, R.; Pulvirenti, S.; Riccobene, G.; Rovelli, A.; Sapienza, P.; Sciacca, V.; Speziale, F.; Spitaleri, A.; Trovato, A.; Viola, S. [INFN Laboratori Nazionali del Sud, Catania (Italy); Bouhadef, B.; Flaminio, V.; Raffaelli, F. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Bozza, C.; Grella, G.; Stellacci, S.M. [INFN Gruppo Collegato di Salerno, Fisciano (Italy); Dipartimento di Fisica Universita di Salerno, Fisciano (Italy); Calvo, D.; Real, D. [CSIC-Universitat de Valencia, IFIC-Instituto de Fisica Corpuscular, Valencia (Spain); Capone, A.; Masullo, R.; Perrina, C. [INFN Sezione Roma, Rome (Italy); Dipartimento di Fisica Universita ' ' Sapienza' ' , Rome (Italy); Ceres, A.; Circella, M.; Mongelli, M.; Sgura, I. [INFN Sezione Bari, Bari (Italy); Chiarusi, T. [INFN Sezione Bologna, Bologna (Italy); D' Amico, A. [INFN Laboratori Nazionali del Sud, Catania (Italy); Nikhef, Science Park, Amsterdam (Netherlands); Deniskina, N.; Migliozzi, P.; Mollo, C.M. [INFN Sezione Napoli, Naples (Italy); Enzenhoefer, A.; Lahmann, R. [Friedrich-Alexander-Universitaet Erlangen-Nuernberg, Erlangen Centre for Astroparticle Physics, Erlangen (Germany); Ferrara, G. [INFN Laboratori Nazionali del Sud, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Fusco, L.A.; Margiotta, A.; Pellegrino, C.; Spurio, M. [INFN Sezione Bologna, Bologna (Italy); Dipartimento di Fisica ed Astronomia Universita di Bologna, Bologna (Italy); Lo Presti, D.; Pugliatti, C. [INFN Sezione Catania, Catania (Italy); Dipartimento di Fisica e Astronomia Universita di Catania, Catania (Italy); Martini, A.; Trasatti, L. [INFN Laboratori Nazionali di Frascati, Frascati (Italy); Morganti, M. [INFN Sezione Pisa, Polo Fibonacci, Pisa (Italy); Accademia Navale di Livorno, Livorno (Italy); Pellegriti, M.G. [INFN Laboratori Nazionali del Sud, Catania (IT); Piattelli, P. [INFN Laboratori Nazionali del Sud, Catania (IT); Taiuti, M. [INFN Sezione Genova, Genoa (IT); Dipartimento di Fisica Universita di Genova, Genoa (IT)

    2016-02-15

    The NEMO Phase-2 tower is the first detector which was operated underwater for more than 1 year at the ''record'' depth of 3500 m. It was designed and built within the framework of the NEMO (NEutrino Mediterranean Observatory) project. The 380 m high tower was successfully installed in March 2013 80 km offshore Capo Passero (Italy). This is the first prototype operated on the site where the Italian node of the KM3NeT neutrino telescope will be built. The installation and operation of the NEMO Phase-2 tower has proven the functionality of the infrastructure and the operability at 3500 m depth. A more than 1 year long monitoring of the deep water characteristics of the site has been also provided. In this paper the infrastructure and the tower structure and instrumentation are described. The results of long term optical background measurements are presented. The rates show stable and low baseline values, compatible with the contribution of {sup 40}K light emission, with a small percentage of light bursts due to bioluminescence. All these features confirm the stability and good optical properties of the site. (orig.)

  20. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  1. Development of hydrogeological modelling tools based on NAMMU

    Energy Technology Data Exchange (ETDEWEB)

    Marsic, N. [Kemakta Konsult AB, Stockholm (Sweden); Hartley, L.; Jackson, P.; Poole, M. [AEA Technology, Harwell (United Kingdom); Morvik, A. [Bergen Software Services International AS, Bergen (Norway)

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  2. Implementing an HL7 version 3 modeling tool from an Ecore model.

    Science.gov (United States)

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application.

  3. Congenital alterations of NEMO glutamic acid 223 result in hypohidrotic ectodermal dysplasia and immunodeficiency with normal serum IgG levels.

    Science.gov (United States)

    Karamchandani-Patel, Gital; Hanson, Eric P; Saltzman, Rushani; Kimball, C Eve; Sorensen, Ricardo U; Orange, Jordan S

    2011-07-01

    Hypomorphic mutations in the nuclear factor-κB (NF-κB) essential modulator (NEMO) gene result in a variable syndrome of somatic and immunologic abnormalities. Clinically relevant genotype-phenotype associations are essential to understanding this complex disease. To study 2 unrelated boys with novel NEMO mutations altering codon 223 for similarity in phenotype in consideration of potential genotype-phenotype associations. Clinical and laboratory features, including cell counts, immunoglobulin quantity and quality, natural killer cell cytotoxicity, and Toll-like and tumor necrosis factor receptor signaling, were evaluated. Because both mutations affected NEMO codon 223 and were novel, consideration was given to new potential genotype-phenotype associations. Both patients were diagnosed as having hypohidrotic ectodermal dysplasia and had severe or recurrent infections. One had recurrent sinopulmonary infections and the other necrotizing soft tissue methicillin-resistant Staphylococcus aureus infection and Streptococcus anginosus subdural empyema with bacteremia. NEMO gene sequence demonstrated a 3-nucleotide deletion (c.667_669delGAG) in one patient and a substitution (667G>A) in the other. These findings predict either the deletion of NEMO glutamic acid 223 or it being replaced with lysine, respectively. Both patients had normal serum IgG levels but poor specific antibodies. Natural killer cell cytotoxicity and Toll-like and tumor necrosis factor receptor signaling were also impaired. Serious bacterial infection did not occur in both patients after immunoglobulin replacement therapy. Two different novel mutations affecting NEMO glutamic acid 223 resulted in clinically relevant similar phenotypes, providing further evidence to support genotype-phenotype correlations in this disease. They suggest NEMO residue 223 is required for ectodermal development and immunity and is apparently dispensable for quantitative IgG production but may be required for specific antibody

  4. Conversion of Rapid Prototyping Models into Metallic Tools by Ceramic Moulding—an Indirect Rapid Tooling Process

    Institute of Scientific and Technical Information of China (English)

    Teresa; P; DUARTE; J; M; FERREIRA; F; Jorge; LINO; A; BARBEDO; Rui; NETO

    2002-01-01

    A process to convert models made by rapid prototypi ng techniques like SL (stereolitography) and LOM (laminated object manufacturing) or by conventional techniques (silicones, resins, wax, etc.) into metallic mould s or tools has been developed. The main purpose of this technique is to rapidly obtain the first prototypes of parts, for plastics injection, forging or any oth er manufacturing process using the tools produced by casting a metal into a cera mic mould. Briefly, it can be said that the ceramic...

  5. Radon emanation based material measurement and selection for the SuperNEMO double beta experiment

    Energy Technology Data Exchange (ETDEWEB)

    Cerna, Cédric, E-mail: cerna@cenbg.in2p3.fr; Soulé, Benjamin; Perrot, Frédéric [Centre d’Études Nucléaires de Bordeaux Gradignan, UMR 5797, F-33170 Gradignan (France)

    2015-08-17

    The SuperNEMO Demonstrator experiment aims to study the neutrinoless double beta decay of 7 kg of {sup 82}Se in order to reach a limit on the light Majorana neutrino mass mechanism T{sub 1/2} (ββ0ν) > 6.5 10{sup 24} years (90%CL) equivalent to a mass sensitivity mβ{sub β} < 0.20 - 0.40 eV (90%CL) in two years of data taking. The detector construction started in 2014 and its installation in the Laboratoire Souterrain de Modane (LSM) is expected during the course of 2015. The remaining level of {sup 226}Ra ({sup 238}U chain) in the detector components can lead to the emanation of {sup 222}Rn gas. This isotope should be controlled and reduced down to the level of a 150 µBq/m{sup 3} in the tracker chamber of the detector to achieve the physics goals. Besides the HPGe selection of the detector materials for their radiopurity, the most critical materials have been tested and selected in a dedicated setup facility able to measure their {sup 222}Rn emanation level. The operating principle relies on a large emanation tank (0.7m{sup 3}) that allows measuring large material surfaces or large number of construction pieces. The emanation tank is coupled to an electrostatic detector equipped with a silicon diode to perform the alpha spectroscopy of the gas it contains and extract the {sup 222}Rn daughters. The transfer efficiency and the detector efficiency have been carefully calibrated through different methods. The intrinsic background of the system allows one to measure 222Rn activities down to 3 mBq, leading to a typical emanation sensitivity of 20 µBq/m{sup 2}/day for a 30 m{sup 2} surface sample. Several construction materials have been measured and selected, such as nylon and aluminized Mylar films, photomultipliers and tracking of the SuperNEMO Demonstrator.

  6. M4AST - A Tool for Asteroid Modelling

    Science.gov (United States)

    Birlan, Mirel; Popescu, Marcel; Irimiea, Lucian; Binzel, Richard

    2016-10-01

    M4AST (Modelling for asteroids) is an online tool devoted to the analysis and interpretation of reflection spectra of asteroids in the visible and near-infrared spectral intervals. It consists into a spectral database of individual objects and a set of routines for analysis which address scientific aspects such as: taxonomy, curve matching with laboratory spectra, space weathering models, and mineralogical diagnosis. Spectral data were obtained using groundbased facilities; part of these data are precompiled from the literature[1].The database is composed by permanent and temporary files. Each permanent file contains a header and two or three columns (wavelength, spectral reflectance, and the error on spectral reflectance). Temporary files can be uploaded anonymously, and are purged for the property of submitted data. The computing routines are organized in order to accomplish several scientific objectives: visualize spectra, compute the asteroid taxonomic class, compare an asteroid spectrum with similar spectra of meteorites, and computing mineralogical parameters. One facility of using the Virtual Observatory protocols was also developed.A new version of the service was released in June 2016. This new release of M4AST contains a database and facilities to model more than 6,000 spectra of asteroids. A new web-interface was designed. This development allows new functionalities into a user-friendly environment. A bridge system of access and exploiting the database SMASS-MIT (http://smass.mit.edu) allows the treatment and analysis of these data in the framework of M4AST environment.Reference:[1] M. Popescu, M. Birlan, and D.A. Nedelcu, "Modeling of asteroids: M4AST," Astronomy & Astrophysics 544, EDP Sciences, pp. A130, 2012.

  7. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  8. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  9. Nuclear initiated NF-κB signaling: NEMO and ATM take center stage

    Institute of Scientific and Technical Information of China (English)

    Shigeki Miyamoto

    2011-01-01

    A large body of literature describes elaborate NF-κB signaling networks induced by inflammatory and immune signals.Decades of research has revealed that transcriptionally functional NF-κB dimers are activated by two major pathways,canonical and non-canonical.Both pathways involve the release of NF-κB dimers from inactive cytoplasmic complexes to cause their nuclear translocation to modulate gene expression programs and biological responses.NF-κB is also responsive to genotoxic agents; however,signal communication networks that are initiated in the nucleus following DNA damage induction are less defined.Evidence in the literature supports the presence of such signaling pathways induced by multiple distinct genotoxic agents,resulting in the activation of cytoplasmic IKK complex.An example is a pathway that involves the DNA damage-responsive kinase ataxia telangiectasia mutated(ATM)and a series of post-translational modifications of NF-κB essential modulator(NEMO)in the nucleus of a genotoxinexposed cell.Recent evidence also suggests that this nuclear-initiated NF-κB signaling pathway plays significant physiological and pathological roles,particularly in lymphocyte development and human cancer progression.This review will summarize these new developments,while identifying significant unanswered questions and providing new hypotheses that may be addressed in future studies.

  10. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  11. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  12. Measurement of the double-β decay half-life and search for the neutrinoless double-β decay of 48Ca with the NEMO-3 detector

    Science.gov (United States)

    Waters, David; Vilela, Cristóvão; NEMO-3 collaboration

    2017-09-01

    Neutrinoless double-β decay is a powerful probe of lepton number violating processes that may arise from Majorana terms in neutrino masses, or from supersymmetric, left-right symmetric, and other extensions of the Standard Model. Of the candidate isotopes for the observation of this process, 48Ca has the highest Qββ -value, resulting in decays with energies significantly above most naturally occurring backgrounds. The nucleus also lends itself to precise matrix element calculations within the nuclear shell model. We present the world’s best measurement of the two-neutrino double-β decay of 48Ca, obtained by the NEMO-3 collaboration using 5.25 yr of data recorded with a 6.99 g sample of isotope, yielding ≈ 150 events with a signal to background ratio larger than 3. Neutrinoless modes of double-β decay are also investigated, with no evidence of new physics. Furthermore, these results indicate that two-neutrino double-β decay would be the main source of background for similar future searches using 48Ca with significantly larger exposures.

  13. An Executable Architecture Tool for the Modeling and Simulation of Operational Process Models

    Science.gov (United States)

    2015-03-16

    national coordination and iterative development. This paper includes a literature review, background information on process models and architecture...future work involves coordination with Subject Matter Experts ( SMEs ), and extracting data from experiments to assign more appropriate values. 3) Sub...development. This paper provided a brief description of other available tools, Fig. 10. Snapshot of Simulation Output Results for Example 3 background

  14. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  15. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  16. Improved Modeling Tools For High Speed Reacting Flows

    Science.gov (United States)

    2006-09-01

    putting the tools in place and operating them as a single system on the Beowulf cluster which was purposely built by Blue Blanket LLC (BBLLC) for this...a commercial tool, available from the Program Development Company (PDC). Computational Cluster An eight processor cluster was leased from BBLLC...SBIR I - FA8650-05-M-2594 3 Software Installation Once this cluster was in place, the off-the-shelf software was installed and tested

  17. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  18. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  19. Designing a new tool for modeling and simulation of discrete-event based systems

    OpenAIRE

    2009-01-01

    This paper talks about design, development, and application of a new Petri net simulator for modeling and simulation of discrete event system (e.g. information systems). The new tool is called GPenSIM (General purpose Petri Net Simulator). Firstly, this paper presents the reason for developing a new tool, through a brief literature study. Secondly, the design and architectural issues of the tool is given. Finally, an application example is given on the application of the tool.

  20. Failure of the Nemo trial: bumetanide is a promising agent to treat many brain disorders but not newborn seizures

    Directory of Open Access Journals (Sweden)

    Yehezkel eBen-Ari

    2016-04-01

    Full Text Available The diuretic bumetanide failed to treat acute seizures due to hypoxic ischemic encephalopathy (HIE in newborn babies and was associated with hearing loss (NEMO trial; 1. On the other hand, clinical and experimental observations suggest that the diuretic might provide novel therapy for many brain disorders including autistic spectrum disorder, schizophrenia, Rett syndrome and Parkinson disease. Here, we discuss the differences between the pathophysiology of severe recurrent seizures in the neonates and neurological and psychiatric disorders stressing the uniqueness of severe seizures in newborn in comparison to other disorders.

  1. Failure of the Nemo Trial: Bumetanide Is a Promising Agent to Treat Many Brain Disorders but Not Newborn Seizures.

    Science.gov (United States)

    Ben-Ari, Yehezkel; Damier, Philippe; Lemonnier, Eric

    2016-01-01

    The diuretic bumetanide failed to treat acute seizures due to hypoxic ischemic encephalopathy (HIE) in newborn babies and was associated with hearing loss (NEMO trial, Pressler et al., 2015). On the other hand, clinical and experimental observations suggest that the diuretic might provide novel therapy for many brain disorders including Autism Spectrum Disorders (ASD), schizophrenia, Rett syndrome, and Parkinson disease. Here, we discuss the differences between the pathophysiology of severe recurrent seizures in the neonates and neurological and psychiatric disorders stressing the uniqueness of severe seizures in newborn in comparison to other disorders.

  2. Measurement of the Double Beta Decay Half-life of 130Te with the NEMO-3 Detector

    CERN Document Server

    Arnold, R; Baker, J; Barabash, A S; Basharina-Freshville, A; Blondel, S; Bongrand, M; Broudin-Bay, G; Brudanin, V; Caffrey, A J; Chapon, A; Chauveau, E; Durand, D; Egorov, V; Flack, R; Garrido, X; Grozier, J; Guillon, B; Hubert, Ph; Jackson, C M; Jullian, S; Kauer, M; Klimenko, A; Kochetov, O; Konovalov, S I; Kovalenko, V; Lalanne, D; Lamhamdi, T; Lang, K; Liptak, Z; Lutter, G; Mamedov, F; Marquet, Ch; Martin-Albo, J; Mauger, F; Mott, J; Nachab, A; Nemchenok, I; Nguyen, C H; Nova, F; Novella, P; Ohsumi, H; Pahlka, R B; Perrot, F; Piquemal, F; Reyss, J L; Richards, B; Ricol, J S; Saakyan, R; Sarazin, X; Shitov, Yu; Simard, L; Šimkovic, F; Smolnikov, A; Söldner-Rembold, S; Štekl, I; Suhonen, J; Sutton, C S; Szklarz, G; Thomas, J; Timkin, V; Torre, S; Tretyak, V I; Umatov, V; Vála, L; Vanyushin, I; Vasiliev, V; Vorobel, V; Vylov, T; Zukauskas, A

    2011-01-01

    This Letter reports results from the NEMO-3 experiment based on an exposure of 1275 days with 661g of 130Te in the form of enriched and natural tellurium foils. With this data set the double beta decay rate of 130Te is found to be non-zero with a significance of 7.7 standard deviations and the half-life is measured to be T1/2 = (7.0 +/- 0.9(stat) +/- 1.1(syst)) x 10^{20} yr. This represents the most precise measurement of this half-life yet published and the first real-time observation of this decay.

  3. Examining an important urban transportation management tool: subarea modeling

    Directory of Open Access Journals (Sweden)

    Xueming CHEN

    2009-12-01

    Full Text Available At present, customized subarea models have been widely used in local transportation planning throughout the United States. The biggest strengths of a subarea model lie in its more detailed and accurate modeling outputs which better meet local planning requirements. In addition, a subarea model can substantially reduce database size and model running time. In spite of these advantages, subarea models remain quite weak in maintaining consistency with a regional model, modeling transit projects, smart growth measures, air quality conformity, and other areas. Both opportunities and threats exist for subarea modeling. In addition to examining subarea models, this paper introduces the decision-making process in choosing a proper subarea modeling approach (windowing versus focusing and software package. This study concludes that subarea modeling will become more popular in the future. More GIS applications, travel surveys, transit modeling, microsimulation software utilization, and other modeling improvements are expected to be incorporated into the subarea modeling process.

  4. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  5. Ergonomics applications of a mechanical model of the human operator in power hand tool operation.

    Science.gov (United States)

    Lin, Jia-Hua; Radwin, Robert; Nembhard, David

    2005-02-01

    Applications of a new model for predicting power threaded-fastener-driving tool operator response and capacity to react against impulsive torque reaction forces are explored for use in tool selection and ergonomic workplace design. The model is based on a mechanical analog of the human operator, with parameters dependent on work location (horizontal and vertical distances); work orientation (horizontal and vertical); and tool shape (in-line, pistol grip, and right angle); and is stratified by gender. This model enables prediction of group means and variances of handle displacement and force for a given tool configuration. Response percentiles can be ascertained for specific tool operations. For example, a sample pistol grip nutrunner used on a horizontal surface at 30 cm in front of the ankles and 140 cm above the floor results in a predicted mean handle reaction displacement of 39.0 (SD=28.1) mm for males. Consequently 63%of the male users exceed a 30 mm handle displacement limit. When a right angle tool of similar torque output is used instead, the model predicted that only 4.6%of the male tool users exceed a 30 mm handle displacement. A method is described for interpolating individual subject model parameters at any given work location using linear combinations in relation to the range of modeled factors. Additional examples pertinent to ergonomic workstation design and tool selection are provided to demonstrate how the model can be used to aid tool selection and workstation design.

  6. Measurement Model for Division as a Tool in Computing Applications

    Science.gov (United States)

    Abramovich, Sergei; Strock, Tracy

    2002-01-01

    The paper describes the use of a spreadsheet in a mathematics teacher education course. It shows how the tool can serve as a link between seemingly disconnected mathematical concepts. The didactical triad of using a spreadsheet as an agent, consumer, and amplifier of mathematical activities allows for an extended investigation of simple yet…

  7. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid;

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  8. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  9. An Energy Systems Modelling Tool for the Social Simulation Community

    NARCIS (Netherlands)

    Bollinger, L. Andrew; van Blijswijk, Martti J.; Dijkema, Gerard P.J.; Nikolic, Igor

    2016-01-01

    The growing importance of links between the social and technical dimensions of the electricity infrastructure mean that many research problems cannot be effectively addressed without joint consideration of social and technical dynamics. This paper motivates the need for and introduces a tool to faci

  10. A practical tool for modeling biospecimen user fees.

    Science.gov (United States)

    Matzke, Lise; Dee, Simon; Bartlett, John; Damaraju, Sambasivarao; Graham, Kathryn; Johnston, Randal; Mes-Masson, Anne-Marie; Murphy, Leigh; Shepherd, Lois; Schacter, Brent; Watson, Peter H

    2014-08-01

    The question of how best to attribute the unit costs of the annotated biospecimen product that is provided to a research user is a common issue for many biobanks. Some of the factors influencing user fees are capital and operating costs, internal and external demand and market competition, and moral standards that dictate that fees must have an ethical basis. It is therefore important to establish a transparent and accurate costing tool that can be utilized by biobanks and aid them in establishing biospecimen user fees. To address this issue, we built a biospecimen user fee calculator tool, accessible online at www.biobanking.org . The tool was built to allow input of: i) annual operating and capital costs; ii) costs categorized by the major core biobanking operations; iii) specimen products requested by a biobank user; and iv) services provided by the biobank beyond core operations (e.g., histology, tissue micro-array); as well as v) several user defined variables to allow the calculator to be adapted to different biobank operational designs. To establish default values for variables within the calculator, we first surveyed the members of the Canadian Tumour Repository Network (CTRNet) management committee. We then enrolled four different participants from CTRNet biobanks to test the hypothesis that the calculator tool could change approaches to user fees. Participants were first asked to estimate user fee pricing for three hypothetical user scenarios based on their biobanking experience (estimated pricing) and then to calculate fees for the same scenarios using the calculator tool (calculated pricing). Results demonstrated significant variation in estimated pricing that was reduced by calculated pricing, and that higher user fees are consistently derived when using the calculator. We conclude that adoption of this online calculator for user fee determination is an important first step towards harmonization and realistic user fees.

  11. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    Science.gov (United States)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-01-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school…

  12. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  13. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  14. A Model for Developing Meta-Cognitive Tools in Teacher Apprenticeships

    Science.gov (United States)

    Bray, Paige; Schatz, Steven

    2013-01-01

    This research investigates a model for developing meta-cognitive tools to be used by pre-service teachers during apprenticeship (student teaching) experience to operationalise the epistemological model of Cook and Brown (2009). Meta-cognitive tools have proven to be effective for increasing performance and retention of undergraduate students.…

  15. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  16. Tools and data for the geochemical modeling. Thermodynamic data for sulfur species and background salts and tools for the uncertainty analysis; WEDA. Werkzeuge und Daten fuer die Geochemische Modellierung. Thermodynamische Daten fuer Schwefelspezies und Hintergrundsalze sowie Tools zur Unsicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Sven; Schoenwiese, Dagmar; Scharge, Tina

    2015-07-15

    The report on tools and data for the geochemical modeling covers the following issues: experimental methods and theoretical models, design of a thermodynamic model for reduced sulfur species, thermodynamic models for background salts, tools for the uncertainty and sensitivity analyses of geochemical equilibrium modeling.

  17. Modeling Heterogeneity in Networks using Uncertainty Quantification Tools

    CERN Document Server

    Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G

    2015-01-01

    Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...

  18. MQ-2 A Tool for Prolog-based Model Querying

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2012-01-01

    MQ-2 integrates a Prolog console into the MagicDraw1 modeling environment and equips this console with features targeted specifically to the task of querying models. The vision of MQ-2 is to make Prolog-based model querying accessible to both student and expert modelers by offering powerful query...

  19. A unified tool for performance modelling and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Gilmore, Stephen [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)]. E-mail: stg@inf.ed.ac.uk; Kloul, Leila [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)

    2005-07-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony.

  20. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  1. Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.

    Science.gov (United States)

    Brockman, Vicki

    As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.

  2. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  3. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  4. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  5. 支持多接口的NEMO实现与测试%IMPLEMENTATION AND TEST OF MULTIPLE INTERFACES-SUPPORTED NEMO

    Institute of Scientific and Technical Information of China (English)

    邱陆威; 高德云; 周华春

    2012-01-01

    Network Mobility ( NEMO) Basic Protocol is able to allow mobile nodes within sub-network to maintain continuity of the session when moving. Multiple care-of address (MCoA) protocol allows a mobile node to register multiple care-of addresses simultaneously. In this paper we integrate the protocols of NEMO and MCoA and study the implementation of the function of NEMO supported with multiple interfaces, and design the experiments to verify this function and analyse its performance test.%子网移动性(NEMO)基本支持协议可以让移动子网内部的节点在移动时依然保持会话的连续性,多转交地址协议(MCoA)允许一个移动节点同时注册多个转交地址.集成NEMO和MCoA协议,研究了支持多接口的NEMO功能的实现,并设计实验进行了功能验证和性能测试分析.

  6. The Quantum Atomic Model "Electronium": A Successful Teaching Tool.

    Science.gov (United States)

    Budde, Marion; Niedderer, Hans; Scott, Philip; Leach, John

    2002-01-01

    Focuses on the quantum atomic model Electronium. Outlines the Bremen teaching approach in which this model is used, and analyzes the learning of two students as they progress through the teaching unit. (Author/MM)

  7. Tools for modeling radioactive contaminants in chip materials

    Science.gov (United States)

    Wrobel, F.; Kaouache, A.; Saigné, F.; Touboul, A. D.; Schrimpf, R. D.; Warot, G.; Bruguier, O.

    2017-03-01

    Radioactive pollutants are naturally present in microelectronic device materials and can be an issue for the reliability of devices. The main concern is alpha emitters that produce high-energy particles (a few MeV) that ionize the semiconductor and then trigger soft errors. The question is to know what kinds of radionuclides are present in the device, their location in the device and the abundance of each species. In this paper we describe tools that are required to address the issue of radioactive pollutants in electronic devices.

  8. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  9. Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics

    Science.gov (United States)

    2008-10-01

    AFRL-RH-WP-TR-2009-0110 Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics...SUBTITLE Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics 5a. CONTRACT NUMBER FA8650-07-1-6848...8 1 1.0 SUMMARY This final technical report describes the research findings of the project Simulation Modeling and Statistical Network

  10. WeedML: a Tool for Collaborative Weed Demographic Modeling

    OpenAIRE

    Holst, Niels

    2010-01-01

    WeedML is a proposed standard to formulate models of weed demography, or maybe even complex models in general, that are both transparent and straightforward to re-use as building blocks for new models. The paper describes the design and thoughts behind WeedML which relies on XML and object-oriented systems development. Proof-of-concept software is provided as open-source C++ code and executables that can be downloaded freely.

  11. Modeling of Tool Wear in Vibration Assisted Nano Impact-Machining by Loose Abrasives

    Directory of Open Access Journals (Sweden)

    Sagil James

    2014-01-01

    Full Text Available Vibration assisted nano impact-machining by loose abrasives (VANILA is a novel nanomachining process that combines the principles of vibration assisted abrasive machining and tip-based nanomachining, to perform target specific nanoabrasive machining of hard and brittle materials. An atomic force microscope (AFM is used as a platform in this process wherein nanoabrasives, injected in slurry between the workpiece and the vibrating AFM probe which is the tool, impact the workpiece and cause nanoscale material removal. The VANILA process are conducted such that the tool tip does not directly contact the workpiece. The level of precision and quality of the machined features in a nanomachining process is contingent on the tool wear which is inevitable. Initial experimental studies have demonstrated reduced tool wear in the VANILA process as compared to indentation process in which the tool directly contacts the workpiece surface. In this study, the tool wear rate during the VANILA process is analytically modeled considering impacts of abrasive grains on the tool tip surface. Experiments are conducted using several tools in order to validate the predictions of the theoretical model. It is seen that the model is capable of accurately predicting the tool wear rate within 10% deviation.

  12. Towards diagnostic tools for analysing Swarm data through model retrievals

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Plank, Gernot; Haagmans, R.

    The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal dependency, and to gain new insights into improving our knowledge of the Earth’s interior and climate. The Swarm concept consists of a constellation of three satellites in three different...... polar orbits between 300 and 550 km altitude. Goal of the current study is to build tools and to analyze datasets, in order to allow a fast diagnosis of the Swarm system performance in orbit during the commission phase and operations of the spacecraft. The effects on the reconstruction of the magnetic...... field resulting from various error sources are investigated. By using a specially developed software package closed loop simulations are performed aiming at different scenarios. We start from the simple noise-free case and move on to more complex and realistic situations which include attitude errors...

  13. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  14. KINEROS2 – AGWA Suite of Modeling Tools

    Science.gov (United States)

    KINEROS2 (K2) originated in the 1960s as a distributed event-based rainfall-runoff erosion model abstracting the watershed as a cascade of overland flow elements contributing to channel model elements. Development and improvement of K2 has continued for a variety of projects and ...

  15. Using a Parametric Solid Modeler as an Instructional Tool

    Science.gov (United States)

    Devine, Kevin L.

    2008-01-01

    This paper presents the results of a quasi-experimental study that brought 3D constraint-based parametric solid modeling technology into the high school mathematics classroom. This study used two intact groups; a control group and an experimental group, to measure the extent to which using a parametric solid modeler during instruction affects…

  16. Computerized models : tools for assessing the future of complex systems?

    NARCIS (Netherlands)

    Ittersum, van M.K.; Sterk, B.

    2015-01-01

    Models are commonly used to make decisions. At some point all of us will have employed a mental model, that is, a simplification of reality, in an everyday situation. For instance, when we want to make the best decision for the environment and consider whether to buy our vegetables in a large

  17. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to unders

  18. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    Science.gov (United States)

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  19. MuscleBuilder:A Modeling Tool for Human Anatomy

    Institute of Scientific and Technical Information of China (English)

    Amaury Aubel; Daniel Thalmann

    2004-01-01

    A traditional multi-layered approach is adopted to human body modeling and deformation. The model is split into three general anatomical structures: the skeleton, musculature and skin. It is shown that each of these layers is modeled and deformed by using fast, procedural, ad-hoc methods that can painlessly be reimplemented. The modeling approach is generic enough to handle muscles of varying shape, size and characteristics and does not break in extreme skeleton poses. It is also described that the integrated MuscleBuilder system whose main features are: i) easy and quick creation of muscle deformation models; ii) automatic deformation of an overlying skin. It is shown that visually realistic results can be obtained at interactive frame rates with very little input from the designer.

  20. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing;

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  1. The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool

    Science.gov (United States)

    1988-07-01

    AARL-SR-90-512 AD-A235 930l[liill ~i 11111111111 iIII J The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool Louise A...R 4. TITLE AND SUBTITLE S. FUNDING NUMBERS The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool PE 62202F 6. AUTHOR(S) PR...Lagrange method. In this paper the use of the ATH model as a robot dynamics simulation tool is discussed and various simulations are demonstrated. For this

  2. Application of Krylov Reduction Technique for a Machine Tool Multibody Modelling

    Directory of Open Access Journals (Sweden)

    M. Sulitka

    2014-02-01

    Full Text Available Quick calculation of machine tool dynamic response represents one of the major requirements for machine tool virtual modelling and virtual machining, aiming at simulating the machining process performance, quality, and precision of a workpiece. Enhanced time effectiveness in machine tool dynamic simulations may be achieved by employing model order reduction (MOR techniques of the full finite element (FE models. The paper provides a case study aimed at comparison of Krylov subspace base and mode truncation technique. Application of both of the reduction techniques for creating a machine tool multibody model is evaluated. The Krylov subspace reduction technique shows high quality in terms of both dynamic properties of the reduced multibody model and very low time demands at the same time.

  3. CPS Modeling of CNC Machine Tool Work Processes Using an Instruction-Domain Based Approach

    Directory of Open Access Journals (Sweden)

    Jihong Chen

    2015-06-01

    Full Text Available Building cyber-physical system (CPS models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control (CNC system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control (NC processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.

  4. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomas

  5. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton

  6. Static Stiffness Modeling of a Novel PKM-Machine Tool Structure

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2014-07-01

    Full Text Available This article presents a new configuration of a 3-dof machine tool with parallel kinematics. Elastic deformations of the machine tool have been modeled with finite elements, stiffness coefficients at characteristic points of the working area for different cutting forces have been calculated.

  7. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  8. Econometric Model – A Tool in Financial Management

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2011-06-01

    Full Text Available The economic situation in Romania requires from the trader a rigorous analysis of vulnerabilities and opportunities offered by the external environment and a careful analysis of internal environmental conditions in which the entity operates. In this context particular attention is paid to indicators presented in the financial statements. Many times they are a model for economic forecasts, future plans, basic business and businesses that use them with a good forecasting activity. In this paper we propose to analyze the comparative evolution of the main financial indicators highlighted in financial statements (profit and loss through a multi-equation econometric model, namely dynamic Keynesian model.

  9. An Introduction to Model Selection: Tools and Algorithms

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2006-03-01

    Full Text Available Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans prevents the use of Popper’s falsification principle (which is the norm in other sciences. Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the present paper, an error measure (likelihood, as well as five methods to compare model fits (the likelihood ratio test, Akaike’s information criterion, the Bayesian information criterion, bootstrapping and cross-validation, are presented. The use of each method is illustrated by an example, and the advantages and weaknesses of each method are also discussed.

  10. Modeling and Calculator Tools for State and Local Transportation Resources

    Science.gov (United States)

    Air quality models, calculators, guidance and strategies are offered for estimating and projecting vehicle air pollution, including ozone or smog-forming pollutants, particulate matter and other emissions that pose public health and air quality concerns.

  11. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    Science.gov (United States)

    Salah, Ahmad M.; Nelson, E. James; Williams, Gustavious P.

    2010-04-01

    We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS) which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  12. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    Directory of Open Access Journals (Sweden)

    Ahmad M Salah

    2010-12-01

    Full Text Available We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  13. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Marques, W. C., E. H. L. Fernandes, B. C. Moraes, O. O. Möller, and A. Malcherek (2010), Dynamics of the Patos Lagoon coastal plume and its...multiple hurricane beds in the northern Gulf of Mexico , Marine Geology, Volume 210, Issues 1-4, Storms and their significance in coastal morpho-sedimentary...accuracy of model forecasts of currents in coastal areas. The MVV module is implemented as part of the Geospatial Analysis and Model Evaluation Software

  14. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  15. SARAH 4: A tool for (not only SUSY) model builders

    CERN Document Server

    Staub, Florian

    2013-01-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@te. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  16. SARAH 4: A tool for (not only SUSY) model builders

    Science.gov (United States)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  17. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    Science.gov (United States)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  18. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  19. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  20. Cooperative development of logical modelling standards and tools with CoLoMoTo.

    Science.gov (United States)

    Naldi, Aurélien; Monteiro, Pedro T; Müssel, Christoph; Kestler, Hans A; Thieffry, Denis; Xenarios, Ioannis; Saez-Rodriguez, Julio; Helikar, Tomas; Chaouiya, Claudine

    2015-04-01

    The identification of large regulatory and signalling networks involved in the control of crucial cellular processes calls for proper modelling approaches. Indeed, models can help elucidate properties of these networks, understand their behaviour and provide (testable) predictions by performing in silico experiments. In this context, qualitative, logical frameworks have emerged as relevant approaches, as demonstrated by a growing number of published models, along with new methodologies and software tools. This productive activity now requires a concerted effort to ensure model reusability and interoperability between tools. Following an outline of the logical modelling framework, we present the most important achievements of the Consortium for Logical Models and Tools, along with future objectives. Our aim is to advertise this open community, which welcomes contributions from all researchers interested in logical modelling or in related mathematical and computational developments.

  1. The synergy professional practice model and its patient characteristics tool: a staff empowerment strategy.

    Science.gov (United States)

    MacPhee, Maura; Wardrop, Andrea; Campbell, Cheryl; Wejr, Patricia

    2011-10-01

    Nurse leaders can positively influence practice environments through a number of empowerment strategies, among them professional practice models. These models encompass the philosophy, structures and processes that support nurses' control over their practice and their voice within healthcare organizations. Nurse-driven professional practice models can serve as a framework for collaborative decision-making among nursing and other staff. This paper describes a provincewide pilot project in which eight nurse-led project teams in four healthcare sectors worked with the synergy professional practice model and its patient characteristics tool. The teams learned how the model and tool can be used to classify patients' acuity levels and make staffing assignments based on a "best fit" between patient needs and staff competencies. The patient characteristics tool scores patients' acuities on eight characteristics such as stability, vulnerability and resource availability. This tool can be used to make real-time patient assessments. Other potential applications for the model and tool are presented, such as care planning, team-building and determining appropriate staffing levels. Our pilot project evidence suggests that the synergy model and its patient characteristics tool may be an empowerment strategy that nursing leaders can use to enhance their practice environments.

  2. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    Directory of Open Access Journals (Sweden)

    Raul Torres-Ruiz

    2015-09-01

    Full Text Available The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  3. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling.

    Science.gov (United States)

    Torres-Ruiz, Raul; Rodriguez-Perales, Sandra

    2015-09-14

    The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  4. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  5. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  6. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  7. The Integrated Medical Model: A Decision Support Tool for In-flight Crew Health Care

    Science.gov (United States)

    Butler, Doug

    2009-01-01

    This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.

  8. Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning

    Science.gov (United States)

    Belozerov, V. A.; Uteshev, M. H.

    2016-08-01

    This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.

  9. Multi-Physics Computational Modeling Tool for Materials Damage Assessment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is to provide a multi-physics modeling tool for materials damage assessment for application to future aircraft design. The software...

  10. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  11. High Performance Computing tools for the Integrated Tokamak Modelling project

    Energy Technology Data Exchange (ETDEWEB)

    Guillerminet, B., E-mail: bernard.guillerminet@cea.f [Association Euratom-CEA sur la Fusion, IRFM, DSM, CEA Cadarache (France); Plasencia, I. Campos [Instituto de Fisica de Cantabria (IFCA), CSIC, Santander (Spain); Haefele, M. [Universite Louis Pasteur, Strasbourg (France); Iannone, F. [EURATOM/ENEA Fusion Association, Frascati (Italy); Jackson, A. [University of Edinburgh (EPCC) (United Kingdom); Manduchi, G. [EURATOM/ENEA Fusion Association, Padova (Italy); Plociennik, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland); Sonnendrucker, E. [Universite Louis Pasteur, Strasbourg (France); Strand, P. [Chalmers University of Technology (Sweden); Owsiak, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland)

    2010-07-15

    Fusion Modelling and Simulation are very challenging and the High Performance Computing issues are addressed here. Toolset for jobs launching and scheduling, data communication and visualization have been developed by the EUFORIA project and used with a plasma edge simulation code.

  12. Cognitive Bargaining Model: An Analysis Tool for Third Party Incentives?

    Science.gov (United States)

    2009-12-01

    A. MIXING APPLES AND ORANGES?.........................................................20 B. MODEL CONSTRUCTION...international community for years to come. From the foundational research /writing and comparative politics instruction from Professor Lawson to a...N. Haass and Meghan L. O’Sullivan, Honey and Vinegar (Brookings Institute Press: Washington, D.C., 2000), 162. 7 (economic or strategically) also

  13. Modeling mind-wandering: a tool to better understand distraction

    NARCIS (Netherlands)

    van Vugt, Marieke; Taatgen, Niels; Sackur, Jerome; Bastian, Mikael; Taatgen, Niels; van Vugt, Marieke; Borst, Jelmer; Mehlhorn, Katja

    2015-01-01

    When we get distracted, we may engage in mind-wandering, or task-unrelated thinking, which impairs performance on cognitive tasks. Yet, we do not have cognitive models that make this process explicit. On the basis of both recent experiments that have started to investigate mind-wandering and introsp

  14. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation serve

  15. Modeling tools to Account for Ethanol Impacts on BTEX Plumes

    Science.gov (United States)

    Widespread usage of ethanol in gasoline leads to impacts at leak sites which differ from those of non-ethanol gasolines. The presentation reviews current research results on the distribution of gasoline and ethanol, biodegradation, phase separation and cosolvancy. Model results f...

  16. Enhancing Technology-Mediated Communication: Tools, Analyses, and Predictive Models

    Science.gov (United States)

    2007-09-01

    the home (see, for example, Nagel, Hudson, & Abowd, 2004), in social Chapter 2: Background 17 settings (see Kern, Antifakos, Schiele ...on Computer Supported Cooperative Work (CSCW 2006), pp. 525-528 ACM Press. Kern, N., Antifakos, S., Schiele , B., & Schwaninger, A. (2004). A model

  17. Inverse thermal history modelling as a hydrocarbon exploration tool

    Energy Technology Data Exchange (ETDEWEB)

    Gallagher, K. [Imperial College of Science, Technology and Medicine, London (United Kingdom). TH Huxley School of Environment, Earth Science and Engineering

    1998-12-31

    Thermal history modelling is a significant part of hydrocarbon exploration and resource assessment. Its primary use is to predict the volume and timing of hydrocarbon generation as a sedimentary basin evolves on timescales of 10{sup 7}-10{sup 8} years. Forward modelling is commonly used to constrain the thermal history in sedimentary basins. Alternatively, inversion schemes may be used which have many advantages over the conventional forward modelling approach. An example of an inversion approach is presented here, wherein the preferred philosophy is to find the least complex model that fits the data. In this case, we estimate a heat flow function (of time) which provides an adequate fit to the available thermal indicator calibration data. The function is also constrained to be smooth, in either a first or second derivative sense. Extra complexity or structure is introduced into the function only where required to fit the data and the regularization stabilizes the inversion. The general formulation is presented and a real data example from the North Slope, Alaska is discussed. (author)

  18. Mathematical modelling: a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B.

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  19. Mathematical modelling : a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, H; Hellriegel, B

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  20. Mathematical modelling: a tool for hospital infection control.

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  1. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  2. Numerical Tools for the Bayesian Analysis of Stochastic Frontier Models

    NARCIS (Netherlands)

    Osiewalski, J.; Steel, M.F.J.

    1996-01-01

    In this paper we describe the use of modern numerical integration methods for making posterior inferences in composed error stochastic frontier models for panel data or individual cross-sections.Two Monte Carlo methods have been used in practical applications.We survey these two methods in some

  3. Interactive model evaluation tool based on IPython notebook

    Science.gov (United States)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the

  4. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    OpenAIRE

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes...

  5. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  6. Road traffic pollution monitoring and modelling tools and the UK national air quality strategy.

    OpenAIRE

    Marsden, G.R.; Bell, M.C.

    2001-01-01

    This paper provides an assessment of the tools required to fulfil the air quality management role now expected of local authorities within the UK. The use of a range of pollution monitoring tools in assessing air quality is discussed and illustrated with evidence from a number of previous studies of urban background and roadside pollution monitoring in Leicester. A number of approaches to pollution modelling currently available for deployment are examined. Subsequently, the modelling and moni...

  7. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  8. The generalized mathematical model of the failure of the cutting tool

    Science.gov (United States)

    Pasko, N. I.; Antsev, A. V.; Antseva, N. V.; Fyodorov, V. P.

    2017-02-01

    We offer a mathematical model which takes into account the following factors: the spread of the cutting properties of the tool, parameters spread of gear blanks and consideration of the factor of a possible fracture of the cutting wedge tool. The reliability function, taking into account the above-mentioned factors, has five parameters for which assessment we propose a method according to our experience. A numerical illustration of the method is shown in the article. We suggest using the model in the optimization mode of the cutting tool preventive measures.

  9. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    Directory of Open Access Journals (Sweden)

    Maike Kathrin Aurich

    2016-08-01

    Full Text Available Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools , we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  10. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  11. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  12. A NUI Based Multiple Perspective Variability Modelling CASE Tool

    OpenAIRE

    Bashroush, Rabih

    2010-01-01

    With current trends towards moving variability from hardware to \\ud software, and given the increasing desire to postpone design decisions as much \\ud as is economically feasible, managing the variability from requirements \\ud elicitation to implementation is becoming a primary business requirement in the \\ud product line engineering process. One of the main challenges in variability \\ud management is the visualization and management of industry size variability \\ud models. In this demonstrat...

  13. Mathematical modelling: a tool for hospital infection control.

    OpenAIRE

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associate...

  14. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    OpenAIRE

    Raul Torres-Ruiz; Sandra Rodriguez-Perales

    2015-01-01

    The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a ...

  15. The Visible Signature Modelling and Evaluation ToolBox

    Science.gov (United States)

    2008-12-01

    Orlando, FL, USA, pp. 452– 461. 53. Doll, T. J., Home, R., Cooke, K. J., Wasilewski , A. A., Sheerin, D. T. & Hetzler, M. C. (2003) Human vision simulation...T., McWhorter, S. W., Schmieder, D. E., Hetzler, M. C., Stewart, J. M., Wasilewski , A. A., Owens, W. R., Sheffer, A. D., Galloway, G. l. & Harbert... Wasilewski , A. A. (1995) Sim- ulation of selective attention and training effects in visual search and detection, in Vision models for target

  16. 3D model tools for architecture and archaeology reconstruction

    Science.gov (United States)

    Vlad, Ioan; Herban, Ioan Sorin; Stoian, Mircea; Vilceanu, Clara-Beatrice

    2016-06-01

    The main objective of architectural and patrimonial survey is to provide a precise documentation of the status quo of the surveyed objects (monuments, buildings, archaeological object and sites) for preservation and protection, for scientific studies and restoration purposes, for the presentation to the general public. Cultural heritage documentation includes an interdisciplinary approach having as purpose an overall understanding of the object itself and an integration of the information which characterize it. The accuracy and the precision of the model are directly influenced by the quality of the measurements realized on field and by the quality of the software. The software is in the process of continuous development, which brings many improvements. On the other side, compared to aerial photogrammetry, close range photogrammetry and particularly architectural photogrammetry is not limited to vertical photographs with special cameras. The methodology of terrestrial photogrammetry has changed significantly and various photographic acquisitions are widely in use. In this context, the present paper brings forward a comparative study of TLS (Terrestrial Laser Scanner) and digital photogrammetry for 3D modeling. The authors take into account the accuracy of the 3D models obtained, the overall costs involved for each technology and method and the 4th dimension - time. The paper proves its applicability as photogrammetric technologies are nowadays used at a large scale for obtaining the 3D model of cultural heritage objects, efficacious in their assessment and monitoring, thus contributing to historic conservation. Its importance also lies in highlighting the advantages and disadvantages of each method used - very important issue for both the industrial and scientific segment when facing decisions such as in which technology to invest more research and funds.

  17. Toxicokinetic models and related tools in environmental risk assessment of chemicals.

    Science.gov (United States)

    Grech, Audrey; Brochot, Céline; Dorne, Jean-Lou; Quignot, Nadia; Bois, Frédéric Y; Beaudouin, Rémy

    2017-02-01

    Environmental risk assessment of chemicals for the protection of ecosystems integrity is a key regulatory and scientific research field which is undergoing constant development in modelling approaches and harmonisation with human risk assessment. This review focuses on state-of-the-art toxicokinetic tools and models that have been applied to terrestrial and aquatic species relevant to environmental risk assessment of chemicals. Both empirical and mechanistic toxicokinetic models are discussed using the results of extensive literature searches together with tools and software for their calibration and an overview of applications in environmental risk assessment. These include simple tools such as one-compartment models, multi-compartment models to physiologically-based toxicokinetic (PBTK) models, mostly available for aquatic species such as fish species and a number of chemical classes including plant protection products, metals, persistent organic pollutants, nanoparticles. Data gaps and further research needs are highlighted.

  18. Peer Assessment with Online Tools to Improve Student Modeling

    Science.gov (United States)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  19. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  20. Conceptual Model As The Tool For Managing Bank Services Quality

    Directory of Open Access Journals (Sweden)

    Kornelija Severović

    2009-07-01

    Full Text Available Quality has become basic factor of economic efficiency and basic principle of business activities of successful organizations. Its consequence is revolution on the area of quality that has comprised all kinds of products and services and so the bank services as well. To understand the present and future needs of clients and to know how to fulfill and try to exceed their expectations is the task of each efficient economy. Therefore, the banks in the developed economies try to re-orientate organizationally, technologically and informatically their business activities placing the client in the core of this business activity. Significant indicators of quality services that banks offer is measured by the waiting time of clients for the offer of the desirable service and the number of clients who give up to enter the bank due to the long waiting queues. Dissatisfied client is the worst work result and business activity of banks. Following the stated, the great effort is made to improve service qualities, which means professionalism and communication of personnel with whom the clients come in contact, and giving punctual and clear information and short waiting period of standing in the lines. The aim of this work is to present and describe the functioning of bank system under the conditions of establishing quality in offering services to clients and to recognize basic guidelines for quality increase in the work of sub branches. Since the banking is very dynamic and complex system, the conceptual model is carried out for the purpose of optimization of the stated quality parameters for the bank business activity; this model, in further research, will serve for the development of simulation model.

  1. ENISI SDE: A New Web-Based Tool for Modeling Stochastic Processes.

    Science.gov (United States)

    Mei, Yongguo; Carbo, Adria; Hoops, Stefan; Hontecillas, Raquel; Bassaganya-Riera, Josep

    2015-01-01

    Modeling and simulations approaches have been widely used in computational biology, mathematics, bioinformatics and engineering to represent complex existing knowledge and to effectively generate novel hypotheses. While deterministic modeling strategies are widely used in computational biology, stochastic modeling techniques are not as popular due to a lack of user-friendly tools. This paper presents ENISI SDE, a novel web-based modeling tool with stochastic differential equations. ENISI SDE provides user-friendly web user interfaces to facilitate adoption by immunologists and computational biologists. This work provides three major contributions: (1) discussion of SDE as a generic approach for stochastic modeling in computational biology; (2) development of ENISI SDE, a web-based user-friendly SDE modeling tool that highly resembles regular ODE-based modeling; (3) applying ENISI SDE modeling tool through a use case for studying stochastic sources of cell heterogeneity in the context of CD4+ T cell differentiation. The CD4+ T cell differential ODE model has been published [8] and can be downloaded from biomodels.net. The case study reproduces a biological phenomenon that is not captured by the previously published ODE model and shows the effectiveness of SDE as a stochastic modeling approach in biology in general and immunology in particular and the power of ENISI SDE.

  2. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  3. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  4. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  5. Stochastic Modelling as a Tool for Seismic Signals Segmentation

    Directory of Open Access Journals (Sweden)

    Daniel Kucharczyk

    2016-01-01

    Full Text Available In order to model nonstationary real-world processes one can find appropriate theoretical model with properties following the analyzed data. However in this case many trajectories of the analyzed process are required. Alternatively, one can extract parts of the signal that have homogenous structure via segmentation. The proper segmentation can lead to extraction of important features of analyzed phenomena that cannot be described without the segmentation. There is no one universal method that can be applied for all of the phenomena; thus novel methods should be invented for specific cases. They might address specific character of the signal in different domains (time, frequency, time-frequency, etc.. In this paper we propose two novel segmentation methods that take under consideration the stochastic properties of the analyzed signals in time domain. Our research is motivated by the analysis of vibration signals acquired in an underground mine. In such signals we observe seismic events which appear after the mining activity, like blasting, provoked relaxation of rock, and some unexpected events, like natural rock burst. The proposed segmentation procedures allow for extraction of such parts of the analyzed signals which are related to mentioned events.

  6. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  7. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools....... The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. inductiongenerators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non......-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). Theinitialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1...

  8. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  9. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  10. Genetic Mouse Models: The Powerful Tools to Study Fat Tissues.

    Science.gov (United States)

    Kong, Xingxing; Williams, Kevin W; Liu, Tiemin

    2017-01-01

    Obesity and Type 2 diabetes (T2D) are associated with a variety of comorbidities that contribute to mortality around the world. Although significant effort has been expended in understanding mechanisms that mitigate the consequences of this epidemic, the field has experienced limited success thus far. The potential ability of brown adipose tissue (BAT) to counteract obesity and metabolic disease in rodents (and potentially in humans) has been a topical realization. Recently, there is also another thermogenic fat cell called beige adipocytes, which are located among white adipocytes and share similar activated responses to cyclic AMP as classical BAT. In this chapter, we review contemporary molecular strategies to investigate the role of adipose tissue depots in metabolism. In particular, we will discuss the generation of adipose tissue-specific knockout and overexpression of target genes in various mouse models. We will also discuss how to use different Cre (cyclization recombination) mouse lines to investigate diverse types of adipocytes.

  11. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  12. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Directory of Open Access Journals (Sweden)

    M. N. Jebur

    2014-10-01

    Full Text Available Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler, for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  15. Process Modeling In Cold Forging Considering The Process-Tool-Machine Interactions

    Science.gov (United States)

    Kroiss, Thomas; Engel, Ulf; Merklein, Marion

    2010-06-01

    In this paper, a methodic approach is presented for the determination and modeling of the axial deflection characteristic for the whole system of stroke-controlled press and tooling system. This is realized by a combination of experiment and FE simulation. The press characteristic is uniquely measured in experiment. The tooling system characteristic is determined in FE simulation to avoid experimental investigations on various tooling systems. The stiffnesses of press and tooling system are combined to a substitute stiffness that is integrated into the FE process simulation as a spring element. Non-linear initial effects of the press are modeled with a constant shift factor. The approach was applied to a full forward extrusion process on a press with C-frame. A comparison between experiments and results of the integrated FE simulation model showed a high accuracy of the FE model. The simulation model with integrated deflection characteristic represents the entire process behavior and can be used for the calculation of a mathematical process model based on variant simulations and response surfaces. In a subsequent optimization step, an adjusted process and tool design can be determined, that compensates the influence of the deflections on the workpiece dimensions leading to high workpiece accuracy. Using knowledge on the process behavior, the required number of variant simulations was reduced.

  16. Universal geometric error modeling of the CNC machine tools based on the screw theory

    Science.gov (United States)

    Tian, Wenjie; He, Baiyan; Huang, Tian

    2011-05-01

    The methods to improve the precision of the CNC (Computerized Numerical Control) machine tools can be classified into two categories: error prevention and error compensation. Error prevention is to improve the precision via high accuracy in manufacturing and assembly. Error compensation is to analyze the source errors that affect on the machining error, to establish the error model and to reach the ideal position and orientation by modifying the trajectory in real time. Error modeling is the key to compensation, so the error modeling method is of great significance. Many researchers have focused on this topic, and proposed many methods, but we can hardly describe the 6-dimensional configuration error of the machine tools. In this paper, the universal geometric error model of CNC machine tools is obtained utilizing screw theory. The 6-dimensional error vector is expressed with a twist, and the error vector transforms between different frames with the adjoint transformation matrix. This model can describe the overall position and orientation errors of the tool relative to the workpiece entirely. It provides the mathematic model for compensation, and also provides a guideline in the manufacture, assembly and precision synthesis of the machine tools.

  17. Tool flank wear model and parametric optimization in end milling of metal matrix composite using carbide tool: Response surface methodology approach

    Directory of Open Access Journals (Sweden)

    R. Arokiadass

    2012-04-01

    Full Text Available Highly automated CNC end milling machines in manufacturing industry requires reliable model for prediction of tool flank wear. This model later can be used to predict the tool flank wear (VBmax according to the process parameters. In this investigation an attempt was made to develop an empirical relationship to predict the tool flank wear (VBmax of carbide tools while machining LM25 Al/SiCp incorporating the process parameters such as spindle speed (N, feed rate (f, depth of cut (d and various % wt. of silicon carbide (S. Response surface methodology (RSM was applied to optimizing the end milling process parameters to attain the minimum tool flank wear. Predicted values obtained from the developed model and experimental results are compared, and error <5 percent is observed. In addition, it is concluded that the flank wear increases with the increase of SiCp percentage weight in the MMC.

  18. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  19. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    Science.gov (United States)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  20. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  1. Force sensor based tool condition monitoring using a heterogeneous ensemble learning model.

    Science.gov (United States)

    Wang, Guofeng; Yang, Yinwei; Li, Zhimeng

    2014-11-14

    Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  2. Numerical modeling of friction stir welding using the tools with polygonal pins

    Directory of Open Access Journals (Sweden)

    M. Mehta

    2015-09-01

    Full Text Available Friction stir welding using the tools with polygonal pins is often found to improve the mechanical strength of weld joint in comparison to the tools with circular pins. However, the impacts of pin profile on the peak temperature, tool torque and traverse force, and the resultant mechanical stresses experienced by the tool have been rarely reported in a systematic manner. An estimation of the rate of heat generation for the tools with polygonal pins is challenging due to their non-axisymmetric cross-section about the tool axis. A novel methodology is presented to analytically estimate the rate of heat generation for the tools with polygonal pins. A three-dimensional heat transfer analysis of friction stir welding is carried out using finite element method. The computed temperature field from the heat transfer model is used to estimate the torque, traverse force and the mechanical stresses experienced by regular triangular, square, pentagon and hexagon pins following the principles of solid mechanics. The computed results show that the peak temperature experienced by the tool pin increases with the number of pin sides. However, the resultant maximum shear stress experienced by the pin reduces from the triangular to hexagonal pins.

  3. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    to be modelled rather than monitored as in classical LCA (e.g. landfilling or the application of processed waste on agricultural land). Therefore LCA-tools are needed which specifically address these issues and enable practitioners to model properly their systems. In this thesis several pieces of work...... are presented. First a review was carried out on all LCA studies of waste management systems published before mid-2012. This provided a global overview of the technologies and waste fractions which have attracted focus within LCA while enabling an analysis of methodological tendencies, the use of tools...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  4. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... a known contact condition at the contact interface, e.g. either as pure sliding or sticking. The present model uses Coulomb’s law of friction for the sliding condition and the material yield shear stress for the sticking condition to model the contact forces. The model includes heat generation...

  5. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  6. Development of a visualization tool for integrated surface water-groundwater modeling

    Science.gov (United States)

    Tian, Yong; Zheng, Yi; Zheng, Chunmiao

    2016-01-01

    Physically-based, fully integrated surface water (SW)-groundwater (GW) models have been increasingly used in water resources research and management. The integrated modeling involves a large amount of scientific data. The use of three-dimensional (3D) visualization software to integrate all the scientific data into a comprehensive system can facilitate the interpretation and validation of modeling results. Nevertheless, at present few software tools can efficiently perform data visualization for integrated SW-GW modeling. In this study, a visualization tool named IHM3D was designed and developed specifically for integrated SW-GW modeling. In IHM3D, spatially distributed model inputs/outputs and geo-referenced data sets are visualized in a virtual globe-based 3D environment. End users can conveniently explore and validate modeling results within the 3D environment. A GSLFOW (an integrated SW-GW model developed by USGS) modeling case in the Heihe River Basin (Northwest China) was used to demonstrate the applicability of IHM3D at a large basin scale. The visualization of the modeling results significantly improved the understanding of the complex hydrologic cycle in this water-limited area, and provided insights into the regional water resources management. This study shows that visualization tools like IHM3D can promote data and model sharing in the water resources research community, and make it more practical to perform complex hydrological modeling in real-world water resources management.

  7. Mathematical Practices in the Sciences: The Potential of Computers as a Modelling Tool.

    Science.gov (United States)

    Molyneux-Hodgson, Susan; Mochon, Simon

    This paper is concerned with the role of spreadsheets as a tool for the development of mathematical models in science, one aspect of a collaborative project which worked with two groups of pre-university students from Mexico and the United Kingdom. The purpose of the modeling activities designed was to engage students in creating an…

  8. The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers

    Science.gov (United States)

    Bird, Len

    2007-01-01

    This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…

  9. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  10. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  11. A new validation-assessment tool for health-economic decision models

    NARCIS (Netherlands)

    Mauskopf, J.; Vemer, P.; Voorn, van G.A.K.; Corro Ramos, I.

    2014-01-01

    A validation-assessment tool is being developed for decision makers to transparently and consistently evaluate the validation status of different health-economic decision models. It is designed as a list of validation techniques covering all relevant aspects of model validation to be filled in by

  12. JTorX: A Tool for On-Line Model-Driven Test Derivation and Execution

    NARCIS (Netherlands)

    Belinfante, Axel; Esparza, Javier; Majumdar, Rupak

    We introduce JTorX, a tool for model-driven test derivation and execution, based on the ioco theory. This theory, originally presented in [Tretmans,1996], has been refined in [Tretmans,2008] with test-cases that are input-enabled. For models with underspecified traces [vdBijl+,2004] introduced

  13. Modelling and Optimization of Technological Process for Magnetron Synthesis of Altin Nanocomposite Films on Cutting Tools

    Science.gov (United States)

    Kozhina, T. D.

    2016-04-01

    The paper highlights the results of the research on developing the mechanism to model the technological process for magnetron synthesis of nanocomposite films on cutting tools, which provides their specified physical and mechanical characteristics by controlling pulsed plasma parameters. The paper presents optimal conditions for AlTiN coating deposition on cutting tools according to the ion energy of sputtered atoms in order to provide their specified physical and mechanical characteristics.

  14. Model-based calculating tool for pollen-mediated gene flow frequencies in plants.

    Science.gov (United States)

    Lei, Wang; Bao-Rong, Lu

    2016-12-30

    The potential social-economic and environmental impacts caused by transgene flow from genetically engineered (GE) crops have stimulated worldwide biosafety concerns. To determine transgene flow frequencies resulted from pollination is the first critical step for assessing such impacts, in addition to the determination of transgene expression and fitness in crop-wild hybrid descendants. Two methods are commonly used to estimate pollen-mediated gene flow (PMGF) frequencies: field experimenting and mathematical modeling. Field experiments can provide relatively accurate results but are time/resource consuming. Modeling offers an effective complement for PMGF experimental assessment. However, many published models describe PMGF by mathematical equations and are practically not easy to use. To increase the application of PMGF modeling for the estimation of transgene flow, we established a tool to calculate PMGF frequencies based on a quasi-mechanistic PMGF model for wind-pollination species. This tool includes a calculating program displayed by an easy-operating interface. PMGF frequencies of different plant species can be quickly calculated under different environmental conditions by including a number of biological and wind speed parameters that can be measured in the fields/laboratories or obtained from published data. The tool is freely available in the public domain (http://ecology.fudan.edu.cn/userfiles/cn/files/Tool_Manual.zip). Case studies including rice, wheat, and maize demonstrated similar results between the calculated frequencies based on this tool and those from published PMGF data. This PMGF calculating tool will provide useful information for assessing and monitoring social-economic and environmental impacts caused by transgene flow from GE crops. This tool can also be applied to determine the isolation distances between GE and non-GE crops in a coexistence agro-ecosystem, and to ensure the purity of certified seeds by setting proper isolation distances

  15. Numerical Modeling of Nanoelectronic Devices

    Science.gov (United States)

    Klimeck, Gerhard; Oyafuso, Fabiano; Bowen, R. Chris; Boykin, Timothy

    2003-01-01

    Nanoelectronic Modeling 3-D (NEMO 3-D) is a computer program for numerical modeling of the electronic structure properties of a semiconductor device that is embodied in a crystal containing as many as 16 million atoms in an arbitrary configuration and that has overall dimensions of the order of tens of nanometers. The underlying mathematical model represents the quantummechanical behavior of the device resolved to the atomistic level of granularity. The system of electrons in the device is represented by a sparse Hamiltonian matrix that contains hundreds of millions of terms. NEMO 3-D solves the matrix equation on a Beowulf-class cluster computer, by use of a parallel-processing matrix vector multiplication algorithm coupled to a Lanczos and/or Rayleigh-Ritz algorithm that solves for eigenvalues. In a recent update of NEMO 3-D, a new strain treatment, parameterized for bulk material properties of GaAs and InAs, was developed for two tight-binding submodels. The utility of the NEMO 3-D was demonstrated in an atomistic analysis of the effects of disorder in alloys and, in particular, in bulk In(x)Ga(l-x)As and in In0.6Ga0.4As quantum dots.

  16. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH).

  17. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...

  18. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...

  19. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  20. Tools and Products of Real-Time Modeling: Opportunities for Space Weather Forecasting

    Science.gov (United States)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.

  1. Optimal Vehicle Design Using the Integrated System and Cost Modeling Tool Suite

    Science.gov (United States)

    2010-08-01

    Space Vehicle Costing ( ACEIT ) • New Small Sat Model Development & Production Cost O&M Cost Module  Radiation Exposure  Radiation Detector Response...Reliability OML Availability Risk l l Tools CEA, SRM Model, POST, ACEIT , Inflation Model, Rotor Blade Des, Microsoft Project, ATSV, S/1-iABP...space STK, SOAP – Specific mission • Space Vehicle Design (SMAD) • Space Vehicle Propulsion • Orbit Propagation • Space Vehicle Costing ( ACEIT ) • New

  2. MODELING AND COMPENSATION TECHNIQUE FOR THE GEOMETRIC ERRORS OF FIVE-AXIS CNC MACHINE TOOLS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    One of the important trends in precision machining is the development of real-time error compensation technique.The error compensation for multi-axis CNC machine tools is very difficult and attractive.The modeling for the geometric error of five-axis CNC machine tools based on multi-body systems is proposed.And the key technique of the compensation-identifying geometric error parameters-is developed.The simulation of cutting workpiece to verify the modeling based on the multi-body systems is also considered.

  3. FLOW STRESS MODEL FOR HARD MACHINING OF AISI H13 WORK TOOL STEEL

    Institute of Scientific and Technical Information of China (English)

    H. Yan; J. Hua; R. Shivpuri

    2005-01-01

    An approach is presented to characterize the stress response of workpiece in hard machining,accounted for the effect of the initial workpiece hardness, temperature, strain and strain rate on flow stress. AISI H13 work tool steel was chosen to verify this methodology. The proposed flow stress model demonstrates a good agreement with data collected from published experiments.Therefore, the proposed model can be used to predict the corresponding flow stress-strain response of AISI H13 work tool steel with variation of the initial workpiece hardness in hard machining.

  4. An axisymmetrical non-linear finite element model for induction heating in injection molding tools

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Menotti, Stefano;

    2016-01-01

    To analyze the heating and cooling phase of an induction heated injection molding tool accurately, the temperature dependent magnetic properties, namely the non-linear B-H curves, need to be accounted for in an induction heating simulation. Hence, a finite element model has been developed...... in to the injection molding tool. The model shows very good agreement with the experimental temperature measurements. It is also shown that the non-linearity can be used without the temperature dependency in some cases, and a proposed method is presented of how to estimate an effective linear permeability to use...

  5. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  6. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  7. Bio-Logic Builder: A Non-Technical Tool for Building Dynamical, Qualitative Models

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A.

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized “bio-logic” modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool. PMID:23082121

  8. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    Full Text Available Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  9. Swine models, genomic tools and services to enhance our understanding of human health and diseases.

    Science.gov (United States)

    Walters, Eric M; Wells, Kevin D; Bryda, Elizabeth C; Schommer, Susan; Prather, Randall S

    2017-03-22

    The pig is becoming increasingly important as a biomedical model. Given the similarities between pigs and humans, a greater understanding of the underlying biology of human health and diseases may come from the pig rather than from classical rodent models. With an increasing need for swine models, it is essential that the genomic tools, models and services be readily available to the scientific community. Many of these are available through the National Swine Resource and Research Center (NSRRC), a facility funded by the US National Institutes of Health at the University of Missouri. The goal of the NSRRC is to provide high-quality biomedical swine models to the scientific community.

  10. Thermal Error Modeling of a Machine Tool Using Data Mining Scheme

    Science.gov (United States)

    Wang, Kun-Chieh; Tseng, Pai-Chang

    In this paper the knowledge discovery technique is used to build an effective and transparent mathematic thermal error model for machine tools. Our proposed thermal error modeling methodology (called KRL) integrates the schemes of K-means theory (KM), rough-set theory (RS), and linear regression model (LR). First, to explore the machine tool's thermal behavior, an integrated system is designed to simultaneously measure the temperature ascents at selected characteristic points and the thermal deformations at spindle nose under suitable real machining conditions. Second, the obtained data are classified by the KM method, further reduced by the RS scheme, and a linear thermal error model is established by the LR technique. To evaluate the performance of our proposed model, an adaptive neural fuzzy inference system (ANFIS) thermal error model is introduced for comparison. Finally, a verification experiment is carried out and results reveal that the proposed KRL model is effective in predicting thermal behavior in machine tools. Our proposed KRL model is transparent, easily understood by users, and can be easily programmed or modified for different machining conditions.

  11. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  12. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2009-01-01

    Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...... phenomena which occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: 1. Air flow in ventilated...... cavity such as in the exterior cladding of building envelopes, i.e. a flow which is parallel to the construction plane. 2. Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the construction plane. The new models make it possible to predict the thermal...

  13. Modeling Tool for Decision Support during Early Days of an Anthrax Event

    Science.gov (United States)

    Meltzer, Martin I.; Shadomy, Sean; Bower, William A.; Hupert, Nathaniel

    2017-01-01

    Health officials lack field-implementable tools for forecasting the effects that a large-scale release of Bacillus anthracis spores would have on public health and hospitals. We created a modeling tool (combining inhalational anthrax caseload projections based on initial case reports, effects of variable postexposure prophylaxis campaigns, and healthcare facility surge capacity requirements) to project hospitalizations and casualties from a newly detected inhalation anthrax event, and we examined the consequences of intervention choices. With only 3 days of case counts, the model can predict final attack sizes for simulated Sverdlovsk-like events (1979 USSR) with sufficient accuracy for decision making and confirms the value of early postexposure prophylaxis initiation. According to a baseline scenario, hospital treatment volume peaks 15 days after exposure, deaths peak earlier (day 5), and recovery peaks later (day 23). This tool gives public health, hospital, and emergency planners scenario-specific information for developing quantitative response plans for this threat. PMID:27983505

  14. Multiscale Multiphysics-Based Modeling and Analysis on the Tool Wear in Micro Drilling

    Science.gov (United States)

    Niu, Zhichao; Cheng, Kai

    2016-02-01

    In micro-cutting processes, process variables including cutting force, cutting temperature and drill-workpiece interfacing conditions (lubrication and interaction, etc.) significantly affect the tool wear in a dynamic interactive in-process manner. The resultant tool life and cutting performance directly affect the component surface roughness, material removal rate and form accuracy control, etc. In this paper, a multiscale multiphysics oriented approach to modeling and analysis is presented particularly on tooling performance in micro drilling processes. The process optimization is also taken account based on establishing the intrinsic relationship between process parameters and cutting performance. The modeling and analysis are evaluated and validated through well-designed machining trials, and further supported by metrology measurements and simulations. The paper is concluded with a further discussion on the potential and application of the approach for broad micro manufacturing purposes.

  15. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    Science.gov (United States)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  16. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    The objective of the present paper is to investigate the heat generation and contact condition during Friction Stir Welding (FSW). For this purpose, an analytical model is developed for the heat generation and this is combined with a Eulerian FE-analysis of the temperature field. The heat...... generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... is governed by the contact condition, i.e. whether there is sliding, sticking or partial sliding/sticking. The contact condition in FSW is complex (dependent on alloy, welding parameters, tool design etc.), and previous models (both analytical and numerical) for simulation of the heat generation assume...

  17. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  18. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Institute of Scientific and Technical Information of China (English)

    Qianjian GUO; Shuo FAN; Rufeng XU; Xiang CHENG; Guoyong ZHAO; Jianguo YANG

    2017-01-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools,spindle thermal error measurement,modeling and compensation of a two turntable five-axis machine tool are researched.Measurement experiment of heat sources and thermal errors are carried out,and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling.In order to analyze the influence of different heat sources on spindle thermal errors,an ANN (artificial neural network) model is presented,and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN,a new ABCNN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors.In order to test the prediction performance of ABC-NN model,an experiment system is developed,the prediction results of LSR (least squares regression),ANN and ABC-NN are compared with the measurement results of spindle thermal errors.Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN,and the residual error is smaller than 3 μm,the new modeling method is feasible.The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  19. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Science.gov (United States)

    Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo

    2017-03-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  20. Digital soil mapping as a tool for quantifying state-and-transition models

    Science.gov (United States)

    Ecological sites and associated state-and-transition models (STMs) are rapidly becoming important land management tools in rangeland systems in the US and around the world. Descriptions of states and transitions are largely developed from expert knowledge and generally accepted species and community...

  1. Towards Semantically Integrated Models and Tools for Cyber-Physical Systems Design

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Fitzgerald, John; Woodcock, Jim

    2016-01-01

    We describe an approach to the model-based engineering of embedded and cyber-physical systems, based on the semantic integration of diverse discipline-specific notations and tools. Using the example of a small unmanned aerial vehicle, we explain the need for multiple notations and collaborative...

  2. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Blair, N.; Dobos, A. P.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% for all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.

  3. Recommender System and Web 2.0 Tools to Enhance a Blended Learning Model

    Science.gov (United States)

    Hoic-Bozic, Natasa; Dlab, Martina Holenko; Mornar, Vedran

    2016-01-01

    Blended learning models that combine face-to-face and online learning are of great importance in modern higher education. However, their development should be in line with the recent changes in e-learning that emphasize a student-centered approach and use tools available on the Web to support the learning process. This paper presents research on…

  4. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products (

  5. Simulation of Forming Process as an Educational Tool Using Physical Modeling

    Science.gov (United States)

    Abdullah, A. B.; Muda, M. R.; Samad, Z.

    2008-01-01

    Metal forming process simulation requires a very high cost including the cost for dies, machine and material and tight process control since the process involve very huge pressure. A physical modeling technique is developed and initiates a new era of educational tool of simulating the process effectively. Several publications and findings have…

  6. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    OpenAIRE

    R. Fernandes; F. Braunschweig; Lourenço, F.; R. Neves

    2015-01-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has b...

  7. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    OpenAIRE

    R. Fernandes; F. Braunschweig; Lourenço, F.; R. Neves

    2016-01-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline cont...

  8. Facebook and the Cognitive Model: A Tool for Promoting Adolescent Self-Awareness

    Science.gov (United States)

    Lewis, Lucy; Wahesh, Edward

    2012-01-01

    A homework activity incorporating the social networking site Facebook is presented as a tool for teaching adolescent clients about the cognitive model and increasing their ability to identify and modify problematic thinking. The authors describe how a worksheet developed to help clients examine information presented on their Facebook profile can…

  9. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products (

  10. Facebook and the Cognitive Model: A Tool for Promoting Adolescent Self-Awareness

    Science.gov (United States)

    Lewis, Lucy; Wahesh, Edward

    2012-01-01

    A homework activity incorporating the social networking site Facebook is presented as a tool for teaching adolescent clients about the cognitive model and increasing their ability to identify and modify problematic thinking. The authors describe how a worksheet developed to help clients examine information presented on their Facebook profile can…

  11. Realistic tool-tissue interaction models for surgical simulation and planning

    NARCIS (Netherlands)

    Misra, Sarthak

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development

  12. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  13. TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment

    NARCIS (Netherlands)

    Petrov, Milen; Aleksieva-Petrova, Adelina; Stefanov, Krassen; Schoonenboom, Judith; Miao, Yongwu

    2008-01-01

    Petrov, M., Aleksieva-Petrova, A., Stefanov, K., Schoonenboom, J., & Miao, Y. (2008). TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment. In H. W. Sligte & R. Koper (Eds). Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Com

  14. GAMBIT: The Global and Modular Beyond-the-Standard-Model Inference Tool arXiv

    CERN Document Server

    Athron, Peter; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  15. Modeling movements of a long hand-held tool with effects of moments of inertia.

    Science.gov (United States)

    Lin, Chiuhsiang Joe; Chen, Hung-Jen

    2014-04-01

    The current experiment aimed to investigate the effects of weight position on movement time in target acquisition tasks. Subsequently, a simple mathematical model was developed to describe the movement time with the moments of inertia. Ten right-handed participants conducted continuous Fitts pointing tasks using a laparoscopic instrument as a long hand-held tool. The results showed significant effects of weight position on movement time. Furthermore, an extended Fitts' law model is proposed for the moments of inertia produced by the hand, instrument, and a constant mass in different positions. This predictive model accounted for 63% of the variance in movement time. The predictive model proposed in the present study can be applied not only to estimate movement time given a particular target width, instrument movement amplitude, and weight position of a long hand-held tool but also to standardize movement time and establish training standards.

  16. ESMValTool (v1.0) - a community diagnostic and performance metrics tool for routine evaluation of Earth system models in CMIP

    Science.gov (United States)

    Eyring, Veronika; Righi, Mattia; Lauer, Axel; Evaldsson, Martin; Wenzel, Sabrina; Jones, Colin; Anav, Alessandro; Andrews, Oliver; Cionni, Irene; Davin, Edouard L.; Deser, Clara; Ehbrecht, Carsten; Friedlingstein, Pierre; Gleckler, Peter; Gottschaldt, Klaus-Dirk; Hagemann, Stefan; Juckes, Martin; Kindermann, Stephan; Krasting, John; Kunert, Dominik; Levine, Richard; Loew, Alexander; Mäkelä, Jarmo; Martin, Gill; Mason, Erik; Phillips, Adam S.; Read, Simon; Rio, Catherine; Roehrig, Romain; Senftleben, Daniel; Sterl, Andreas; van Ulft, Lambertus H.; Walton, Jeremy; Wang, Shiyu; Williams, Keith D.

    2016-05-01

    A community diagnostics and performance metrics tool for the evaluation of Earth system models (ESMs) has been developed that allows for routine comparison of single or multiple models, either against predecessor versions or against observations. The priority of the effort so far has been to target specific scientific themes focusing on selected essential climate variables (ECVs), a range of known systematic biases common to ESMs, such as coupled tropical climate variability, monsoons, Southern Ocean processes, continental dry biases, and soil hydrology-climate interactions, as well as atmospheric CO2 budgets, tropospheric and stratospheric ozone, and tropospheric aerosols. The tool is being developed in such a way that additional analyses can easily be added. A set of standard namelists for each scientific topic reproduces specific sets of diagnostics or performance metrics that have demonstrated their importance in ESM evaluation in the peer-reviewed literature. The Earth System Model Evaluation Tool (ESMValTool) is a community effort open to both users and developers encouraging open exchange of diagnostic source code and evaluation results from the Coupled Model Intercomparison Project (CMIP) ensemble. This will facilitate and improve ESM evaluation beyond the state-of-the-art and aims at supporting such activities within CMIP and at individual modelling centres. Ultimately, we envisage running the ESMValTool alongside the Earth System Grid Federation (ESGF) as part of a more routine evaluation of CMIP model simulations while utilizing observations available in standard formats (obs4MIPs) or provided by the user.

  17. ESMValTool (v1.0 – a community diagnostic and performance metrics tool for routine evaluation of Earth System Models in CMIP

    Directory of Open Access Journals (Sweden)

    V. Eyring

    2015-09-01

    Full Text Available A community diagnostics and performance metrics tool for the evaluation of Earth System Models (ESMs has been developed that allows for routine comparison of single or multiple models, either against predecessor versions or against observations. The priority of the effort so far has been to target specific scientific themes focusing on selected Essential Climate Variables (ECVs, a range of known systematic biases common to ESMs, such as coupled tropical climate variability, monsoons, Southern Ocean processes, continental dry biases and soil hydrology-climate interactions, as well as atmospheric CO2 budgets, tropospheric and stratospheric ozone, and tropospheric aerosols. The tool is being developed in such a way that additional analyses can easily be added. A set of standard namelists for each scientific topic reproduces specific sets of diagnostics or performance metrics that have demonstrated their importance in ESM evaluation in the peer-reviewed literature. The Earth System Model Evaluation Tool (ESMValTool is a community effort open to both users and developers encouraging open exchange of diagnostic source code and evaluation results from the CMIP ensemble. This will facilitate and improve ESM evaluation beyond the state-of-the-art and aims at supporting such activities within the Coupled Model Intercomparison Project (CMIP and at individual modelling centres. Ultimately, we envisage running the ESMValTool alongside the Earth System Grid Federation (ESGF as part of a more routine evaluation of CMIP model simulations while utilizing observations available in standard formats (obs4MIPs or provided by the user.

  18. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    Science.gov (United States)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  19. Thermal modelling of cooling tool cutting when milling by electrical analogy

    Directory of Open Access Journals (Sweden)

    Benmoussa H.

    2010-06-01

    Full Text Available Measurement temperatures by (some devises are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  20. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    Science.gov (United States)

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.