WorldWideScience

Sample records for preliminary computer modeling

  1. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  2. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER

    2014-05-01

    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach called “Computational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  3. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  4. Preliminary development of a global 3-D magnetohydrodynamic computational model for solar wind-cometary and planetary interactions

    International Nuclear Information System (INIS)

    Stahara, S.S.

    1986-05-01

    This is the final summary report by Resource Management Associates, Inc., of the first year's work under Contract No. NASW-4011 to the National Aeronautics and Space Administration. The work under this initial phase of the contract relates to the preliminary development of a global, 3-D magnetohydrodynamic computational model to quantitatively describe the detailed continuum field and plasma interaction process of the solar wind with cometary and planetary bodies throughout the solar system. The work extends a highly-successful, observationally-verified computational model previously developed by the author, and is appropriate for the global determination of supersonic, super-Alfvenic solar wind flows past planetary obstacles. This report provides a concise description of the problems studied, a summary of all the important research results, and copies of the publications

  5. Individualized computer-aided education in mammography based on user modeling: concept and preliminary experiments.

    Science.gov (United States)

    Mazurowski, Maciej A; Baker, Jay A; Barnhart, Huiman X; Tourassi, Georgia D

    2010-03-01

    The authors propose the framework for an individualized adaptive computer-aided educational system in mammography that is based on user modeling. The underlying hypothesis is that user models can be developed to capture the individual error making patterns of radiologists-in-training. In this pilot study, the authors test the above hypothesis for the task of breast cancer diagnosis in mammograms. The concept of a user model was formalized as the function that relates image features to the likelihood/extent of the diagnostic error made by a radiologist-in-training and therefore to the level of difficulty that a case will pose to the radiologist-in-training (or "user"). Then, machine learning algorithms were implemented to build such user models. Specifically, the authors explored k-nearest neighbor, artificial neural networks, and multiple regression for the task of building the model using observer data collected from ten Radiology residents at Duke University Medical Center for the problem of breast mass diagnosis in mammograms. For each resident, a user-specific model was constructed that predicts the user's expected level of difficulty for each presented case based on two BI-RADS image features. In the experiments, leave-one-out data handling scheme was applied to assign each case to a low-predicted-difficulty or a high-predicted-difficulty group for each resident based on each of the three user models. To evaluate whether the user model is useful in predicting difficulty, the authors performed statistical tests using the generalized estimating equations approach to determine whether the mean actual error is the same or not between the low-predicted-difficulty group and the high-predicted-difficulty group. When the results for all observers were pulled together, the actual errors made by residents were statistically significantly higher for cases in the high-predicted-difficulty group than for cases in the low-predicted-difficulty group for all modeling

  6. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...... simulations. The magnets studied are identical to the capture magnet and filter magnet on MER, though results are more generally applicable. The dust capture process is found to be dependent upon wind speed, dust magnetization, dust grain size and dust grain mass density. Here we develop an understanding...... of how these parameters affect dust capture rates and patterns on the magnets and set bounds for these parameters based on MER data and results from the numerical model. This results in a consistent picture of the dust as containing varying amounts of at least two separate components with different...

  7. A PRELIMINARY JUPITER MODEL

    International Nuclear Information System (INIS)

    Hubbard, W. B.; Militzer, B.

    2016-01-01

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen–helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen–helium-rich envelope with approximately three times solar metallicity

  8. A PRELIMINARY JUPITER MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, W. B. [Lunar and Planetary Laboratory, The University of Arizona, Tucson, AZ 85721 (United States); Militzer, B. [Department of Earth and Planetary Science, Department of Astronomy, University of California, Berkeley, CA 94720 (United States)

    2016-03-20

    In anticipation of new observational results for Jupiter's axial moment of inertia and gravitational zonal harmonic coefficients from the forthcoming Juno orbiter, we present a number of preliminary Jupiter interior models. We combine results from ab initio computer simulations of hydrogen–helium mixtures, including immiscibility calculations, with a new nonperturbative calculation of Jupiter's zonal harmonic coefficients, to derive a self-consistent model for the planet's external gravity and moment of inertia. We assume helium rain modified the interior temperature and composition profiles. Our calculation predicts zonal harmonic values to which measurements can be compared. Although some models fit the observed (pre-Juno) second- and fourth-order zonal harmonics to within their error bars, our preferred reference model predicts a fourth-order zonal harmonic whose absolute value lies above the pre-Juno error bars. This model has a dense core of about 12 Earth masses and a hydrogen–helium-rich envelope with approximately three times solar metallicity.

  9. Preliminary validation of computational model for neutron flux prediction of Thai Research Reactor (TRR-1/M1)

    Science.gov (United States)

    Sabaibang, S.; Lekchaum, S.; Tipayakul, C.

    2015-05-01

    This study is a part of an on-going work to develop a computational model of Thai Research Reactor (TRR-1/M1) which is capable of accurately predicting the neutron flux level and spectrum. The computational model was created by MCNPX program and the CT (Central Thimble) in-core irradiation facility was selected as the location for validation. The comparison was performed with the typical flux measurement method routinely practiced at TRR-1/M1, that is, the foil activation technique. In this technique, gold foil is irradiated for a certain period of time and the activity of the irradiated target is measured to derive the thermal neutron flux. Additionally, the flux measurement with SPND (self-powered neutron detector) was also performed for comparison. The thermal neutron flux from the MCNPX simulation was found to be 1.79×1013 neutron/cm2s while that from the foil activation measurement was 4.68×1013 neutron/cm2s. On the other hand, the thermal neutron flux from the measurement using SPND was 2.47×1013 neutron/cm2s. An assessment of the differences among the three methods was done. The difference of the MCNPX with the foil activation technique was found to be 67.8% and the difference of the MCNPX with the SPND was found to be 27.8%.

  10. A Statistical Model and Computer program for Preliminary Calculations Related to the Scaling of Sensor Arrays; TOPICAL

    International Nuclear Information System (INIS)

    Max Morris

    2001-01-01

    Recent advances in sensor technology and engineering have made it possible to assemble many related sensors in a common array, often of small physical size. Sensor arrays may report an entire vector of measured values in each data collection cycle, typically one value per sensor per sampling time. The larger quantities of data provided by larger arrays certainly contain more information, however in some cases experience suggests that dramatic increases in array size do not always lead to corresponding improvements in the practical value of the data. The work leading to this report was motivated by the need to develop computational planning tools to approximate the relative effectiveness of arrays of different size (or scale) in a wide variety of contexts. The basis of the work is a statistical model of a generic sensor array. It includes features representing measurement error, both common to all sensors and independent from sensor to sensor, and the stochastic relationships between the quantities to be measured by the sensors. The model can be used to assess the effectiveness of hypothetical arrays in classifying objects or events from two classes. A computer program is presented for evaluating the misclassification rates which can be expected when arrays are calibrated using a given number of training samples, or the number of training samples required to attain a given level of classification accuracy. The program is also available via email from the first author for a limited time

  11. Preliminary model for core/concrete interactions

    International Nuclear Information System (INIS)

    Murfin, W.B.

    1977-08-01

    A preliminary model is described for computing the rate of penetration of concrete by a molten LWR core. Among the phenomena included are convective stirring of the melt by evolved gases, admixture of concrete decomposition products to the melt, chemical reactions, radiative heat loss, and variation of heat transfer coefficients with local pressure. The model is most applicable to a two-phase melt (metallic plus oxidic) having a fairly high metallic content

  12. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    International Nuclear Information System (INIS)

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  13. Post-mortem computed tomography angiography utilizing barium sulfate to identify microvascular structures : a preliminary phantom model and case study

    NARCIS (Netherlands)

    Haakma, Wieke; Rohde, Marianne; Kuster, Lidy; Uhrenholt, Lars; Pedersen, Michael; Boel, Lene Warner Thorup

    2016-01-01

    We investigated the use of computer tomography angiography (CTA) to visualize microvascular structures in a vessel-mimicking phantom and post-mortem (PM) bodies. A contrast agent was used based on 22% barium sulfate, 20% polyethylene glycol and 58% distilled water. A vessel-mimicking phantom

  14. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  15. Preliminary ECLSS waste water model

    Science.gov (United States)

    Carter, Donald L.; Holder, Donald W., Jr.; Alexander, Kevin; Shaw, R. G.; Hayase, John K.

    1991-01-01

    A preliminary waste water model for input to the Space Station Freedom (SSF) Environmental Control and Life Support System (ECLSS) Water Processor (WP) has been generated for design purposes. Data have been compiled from various ECLSS tests and flight sample analyses. A discussion of the characterization of the waste streams comprising the model is presented, along with a discussion of the waste water model and the rationale for the inclusion of contaminants in their respective concentrations. The major objective is to establish a methodology for the development of a waste water model and to present the current state of that model.

  16. Support vector machine model for diagnosis of lymph node metastasis in gastric cancer with multidetector computed tomography: a preliminary study

    Directory of Open Access Journals (Sweden)

    Gao Yun

    2011-01-01

    Full Text Available Abstract Background Lymph node metastasis (LNM of gastric cancer is an important prognostic factor regarding long-term survival. But several imaging techniques which are commonly used in stomach cannot satisfactorily assess the gastric cancer lymph node status. They can not achieve both high sensitivity and specificity. As a kind of machine-learning methods, Support Vector Machine has the potential to solve this complex issue. Methods The institutional review board approved this retrospective study. 175 consecutive patients with gastric cancer who underwent MDCT before surgery were included. We evaluated the tumor and lymph node indicators on CT images including serosal invasion, tumor classification, tumor maximum diameter, number of lymph nodes, maximum lymph node size and lymph nodes station, which reflected the biological behavior of gastric cancer. Univariate analysis was used to analyze the relationship between the six image indicators with LNM. A SVM model was built with these indicators above as input index. The output index was that lymph node metastasis of the patient was positive or negative. It was confirmed by the surgery and histopathology. A standard machine-learning technique called k-fold cross-validation (5-fold in our study was used to train and test SVM models. We evaluated the diagnostic capability of the SVM models in lymph node metastasis with the receiver operating characteristic (ROC curves. And the radiologist classified the lymph node metastasis of patients by using maximum lymph node size on CT images as criterion. We compared the areas under ROC curves (AUC of the radiologist and SVM models. Results In 175 cases, the cases of lymph node metastasis were 134 and 41 cases were not. The six image indicators all had statistically significant differences between the LNM negative and positive groups. The means of the sensitivity, specificity and AUC of SVM models with 5-fold cross-validation were 88.5%, 78.5% and 0

  17. Support vector machine model for diagnosis of lymph node metastasis in gastric cancer with multidetector computed tomography: a preliminary study

    International Nuclear Information System (INIS)

    Zhang, Xiao-Peng; Wang, Zhi-Long; Tang, Lei; Sun, Ying-Shi; Cao, Kun; Gao, Yun

    2011-01-01

    Lymph node metastasis (LNM) of gastric cancer is an important prognostic factor regarding long-term survival. But several imaging techniques which are commonly used in stomach cannot satisfactorily assess the gastric cancer lymph node status. They can not achieve both high sensitivity and specificity. As a kind of machine-learning methods, Support Vector Machine has the potential to solve this complex issue. The institutional review board approved this retrospective study. 175 consecutive patients with gastric cancer who underwent MDCT before surgery were included. We evaluated the tumor and lymph node indicators on CT images including serosal invasion, tumor classification, tumor maximum diameter, number of lymph nodes, maximum lymph node size and lymph nodes station, which reflected the biological behavior of gastric cancer. Univariate analysis was used to analyze the relationship between the six image indicators with LNM. A SVM model was built with these indicators above as input index. The output index was that lymph node metastasis of the patient was positive or negative. It was confirmed by the surgery and histopathology. A standard machine-learning technique called k-fold cross-validation (5-fold in our study) was used to train and test SVM models. We evaluated the diagnostic capability of the SVM models in lymph node metastasis with the receiver operating characteristic (ROC) curves. And the radiologist classified the lymph node metastasis of patients by using maximum lymph node size on CT images as criterion. We compared the areas under ROC curves (AUC) of the radiologist and SVM models. In 175 cases, the cases of lymph node metastasis were 134 and 41 cases were not. The six image indicators all had statistically significant differences between the LNM negative and positive groups. The means of the sensitivity, specificity and AUC of SVM models with 5-fold cross-validation were 88.5%, 78.5% and 0.876, respectively. While the diagnostic power of the

  18. Preliminary evaluation of the BIODOSE computer program

    International Nuclear Information System (INIS)

    Bonner, N.A.; Ng, Y.C.

    1979-09-01

    The BIODOSE computer program simulates the environmental transport of radionuclides released to surface water and predicts the dosage to humans. We have evaluated the program for its suitability to the needs of the Nuclear Regulatory Commission Waste Management Program. In particular, it is an evaluation to determine whether BIODOSE models account for the significant pathways and mechanisms resulting in radiological doses to man. In general, BIODOSE is a satisfactory code for converting radionuclide releases to the aqueous environment into doses to man

  19. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  20. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  1. Dissociation - a preliminary contextual model

    Directory of Open Access Journals (Sweden)

    C Krüger

    2007-02-01

    Full Text Available Background. The Diagnostic and Statistical Manual of Mental Disorders (DSM system has certain limitations when applied to two South African examples of dissociation, because it is descriptive (non-explanatory and focuses on intrapsychic (non-communal processes. Even the existing Western explanatory models of dissociation fail to accommodate fully the communal aspects of dissociation in our South African context. Objectives and methods. The aim was to explore an expanded perspective on dissociation that does not limit it to an intrapsychic phenomenon, but that accounts for the interrelatedness of individuals within their social context. Auto-ethnography was used. In this article a collective, socially orientated, contextual hermeneutic was applied to two local examples of dissociation. Three existing Western models were expanded along multicontextual, collective lines, for them to be more useful in the pluralistic South African context. Results. This preliminary contextual model of dissociation includes a person’s interpersonal, socio-cultural, and spiritual contexts, in addition to the intrapsychic context. Dissociation is considered to be a normal information-processing tool that maintains balanced, coherent selves-in-society, i.e. individuals connected to each other. In the South African context dissociation appears mostly as a normal phenomenon and seldom as a sign of mental illness. Dissociation is pivotal for the normal construction of individual and communal identities in the face of conflicting sets of information from various contexts. Dissociation may help individuals or communities to survive in a world of conflicting messages, where conflict is often interpersonal/cultural/societal in nature, rather than primarily intrapsychic. Conclusions. This model should be developed and evaluated further. Such evaluation would require suitable new local terminology.

  2. Reservoir souring: Problems, uncertainties and modelling. Part I: Problems and uncertainty involved in prediction. Part II: Preliminary investigations of a computational model

    International Nuclear Information System (INIS)

    Paulsen, J.E.; Read, P.A.; Thompson, C.P.; Jelley, C.; Lezeau, P.

    1996-01-01

    The paper relates to improved oil recovery (IOR) techniques by mathematical modelling. The uncertainty involved in modelling of reservoir souring is discussed. IOR processes are speculated to influence a souring process in a positive direction. Most models do not take into account pH in reservoir fluids, and thus do not account for partitioning behaviour of sulfide. Also, sulfide is antagonistic to bacterial metabolism and impedes to bacterial metabolism and impedes the sulfate reduction rate, this may be an important factor in modelling. Biofilms are thought to play a crucial role in a reservoir souring process. Biofilm in a reservoir matrix is different from biofilm in open systems. This has major impact on microbial impact on microbial transport and behaviour. Studies on microbial activity in reservoir matrices must be carried out with model cores, in order to mimic a realistic situation. Sufficient data do not exist today. The main conclusion is that a model does not reflect a true situation before the nature of these elements is understood. A simplified version of an Norwegian developed biofilm model is discussed. The model incorporates all the important physical phenomena studied in the above references such as bacteria growth limited by nutrients and/or energy sources and hydrogen sulfide adsorption. 18 refs., 8 figs., 1 tab

  3. Reservoir souring: Problems, uncertainties and modelling. Part I: Problems and uncertainty involved in prediction. Part II: Preliminary investigations of a computational model

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, J.E. [Rogalandsforskning, Stavanger (Norway); Read, P.A.; Thompson, C.P.; Jelley, C.; Lezeau, P.

    1996-12-31

    The paper relates to improved oil recovery (IOR) techniques by mathematical modelling. The uncertainty involved in modelling of reservoir souring is discussed. IOR processes are speculated to influence a souring process in a positive direction. Most models do not take into account pH in reservoir fluids, and thus do not account for partitioning behaviour of sulfide. Also, sulfide is antagonistic to bacterial metabolism and impedes to bacterial metabolism and impedes the sulfate reduction rate, this may be an important factor in modelling. Biofilms are thought to play a crucial role in a reservoir souring process. Biofilm in a reservoir matrix is different from biofilm in open systems. This has major impact on microbial impact on microbial transport and behaviour. Studies on microbial activity in reservoir matrices must be carried out with model cores, in order to mimic a realistic situation. Sufficient data do not exist today. The main conclusion is that a model does not reflect a true situation before the nature of these elements is understood. A simplified version of an Norwegian developed biofilm model is discussed. The model incorporates all the important physical phenomena studied in the above references such as bacteria growth limited by nutrients and/or energy sources and hydrogen sulfide adsorption. 18 refs., 8 figs., 1 tab.

  4. Can a numerically stable subgrid-scale model for turbulent flow computation be ideally accurate?: a preliminary theoretical study for the Gaussian filtered Navier-Stokes equations.

    Science.gov (United States)

    Ida, Masato; Taniguchi, Nobuyuki

    2003-09-01

    This paper introduces a candidate for the origin of the numerical instabilities in large eddy simulation repeatedly observed in academic and practical industrial flow computations. Without resorting to any subgrid-scale modeling, but based on a simple assumption regarding the streamwise component of flow velocity, it is shown theoretically that in a channel-flow computation, the application of the Gaussian filtering to the incompressible Navier-Stokes equations yields a numerically unstable term, a cross-derivative term, which is similar to one appearing in the Gaussian filtered Vlasov equation derived by Klimas [J. Comput. Phys. 68, 202 (1987)] and also to one derived recently by Kobayashi and Shimomura [Phys. Fluids 15, L29 (2003)] from the tensor-diffusivity subgrid-scale term in a dynamic mixed model. The present result predicts that not only the numerical methods and the subgrid-scale models employed but also only the applied filtering process can be a seed of this numerical instability. An investigation concerning the relationship between the turbulent energy scattering and the unstable term shows that the instability of the term does not necessarily represent the backscatter of kinetic energy which has been considered a possible origin of numerical instabilities in large eddy simulation. The present findings raise the question whether a numerically stable subgrid-scale model can be ideally accurate.

  5. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  6. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  7. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  8. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  9. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  10. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  11. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  12. Analysis of Vector Models in Quantification of Artifacts Produced by Standard Prosthetic Inlays in Cone-Beam Computed Tomography (CBCT – a Preliminary Study

    Directory of Open Access Journals (Sweden)

    Ingrid Różyło-Kalinowska

    2014-11-01

    Full Text Available Cone-beam computed tomography (CBCT is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany, at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL. The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071. Ceramic inlay with zirconium dioxide (Cera Post as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin and active (Flexi Flange. Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  13. Computed tomography of the chest with model-based iterative reconstruction using a radiation exposure similar to chest X-ray examination: preliminary observations

    Energy Technology Data Exchange (ETDEWEB)

    Neroladaki, Angeliki; Botsikas, Diomidis; Boudabbous, Sana; Becker, Christoph D.; Montet, Xavier [Geneva University Hospital, Department of Radiology, Geneva 4 (Switzerland)

    2013-02-15

    The purpose of this study was to assess the diagnostic image quality of ultra-low-dose chest computed tomography (ULD-CT) obtained with a radiation dose comparable to chest radiography and reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR) and model-based iterative reconstruction (MBIR) in comparison with standard dose diagnostic CT (SDD-CT) or low-dose diagnostic CT (LDD-CT) reconstructed with FBP alone. Unenhanced chest CT images of 42 patients acquired with ULD-CT were compared with images obtained with SDD-CT or LDD-CT in the same examination. Noise measurements and image quality, based on conspicuity of chest lesions on all CT data sets were assessed on a five-point scale. The radiation dose of ULD-CT was 0.16 {+-} 0.006 mSv compared with 11.2 {+-} 2.7 mSv for SDD-CT (P < 0.0001) and 2.7 {+-} 0.9 mSv for LDD-CT. Image quality of ULD-CT increased significantly when using MBIR compared with FBP or ASIR (P < 0.001). ULD-CT reconstructed with MBIR enabled to detect as many non-calcified pulmonary nodules as seen on SDD-CT or LDD-CT. However, image quality of ULD-CT was clearly inferior for characterisation of ground glass opacities or emphysema. Model-based iterative reconstruction allows detection of pulmonary nodules with ULD-CT with radiation exposure in the range of a posterior to anterior (PA) and lateral chest X-ray. (orig.)

  14. Preliminary Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd

    2009-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.

  15. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...

  16. Modelling computer networks

    International Nuclear Information System (INIS)

    Max, G

    2011-01-01

    Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.

  17. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  18. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    Science.gov (United States)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  19. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  20. Computational multiscale modeling of intergranular cracking

    International Nuclear Information System (INIS)

    Simonovski, Igor; Cizelj, Leon

    2011-01-01

    A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.

  1. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  2. Plasticity modeling & computation

    CERN Document Server

    Borja, Ronaldo I

    2013-01-01

    There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.

  3. Preliminary Study on the Enhancement of Reconstruction Speed for Emission Computed Tomography Using Parallel Processing

    International Nuclear Information System (INIS)

    Park, Min Jae; Lee, Jae Sung; Kim, Soo Mee; Kang, Ji Yeon; Lee, Dong Soo; Park, Kwang Suk

    2009-01-01

    Conventional image reconstruction uses simplified physical models of projection. However, real physics, for example 3D reconstruction, takes too long time to process all the data in clinic and is unable in a common reconstruction machine because of the large memory for complex physical models. We suggest the realistic distributed memory model of fast-reconstruction using parallel processing on personal computers to enable large-scale technologies. The preliminary tests for the possibility on virtual machines and various performance test on commercial super computer, Tachyon were performed. Expectation maximization algorithm with common 2D projection and realistic 3D line of response were tested. Since the process time was getting slower (max 6 times) after a certain iteration, optimization for compiler was performed to maximize the efficiency of parallelization. Parallel processing of a program on multiple computers was available on Linux with MPICH and NFS. We verified that differences between parallel processed image and single processed image at the same iterations were under the significant digits of floating point number, about 6 bit. Double processors showed good efficiency (1.96 times) of parallel computing. Delay phenomenon was solved by vectorization method using SSE. Through the study, realistic parallel computing system in clinic was established to be able to reconstruct by plenty of memory using the realistic physical models which was impossible to simplify

  4. Plasma brake model for preliminary mission analysis

    Science.gov (United States)

    Orsini, Leonardo; Niccolai, Lorenzo; Mengali, Giovanni; Quarta, Alessandro A.

    2018-03-01

    Plasma brake is an innovative propellantless propulsion system concept that exploits the Coulomb collisions between a charged tether and the ions in the surrounding environment (typically, the ionosphere) to generate an electrostatic force orthogonal to the tether direction. Previous studies on the plasma brake effect have emphasized the existence of a number of different parameters necessary to obtain an accurate description of the propulsive acceleration from a physical viewpoint. The aim of this work is to discuss an analytical model capable of estimating, with the accuracy required by a preliminary mission analysis, the performance of a spacecraft equipped with a plasma brake in a (near-circular) low Earth orbit. The simplified mathematical model is first validated through numerical simulations, and is then used to evaluate the plasma brake performance in some typical mission scenarios, in order to quantify the influence of the system parameters on the mission performance index.

  5. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  6. Preliminary Test for Constitutive Models of CAP

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2010-05-15

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  7. Preliminary Test for Constitutive Models of CAP

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  8. Introducing handheld computing into a residency program: preliminary results from qualitative and quantitative inquiry.

    OpenAIRE

    Manning, B.; Gadd, C. S.

    2001-01-01

    Although published reports describe specific handheld computer applications in medical training, we know very little yet about how, and how well, handheld computing fits into the spectrum of information resources available for patient care and physician training. This paper reports preliminary quantitative and qualitative results from an evaluation study designed to track changes in computer usage patterns and computer-related attitudes before and after introduction of handheld computing. Pre...

  9. A physicist's model of computation

    International Nuclear Information System (INIS)

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  10. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  11. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice

    International Nuclear Information System (INIS)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Amedee, Joelle; Fricain, Jean-Christophe; Catros, Sylvain; Miraux, Sylvain

    2010-01-01

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  12. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice

    Energy Technology Data Exchange (ETDEWEB)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Amedee, Joelle; Fricain, Jean-Christophe; Catros, Sylvain [INSERM, U577, Bordeaux, F-33076 (France) and Universite Victor Segalen Bordeaux 2, UMR-S577 Bordeaux, F-33076 (France); Miraux, Sylvain [Centre de Resonance Magnetique des Systemes Biologiques, UMR 5536 (France)

    2010-03-15

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  13. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  14. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  15. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  16. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    International Nuclear Information System (INIS)

    Fletcher, C.D.

    1986-01-01

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes. 6 refs., 17 figs., 1 tab

  17. RFQ modeling computer program

    International Nuclear Information System (INIS)

    Potter, J.M.

    1985-01-01

    The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks

  18. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  19. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  20. Computational modelling in fluid mechanics

    International Nuclear Information System (INIS)

    Hauguel, A.

    1985-01-01

    The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr

  1. The Architectural Designs of a Nanoscale Computing Model

    Directory of Open Access Journals (Sweden)

    Mary M. Eshaghian-Wilner

    2004-08-01

    Full Text Available A generic nanoscale computing model is presented in this paper. The model consists of a collection of fully interconnected nanoscale computing modules, where each module is a cube of cells made out of quantum dots, spins, or molecules. The cells dynamically switch between two states by quantum interactions among their neighbors in all three dimensions. This paper includes a brief introduction to the field of nanotechnology from a computing point of view and presents a set of preliminary architectural designs for fabricating the nanoscale model studied.

  2. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Chaos is one of the major scientific discoveries of our times. In fact many scientists ... But there are other natural phenomena that are not predictable though ... characteristics of chaos. ... The position and velocity are all that are needed to determine the motion of a .... a system of equations that modelled the earth's weather ...

  3. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  4. Computer model for ductile fracture

    International Nuclear Information System (INIS)

    Moran, B.; Reaugh, J. E.

    1979-01-01

    A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak

  5. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  6. Preliminary Study on Hybrid Computational Phantom for Radiation Dosimetry Based on Subdivision Surface

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Choi, Sang Hyoun; Cho, Sung Koo; Kim, Chan Hyeong

    2007-01-01

    The anthropomorphic computational phantoms are classified into two groups. One group is the stylized phantoms, or MIRD phantoms, which are based on mathematical representations of the anatomical structures. The shapes and positions of the organs and tissues in these phantoms can be adjusted by changing the coefficients of the equations in use. The other group is the voxel phantoms, which are based on tomographic images of a real person such as CT, MR and serially sectioned color slice images from a cadaver. Obviously, the voxel phantoms represent the anatomical structures of a human body much more realistically than the stylized phantoms. A realistic representation of anatomical structure is very important for an accurate calculation of radiation dose in the human body. Consequently, the ICRP recently has decided to use the voxel phantoms for the forthcoming update of the dose conversion coefficients. However, the voxel phantoms also have some limitations: (1) The topology and dimensions of the organs and tissues in a voxel model are extremely difficult to change, and (2) The thin organs, such as oral mucosa and skin, cannot be realistically modeled unless the voxel resolution is prohibitively high. Recently, a new approach has been implemented by several investigators. The investigators converted their voxel phantoms to hybrid computational phantoms based on NURBS (Non-Uniform Rational B-Splines) surface, which is smooth and deformable. It is claimed that these new phantoms have the flexibility of the stylized phantom along with the realistic representations of the anatomical structures. The topology and dimensions of the anatomical structures can be easily changed as necessary. Thin organs can be modeled without affecting computational speed or memory requirement. The hybrid phantoms can be also used for 4-D Monte Carlo simulations. In this preliminary study, the external shape of a voxel phantom (i.e., skin), HDRK-Man, was converted to a hybrid computational

  7. Electromagnetic Physics Models for Parallel Computing Architectures

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R; Ananya, A; Apostolakis, J; Aurora, A; Bandieramonte, M; Brun, R; Carminati, F; Gheata, A; Gheata, M; Goulas, I; Nikitina, T; Bhattacharyya, A; Mohanty, A; Canal, P; Elvira, D; Jun, S Y; Lima, G; Duhem, L

    2016-01-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well. (paper)

  8. Electromagnetic Physics Models for Parallel Computing Architectures

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  9. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  10. Trust models in ubiquitous computing.

    Science.gov (United States)

    Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro

    2008-10-28

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.

  11. Ch. 33 Modeling: Computational Thermodynamics

    International Nuclear Information System (INIS)

    Besmann, Theodore M.

    2012-01-01

    This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  14. Preliminary results of steel containment vessel model test

    International Nuclear Information System (INIS)

    Matsumoto, T.; Komine, K.; Arai, S.

    1997-01-01

    A high pressure test of a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of a steel containment vessel (SCV), representing an improved boiling water reactor (BWR) Mark II containment, was conducted on December 11-12, 1996 at Sandia National Laboratories. This paper describes the preliminary results of the high pressure test. In addition, the preliminary post-test measurement data and the preliminary comparison of test data with pretest analysis predictions are also presented

  15. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  16. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  17. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  18. Computational model of a whole tree combustor

    Energy Technology Data Exchange (ETDEWEB)

    Bryden, K.M.; Ragland, K.W. [Univ. of Wisconsin, Madison, WI (United States)

    1993-12-31

    A preliminary computational model has been developed for the whole tree combustor and compared to test results. In the simulation model presented hardwood logs, 15 cm in diameter are burned in a 4 m deep fuel bed. Solid and gas temperature, solid and gas velocity, CO, CO{sub 2}, H{sub 2}O, HC and O{sub 2} profiles are calculated. This deep, fixed bed combustor obtains high energy release rates per unit area due to the high inlet air velocity and extended reaction zone. The lowest portion of the overall bed is an oxidizing region and the remainder of the bed acts as a gasification and drying region. The overfire air region completes the combustion. Approximately 40% of the energy is released in the lower oxidizing region. The wood consumption rate obtained from the computational model is 4,110 kg/m{sup 2}-hr which matches well the consumption rate of 3,770 kg/m{sup 2}-hr observed during the peak test period of the Aurora, MN test. The predicted heat release rate is 16 MW/m{sup 2} (5.0*10{sup 6} Btu/hr-ft{sup 2}).

  19. Getting computer models to communicate

    International Nuclear Information System (INIS)

    Caremoli, Ch.; Erhard, P.

    1999-01-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  20. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  1. Computer content analysis of schizophrenic speech: a preliminary report.

    Science.gov (United States)

    Tucker, G J; Rosenberg, S D

    1975-06-01

    Computer analysis significantly differtiated the thermatic content of the free speech of 10 schizophrenic patients from that of 10 nonschizophrenic patients and from the content of transcripts of dream material from 10 normal subjects. Schizophrenic patients used the thematic categories in factor 1 (the "schizophrenic factor") 3 times more frequently than the nonschizophrenics and 10 times more frequently than the normal subjects (p smaller than 01). In general, the language content of the schizophrenic patient mirrored an almost agitated attempt to locate oneself in time and space and to defend against internal discomfort and confusion. The authors discuss the implications of this study for future research.

  2. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  3. Dynamic computed tomography scanning of benign bone lesions: Preliminary results

    International Nuclear Information System (INIS)

    Levine, E.; Neff, J.R.

    1983-01-01

    The majority of benign bone lesions can be evaluated adequately using conventional radiologic techniques. However, it is not always possible to differentiate reliably between different types of benign bone lesions on the basis of plain film appearances alone. Dynamic computed tomography (CT) scanning provides a means for further characterizing such lesions by assessing their degree of vascularity. Thus, it may help in distinguishing an osteoid osteoma, which has a hypervascular nidus, from a Brodie's abscess, which is avascular. Dynamic CT scanning may also help in the differentiation between a fluid-containing simple bone cyst, which is avascular, and other solid or semi-solid benign bone lesions which slow varying degrees of vascularity. However, because of the additional irradiation involved, dynamic CT scanning should be reserved for evaluation of selected patients with benign bone lesions in whom the plain film findings are not definitive and in whom the CT findings may have a significant influence on management. (orig.)

  4. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  5. preliminary multidomain modelling and simulation study

    African Journals Online (AJOL)

    user

    rad/s at a height of 10m. At this angular ... exerted on the rotor, the power train and the nacelle into the ground. .... Thesis, Department of Electrical and Computer. Engineering ... studies”, International Conference on Power Systems. Transients ...

  6. Cosmic logic: a computational model

    International Nuclear Information System (INIS)

    Vanchurin, Vitaly

    2016-01-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps

  7. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  8. Krypton for computed tomography lung ventilation imaging: preliminary animal data.

    Science.gov (United States)

    Mahnken, Andreas H; Jost, Gregor; Pietsch, Hubertus

    2015-05-01

    The objective of this study was to assess the feasibility and safety of krypton ventilation imaging with intraindividual comparison to xenon ventilation computed tomography (CT). In a first step, attenuation of different concentrations of xenon and krypton was analyzed in a phantom setting. Thereafter, 7 male New Zealand white rabbits (4.4-6.0 kg) were included in an animal study. After orotracheal intubation, an unenhanced CT scan was obtained in end-inspiratory breath-hold. Thereafter, xenon- (30%) and krypton-enhanced (70%) ventilation CT was performed in random order. After a 2-minute wash-in of gas A, CT imaging was performed. After a 45-minute wash-out period and another 2-minute wash-in of gas B, another CT scan was performed using the same scan protocol. Heart rate and oxygen saturation were measured. Unenhanced and krypton or xenon data were registered and subtracted using a nonrigid image registration tool. Enhancement was quantified and statistically analyzed. One animal had to be excluded from data analysis owing to problems during intubation. The CT scans in the remaining 6 animals were completed without complications. There were no relevant differences in oxygen saturation or heart rate between the scans. Xenon resulted in a mean increase of enhancement of 35.3 ± 5.5 HU, whereas krypton achieved a mean increase of 21.9 ± 1.8 HU in enhancement (P = 0.0055). The use of krypton for lung ventilation imaging appears to be feasible and safe. Despite the use of a markedly higher concentration of krypton, enhancement is significantly worse when compared with xenon CT ventilation imaging, but sufficiently high for CT ventilation imaging studies.

  9. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  10. Preliminary Multivariable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  11. Synthesis, Preliminary Bioevaluation and Computational Analysis of Caffeic Acid Analogues

    Directory of Open Access Journals (Sweden)

    Zhiqian Liu

    2014-05-01

    Full Text Available A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification.

  12. V and V Efforts of Auroral Precipitation Models: Preliminary Results

    Science.gov (United States)

    Zheng, Yihua; Kuznetsova, Masha; Rastaetter, Lutz; Hesse, Michael

    2011-01-01

    Auroral precipitation models have been valuable both in terms of space weather applications and space science research. Yet very limited testing has been performed regarding model performance. A variety of auroral models are available, including empirical models that are parameterized by geomagnetic indices or upstream solar wind conditions, now casting models that are based on satellite observations, or those derived from physics-based, coupled global models. In this presentation, we will show our preliminary results regarding V&V efforts of some of the models.

  13. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  14. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  15. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  16. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  17. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  18. Preliminary report on electromagnetic model studies

    Science.gov (United States)

    Frischknecht, F.C.; Mangan, G.B.

    1960-01-01

    More than 70 resopnse curves for various models have been obtained using the slingram and turam electromagnetic methods. Results show that for the slingram method, horizontal co-planar coils are usually more sensitive than vertical, co-axial or vertical, co-planar coils. The shape of the anomaly usually is simpler for the vertical coils.

  19. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  20. Computer modeling of liquid crystals

    International Nuclear Information System (INIS)

    Al-Barwani, M.S.

    1999-01-01

    In this thesis, we investigate several aspects of the behaviour of liquid crystal molecules near interfaces using computer simulation. We briefly discuss experiment, theoretical and computer simulation studies of some of the liquid crystal interfaces. We then describe three essentially independent research topics. The first of these concerns extensive simulations of a liquid crystal formed by long flexible molecules. We examined the bulk behaviour of the model and its structure. Studies of a film of smectic liquid crystal surrounded by vapour were also carried out. Extensive simulations were also done for a long-molecule/short-molecule mixture, studies were then carried out to investigate the liquid-vapour interface of the mixture. Next, we report the results of large scale simulations of soft-spherocylinders of two different lengths. We examined the bulk coexistence of the nematic and isotropic phases of the model. Once the bulk coexistence behaviour was known, properties of the nematic-isotropic interface were investigated. This was done by fitting order parameter and density profiles to appropriate mathematical functions and calculating the biaxial order parameter. We briefly discuss the ordering at the interfaces and make attempts to calculate the surface tension. Finally, in our third project, we study the effects of different surface topographies on creating bistable nematic liquid crystal devices. This was carried out using a model based on the discretisation of the free energy on a lattice. We use simulation to find the lowest energy states and investigate if they are degenerate in energy. We also test our model by studying the Frederiks transition and comparing with analytical and other simulation results. (author)

  1. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  2. Preliminary Development of Regulatory PSA Models for SFR

    International Nuclear Information System (INIS)

    Choi, Yong Won; Shin, Andong; Bae, Moohoon; Suh, Namduk; Lee, Yong Suk

    2013-01-01

    Well developed PRA methodology exists for LWR (Light Water Reactor) and PHWR (Pressurized Heavy Water Reactor). Since KAERI is developing a prototype SFR targeting to apply for a license by 2017, KINS needs to have a PRA models to assess the safety of this prototype reactor. The purpose of this study is to develop the regulatory PSA models for the independent verification of the SFR safety. Since the design of the prototype SFR is not mature yet, we have tried to develop the preliminary models based on the design data of KAERI's previous SFR design. In this study, the preliminary initiating events of level 1 internal event for SFR were selected through reviews of existing PRA (LWR, PRISM, ASTRID and KALIMER-600) models. Then, the event tree for each selected initiating event was developed. The regulatory PRA models of SFR developed are preliminary in a sense, because the prototype SFR design is not mature and provided yet. Still it might be utilized for the forthcoming licensing review in assessing the risk of safety issues and the configuration control of the design

  3. Modelling Extortion Racket Systems: Preliminary Results

    Science.gov (United States)

    Nardin, Luis G.; Andrighetto, Giulia; Székely, Áron; Conte, Rosaria

    Mafias are highly powerful and deeply entrenched organised criminal groups that cause both economic and social damage. Overcoming, or at least limiting, their harmful effects is a societally beneficial objective, which renders its dynamics understanding an objective of both scientific and political interests. We propose an agent-based simulation model aimed at understanding how independent and combined effects of legal and social norm-based processes help to counter mafias. Our results show that legal processes are effective in directly countering mafias by reducing their activities and changing the behaviour of the rest of population, yet they are not able to change people's mind-set that renders the change fragile. When combined with social norm-based processes, however, people's mind-set shifts towards a culture of legality rendering the observed behaviour resilient to change.

  4. Preliminary Multi-Variable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. This paper reviews the methodology used to develop space telescope cost models; summarizes recently published single variable models; and presents preliminary results for two and three variable cost models. Some of the findings are that increasing mass reduces cost; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and technology development as a function of time reduces cost at the rate of 50% per 17 years.

  5. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  6. Sensitivity Measurement of Transmission Computer Tomography: thePreliminary Experimental Study

    International Nuclear Information System (INIS)

    Widodo, Chomsin-S; Sudjatmoko; Kusminarto; Agung-BS Utomo; Suparta, Gede B

    2000-01-01

    This paper reports result of preliminary experimental study onmeasurement method for sensitivity of a computed tomography (CT) scanner. ACT scanner has been build at the Department of Physics, FMIPA UGM and itsperformance based on its sensitivity was measured. The result showed that themeasurement method for sensitivity confirmed this method may be developedfurther as a measurement standard. Although the CT scanner developed has anumber of shortcoming, the analytical results from the sensitivitymeasurement suggest a number of reparations and improvements for the systemso that improved reconstructed CT images can be obtained. (author)

  7. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  8. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  9. Biologically Inspired Visual Model With Preliminary Cognition and Active Attention Adjustment.

    Science.gov (United States)

    Qiao, Hong; Xi, Xuanyang; Li, Yinlin; Wu, Wei; Li, Fengfu

    2015-11-01

    Recently, many computational models have been proposed to simulate visual cognition process. For example, the hierarchical Max-Pooling (HMAX) model was proposed according to the hierarchical and bottom-up structure of V1 to V4 in the ventral pathway of primate visual cortex, which could achieve position- and scale-tolerant recognition. In our previous work, we have introduced memory and association into the HMAX model to simulate visual cognition process. In this paper, we improve our theoretical framework by mimicking a more elaborate structure and function of the primate visual cortex. We will mainly focus on the new formation of memory and association in visual processing under different circumstances as well as preliminary cognition and active adjustment in the inferior temporal cortex, which are absent in the HMAX model. The main contributions of this paper are: 1) in the memory and association part, we apply deep convolutional neural networks to extract various episodic features of the objects since people use different features for object recognition. Moreover, to achieve a fast and robust recognition in the retrieval and association process, different types of features are stored in separated clusters and the feature binding of the same object is stimulated in a loop discharge manner and 2) in the preliminary cognition and active adjustment part, we introduce preliminary cognition to classify different types of objects since distinct neural circuits in a human brain are used for identification of various types of objects. Furthermore, active cognition adjustment of occlusion and orientation is implemented to the model to mimic the top-down effect in human cognition process. Finally, our model is evaluated on two face databases CAS-PEAL-R1 and AR. The results demonstrate that our model exhibits its efficiency on visual recognition process with much lower memory storage requirement and a better performance compared with the traditional purely computational

  10. Computer modeling of the gyrocon

    International Nuclear Information System (INIS)

    Tallerico, P.J.; Rankin, J.E.

    1979-01-01

    A gyrocon computer model is discussed in which the electron beam is followed from the gun output to the collector region. The initial beam may be selected either as a uniform circular beam or may be taken from the output of an electron gun simulated by the program of William Herrmannsfeldt. The fully relativistic equations of motion are then integrated numerically to follow the beam successively through a drift tunnel, a cylindrical rf beam deflection cavity, a combination drift space and magnetic bender region, and an output rf cavity. The parameters for each region are variable input data from a control file. The program calculates power losses in the cavity wall, power required by beam loading, power transferred from the beam to the output cavity fields, and electronic and overall efficiency. Space-charge effects are approximated if selected. Graphical displays of beam motions are produced. We discuss the Los Alamos Scientific Laboratory (LASL) prototype design as an example of code usage. The design shows a gyrocon of about two-thirds megawatt output at 450 MHz with up to 86% overall efficiency

  11. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  12. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  13. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde

    2010-01-01

    Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies.......Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... impact at different geographical areas, as well as driving and charging patterns. Electric circuit model is deployed in this work to represent the electrical properties of a lithium-ion battery. This paper reports the preliminary modeling and validation work based on manufacturer data sheet and realistic...

  14. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  15. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  16. Cognitive impairment and computer tomography image in patients with arterial hypertension -preliminary results

    International Nuclear Information System (INIS)

    Yaneva-Sirakova, T.; Tarnovska-Kadreva, R.; Traykov, L.; Zlatareva, D.

    2012-01-01

    Arterial hypertension is the leading risk factor for cognitive impairment, but it is developed only in some of the patients with pour control. On the other hand, not all of the patents with white matter changes have cognitive deficit. There may be a variety of reasons for this: the accuracy of methods for blood pressure measurement, the specific brain localization or some other reason. Here are the preliminary results of a study (or the potential correlation between self-measured, office-, ambulatory monitored blood pressure, central aortic blood pressure, minimal cognitive impairment and the specific brain image on contrast computer tomography. We expect to answer, the question whether central aortic or self-measured blood pressure have the leading role for the development of cognitive impairment in the presence of a specific neuroimaging finding, as well as what is the prerequisite for the clinical manifestation of cognitive dysfunction in patients with computer tomographic pathology. (authors)

  17. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    International Nuclear Information System (INIS)

    Broekema, P.C.; Nieuwpoort, R.V. van; Bal, H.E.

    2015-01-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload

  18. The IceCube Computing Infrastructure Model

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  19. Review of SFR Design Safety using Preliminary Regulatory PSA Model

    International Nuclear Information System (INIS)

    Na, Hyun Ju; Lee, Yong Suk; Shin, Andong; Suh, Nam Duk

    2013-01-01

    The major objective of this research is to develop a risk model for regulatory verification of the SFR design, and thereby, make sure that the SFR design is adequate from a risk perspective. In this paper, the development result of preliminary regulatory PSA model of SFR is discussed. In this paper, development and quantification result of preliminary regulatory PSA model of SFR is discussed. It was confirmed that the importance PDRC and ADRC dampers is significant as stated in the result of KAERI PSA model. However, the importance can be changed significantly depending on assumption of CCCG and CCF factor of PDRC and ADRC dampers. SFR (sodium-cooled fast reactor) which is Gen-IV nuclear energy system, is designed to accord with the concept of stability, sustainability and proliferation resistance. KALIMER-600, which is under development in Korea, includes passive safety systems (e. g. passive reactor shutdown, passive residual heat removal, and etc.) as well as active safety systems. Risk analysis from a regulatory perspective is needed to support the regulatory body in its safety and licensing review for SFR (KALIMER-600). Safety issues should be identified in the early design phase in order to prevent the unexpected cost increase and delay of the SFR licensing schedule that may be caused otherwise

  20. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  1. Finite-element model evaluation of barrier configurations to reduce infiltration into waste-disposal structures: preliminary results and design considerations

    International Nuclear Information System (INIS)

    Lu, A.H.; Phillips, S.J.; Adams, M.R.

    1982-09-01

    Barriers to reduce infiltration into waste burial disposal structures (trenches, pits, etc.) may be required to provide adequate waste confinement. The preliminary engineering design of these barriers should consider interrelated barrier performance factors. This paper summarizes preliminary computer simulation activities to further engineering barrier design efforts. Several barrier configurations were conceived and evaluated. Models were simulated for each barrier configuration using a finite element computer code. Results of this preliminary evaluation indicate that barrier configurations, depending on their morphology and materials, may significantly influence infiltration, flux, drainage, and storage of water through and within waste disposal structures. 9 figures

  2. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one......This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT...... that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...

  3. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  4. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  5. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    Science.gov (United States)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  6. A preliminary model to identify low-risk MBA applicants

    Directory of Open Access Journals (Sweden)

    CA Bisschoff

    2014-08-01

    The reliability of the discriminant function rates favourably with 71% (MBA in 3 years, 62% (MBA in 4 years and 83% (dropping out of the programme being categorised correctly by the respective discriminant functions. Being a preliminary model, its predictive capabilities need to be verified in practice before it can  be implemented as tool to render assistance in MBA admissions.  The value of this research lies  in the fact that it constitutes a model that could be employed and improved as a predictive tool in an environment where very limited predictive tools exist.  Therefore, although it is by no means a tried and tested model, it sets the scene by supplying a scientific base from which incremental improvements could result.

  7. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  8. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  9. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  10. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  11. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.

  13. Rayleigh to Compton ratio scatter tomography applied to breast cancer diagnosis: A preliminary computational study

    International Nuclear Information System (INIS)

    Antoniassi, M.; Conceição, A.L.C.; Poletti, M.E.

    2014-01-01

    In the present work, a tomographic technique based on Rayleigh to Compton scattering ratio (R/C) was studied using computational simulation in order to assess its application to breast cancer diagnosis. In this preliminary study, some parameters that affect the image quality were evaluated, such as: (i) energy beam, (ii) size and glandularity of the breast, and (iii) statistical count noise. The results showed that the R/C contrast increases with increasing photon energy and decreases with increasing glandularity of the sample. The statistical noise showed to be a significant parameter, although the quality of the obtained images was acceptable for a considerable range of noise level. The preliminary results suggest that the R/C tomographic technique has a potential of being applied as a complementary tool in the breast cancer diagnostic. - Highlights: ► A tomographic technique based on Rayleigh to Compton scattering ratio is proposed in order to study breast tissues. ► The Rayleigh to Compton scattering ratio technique is compared with conventional transmission technique. ► The influence of experimental parameters (energy, sample, detection system) is studied

  14. Noise and dose modeling for pediatric CT optimization: preliminary results

    International Nuclear Information System (INIS)

    Miller Clemente, Rafael A.; Perez Diaz, Marlen; Mora Reyes, Yudel; Rodriguez Garlobo, Maikel; Castillo Salazar, Rafael

    2008-01-01

    Full text: A Multiple Linear Regression Model was developed to predict noise and dose in computed tomography pediatric imaging for head and abdominal examinations. Relative values of Noise and Volumetric Computed Tomography Dose Index was used to estimate de model respectively. 54 images of physical phantoms were performed. Independent variables considered included: phantom diameter, tube current and kilovolts, x ray beam collimation, reconstruction diameter and equipment's post processing filters. Predicted values show good agreement with measurements, which were better in noise model (R 2 adjusted =0.953) than the dose model (R 2 adjusted =0.744). Tube current, object diameter, beam collimation and reconstruction filter were identified as the most influencing factors in models. (author)

  15. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10 8 kg, with a corresponding kinetic energy of 1.88 x 10 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references

  16. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    International Nuclear Information System (INIS)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-04-01

    A computational approach used for subsurface explosion cratering has been extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for our first computer simulation because it was the most thoroughly studied. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Shoemaker estimates that the impact occurred about 20,000 to 30,000 years ago [Roddy (1977)]. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s. meteorite mass of 1.57E + 08 kg, with a corresponding kinetic energy of 1.88E + 16 J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation a Tillotson equation-of-state description for iron and limestone was used with no shear strength. A color movie based on this calculation was produced using computer-generated graphics. Results obtained for this preliminary calculation of the formation of Meteor Crater, Arizona, are in good agreement with Meteor Crater Measurements

  17. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  18. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  19. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  20. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  1. Creation of 'Ukrytie' objects computer model

    International Nuclear Information System (INIS)

    Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.

    1999-01-01

    A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model

  2. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  3. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  4. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  5. Ranked retrieval of Computational Biology models.

    Science.gov (United States)

    Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar

    2010-08-11

    The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.

  6. A preliminary model for estimating the first wall lifetime of a fusion reactor

    International Nuclear Information System (INIS)

    Daenner, W.

    1975-02-01

    The estimation of the first wall lifetime is a necessary basis for predicting the availability of a fusion power plant. In order to do this, an analytical model was prepared and programmed for the computer which calculates the temperature and stress load of the first wall from the principal design parameters and quotes them against the relevant material properties. Neither the analytical model nor the information about the material performance is yet complete so that the answers obtained from the program are very preliminary. This situation is underlined by the results of sample calculations performed for the CTRD blanket module cell. The results obtained for vanadium and vanadium alloys show a strong dependence of the lifetime on the irradiation creep and the ductility of these materials. Completion of this model is envisaged as soon as the missing information becomes available. (orig.) [de

  7. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  8. Preliminary deformation model for National Seismic Hazard map of Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z. [Geodesy Research Division, Faculty of Earth Science and Technology, Institute of Technology Bandung (Indonesia); Susilo,; Efendi, Joni [Agency for Geospatial Information (BIG) (Indonesia)

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  9. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  10. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  11. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram

    2016-01-01

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.

  12. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  13. Gravity model development for precise orbit computations for satellite altimetry

    Science.gov (United States)

    Marsh, James G.; Lerch, Francis, J.; Smith, David E.; Klosko, Steven M.; Pavlis, Erricos

    1986-01-01

    Two preliminary gravity models developed as a first step in reaching the TOPEX/Poseidon modeling goals are discussed. They were obtained by NASA-Goddard from an analysis of exclusively satellite tracking observations. With the new Preliminary Gravity Solution-T2 model, an improved global estimate of the field is achieved with an improved description of the geoid.

  14. Sierra toolkit computational mesh conceptual model

    International Nuclear Information System (INIS)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-01-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  15. The 3D Reference Earth Model: Status and Preliminary Results

    Science.gov (United States)

    Moulik, P.; Lekic, V.; Romanowicz, B. A.

    2017-12-01

    In the 20th century, seismologists constructed models of how average physical properties (e.g. density, rigidity, compressibility, anisotropy) vary with depth in the Earth's interior. These one-dimensional (1D) reference Earth models (e.g. PREM) have proven indispensable in earthquake location, imaging of interior structure, understanding material properties under extreme conditions, and as a reference in other fields, such as particle physics and astronomy. Over the past three decades, new datasets motivated more sophisticated efforts that yielded models of how properties vary both laterally and with depth in the Earth's interior. Though these three-dimensional (3D) models exhibit compelling similarities at large scales, differences in the methodology, representation of structure, and dataset upon which they are based, have prevented the creation of 3D community reference models. As part of the REM-3D project, we are compiling and reconciling reference seismic datasets of body wave travel-time measurements, fundamental mode and overtone surface wave dispersion measurements, and normal mode frequencies and splitting functions. These reference datasets are being inverted for a long-wavelength, 3D reference Earth model that describes the robust long-wavelength features of mantle heterogeneity. As a community reference model with fully quantified uncertainties and tradeoffs and an associated publically available dataset, REM-3D will facilitate Earth imaging studies, earthquake characterization, inferences on temperature and composition in the deep interior, and be of improved utility to emerging scientific endeavors, such as neutrino geoscience. Here, we summarize progress made in the construction of the reference long period dataset and present a preliminary version of REM-3D in the upper-mantle. In order to determine the level of detail warranted for inclusion in REM-3D, we analyze the spectrum of discrepancies between models inverted with different subsets of the

  16. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  17. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  18. Computational aspects of premixing modelling

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.

    1998-01-01

    In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)

  19. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    particle flow, and numerical techniques allowing the modeling of particles suspended in a fluid. The general concept behind each family of techniques is described. Pros and cons for each technique are given along with examples and references to applications to fresh cementitious materials....

  20. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  1. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  2. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  3. Computational modeling of epiphany learning.

    Science.gov (United States)

    Chen, Wei James; Krajbich, Ian

    2017-05-02

    Models of reinforcement learning (RL) are prevalent in the decision-making literature, but not all behavior seems to conform to the gradual convergence that is a central feature of RL. In some cases learning seems to happen all at once. Limited prior research on these "epiphanies" has shown evidence of sudden changes in behavior, but it remains unclear how such epiphanies occur. We propose a sequential-sampling model of epiphany learning (EL) and test it using an eye-tracking experiment. In the experiment, subjects repeatedly play a strategic game that has an optimal strategy. Subjects can learn over time from feedback but are also allowed to commit to a strategy at any time, eliminating all other options and opportunities to learn. We find that the EL model is consistent with the choices, eye movements, and pupillary responses of subjects who commit to the optimal strategy (correct epiphany) but not always of those who commit to a suboptimal strategy or who do not commit at all. Our findings suggest that EL is driven by a latent evidence accumulation process that can be revealed with eye-tracking data.

  4. Preliminary modelling of the 2010 MAM survey data

    International Nuclear Information System (INIS)

    Ahokas, T.

    2010-10-01

    Posiva Oy prepares for disposal of spent nuclear fuel into bedrock focusing in Olkiluoto, Eurajoki. This is in accordance of the Decision-in-Principle of the State Council in 2000, and ratification by the Parliament in 2001. The ONKALO underground characterization premises have been constructed since 2004. Posiva Oy is aiming for submitting the construction licence application in 2012. To support the compilation of the safety case and repository and ONKALO underground characterisation facility design and construction, a series of Olkiluoto Site Descriptive Model including six parts: the surface system geology, rock mechanics, hydrogeology and hydrogeochemistry and migration, have been compiled. To support the next update of the Olkiluoto Site Description and especially, the geological and hydrogeological sub-models, the preliminary modelling of the recent mise-a-la-masse (MAM) surveys has been carried out. This report discusses the mise-a-la-masse (MAM) surveys carried out in the Olkiluoto area in 2010 and aims to find out the continuation of some electrically conductive zones intersected by drillholes OL-KR49 ...OL-KR53 in the eastern part of the Olkiluoto island. Several electrically conductive zones were modelled from the examined data, many of them coincide with the brittle deformation zones presented in the geological model, but also indications of some so far unknown zones were detected. The complexity and the extent of the group of zones including the hydraulically conductive zones HZ19A, HZ19B and HZ19C (Vaittinen et al. 2009) emerged from the data during this work. Modelling of this group in detail needs more information from both geological and hydrogeological investigations. (orig.)

  5. Validation od computational model ALDERSON/EGSnrc for chest radiography

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Santos, André L. dos; Menezes, Claudio J.M.

    2017-01-01

    To perform dose studies in situations of exposure to radiation, without exposing individuals, the numerical dosimetry uses Computational Exposure Models (ECM). Composed essentially by a radioactive source simulator algorithm, a voxel phantom representing the human anatomy and a Monte Carlo code, the ECMs must be validated to determine the reliability of the physical array representation. The objective of this work is to validate the ALDERSON / EGSnrc MCE by through comparisons between the experimental measurements obtained with the ionization chamber and virtual simulations using Monte Carlo Method to determine the ratio of the input and output radiation dose. Preliminary results of these comparisons showed that the ECM reproduced the results of the experimental measurements performed with the physical phantom with a relative error of less than 10%, validating the use of this model for simulations of chest radiographs and estimates of radiation doses in tissues in the irradiated structures

  6. Preliminary report on NTS spectral gamma logging and calibration models

    International Nuclear Information System (INIS)

    Mathews, M.A.; Warren, R.G.; Garcia, S.R.; Lavelle, M.J.

    1985-01-01

    Facilities are now available at the Nevada Test Site (NTS) in Building 2201 to calibrate spectral gamma logging equipment in environments of low radioactivity. Such environments are routinely encountered during logging of holes at the NTS. Four calibration models were delivered to Building 2201 in January 1985. Each model, or test pit, consists of a stone block with a 12-inch diameter cored borehole. Preliminary radioelement values from the core for the test pits range from 0.58 to 3.83% potassium (K), 0.48 to 29.11 ppm thorium (Th), and 0.62 to 40.42 ppm uranium (U). Two satellite holes, U19ab number2 and U19ab number3, were logged during the winter of 1984-1985. The response of these logs correlates with contents of the naturally radioactive elements K. Th. and U determined in samples from petrologic zones that occur within these holes. Based on these comparisons, the spectral gamma log aids in the recognition and mapping of subsurface stratigraphic units and alteration features associated with unusual concentration of these radioactive elements, such as clay-rich zones

  7. Computational models of airway branching morphogenesis.

    Science.gov (United States)

    Varner, Victor D; Nelson, Celeste M

    2017-07-01

    The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  9. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  10. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  11. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  12. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  13. Light reflection models for computer graphics.

    Science.gov (United States)

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  14. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  15. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  16. Non-invasive coronary angiography with multislice computed tomography. Technology, methods, preliminary experience and prospects.

    Science.gov (United States)

    Traversi, Egidio; Bertoli, Giuseppe; Barazzoni, Giancarlo; Baldi, Maurizia; Tramarin, Roberto

    2004-02-01

    The recent technical developments in multislice computed tomography (MSCT), with ECG retro-gated image reconstruction, have elicited great interest in the possibility of accurate non-invasive imaging of the coronary arteries. The latest generation of MSCT systems with 8-16 rows of detectors permits acquisition of the whole cardiac volume during a single 15-20 s breath-hold with a submillimetric definition of the images and an outstanding signal-to-noise ratio. Thus the race which, between MSCT, electron beam computed tomography and cardiac magnetic resonance imaging, can best provide routine and reliable imaging of the coronary arteries in clinical practice has recommenced. Currently available MSCT systems offer different options for both cardiac image acquisition and reconstruction, including multiplanar and curved multiplanar reconstruction, three-dimensional volume rendering, maximum intensity projection, and virtual angioscopy. In our preliminary experience including 176 patients suffering from known or suspected coronary artery disease, MSCT was feasible in 161 (91.5%) and showed a sensitivity of 80.4% and a specificity of 80.3%, with respect to standard coronary angiography, in detecting critical stenosis in coronary arteries and artery or venous bypass grafts. These results correspond to a positive predictive value of 58.6% and a negative predictive value of 92.2%. The true role that MSCT is likely to play in the future in non-invasive coronary imaging is still to be defined. Nevertheless, the huge amount of data obtainable by MSCT along with the rapid technological advances, shorter acquisition times and reconstruction algorithm developments will make the technique stronger, and possible applications are expected not only for non-invasive coronary angiography, but also for cardiac function and myocardial perfusion evaluation, as an all-in-one examination.

  17. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  18. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  19. Preliminary tests of a high speed vertical axis windmill model

    Energy Technology Data Exchange (ETDEWEB)

    South, P; Rangi, R S

    1971-01-01

    This report discusses a fixed-pitch vertical axis windmill that combines the inherent simplicity of this type of machine with a high aerodynamic efficiency and a high relative velocity. A three-bladed rotor was selected as the basic design, having constant chord symmetric airfoil blades configured in a catenary curve such that the rotor diameter is equal to the rotor height. In wind tunnel tests using a 30 inch scale model, it was found that once this rotor was given a very low rotational speed, it picked up speed and ran at a rotor tip velocity/wind speed ratio greater than 1. The number of blades was varied in the testing. A maximum power coefficient of 0.67 was achieved at 17 ft/s wind speed at a tip speed/wind speed ratio of 7.25 for a 2-bladed rotor. Increasing the number of blades above 3 did not result in higher power. The rotor could operate in gusts which double the mean wind velocity. Examination of Reynolds number effects, and taking into account the scale of the model, it was concluded that a full-scale windmill could run at lower velocity ratios than those predicted by the model tests, and that it could self-start under no-load conditions if the cut-in rpm are at least half the rpm for maximum power at the prevailing wind speed. Preliminary estimates show that a 15 ft diameter windmill of this design, designed to operate with a safety factor of 2.5 up to a maximum wind speed of 60 ft/s, would weigh ca 150 lb and could be marketed for ca $60.00, excluding the driven unit, if sufficient quantities were produced to make tooling costs negligible. Similarly, a 30 ft windmill would weigh ca 1000 lb and cost ca $400.00. 2 refs., 6 figs.

  20. Towards The Deep Model : Understanding Visual Recognition Through Computational Models

    OpenAIRE

    Wang, Panqu

    2017-01-01

    Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...

  1. Hybrid computer modelling in plasma physics

    International Nuclear Information System (INIS)

    Hromadka, J; Ibehej, T; Hrach, R

    2016-01-01

    Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)

  2. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  3. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  4. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  5. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  6. Preliminary experimentally-validated forced and mixed convection computational simulations of the Rotatable Buoyancy Tunnel

    International Nuclear Information System (INIS)

    Clifford, Corey E.; Kimber, Mark L.

    2015-01-01

    Although computational fluid dynamics (CFD) has not been directly utilized to perform safety analyses of nuclear reactors in the United States, several vendors are considering adopting commercial numerical packages for current and future projects. To ensure the accuracy of these computational models, it is imperative to validate the assumptions and approximations built into commercial CFD codes against physical data from flows analogous to those in modern nuclear reactors. To this end, researchers at Utah State University (USU) have constructed the Rotatable Buoyancy Tunnel (RoBuT) test facility, which is designed to provide flow and thermal validation data for CFD simulations of forced and mixed convection scenarios. In order to evaluate the ability of current CFD codes to capture the complex physics associated with these types of flows, a computational model of the RoBuT test facility is created using the ANSYS Fluent commercial CFD code. The numerical RoBuT model is analyzed at identical conditions to several experimental trials undertaken at USU. Each experiment is reconstructed numerically and evaluated with the second-order Reynolds stress model (RSM). Two different thermal boundary conditions at the heated surface of the RoBuT test section are investigated: constant temperature (isothermal) and constant surface heat flux (isoflux). Additionally, the fluid velocity at the inlet of the test section is varied in an effort to modify the relative importance of natural convection heat transfer from the heated wall of the RoBuT. Mean velocity, both in the streamwise and transverse directions, as well as components of the Reynolds stress tensor at three points downstream of the RoBuT test section inlet are compared to results obtained from experimental trials. Early computational results obtained from this research initiative are in good agreement with experimental data obtained from the RoBuT facility and both the experimental data and numerical method can be used

  7. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  8. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  9. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  10. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  11. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  12. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  13. An explorative study of the technology transfer coach as a preliminary for the design of a computer aid

    OpenAIRE

    Jönsson, Oscar

    2014-01-01

    The university technology transfer coach has an important role in supporting the commercialization of research results. This thesis has studied the technology transfer coach and their needs in the coaching process. The goal has been to investigate information needs of the technology transfer coach as a preliminary for the design of computer aids.Using a grounded theory approach, we interviewed 17 coaches working in the Swedish technology transfer environment. Extracted quotes from interviews ...

  14. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  15. A Computational Model of Fraction Arithmetic

    Science.gov (United States)

    Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.

    2017-01-01

    Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…

  16. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  18. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  19. Particle modeling of plasmas computational plasma physics

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1991-01-01

    Recently, through the development of supercomputers, a powerful new method for exploring plasmas has emerged; it is computer modeling of plasmas. Such modeling can duplicate many of the complex processes that go on in a plasma and allow scientists to understand what the important processes are. It helps scientists gain an intuition about this complex state of matter. It allows scientists and engineers to explore new ideas on how to use plasma before building costly experiments; it allows them to determine if they are on the right track. It can duplicate the operation of devices and thus reduce the need to build complex and expensive devices for research and development. This is an exciting new endeavor that is in its infancy, but which can play an important role in the scientific and technological competitiveness of the US. There are a wide range of plasma models that are in use. There are particle models, fluid models, hybrid particle fluid models. These can come in many forms, such as explicit models, implicit models, reduced dimensional models, electrostatic models, magnetostatic models, electromagnetic models, and almost an endless variety of other models. Here the author will only discuss particle models. He will give a few examples of the use of such models; these will be taken from work done by the Plasma Modeling Group at UCLA because he is most familiar with work. However, it only gives a small view of the wide range of work being done around the US, or for that matter around the world

  20. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  1. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  2. Computer modeling of ground-water flow at the Savannah River Plant

    International Nuclear Information System (INIS)

    Root, R.W. Jr.

    1979-01-01

    Mathematical equations describing ground-water flow are used in a computer model being developed to predict the space-time distribution of hydraulic head beneath a part of the Savannah River Plant site. These equations are solved by a three-dimensional finite-difference scheme. Preliminary calibration of the hydraulic head model has been completed and calculated results compare well with water-level changes observed in the field. 10 figures, 1 table

  3. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  4. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  5. Computer-Aided Diagnosis Based on Convolutional Neural Network System for Colorectal Polyp Classification: Preliminary Experience.

    Science.gov (United States)

    Komeda, Yoriaki; Handa, Hisashi; Watanabe, Tomohiro; Nomura, Takanobu; Kitahashi, Misaki; Sakurai, Toshiharu; Okamoto, Ayana; Minami, Tomohiro; Kono, Masashi; Arizumi, Tadaaki; Takenaka, Mamoru; Hagiwara, Satoru; Matsui, Shigenaga; Nishida, Naoshi; Kashida, Hiroshi; Kudo, Masatoshi

    2017-01-01

    Computer-aided diagnosis (CAD) is becoming a next-generation tool for the diagnosis of human disease. CAD for colon polyps has been suggested as a particularly useful tool for trainee colonoscopists, as the use of a CAD system avoids the complications associated with endoscopic resections. In addition to conventional CAD, a convolutional neural network (CNN) system utilizing artificial intelligence (AI) has been developing rapidly over the past 5 years. We attempted to generate a unique CNN-CAD system with an AI function that studied endoscopic images extracted from movies obtained with colonoscopes used in routine examinations. Here, we report our preliminary results of this novel CNN-CAD system for the diagnosis of colon polyps. A total of 1,200 images from cases of colonoscopy performed between January 2010 and December 2016 at Kindai University Hospital were used. These images were extracted from the video of actual endoscopic examinations. Additional video images from 10 cases of unlearned processes were retrospectively assessed in a pilot study. They were simply diagnosed as either an adenomatous or nonadenomatous polyp. The number of images used by AI to learn to distinguish adenomatous from nonadenomatous was 1,200:600. These images were extracted from the videos of actual endoscopic examinations. The size of each image was adjusted to 256 × 256 pixels. A 10-hold cross-validation was carried out. The accuracy of the 10-hold cross-validation is 0.751, where the accuracy is the ratio of the number of correct answers over the number of all the answers produced by the CNN. The decisions by the CNN were correct in 7 of 10 cases. A CNN-CAD system using routine colonoscopy might be useful for the rapid diagnosis of colorectal polyp classification. Further prospective studies in an in vivo setting are required to confirm the effectiveness of a CNN-CAD system in routine colonoscopy. © 2017 S. Karger AG, Basel.

  6. Multidirectional Networks of Government Transparency: A Preliminary Model

    Directory of Open Access Journals (Sweden)

    Ahmad Subhan

    2016-11-01

    Full Text Available This article reviews some literature in theoretical level regarding two concepts: governance network and government transparency, in order to search for theoretical linkages and to build an alternative framework that can support the implementation of public disclosure. Transparency agenda has been implemented in various forms at international, national, and local level. Transparency application was also followed by Indonesia with the implementation of Public Information Disclosure Law since 2008. This enthusiasm is quite reasonable because transparency is believed to be one of the human rights principles; as well as a key to better governance, that can help democracy consolidation, prevent corruption, strengthen the legitimacy and improve efficiency. In order to maximize transparency, the government can use a network approach because of some changes at this time, such as democratization, decentralization, and liberalization has placed the government in a position where there is not one actor who manages the state power without stakeholder’s participation. In this context, the government needs to build synergies with other institutions in a reciprocal relationship with all stakeholders. Therefore, adopting the theory of government networks can be one of the strategies to strengthen government transparency. The findings of this article indicate that the government transparency application needs to develop networks in all directions: intragovernmental, intergovernmental and collaborative networks. These three types of network in contrast with the popular belief that government transparency is interpreted only as a procedural activity to outside parties. A preliminary model in this article gives an overview about the arena of government transparency with multi-directional networks more comprehensively.

  7. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  8. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  9. Grid computing in large pharmaceutical molecular modeling.

    Science.gov (United States)

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  10. Attacker Modelling in Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Papini, Davide

    in with our everyday life. This future is visible to everyone nowadays: terms like smartphone, cloud, sensor, network etc. are widely known and used in our everyday life. But what about the security of such systems. Ubiquitous computing devices can be limited in terms of energy, computing power and memory...... attacker remain somehow undened and still under extensive investigation. This Thesis explores the nature of the ubiquitous attacker with a focus on how she interacts with the physical world and it denes a model that captures the abilities of the attacker. Furthermore a quantitative implementation...

  11. Nursing opinion leadership: a preliminary model derived from philosophic theories of rational belief.

    Science.gov (United States)

    Anderson, Christine A; Whall, Ann L

    2013-10-01

    Opinion leaders are informal leaders who have the ability to influence others' decisions about adopting new products, practices or ideas. In the healthcare setting, the importance of translating new research evidence into practice has led to interest in understanding how opinion leaders could be used to speed this process. Despite continued interest, gaps in understanding opinion leadership remain. Agent-based models are computer models that have proven to be useful for representing dynamic and contextual phenomena such as opinion leadership. The purpose of this paper is to describe the work conducted in preparation for the development of an agent-based model of nursing opinion leadership. The aim of this phase of the model development project was to clarify basic assumptions about opinions, the individual attributes of opinion leaders and characteristics of the context in which they are effective. The process used to clarify these assumptions was the construction of a preliminary nursing opinion leader model, derived from philosophical theories about belief formation. © 2013 John Wiley & Sons Ltd.

  12. Climate models on massively parallel computers

    International Nuclear Information System (INIS)

    Vitart, F.; Rouvillois, P.

    1993-01-01

    First results got on massively parallel computers (Multiple Instruction Multiple Data and Simple Instruction Multiple Data) allow to consider building of coupled models with high resolutions. This would make possible simulation of thermoaline circulation and other interaction phenomena between atmosphere and ocean. The increasing of computers powers, and then the improvement of resolution will go us to revise our approximations. Then hydrostatic approximation (in ocean circulation) will not be valid when the grid mesh will be of a dimension lower than a few kilometers: We shall have to find other models. The expert appraisement got in numerical analysis at the Center of Limeil-Valenton (CEL-V) will be used again to imagine global models taking in account atmosphere, ocean, ice floe and biosphere, allowing climate simulation until a regional scale

  13. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  14. 40 CFR 194.23 - Models and computer codes.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Models and computer codes. 194.23... General Requirements § 194.23 Models and computer codes. (a) Any compliance application shall include: (1... obtain stable solutions; (iv) Computer models accurately implement the numerical models; i.e., computer...

  15. Computational Aerodynamic Modeling of Small Quadcopter Vehicles

    Science.gov (United States)

    Yoon, Seokkwan; Ventura Diaz, Patricia; Boyd, D. Douglas; Chan, William M.; Theodore, Colin R.

    2017-01-01

    High-fidelity computational simulations have been performed which focus on rotor-fuselage and rotor-rotor aerodynamic interactions of small quad-rotor vehicle systems. The three-dimensional unsteady Navier-Stokes equations are solved on overset grids using high-order accurate schemes, dual-time stepping, low Mach number preconditioning, and hybrid turbulence modeling. Computational results for isolated rotors are shown to compare well with available experimental data. Computational results in hover reveal the differences between a conventional configuration where the rotors are mounted above the fuselage and an unconventional configuration where the rotors are mounted below the fuselage. Complex flow physics in forward flight is investigated. The goal of this work is to demonstrate that understanding of interactional aerodynamics can be an important factor in design decisions regarding rotor and fuselage placement for next-generation multi-rotor drones.

  16. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    International Nuclear Information System (INIS)

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions

  17. Preliminary development of the Active Colonoscopy Training Model

    Directory of Open Access Journals (Sweden)

    Choi J

    2011-06-01

    Full Text Available JungHun Choi1, Kale Ravindra1, Randolph Robert1, David Drozek21Mechanical Engineering, Ohio University, Athens, OH, USA; 2College of Osteopathic Medicine, Ohio University, Athens, OH, USAAbstract: Formal colonoscopy training requires a significant amount of time and effort. In particular, it requires actual patients for a realistic learning experience. The quality of colonoscopy training varies, and includes didactic courses and procedures proctored by skilled surgeons. A colonoscopy training model is occasionally used as part of the training method, but the effects are minute due to both the simple and tedious training procedures. To enhance the educational effect of the colonoscopy training model, the Active Colonoscopy Training Model (ACTM has been developed. ACTM is an interactive colonoscopy training device which can create the environment of a real colonoscopy procedure as closely as possible. It comprises a configurable rubber colon, a human torso, sensors, a display, and the control part. The ACTM provides audio and visual interaction to the trainee by monitoring important factors, such as forces caused by the distal tip and the shaft of the colonoscope and the pressure to open up the lumen and the localization of the distal tip. On the computer screen, the trainee can easily monitor the status of the colonoscopy, which includes the localization of the distal tip, maximum forces, pressure inside the colon, and surgery time. The forces between the rubber colon and the constraints inside the ACTM are measured and the real time display shows the results to the trainee. The pressure sensors will check the pressure at different parts of the colon. The real-time localized distal tip gives the colonoscopy trainee easier and more confident operation without introducing an additional device in the colonoscope. With the current need for colonoscopists and physicians, the ACTM can play an essential role resolving the problems of the current

  18. Computer-assisted learning in anatomy at the international medical school in Debrecen, Hungary: a preliminary report.

    Science.gov (United States)

    Kish, Gary; Cook, Samuel A; Kis, Gréta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an introduction to anatomical digital images along with clinical cases. This low-budget course has a large visual component using images from magnetic resonance imaging and computer axial tomogram scans, ultrasound clinical studies, and readily available anatomy software that presents topics which run in parallel to the university's core anatomy curriculum. From the combined computer images and CHA lecture information, students are asked to solve computer-based clinical anatomy problems in the CHA computer laboratory. A statistical comparison was undertaken of core anatomy oral examination performances of English program first-year medical students who took the elective CHA course and those who did not in the three academic years 2007-2008, 2008-2009, and 2009-2010. The results of this study indicate that the CHA-enrolled students improved their performance on required anatomy core curriculum oral examinations (P computer-assisted learning may play an active role in anatomy curriculum improvement. These preliminary results have prompted ongoing evaluation of what specific aspects of CHA are valuable and which students benefit from computer-assisted learning in a multilingual and diverse cultural environment. Copyright © 2012 American Association of Anatomists.

  19. In Vitro Studies and Preliminary Mathematical Model for Jet Fuel and Noise Induced Auditory Impairment

    Science.gov (United States)

    2015-06-01

    of JP-8 and a Fischer- Tropsch synthetic jet fuel following subacute inhalation exposure in rats. Toxicol Sci 116(1): 239-248. Gallinat, J...AFRL-RH-WP-TR-2015-0084 IN VITRO STUDIES AND PRELIMINARY MATHEMATICAL MODEL FOR JET FUEL AND NOISE INDUCED AUDITORY IMPAIRMENT...April 2014 – September 2014 4. TITLE AND SUBTITLE In Vitro Studies and Preliminary Mathematical Model for Jet Fuel and Noise Induced Auditory

  20. Ten Years toward Equity: Preliminary Results from a Follow-Up Case Study of Academic Computing Culture

    Directory of Open Access Journals (Sweden)

    Tanya L. Crenshaw

    2017-05-01

    Full Text Available Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing.

  1. A preliminary geodetic data model for geographic information systems

    Science.gov (United States)

    Kelly, K. M.

    2009-12-01

    geophysical datasets, thus facilitating creation of multi-tiered models. The Geodetic Data Model encourages data assimilation and analysis and facilitates data interoperability, coordination and integration in earth system modeling. It offers a basic set of data structures organized in a simple and homogeneous way and can streamline access to and processing of geodetic data. It can aid knowledge discovery through the use of GIS technology to enable identification and understanding of relationships and provide well-established tools and methods to communicate complex technical knowledge with non-specialist audiences. The Geodetic Data Model comprise the base classes for using workflow driven ontology (WDO) techniques for specifying the computation of complex geodetic products along with the ability to capture provenance information. While we do not specify WDO for any given geodetic product, we recognize that structured geodetic data is essential for generating any geodetic WDO, a task that can be streamlined in some GIS software.

  2. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  3. Computer model for harmonic ultrasound imaging.

    Science.gov (United States)

    Li, Y; Zagzebski, J A

    2000-01-01

    Harmonic ultrasound imaging has received great attention from ultrasound scanner manufacturers and researchers. In this paper, we present a computer model that can generate realistic harmonic images. In this model, the incident ultrasound is modeled after the "KZK" equation, and the echo signal is modeled using linear propagation theory because the echo signal is much weaker than the incident pulse. Both time domain and frequency domain numerical solutions to the "KZK" equation were studied. Realistic harmonic images of spherical lesion phantoms were generated for scans by a circular transducer. This model can be a very useful tool for studying the harmonic buildup and dissipation processes in a nonlinear medium, and it can be used to investigate a wide variety of topics related to B-mode harmonic imaging.

  4. Computer modelling of superconductive fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Weller, R.A.; Campbell, A.M.; Coombs, T.A.; Cardwell, D.A.; Storey, R.J. [Cambridge Univ. (United Kingdom). Interdisciplinary Research Centre in Superconductivity (IRC); Hancox, J. [Rolls Royce, Applied Science Division, Derby (United Kingdom)

    1998-05-01

    Investigations are being carried out on the use of superconductors for fault current limiting applications. A number of computer programs are being developed to predict the behavior of different `resistive` fault current limiter designs under a variety of fault conditions. The programs achieve solution by iterative methods based around real measured data rather than theoretical models in order to achieve accuracy at high current densities. (orig.) 5 refs.

  5. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  6. Prediction of the behavior of pedestrian bridges using computer models

    Directory of Open Access Journals (Sweden)

    Jonathan José Cala Monroy

    2017-07-01

    Full Text Available Introduction: The present article is aimed to present a brief introduction of the issues related to the low-frequency vibrations, by indicating human walking as its relevant source which affecting structures of the footbridges and is turned into inconveniences to the pedestrian traffic. Objective: The main objective of this research paper is to explain the most common methods used by engineers for the evaluation of the vibrations and their effects as well as their limitations, furthermore a computer modeling technique was developed in order to approach it to the reality of the phenomenon of vibrations in pedestrian bridges. Methodology: The present work was divided into main phases: The first phase was a conceptual bibliographical review of the subject of floor vibrations by focusing on the use of the Design Guide No. 11 of the American Institute of Steel Constructions, with regard to the second phase, it had to do with the developing of a computer model which included a definition of variables, the elaboration of a dynamic model of the structure, the calibration of the model, the evaluation of the parameters under study and the analysis of results and conclusions. Results: Consequently, and according to the preliminary stages, the results of the acceleration were obtained to different frequencies and to different degrees of damping by observing that the chosen sample was potentially susceptible between four and eight Hz ranges, hence when resonances took place the mentioned structure presented a peak acceleration above the threshold recommended by human beings comfort related to pedestrian bridges. Conclusions: To conclude it can be said that through the appropriate modeling techniques and finite elements convenient and reliable results should be accomplished that leading the design process of structures as pedestrian bridges.

  7. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  8. The deterministic computational modelling of radioactivity

    International Nuclear Information System (INIS)

    Damasceno, Ralf M.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational applicative (software) that modelling the simply radioactive decay, the stable nuclei decay, and tbe chain decay directly coupled with superior limit of thirteen radioactive decays, and a internal data bank with the decay constants of the various existent decays, facilitating considerably the use of program by people who does not have access to the program are not connected to the nuclear area; this makes access of the program to people that do not have acknowledgment of that area. The paper presents numerical results for typical problem-models

  9. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  10. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  11. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.

    1989-05-01

    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  12. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  13. Measurement of single kidney contrast media clearance by multiphasic spiral computed tomography: preliminary results

    International Nuclear Information System (INIS)

    Hackstein, N.; Puille, M.F.; Bak, Benjamin H.; Scharwat, Oliver; Rau, W.S.

    2001-01-01

    Objective. We present preliminary results of a new method (hereinafter called 'CT-clearance') to measure single kidney contrast media clearance by performing multiphasic helical CT of the kidneys. CT-clearance was calculated according to an extension of the Patlak-Plot. In contrast to prior investigators, who repeatedly measured a single slice, this method makes it possible to calculate single kidney clearance from at least three spiral CTs, utilizing the whole kidney volume. Methods. Spiral CT of the kidneys was performed unenhanced and about 30 and 100 s after administration of about 120 ml iopromide. Sum-density of the whole kidneys and aortic density was calculated from this data. Using this data, renal clearance of contrast media was calculated by CT-clearance in 29 patients. As reference, Serum-clearance was calculated in 24 patients by application of a modified one-exponential slope model. Information on the relative kidney function was gained by renal scintigraphy with Tc99m-MAG-3 or Tc99m-DMSA in 29 patients. Results. Linear regression analysis revealed a correlation coefficient of CT-clearance with Serum-clearance of r=0.78 with Cl (CT) [ml/min]=22.2+1.03 * Cl (serum), n=24. Linear regression of the relative kidney function (rkf) of the right kidney calculated by CT-clearance compared to scintigraphy results provided a correlation coefficient r=0.89 with rkf(CT)[%]=18.6+0.58 * rkf(scintigraphy), n=29. Conclusion. The obtained results of contrast media clearance measured by CT-clearance are in the physiological range of the parameter. Future studies should be performed to improve the methodology with the aim of higher accuracy. More specifically, better determination of the aortic density curve might improve the accuracy

  14. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  15. Computational Fluid Dynamics Modeling of Bacillus anthracis ...

    Science.gov (United States)

    Journal Article Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. Four different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Despite the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways of the human at the same air concentration of anthrax spores. This greater deposition of spores in the upper airways in the human resulted in lower penetration and deposition in the tracheobronchial airways and the deep lung than that predict

  16. Computer-based attention training in the schools for children with attention deficit/hyperactivity disorder: a preliminary trial.

    Science.gov (United States)

    Steiner, Naomi J; Sheldrick, Radley Christopher; Gotthelf, David; Perrin, Ellen C

    2011-07-01

    Objective. This study examined the efficacy of 2 computer-based training systems to teach children with attention deficit/hyperactivity disorder (ADHD) to attend more effectively. Design/methods. A total of 41 children with ADHD from 2 middle schools were randomly assigned to receive 2 sessions a week at school of either neurofeedback (NF) or attention training through a standard computer format (SCF), either immediately or after a 6-month wait (waitlist control group). Parents, children, and teachers completed questionnaires pre- and postintervention. Results. Primary parents in the NF condition reported significant (P ADHD index, the BASC Attention Problems Scale, and on the Behavioral Rating Inventory of Executive Functioning (BRIEF). Conclusion. This randomized control trial provides preliminary evidence of the effectiveness of computer-based interventions for ADHD and supports the feasibility of offering them in a school setting.

  17. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  18. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  19. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  20. Computer models for optimizing radiation therapy

    International Nuclear Information System (INIS)

    Duechting, W.

    1998-01-01

    The aim of this contribution is to outline how methods of system analysis, control therapy and modelling can be applied to simulate normal and malignant cell growth and to optimize cancer treatment as for instance radiation therapy. Based on biological observations and cell kinetic data, several types of models have been developed describing the growth of tumor spheroids and the cell renewal of normal tissue. The irradiation model is represented by the so-called linear-quadratic model describing the survival fraction as a function of the dose. Based thereon, numerous simulation runs for different treatment schemes can be performed. Thus, it is possible to study the radiation effect on tumor and normal tissue separately. Finally, this method enables a computer-assisted recommendation for an optimal patient-specific treatment schedule prior to clinical therapy. (orig.) [de

  1. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  2. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  3. Computer modeling for optimal placement of gloveboxes

    International Nuclear Information System (INIS)

    Hench, K.W.; Olivas, J.D.; Finch, P.R.

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units

  4. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  5. Computer models in the design of FXR

    International Nuclear Information System (INIS)

    Vogtlin, G.; Kuenning, R.

    1980-01-01

    Lawrence Livermore National Laboratory is developing a 15 to 20 MeV electron accelerator with a beam current goal of 4 kA. This accelerator will be used for flash radiography and has a requirement of high reliability. Components being developed include spark gaps, Marx generators, water Blumleins and oil insulation systems. A SCEPTRE model was developed that takes into consideration the non-linearity of the ferrite and the time dependency of the emission from a field emitter cathode. This model was used to predict an optimum charge time to obtain maximum magnetic flux change from the ferrite. This model and its application will be discussed. JASON was used extensively to determine optimum locations and shapes of supports and insulators. It was also used to determine stress within bubbles adjacent to walls in oil. Computer results will be shown and bubble breakdown will be related to bubble size

  6. Computational modeling of a forward lunge

    DEFF Research Database (Denmark)

    Alkjær, Tine; Wieland, Maja Rose; Andersen, Michael Skipper

    2012-01-01

    during forward lunging. Thus, the purpose of the present study was to establish a musculoskeletal model of the forward lunge to computationally investigate the complete mechanical force equilibrium of the tibia during the movement to examine the loading pattern of the cruciate ligaments. A healthy female...... was selected from a group of healthy subjects who all performed a forward lunge on a force platform, targeting a knee flexion angle of 90°. Skin-markers were placed on anatomical landmarks on the subject and the movement was recorded by five video cameras. The three-dimensional kinematic data describing...... the forward lunge movement were extracted and used to develop a biomechanical model of the lunge movement. The model comprised two legs including femur, crus, rigid foot segments and the pelvis. Each leg had 35 independent muscle units, which were recruited according to a minimum fatigue criterion...

  7. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  8. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  9. Surface complexation modelling: Experiments on sorption of nickel on quartz, goethite and kaolinite and preliminary tests on sorption of thorium on quartz

    Energy Technology Data Exchange (ETDEWEB)

    Puukko, E.; Hakanen, M. [Univ. of Helsinki (Finland). Dept. of Chemistry. Lab. of Radiochemistry

    1997-09-01

    The aim of the work was to study the sorption behaviour of Ni on quartz, goethite and kaolinite at different pH levels and in different electrolyte solutions of different strength. In addition preliminary experiments were made to study the sorption of thorium on quartz. The MUS quartz and Nilsiae quartz were analysed for MnO{sub 2} by neutron activation analysis (NAA) and the experimental results were modelled with the HYDRAQL computer model. 9 refs.

  10. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  11. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  12. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  13. Preliminary Investigation of Time Remaining Display on the Computer-based Emergency Operating Procedure

    Science.gov (United States)

    Suryono, T. J.; Gofuku, A.

    2018-02-01

    One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.

  14. Getting computer models to communicate; Faire communiquer les modeles numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Caremoli, Ch. [Electricite de France (EDF), 75 - Paris (France). Dept. Mecanique et Modeles Numeriques; Erhard, P. [Electricite de France (EDF), 75 - Paris (France). Dept. Physique des Reacteurs

    1999-07-01

    Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)

  15. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  16. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  17. Gas turbine designer computer program - a study of using a computer for preliminary design of gas turbines

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Rickard

    1995-11-01

    This thesis presents calculation schemes and theories for preliminary design of the fan, high pressure compressor and turbine of a gas turbine. The calculations are presented step by step, making it easier to implement in other applications. The calculation schemes have been implemented as a subroutine in a thermodynamic program. The combination of the thermodynamic cycle calculation and the design calculation turned out to give quite relevant results, when predicting the geometry and performance of an existing aero engine. The program developed is able to handle several different gas turbines, including those in which the flow is split (i.e. turbofan engines). The design process is limited to the fan, compressor and turbine of the gas turbine, the rest of the components have not been considered. Output from the program are main geometry, presented both numerically and as a scale plot, component efficiencies, stresses in critical points and a simple prediction of turbine blade temperatures. 11 refs, 21 figs, 1 tab

  18. Modeling Reality: How Computers Mirror Life

    International Nuclear Information System (INIS)

    Inoue, J-I

    2005-01-01

    Modeling Reality: How Computers Mirror Life covers a wide range of modern subjects in complex systems, suitable not only for undergraduate students who want to learn about modelling 'reality' by using computer simulations, but also for researchers who want to learn something about subjects outside of their majors and need a simple guide. Readers are not required to have specialized training before they start the book. Each chapter is organized so as to train the reader to grasp the essential idea of simulating phenomena and guide him/her towards more advanced areas. The topics presented in this textbook fall into two categories. The first is at graduate level, namely probability, statistics, information theory, graph theory, and the Turing machine, which are standard topics in the course of information science and information engineering departments. The second addresses more advanced topics, namely cellular automata, deterministic chaos, fractals, game theory, neural networks, and genetic algorithms. Several topics included here (neural networks, game theory, information processing, etc) are now some of the main subjects of statistical mechanics, and many papers related to these interdisciplinary fields are published in Journal of Physics A: Mathematical and General, so readers of this journal will be familiar with the subject areas of this book. However, each area is restricted to an elementary level and if readers wish to know more about the topics they are interested in, they will need more advanced books. For example, on neural networks, the text deals with the back-propagation algorithm for perceptron learning. Nowadays, however, this is a rather old topic, so the reader might well choose, for example, Introduction to the Theory of Neural Computation by J Hertz et al (Perseus books, 1991) or Statistical Physics of Spin Glasses and Information Processing by H Nishimori (Oxford University Press, 2001) for further reading. Nevertheless, this book is worthwhile

  19. A COMPUTATIONAL MODEL OF MOTOR NEURON DEGENERATION

    Science.gov (United States)

    Le Masson, Gwendal; Przedborski, Serge; Abbott, L.F.

    2014-01-01

    SUMMARY To explore the link between bioenergetics and motor neuron degeneration, we used a computational model in which detailed morphology and ion conductance are paired with intracellular ATP production and consumption. We found that reduced ATP availability increases the metabolic cost of a single action potential and disrupts K+/Na+ homeostasis, resulting in a chronic depolarization. The magnitude of the ATP shortage at which this ionic instability occurs depends on the morphology and intrinsic conductance characteristic of the neuron. If ATP shortage is confined to the distal part of the axon, the ensuing local ionic instability eventually spreads to the whole neuron and involves fasciculation-like spiking events. A shortage of ATP also causes a rise in intracellular calcium. Our modeling work supports the notion that mitochondrial dysfunction can account for salient features of the paralytic disorder amyotrophic lateral sclerosis, including motor neuron hyperexcitability, fasciculation, and differential vulnerability of motor neuron subpopulations. PMID:25088365

  20. A computational model of motor neuron degeneration.

    Science.gov (United States)

    Le Masson, Gwendal; Przedborski, Serge; Abbott, L F

    2014-08-20

    To explore the link between bioenergetics and motor neuron degeneration, we used a computational model in which detailed morphology and ion conductance are paired with intracellular ATP production and consumption. We found that reduced ATP availability increases the metabolic cost of a single action potential and disrupts K+/Na+ homeostasis, resulting in a chronic depolarization. The magnitude of the ATP shortage at which this ionic instability occurs depends on the morphology and intrinsic conductance characteristic of the neuron. If ATP shortage is confined to the distal part of the axon, the ensuing local ionic instability eventually spreads to the whole neuron and involves fasciculation-like spiking events. A shortage of ATP also causes a rise in intracellular calcium. Our modeling work supports the notion that mitochondrial dysfunction can account for salient features of the paralytic disorder amyotrophic lateral sclerosis, including motor neuron hyperexcitability, fasciculation, and differential vulnerability of motor neuron subpopulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Computational models of intergroup competition and warfare.

    Energy Technology Data Exchange (ETDEWEB)

    Letendre, Kenneth (University of New Mexico); Abbott, Robert G.

    2011-11-01

    This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.

  2. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  3. Development and preliminary analyses of material balance evaluation model in nuclear fuel cycle

    International Nuclear Information System (INIS)

    Matsumura, Tetsuo

    1994-01-01

    Material balance evaluation model in nuclear fuel cycle has been developed using ORIGEN-2 code as basic engine. This model has feature of: It can treat more than 1000 nuclides including minor actinides and fission products. It has flexibility of modeling and graph output using a engineering work station. I made preliminary calculation of LWR fuel high burnup effect (reloading fuel average burnup of 60 GWd/t) on nuclear fuel cycle. The preliminary calculation shows LWR fuel high burnup has much effect on Japanese Pu balance problem. (author)

  4. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...... further comprises determining a first connection element of the first construction element and a second connection element of the second construction element located in a predetermined proximity of each other; and retrieving connectivity information of the corresponding connection types of the first...

  5. Direct modeling for computational fluid dynamics

    Science.gov (United States)

    Xu, Kun

    2015-06-01

    All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct

  6. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  7. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  8. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  9. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  10. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  11. Agent-Based Computational Modeling of Cell Culture ...

    Science.gov (United States)

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assumed a “fried egg shape” but became increasingly cuboidal with increasing confluency. The surface area presented by each cell to the overlying medium varies from cell-to-cell and is a determinant of diffusional flux of toxicant from the medium into the cell. Thus, dose varies among cells for a given concentration of toxicant in the medium. Computer code describing diffusion of H2O2 from medium into each cell and clearance of H2O2 was calibrated against H2O2 time-course data (25, 50, or 75 uM H2O2 for 60 min) obtained with the Amplex Red assay for the medium and the H2O2-sensitive fluorescent reporter, HyPer, for cytosol. Cellular H2O2 concentrations peaked at about 5 min and were near baseline by 10 min. The model predicted a skewed distribution of surface areas, with between cell variation usually 2 fold or less. Predicted variability in cellular dose was in rough agreement with the variation in the HyPer data. These results are preliminary, as the model was not calibrated to the morphology of a specific cell type. Future work will involve morphology model calibration against human bronchial epithelial (BEAS-2B) cells. Our results show, however, the potential of agent-based modeling

  12. MININR: a geochemical computer program for inclusion in water flow models - an application study

    Energy Technology Data Exchange (ETDEWEB)

    Felmy, A.R.; Reisenauer, A.E.; Zachara, J.M.; Gee, G.W.

    1984-02-01

    MININR is a reduced form of the computer program MINTEQ which calculates equilibrium precipitation/dissolution of solid phases, aqueous speciation, adsorption, and gas phase equilibrium. The user-oriented features in MINTEQ were removed to reduce the size and increase the computational speed. MININR closely resembles the MINEQL computer program developed by Westall (1976). The main differences between MININR and MINEQL involve modifications to accept an initial starting mass of solid and necessary changes for linking with a water flow model. MININR in combination with a simple water flow model which considers only dilution was applied to a laboratory column packed with retorted oil shale and percolated with distilled water. Experimental and preliminary model simulation results are presented for the constituents K/sup +/, Na/sup +/, SO/sub 4//sup 2 -/, Mg/sup 2 +/, Ca/sup 2 +/, CO/sub 3//sup 2 -/ and pH.

  13. Integrated multiscale modeling of molecular computing devices

    International Nuclear Information System (INIS)

    Cummings, Peter T; Leng Yongsheng

    2005-01-01

    Molecular electronics, in which single organic molecules are designed to perform the functions of transistors, diodes, switches and other circuit elements used in current siliconbased microelecronics, is drawing wide interest as a potential replacement technology for conventional silicon-based lithographically etched microelectronic devices. In addition to their nanoscopic scale, the additional advantage of molecular electronics devices compared to silicon-based lithographically etched devices is the promise of being able to produce them cheaply on an industrial scale using wet chemistry methods (i.e., self-assembly from solution). The design of molecular electronics devices, and the processes to make them on an industrial scale, will require a thorough theoretical understanding of the molecular and higher level processes involved. Hence, the development of modeling techniques for molecular electronics devices is a high priority from both a basic science point of view (to understand the experimental studies in this field) and from an applied nanotechnology (manufacturing) point of view. Modeling molecular electronics devices requires computational methods at all length scales - electronic structure methods for calculating electron transport through organic molecules bonded to inorganic surfaces, molecular simulation methods for determining the structure of self-assembled films of organic molecules on inorganic surfaces, mesoscale methods to understand and predict the formation of mesoscale patterns on surfaces (including interconnect architecture), and macroscopic scale methods (including finite element methods) for simulating the behavior of molecular electronic circuit elements in a larger integrated device. Here we describe a large Department of Energy project involving six universities and one national laboratory aimed at developing integrated multiscale methods for modeling molecular electronics devices. The project is funded equally by the Office of Basic

  14. Computational modeling of intraocular gas dynamics

    International Nuclear Information System (INIS)

    Noohi, P; Abdekhodaie, M J; Cheng, Y L

    2015-01-01

    The purpose of this study was to develop a computational model to simulate the dynamics of intraocular gas behavior in pneumatic retinopexy (PR) procedure. The presented model predicted intraocular gas volume at any time and determined the tolerance angle within which a patient can maneuver and still gas completely covers the tear(s). Computational fluid dynamics calculations were conducted to describe PR procedure. The geometrical model was constructed based on the rabbit and human eye dimensions. SF_6 in the form of pure and diluted with air was considered as the injected gas. The presented results indicated that the composition of the injected gas affected the gas absorption rate and gas volume. After injection of pure SF_6, the bubble expanded to 2.3 times of its initial volume during the first 23 h, but when diluted SF_6 was used, no significant expansion was observed. Also, head positioning for the treatment of retinal tear influenced the rate of gas absorption. Moreover, the determined tolerance angle depended on the bubble and tear size. More bubble expansion and smaller retinal tear caused greater tolerance angle. For example, after 23 h, for the tear size of 2 mm the tolerance angle of using pure SF_6 is 1.4 times more than that of using diluted SF_6 with 80% air. Composition of the injected gas and conditions of the tear in PR may dramatically affect the gas absorption rate and gas volume. Quantifying these effects helps to predict the tolerance angle and improve treatment efficiency. (paper)

  15. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    Science.gov (United States)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  16. Parallel Computing for Terrestrial Ecosystem Carbon Modeling

    International Nuclear Information System (INIS)

    Wang, Dali; Post, Wilfred M.; Ricciuto, Daniel M.; Berry, Michael

    2011-01-01

    Terrestrial ecosystems are a primary component of research on global environmental change. Observational and modeling research on terrestrial ecosystems at the global scale, however, has lagged behind their counterparts for oceanic and atmospheric systems, largely because the unique challenges associated with the tremendous diversity and complexity of terrestrial ecosystems. There are 8 major types of terrestrial ecosystem: tropical rain forest, savannas, deserts, temperate grassland, deciduous forest, coniferous forest, tundra, and chaparral. The carbon cycle is an important mechanism in the coupling of terrestrial ecosystems with climate through biological fluxes of CO 2 . The influence of terrestrial ecosystems on atmospheric CO 2 can be modeled via several means at different timescales. Important processes include plant dynamics, change in land use, as well as ecosystem biogeography. Over the past several decades, many terrestrial ecosystem models (see the 'Model developments' section) have been developed to understand the interactions between terrestrial carbon storage and CO 2 concentration in the atmosphere, as well as the consequences of these interactions. Early TECMs generally adapted simple box-flow exchange models, in which photosynthetic CO 2 uptake and respiratory CO 2 release are simulated in an empirical manner with a small number of vegetation and soil carbon pools. Demands on kinds and amount of information required from global TECMs have grown. Recently, along with the rapid development of parallel computing, spatially explicit TECMs with detailed process based representations of carbon dynamics become attractive, because those models can readily incorporate a variety of additional ecosystem processes (such as dispersal, establishment, growth, mortality etc.) and environmental factors (such as landscape position, pest populations, disturbances, resource manipulations, etc.), and provide information to frame policy options for climate change

  17. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  18. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  19. How Well Can a Computer Program Teach German Culture? Some Preliminary Findings from EthnoDeutsch.

    Science.gov (United States)

    Ashby, Wendy; Ostertag, Veronica

    2002-01-01

    Investigates the effectiveness of an interactive, computer-mediated instructional segment designed to educate students about ethnicity in German-speaking countries. Fifty-two intermediate German students worked with computer-mediated segments and rated the segments' effectiveness on a Likert-scale questionnaire. (AS)

  20. Model to Implement Virtual Computing Labs via Cloud Computing Services

    OpenAIRE

    Washington Luna Encalada; José Luis Castillo Sequera

    2017-01-01

    In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs), and bring your own device (BYOD) are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the...

  1. Modelled and Observed Diurnal SST Signals: "SSTDV:R.EX.-IM.A.M." Project Preliminary Results

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob; LeBorgne, Pierre

    2013-01-01

    This study presents some of the preliminary results from the ESA Support To Science Element (STSE) funded project on the Diurnal Variability of the Sea Surface Temperature, regarding its Regional Extend and Implications in Atmospheric Modelling (SSTDV:R.EX.–IM.A.M.). During this phase of the proj......This study presents some of the preliminary results from the ESA Support To Science Element (STSE) funded project on the Diurnal Variability of the Sea Surface Temperature, regarding its Regional Extend and Implications in Atmospheric Modelling (SSTDV:R.EX.–IM.A.M.). During this phase...

  2. Computer modelling of eddy current probes

    International Nuclear Information System (INIS)

    Sullivan, S.P.

    1992-01-01

    Computer programs have been developed for modelling impedance and transmit-receive eddy current probes in two-dimensional axis-symmetric configurations. These programs, which are based on analytic equations, simulate bobbin probes in infinitely long tubes and surface probes on plates. They calculate probe signal due to uniform variations in conductor thickness, resistivity and permeability. These signals depend on probe design and frequency. A finite element numerical program has been procured to calculate magnetic permeability in non-linear ferromagnetic materials. Permeability values from these calculations can be incorporated into the above analytic programs to predict signals from eddy current probes with permanent magnets in ferromagnetic tubes. These programs were used to test various probe designs for new testing applications. Measurements of magnetic permeability in magnetically biased ferromagnetic materials have been performed by superimposing experimental signals, from special laboratory ET probes, on impedance plane diagrams calculated using these programs. (author). 3 refs., 2 figs

  3. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  4. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  5. Modeling Techniques for a Computational Efficient Dynamic Turbofan Engine Model

    Directory of Open Access Journals (Sweden)

    Rory A. Roberts

    2014-01-01

    Full Text Available A transient two-stream engine model has been developed. Individual component models developed exclusively in MATLAB/Simulink including the fan, high pressure compressor, combustor, high pressure turbine, low pressure turbine, plenum volumes, and exit nozzle have been combined to investigate the behavior of a turbofan two-stream engine. Special attention has been paid to the development of transient capabilities throughout the model, increasing physics model, eliminating algebraic constraints, and reducing simulation time through enabling the use of advanced numerical solvers. The lessening of computation time is paramount for conducting future aircraft system-level design trade studies and optimization. The new engine model is simulated for a fuel perturbation and a specified mission while tracking critical parameters. These results, as well as the simulation times, are presented. The new approach significantly reduces the simulation time.

  6. Preliminary corrosion models for BWIP [Basalt Waste Isolation Project] canister materials

    International Nuclear Information System (INIS)

    Fish, R.L.; Anantatmula, R.P.

    1983-01-01

    Waste package development for the Basalt Waste Isolation Project (BWIP) requires the generation of materials degradation data under repository relevant conditions. These data are used to develop predictive models for the behavior of each component of waste package. The component models are exercised in performance analyses to optimize the waste package design. This document presents all repository relevant canister materials corrosion data that the BWIP and others have developed to date, describes the methodology used to develop preliminary corrosion models and provides the mathematical description of the models for both low carbon steel and Fe9Cr1Mo steel. Example environment/temperature history and model application calculations are presented to aid in understanding the models. The models are preliminary in nature and will be updated as additional corrosion data become available. 6 refs., 5 tabs

  7. Challenge for knowledge information processing systems (preliminary report on Fifth Generation Computer Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The author explains the reasons, aims and strategies for the Fifth Generation Computer Project in Japan. The project aims to introduce a radical new breed of computer by 1990. This article outlines the economic and social reasons for the project. It describes the impacts and effects that these computers are expected to have. The areas of technology which will form the contents of the research and development are highlighted. These are areas such as VLSI technology, speech and image understanding systems, artificial intelligence and advanced architecture design. Finally a schedule for completion of research is given which aims for a completed project by 1990.

  8. TEVA-SPOT-GUI - Containing Preliminary Flow Model

    Data.gov (United States)

    U.S. Environmental Protection Agency — This ZIP file contains the developmental, test version of TEVA-SPOT-GUI's Flow Model. The Flow Model is a new, event based water quality algorithm for EPANET. The...

  9. Preliminary Uncorrelated Encounter Model of the National Airspace System

    National Research Council Canada - National Science Library

    Kochenderfer, M. J; Kuchar, J. K; Espindle, L. P; Gertz, J. L

    2008-01-01

    ...) and which may not be in contact with air traffic control. In response to the need to develop a model of these types of encounters, Lincoln Laboratory undertook an extensive radar data collection and modeling effort...

  10. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion.

    Science.gov (United States)

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao; Lu, Jun-Ying; Zeng, Yan-Hong; Meng, Fan-Jie; Cao, Bin; Zi, Xue-Rong; Han, Shu-Ming; Zhang, Yu-Huan

    2013-09-01

    Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 × d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l × h × d): V = 0.56 × (l × h × d) + 39.44 (r = 0.92, P = 0.000). The 64-slice CT volume-rendering technique can

  11. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion

    International Nuclear Information System (INIS)

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao

    2013-01-01

    Background: Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. Purpose: To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. Material and Methods: The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. Results: After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 X d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l X h X d): V = 0.56 X (l X h X d) + 39.44 (r = 0.92, P = 0

  12. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Zhi-Jun [Dept. of Radiology, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China)], e-mail: Gzj3@163.com; Lin, Qiang [Dept. of Oncology, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China); Liu, Hai-Tao [Dept. of General Surgery, North China Petroleum Bureau General Hospital, Renqiu, Hebei (China)] [and others])

    2013-09-15

    Background: Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. Purpose: To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. Material and Methods: The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. Results: After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 X d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l X h X d): V = 0.56 X (l X h X d) + 39.44 (r = 0.92, P = 0

  13. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  14. COGMIR: A computer model for knowledge integration

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z.X.

    1988-01-01

    This dissertation explores some aspects of knowledge integration, namely, accumulation of scientific knowledge and performing analogical reasoning on the acquired knowledge. Knowledge to be integrated is conveyed by paragraph-like pieces referred to as documents. By incorporating some results from cognitive science, the Deutsch-Kraft model of information retrieval is extended to a model for knowledge engineering, which integrates acquired knowledge and performs intelligent retrieval. The resulting computer model is termed COGMIR, which stands for a COGnitive Model for Intelligent Retrieval. A scheme, named query invoked memory reorganization, is used in COGMIR for knowledge integration. Unlike some other schemes which realize knowledge integration through subjective understanding by representing new knowledge in terms of existing knowledge, the proposed scheme suggests at storage time only recording the possible connection of knowledge acquired from different documents. The actual binding of the knowledge acquired from different documents is deferred to query time. There is only one way to store knowledge and numerous ways to utilize the knowledge. Each document can be represented as a whole as well as its meaning. In addition, since facts are constructed from the documents, document retrieval and fact retrieval are treated in a unified way. When the requested knowledge is not available, query invoked memory reorganization can generate suggestion based on available knowledge through analogical reasoning. This is done by revising the algorithms developed for document retrieval and fact retrieval, and by incorporating Gentner's structure mapping theory. Analogical reasoning is treated as a natural extension of intelligent retrieval, so that two previously separate research areas are combined. A case study is provided. All the components are implemented as list structures similar to relational data-bases.

  15. Preliminary Evaluation of the Computer-Based Tactics Certification Course--Principles of War Module

    National Research Council Canada - National Science Library

    Pleban, Robert

    1997-01-01

    This report describes a portion of the U.S. Army Research Institute for the Behavioral and Social Sciences Infantry Forces Research Unit's work in the formative evaluation of the computer based Tactics Certification Course (TCC...

  16. QSAR models for anti-androgenic effect - a preliminary study

    DEFF Research Database (Denmark)

    Jensen, Gunde Egeskov; Nikolov, Nikolai Georgiev; Wedebye, Eva Bay

    2011-01-01

    Three modelling systems (MultiCase (R), LeadScope (R) and MDL (R) QSAR) were used for construction of androgenic receptor antagonist models. There were 923-942 chemicals in the training sets. The models were cross-validated (leave-groups-out) with concordances of 77-81%, specificity of 78...... of the model for a particular application, balance of training sets, domain definition, and cut-offs for prediction interpretation should also be taken into account. Different descriptors in the modelling systems are illustrated with hydroxyflutamide and dexamethasone as examples (a non-steroid and a steroid...

  17. Development of Graphical Solution for Computer-Assisted Fault Diagnosis: Preliminary Study

    International Nuclear Information System (INIS)

    Yoon, Han Bean; Yun, Seung Man; Han, Jong Chul

    2009-01-01

    We have developed software for converting the volumetric voxel data obtained from X-ray computed tomography(CT) into computer-aided design(CAD) data. The developed software can used for non-destructive testing and evaluation, reverse engineering, and rapid prototyping, etc. The main algorithms employed in the software are image reconstruction, volume rendering, segmentation, and mesh data generation. The feasibility of the developed software is demonstrated with the CT data of human maxilla and mandible bones

  18. The use of conduction model in laser weld profile computation

    Science.gov (United States)

    Grabas, Bogusław

    2007-02-01

    Profiles of joints resulting from deep penetration laser beam welding of a flat workpiece of carbon steel were computed. A semi-analytical conduction model solved with Green's function method was used in computations. In the model, the moving heat source was attenuated exponentially in accordance with Beer-Lambert law. Computational results were compared with those in the experiment.

  19. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...

  20. Computational and Organotypic Modeling of Microcephaly ...

    Science.gov (United States)

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computational and experimental models to probe the underlying molecular targets, cellular consequences, and biological processes. We describe an Adverse Outcome Pathway (AOP) framework for microcephaly derived from literature on all gene-, chemical-, or viral- effects and brain development. Overlap with NTDs is likely, although the AOP connections identified here focused on microcephaly as the adverse outcome. A query of the Mammalian Phenotype Browser database for ‘microcephaly’ (MP:0000433) returned 85 gene associations; several function in microtubule assembly and centrosome cycle regulated by (microcephalin, MCPH1), a gene for primary microcephaly in humans. The developing ventricular zone is the likely target. In this zone, neuroprogenitor cells (NPCs) self-replicate during the 1st trimester setting brain size, followed by neural differentiation of the neocortex. Recent studies with human NPCs confirmed infectivity with Zika virions invoking critical cell loss (apoptosis) of precursor NPCs; similar findings have been shown with fetal alcohol or methylmercury exposure in rodent studies, leading to mathematical models of NPC dynamics in size determination of the ventricular zone. A key event

  1. Computer modeling of the Cabriolet Event

    International Nuclear Information System (INIS)

    Kamegai, M.

    1979-01-01

    Computer modeling techniques are described for calculating the results of underground nuclear explosions at depths shallow enough to produce cratering. The techniques are applied to the Cabriolet Event, a well-documented nuclear excavation experiment, and the calculations give good agreement with the experimental results. It is concluded that, given data obtainable by outside observers, these modeling techniques are capable of verifying the yield and depth of underground nuclear cratering explosions, and that they could thus be useful in monitoring another country's compliance with treaty agreements on nuclear testing limitations. Several important facts emerge from the study: (1) seismic energy is produced by only a fraction of the nuclear yield, a fraction depending strongly on the depth of shot and the mechanical properties of the surrounding rock; (2) temperature of the vented gas can be predicted accurately only if good equations of state are available for the rock in the detonation zone; and (3) temperature of the vented gas is strongly dependent on the cooling effect, before venting, of mixing with melted rock in the expanding cavity and, to a lesser extent, on the cooling effect of water in the rock

  2. Random matrix model of adiabatic quantum computing

    International Nuclear Information System (INIS)

    Mitchell, David R.; Adami, Christoph; Lue, Waynn; Williams, Colin P.

    2005-01-01

    We present an analysis of the quantum adiabatic algorithm for solving hard instances of 3-SAT (an NP-complete problem) in terms of random matrix theory (RMT). We determine the global regularity of the spectral fluctuations of the instantaneous Hamiltonians encountered during the interpolation between the starting Hamiltonians and the ones whose ground states encode the solutions to the computational problems of interest. At each interpolation point, we quantify the degree of regularity of the average spectral distribution via its Brody parameter, a measure that distinguishes regular (i.e., Poissonian) from chaotic (i.e., Wigner-type) distributions of normalized nearest-neighbor spacings. We find that for hard problem instances - i.e., those having a critical ratio of clauses to variables - the spectral fluctuations typically become irregular across a contiguous region of the interpolation parameter, while the spectrum is regular for easy instances. Within the hard region, RMT may be applied to obtain a mathematical model of the probability of avoided level crossings and concomitant failure rate of the adiabatic algorithm due to nonadiabatic Landau-Zener-type transitions. Our model predicts that if the interpolation is performed at a uniform rate, the average failure rate of the quantum adiabatic algorithm, when averaged over hard problem instances, scales exponentially with increasing problem size

  3. Computational Modeling of Biological Systems From Molecules to Pathways

    CERN Document Server

    2012-01-01

    Computational modeling is emerging as a powerful new approach for studying and manipulating biological systems. Many diverse methods have been developed to model, visualize, and rationally alter these systems at various length scales, from atomic resolution to the level of cellular pathways. Processes taking place at larger time and length scales, such as molecular evolution, have also greatly benefited from new breeds of computational approaches. Computational Modeling of Biological Systems: From Molecules to Pathways provides an overview of established computational methods for the modeling of biologically and medically relevant systems. It is suitable for researchers and professionals working in the fields of biophysics, computational biology, systems biology, and molecular medicine.

  4. Environment modelling in near Earth space: Preliminary LDEF results

    Science.gov (United States)

    Coombs, C. R.; Atkinson, D. R.; Wagner, J. D.; Crowell, L. B.; Allbrooks, M.; Watts, A. J.

    1992-01-01

    Hypervelocity impacts by space debris cause not only local cratering or penetrations, but also cause large areas of damage in coated, painted or laminated surfaces. Features examined in these analyses display interesting morphological characteristics, commonly exhibiting a concentric ringed appearance. Virtually all features greater than 0.2 mm in diameter possess a spall zone in which all of the paint was removed from the aluminum surface. These spall zones vary in size from approximately 2 - 5 crater diameters. The actual craters in the aluminum substrate vary from central pits without raised rims, to morphologies more typical of craters formed in aluminum under hypervelocity laboratory conditions for the larger features. Most features also possess what is referred to as a 'shock zone' as well. These zones vary in size from approximately 1 - 20 crater diameters. In most cases, only the outer-most layer of paint was affected by this impact related phenomenon. Several impacts possess ridge-like structures encircling the area in which this outer-most paint layer was removed. In many ways, such features resemble the lunar impact basins, but on an extremely reduced scale. Overall, there were no noticeable penetrations, bulges or spallation features on the backside of the tray. On Row 12, approximately 85 degrees from the leading edge (RAM direction), there was approximately one impact per 15 cm(exp 2). On the trailing edge, there was approximately one impact per 72 cm(exp 2). Currently, craters on four aluminum experiment trays from Bay E09, directly on the leading edge are being measured and analyzed. Preliminary results have produced more than 2200 craters on approximately 1500 cm(exp 2) - or approximately 1 impact per 0.7 cm(exp 2).

  5. Engineering a thalamo-cortico-thalamic circuit on SpiNNaker: a preliminary study towards modelling sleep and wakefulness

    Directory of Open Access Journals (Sweden)

    Basabdatta Sen Bhattacharya

    2014-05-01

    Full Text Available We present a preliminary study of a thalamo-cortico-thalamic (TCT implementation on SpiNNaker (Spiking Neural Network architecture, a brain inspired hardware platform designed to incorporate the inherent biological properties of parallelism, fault tolerance and energy efficiency. These attributes make SpiNNaker an ideal platform for simulating biologically plausible computational models. Our focus in this work is to design a TCT framework that can be simulated on SpiNNaker to mimic dynamical behaviour similar to Electroencephalogram (EEG time and power-spectra signatures in sleep-wake transition. The scale of the model is minimised for simplicity in this proof-of-concept study; thus the total number of spiking neurons is approximately 1000 and represents a `mini-column' of the thalamocortical tissue. All data on model structure, synaptic layout and parameters is inspired from previous studies and abstracted at a level that is appropriate to the aims of the current study as well as computationally suitable for model simulation on a small 4-chip SpiNNaker system. The initial results from selective deletion of synaptic connectivity parameters in the model show similarity with EEG time series characteristics of sleep and wakefulness. These observations provide a positive perspective and a basis for future implementation of a very large scale biologically plausible model of thalamo-cortico-thalamic interactivity---the essential brain circuit that regulates the biological sleep-wake cycle and associated EEG rhythms.

  6. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  7. Reconstructing Holocene climate using a climate model: Model strategy and preliminary results

    Science.gov (United States)

    Haberkorn, K.; Blender, R.; Lunkeit, F.; Fraedrich, K.

    2009-04-01

    An Earth system model of intermediate complexity (Planet Simulator; PlaSim) is used to reconstruct Holocene climate based on proxy data. The Planet Simulator is a user friendly general circulation model (GCM) suitable for palaeoclimate research. Its easy handling and the modular structure allow for fast and problem dependent simulations. The spectral model is based on the moist primitive equations conserving momentum, mass, energy and moisture. Besides the atmospheric part, a mixed layer-ocean with sea ice and a land surface with biosphere are included. The present-day climate of PlaSim, based on an AMIP II control-run (T21/10L resolution), shows reasonable agreement with ERA-40 reanalysis data. Combining PlaSim with a socio-technological model (GLUES; DFG priority project INTERDYNAMIK) provides improved knowledge on the shift from hunting-gathering to agropastoral subsistence societies. This is achieved by a data assimilation approach, incorporating proxy time series into PlaSim to initialize palaeoclimate simulations during the Holocene. For this, the following strategy is applied: The sensitivities of the terrestrial PlaSim climate are determined with respect to sea surface temperature (SST) anomalies. Here, the focus is the impact of regionally varying SST both in the tropics and the Northern Hemisphere mid-latitudes. The inverse of these sensitivities is used to determine the SST conditions necessary for the nudging of land and coastal proxy climates. Preliminary results indicate the potential, the uncertainty and the limitations of the method.

  8. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel; Buse, Gerrit; Pfluger, Dirk

    2012-01-01

    of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute

  9. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  10. Computer models for kinetic equations of magnetically confined plasmas

    International Nuclear Information System (INIS)

    Killeen, J.; Kerbel, G.D.; McCoy, M.G.; Mirin, A.A.; Horowitz, E.J.; Shumaker, D.E.

    1987-01-01

    This paper presents four working computer models developed by the computational physics group of the National Magnetic Fusion Energy Computer Center. All of the models employ a kinetic description of plasma species. Three of the models are collisional, i.e., they include the solution of the Fokker-Planck equation in velocity space. The fourth model is collisionless and treats the plasma ions by a fully three-dimensional particle-in-cell method

  11. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Bae, Jun Ho; Park, Joo Hwan

    2010-01-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect the detailed shape of rod bundle on the numerical computation due to a lot of computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers, bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve the complex geometry such as a fuel rod bundle. In front of applying the method to the problem of 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to the simple geometry. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for the future works

  12. A Preliminary Field Test of an Employee Work Passion Model

    Science.gov (United States)

    Zigarmi, Drea; Nimon, Kim; Houson, Dobie; Witt, David; Diehl, Jim

    2011-01-01

    Four dimensions of a process model for the formulation of employee work passion, derived from Zigarmi, Nimon, Houson, Witt, and Diehl (2009), were tested in a field setting. A total of 447 employees completed questionnaires that assessed the internal elements of the model in a corporate work environment. Data from the measurements of work affect,…

  13. A Preliminary Model of Insider Theft of Intellectual Property

    Science.gov (United States)

    2011-06-01

    insider IT sabotage [Moore 2008] [Cappelli 2006]. The primary personality model used in CWB research is the Five Factor Model ( FFM ). The FFM includes... FFM dimensions and CWBs, Salgado found 44 studies conducted between 1990 and 1999 that examine the relationship between the FFM dimensions and deviant

  14. Reproduction of the Yucca Mountain Project TSPA-LA Uncertainty and Sensitivity Analyses and Preliminary Upgrade of Models

    Energy Technology Data Exchange (ETDEWEB)

    Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis; Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis

    2016-09-01

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.

  15. Editorial: Modelling and computational challenges in granular materials

    OpenAIRE

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss the current progress and latest advancements in the field of advanced numerical methods and modelling of granular materials. The focus will be on computational methods, improved algorithms and the m...

  16. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. The Preliminary Study for Numerical Computation of 37 Rod Bundle in CANDU Reactor

    International Nuclear Information System (INIS)

    Jeon, Yu Mi; Park, Joo Hwan

    2010-09-01

    A typical CANDU 6 fuel bundle consists of 37 fuel rods supported by two endplates and separated by spacer pads at various locations. In addition, the bearing pads are brazed to each outer fuel rod with the aim of reducing the contact area between the fuel bundle and the pressure tube. Although the recent progress of CFD methods has provided opportunities for computing the thermal-hydraulic phenomena inside of a fuel channel, it is yet impossible to reflect numerical computations on the detailed shape of rod bundle due to challenges with computing mesh and memory capacity. Hence, the previous studies conducted a numerical computation for smooth channels without considering spacers and bearing pads. But, it is well known that these components are an important factor to predict the pressure drop and heat transfer rate in a channel. In this study, the new computational method is proposed to solve complex geometry such as a fuel rod bundle. Before applying a solution to the problem of the 37 rod bundle, the validity and the accuracy of the method are tested by applying the method to simple geometry. The split channel method has been proposed with the aim of computing the fully shaped CANDU fuel channel with detailed components. The validity was tested by applying the method to the single channel problem. The average temperature have similar values for the considered two methods, while the local temperature shows a slight difference by the effect of conduction heat transfer in the solid region of a rod. Based on the present result, the calculation for the fully shaped 37-rod bundle is scheduled for future work

  18. Elements of matrix modeling and computing with Matlab

    CERN Document Server

    White, Robert E

    2006-01-01

    As discrete models and computing have become more common, there is a need to study matrix computation and numerical linear algebra. Encompassing a diverse mathematical core, Elements of Matrix Modeling and Computing with MATLAB examines a variety of applications and their modeling processes, showing you how to develop matrix models and solve algebraic systems. Emphasizing practical skills, it creates a bridge from problems with two and three variables to more realistic problems that have additional variables. Elements of Matrix Modeling and Computing with MATLAB focuses on seven basic applicat

  19. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  20. Preliminary study on the modelling of negative leader discharges

    International Nuclear Information System (INIS)

    Arevalo, L; Cooray, V

    2011-01-01

    Nowadays, there is considerable interest in understanding the physics underlying positive and negative discharges because of the importance of improving lightning protection systems and of coordinating the insulation for high voltages. Numerical simulations of positive switching impulses made in long spark gaps in a laboratory are achievable because the physics of the process is reasonably well understood and because of the availability of powerful computational methods. However, the existing work on the simulation of negative switching discharges has been held up by a lack of experimental data and the absence of a full understanding of the physics involved. In the scientific community, it is well known that most of the lightning discharges that occur in nature are of negative polarity, and because of their complexity, the only way to understand them is to generate the discharges in laboratories under controlled conditions. The voltage impulse waveshape used in laboratories is a negative switching impulse. With the aim of applying the available information to a self-consistent physical method, an electrostatic approximation of the negative leader discharge process is presented here. The simulation procedure takes into consideration the physics of positive and negative discharges, considering that the negative leader propagates towards a grounded electrode and the positive leader towards a rod electrode. The simulation considers the leader channel to be thermodynamic, and assumes that the conditions required to generate a thermal channel are the same for positive and negative leaders. However, the magnitude of the electrical charge necessary to reproduce their propagation and thermalization is different, and both values are based on experimental data. The positive and negative streamer development is based on the constant electric field characteristics of these discharges, as found during experimental measurements made by different authors. As a computational tool

  1. A cost modelling system for cloud computing

    OpenAIRE

    Ajeh, Daniel; Ellman, Jeremy; Keogh, Shelagh

    2014-01-01

    An advance in technology unlocks new opportunities for organizations to increase their productivity, efficiency and process automation while reducing the cost of doing business as well. The emergence of cloud computing addresses these prospects through the provision of agile systems that are scalable, flexible and reliable as well as cost effective. Cloud computing has made hosting and deployment of computing resources cheaper and easier with no up-front charges but pay per-use flexible payme...

  2. International Nuclear Model personal computer (PCINM): Model documentation

    International Nuclear Information System (INIS)

    1992-08-01

    The International Nuclear Model (INM) was developed to assist the Energy Information Administration (EIA), U.S. Department of Energy (DOE) in producing worldwide projections of electricity generation, fuel cycle requirements, capacities, and spent fuel discharges from commercial nuclear reactors. The original INM was developed, maintained, and operated on a mainframe computer system. In spring 1992, a streamlined version of INM was created for use on a microcomputer utilizing CLIPPER and PCSAS software. This new version is known as PCINM. This documentation is based on the new PCINM version. This document is designed to satisfy the requirements of several categories of users of the PCINM system including technical analysts, theoretical modelers, and industry observers. This document assumes the reader is familiar with the nuclear fuel cycle and each of its components. This model documentation contains four chapters and seven appendices. Chapter Two presents the model overview containing the PCINM structure and process flow, the areas for which projections are made, and input data and output reports. Chapter Three presents the model technical specifications showing all model equations, algorithms, and units of measure. Chapter Four presents an overview of all parameters, variables, and assumptions used in PCINM. The appendices present the following detailed information: variable and parameter listings, variable and equation cross reference tables, source code listings, file layouts, sample report outputs, and model run procedures. 2 figs

  3. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  4. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  5. Regenerating computer model of the thymus

    International Nuclear Information System (INIS)

    Lumb, J.R.

    1975-01-01

    This computer model simulates the cell population kinetics of the development and later degeneration of the thymus. Nutritional factors are taken into account by the growth of blood vessels in the simulated thymus. The stem cell population is kept at its maximum by allowing some stem cells to divide into two stem cells until the population reaches its maximum, thus regenerating the thymus after an insult such as irradiation. After a given number of population doublings the maximum allowed stem cell population is gradually decreased in order to simulate the degeneration of the thymus. Results show that the simulated thymus develops and degenerates in a pattern similar to that of the natural thymus. This simulation is used to evaluate cellular kinetic data for the the thymus. The results from testing the internal consistency of available data are reported. The number of generations which most represents the natural thymus includes seven dividing generations of lymphocytes and one mature, nondividing generation of small lymphocytes. The size of the resulting developed thymus can be controlled without affecting other variables by changing the maximum stem cell population allowed. In addition, recovery from irradiation is simulated

  6. Computational modeling of epidural cortical stimulation

    Science.gov (United States)

    Wongsarnpigoon, Amorn; Grill, Warren M.

    2008-12-01

    Epidural cortical stimulation (ECS) is a developing therapy to treat neurological disorders. However, it is not clear how the cortical anatomy or the polarity and position of the electrode affects current flow and neural activation in the cortex. We developed a 3D computational model simulating ECS over the precentral gyrus. With the electrode placed directly above the gyrus, about half of the stimulus current flowed through the crown of the gyrus while current density was low along the banks deep in the sulci. Beneath the electrode, neurons oriented perpendicular to the cortical surface were depolarized by anodic stimulation, and neurons oriented parallel to the boundary were depolarized by cathodic stimulation. Activation was localized to the crown of the gyrus, and neurons on the banks deep in the sulci were not polarized. During regulated voltage stimulation, the magnitude of the activating function was inversely proportional to the thickness of the CSF and dura. During regulated current stimulation, the activating function was not sensitive to the thickness of the dura but was slightly more sensitive than during regulated voltage stimulation to the thickness of the CSF. Varying the width of the gyrus and the position of the electrode altered the distribution of the activating function due to changes in the orientation of the neurons beneath the electrode. Bipolar stimulation, although often used in clinical practice, reduced spatial selectivity as well as selectivity for neuron orientation.

  7. Geometric modeling for computer aided design

    Science.gov (United States)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  8. Review of computational thermal-hydraulic modeling

    International Nuclear Information System (INIS)

    Keefer, R.H.; Keeton, L.W.

    1995-01-01

    Corrosion of heat transfer tubing in nuclear steam generators has been a persistent problem in the power generation industry, assuming many different forms over the years depending on chemistry and operating conditions. Whatever the corrosion mechanism, a fundamental understanding of the process is essential to establish effective management strategies. To gain this fundamental understanding requires an integrated investigative approach that merges technology from many diverse scientific disciplines. An important aspect of an integrated approach is characterization of the corrosive environment at high temperature. This begins with a thorough understanding of local thermal-hydraulic conditions, since they affect deposit formation, chemical concentration, and ultimately corrosion. Computational Fluid Dynamics (CFD) can and should play an important role in characterizing the thermal-hydraulic environment and in predicting the consequences of that environment,. The evolution of CFD technology now allows accurate calculation of steam generator thermal-hydraulic conditions and the resulting sludge deposit profiles. Similar calculations are also possible for model boilers, so that tests can be designed to be prototypic of the heat exchanger environment they are supposed to simulate. This paper illustrates the utility of CFD technology by way of examples in each of these two areas. This technology can be further extended to produce more detailed local calculations of the chemical environment in support plate crevices, beneath thick deposits on tubes, and deep in tubesheet sludge piles. Knowledge of this local chemical environment will provide the foundation for development of mechanistic corrosion models, which can be used to optimize inspection and cleaning schedules and focus the search for a viable fix

  9. Computer aided instruction. Preliminary experience in the Radiological Sciences Institute of the University of Milan

    International Nuclear Information System (INIS)

    Gardani, G.; Bertoli, M.A.; Bellomi, M.

    1987-01-01

    Computerised instruction means teaching by computer using a program that alternates information with self-checking multiple choice questions. This system was used to create a fully computerized lesson on the diagnosis and treatment of breast cancer which was then tested on a small group of madical students attending the Radiology School of the Milan University Institute of Radiological Sciences. At the end of the test, the students were asked to complete a questionnaire which was then analysed. The computer lesson consisted of 66 text messages and 21 self-checking questions. It aroused considerable interest, though the most common reason was curiosity about a novel system. The degree of fatigue caused was modest despite the fact that the computer lesson was at least as demanding as a traditional lesson, if not more so. The level of learning was considered high and optimised by the use of self-checking questions that were considered an essential element. However no student agreed to sit an official examination, even interactively, using the computer

  10. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  11. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs

  12. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage

  13. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. IEA-ETSAP TIMES models in Denmark. Preliminary edition

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, P.E.

    2011-03-15

    This report presents the project 'Danish participation in IEAETSAP, Annex XI, 2008-2010', which continued the Danish participation in ETSAP under Annex XI 'JOint STudies for New And Mitigated Energy Systems (JOSTNAMES): Climate friendly, Secure and Productive Energy Systems'. The main activity has been semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). Contributions to these workshops have been based on various collaborative projects within the EU research programmes and the Danish Centre for Environment, Energy and Health (CEEH). In addition, the DTU Climate Centre at Risoe, which was founded in the autumn of 2008, has taken part in the ETSAP workshops, and used the ETSAP model tools for projects, papers, and presentations, as well as for a Ph.D. project. (Author)

  15. RHF RELAP5 model and preliminary loss-of-offsite-power simulation results for LEU conversion

    Energy Technology Data Exchange (ETDEWEB)

    Licht, J. R. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Bergeron, A. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Dionne, B. [Argonne National Laboratory (ANL), Argonne, IL (United States). Nuclear Engineering Div.; Thomas, F. [Institut Laue-Langevin (ILL), Grenoble (Switzerland). RHF Reactor Dept.

    2014-08-01

    The purpose of this document is to describe the current state of the RELAP5 model for the Institut Laue-Langevin High Flux Reactor (RHF) located in Grenoble, France, and provide an update to the key information required to complete, for example, simulations for a loss of offsite power (LOOP) accident. A previous status report identified a list of 22 items to be resolved in order to complete the RELAP5 model. Most of these items have been resolved by ANL and the RHF team. Enough information was available to perform preliminary safety analyses and define the key items that are still required. Section 2 of this document describes the RELAP5 model of RHF. The final part of this section briefly summarizes previous model issues and resolutions. Section 3 of this document describes preliminary LOOP simulations for both HEU and LEU fuel at beginning of cycle conditions.

  16. Computational Intelligence Agent-Oriented Modelling

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2006-01-01

    Roč. 5, č. 2 (2006), s. 430-433 ISSN 1109-2777 R&D Projects: GA MŠk 1M0567 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * adaptive agents * computational intelligence Subject RIV: IN - Informatics, Computer Science

  17. Computer modelling the potential benefits of amines in NPP Bohunice secondary circuit

    International Nuclear Information System (INIS)

    Fountain, M.J.; Smiesko, I.

    1998-01-01

    The use of computer modelling of PWR and WWER secondary circuit chemistry was already demonstrated in the past. The model was used to illustrate the technical and economic advantages, compared with ammonia, of using an 'advanced', high basicity, low volatility amines to raise the liquid phase pH(T) in the moisture separator and other areas swept by wet steam. Since the 1995, this technique has been successfully applied to a number of power plants and the computer model has been progressively developed. This paper describes the preliminary results of an ongoing assessment being carried out for the VVER 440 plants at Bohunice. The work for Bohunice is being funded by the 'Know How Fund', a department in the British Government's Foreign and Commonwealth Office. (J.P.N.)

  18. Preliminary Design and Computational Fluid Dynamics Analysis of Supercritical Carbon Dioxide Turbine Blade

    International Nuclear Information System (INIS)

    Jeong, Wi S.; Kim, Tae W.; Suh, Kune Y.

    2007-01-01

    The supercritical gas turbine Brayton cycle has been adopted in the secondary loop of the Generation IV Nuclear Energy Systems, and planned to be installed in power conversion cycles of the nuclear fusion reactors as well. The supercritical carbon dioxide (SCO 2 ) is one of widely considered fluids for this concept. The potential beneficiaries include the Secure Transportable Autonomous Reactor- Liquid Metal (STAR-LM), the Korea Advanced Liquid Metal Reactor (KALIMER) and Battery Omnibus Reactor Integral System (BORIS) which is being developed at the Seoul National University. The reason for these welcomed applications is that the SCO 2 Brayton cycle can achieve higher overall energy conversion efficiency than the steam turbine Rankine cycle. Seoul National University has recently been working on the SCO 2 based Modular Optimized Brayton Integral System (MOBIS). The MOBIS design power conversion efficiency is about 45%. Gas turbine design is crucial part in achieving this high efficiency. In this paper, the preliminary analysis on first stage of gas turbine was performed using CFX as a solver

  19. Understanding Creative Design Processes by Integrating Sketching and CAD Modelling Design Environments: A Preliminary Protocol Result from Architectural Designers

    Directory of Open Access Journals (Sweden)

    Yi Teng Shih

    2015-11-01

    Full Text Available This paper presents the results of a preliminary protocol study of the cognitive behaviour of architectural designers during the design process. The aim is to better understand the similarities and differences in cognitive behaviour using Sequential Mixed Media (SMM and Alternative Mixed Media (AMM approaches, and how switching between media may impact on design processes. Two participants with at least one-year’s professional design experience and a Bachelor of Design degree, and competence in both sketching and computer-aid design (CAD modelling participated in the study. Video recordings of participants working on different projects were coded using the Function-Behaviour-Structure (FBS coding scheme. Participants were also interviewed and their explanations about their switching behaviours were categorised into three types: S→C, S/C↹R and C→S. Preliminary results indicate that switching between media may influence how designers identify problems and develop solutions. In particular, two design issues were identified.  These relate to the FBS coding scheme, where structure (S and behaviour derived from structure (Bs, change to documentation (D after switching from sketching to CAD modelling (S→C. These switches make it possible for designers to integrate both approaches into one design medium and facilitate their design processes in AMM design environments.

  20. Business Model Innovation in European SMEs: some preliminary findings

    NARCIS (Netherlands)

    Bouwman, W.A.G.A.; Molina Castillo, F.J.; de Reuver, G.A.

    2016-01-01

    Business Models have been on the research agenda since the emergence of ecommerce and ebusiness in late last century. Although a lot of attention has been paid to the concept, ontologies, taxonomies and approach in the field of strategic management, information systems, digital business and

  1. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  2. Deployment Models: Towards Eliminating Security Concerns From Cloud Computing

    OpenAIRE

    Zhao, Gansen; Chunming, Rong; Jaatun, Martin Gilje; Sandnes, Frode Eika

    2010-01-01

    Cloud computing has become a popular choice as an alternative to investing new IT systems. When making decisions on adopting cloud computing related solutions, security has always been a major concern. This article summarizes security concerns in cloud computing and proposes five service deployment models to ease these concerns. The proposed models provide different security related features to address different requirements and scenarios and can serve as reference models for deployment. D...

  3. Cloud Computing Adoption Business Model Factors: Does Enterprise Size Matter?

    OpenAIRE

    Bogataj Habjan, Kristina; Pucihar, Andreja

    2017-01-01

    This paper presents the results of research investigating the impact of business model factors on cloud computing adoption. The introduced research model consists of 40 cloud computing business model factors, grouped into eight factor groups. Their impact and importance for cloud computing adoption were investigated among enterpirses in Slovenia. Furthermore, differences in opinion according to enterprise size were investigated. Research results show no statistically significant impacts of in...

  4. Cone-Beam Computed Tomography Evaluation of Mental Foramen Variations: A Preliminary Study

    International Nuclear Information System (INIS)

    Sheikhi, Mahnaz; Karbasi Kheir, Mitra; Hekmatian, Ehsan

    2015-01-01

    Background. Mental foramen is important in surgical operations of premolars because it transfers the mental nerves and vessels. This study evaluated the variations of mental foramen by cone-beam computed tomography among a selected Iranian population. Materials and Methods. A total number of 180 cone-beam computed tomography projections were analyzed in terms of shape, size, direction, and horizontal and vertical positions of mental foramen in the right and left sides. Results. The most common shape was oval, opening direction was posterior-superior, horizontal position was in line with second premolar, and vertical position was apical to the adjacent dental root. The mean of foremen diameter was 3.59 mm. Conclusion. In addition to the most common types of mental foramen, other variations exist, too. Hence, it reflects the significance of preoperative radiographic examinations, especially 3-dimensional images to prevent nerve damage

  5. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    International Nuclear Information System (INIS)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N.

    1991-01-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy

  6. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  7. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  8. A preliminary model for posttraumatic brain injury depression.

    Science.gov (United States)

    Malec, James F; Brown, Allen W; Moessner, Anne M; Stump, Timothy E; Monahan, Patrick

    2010-07-01

    To develop, based on previous research, and evaluate a model for depression after traumatic brain injury (TBI). Cross-sectional structural equation modeling (SEM) of data from consecutively recruited patients. Acute hospital and inpatient rehabilitation units. Adult patients (N=158) after hospital admission for moderate to severe TBI. Not applicable. External appraisal of ability in participants was measured by the Mayo-Portland Adaptability Inventory (MPAI-4) Ability Index completed by a TBI clinical nurse specialist. Patient self-appraisal of post-TBI ability and depression were measured by the Awareness Questionnaire and Beck Depression Inventory-II. Functional outcome 1 year after injury was assessed with the MPAI-4 Participation Index. Successive SEM resulted in a parsimonious model with excellent fit. Consistent with prior research, a moderately strong association between self-appraisal of post-TBI ability and depression was found. Injury severity, as measured by the duration of posttraumatic amnesia (PTA), was not significantly associated with post-TBI depression. The 1-year functional outcome was associated with depression and TBI severity. The strong association between self-appraisal of post-TBI ability and depression is consistent with the cognitive-behavioral model of depression and recommends consideration and further study of cognitive-behavioral therapy for post-TBI depression. The lack of association between TBI severity and depression may represent the indirect and proxy nature of current measures of TBI severity such as PTA. Emerging neuroimaging techniques (eg, diffusion tensor imaging, magnetic resonance imaging spectroscopy) may provide the more direct measures of disruption of brain function after TBI that are needed to advance this line of research. Copyright 2010 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. A Preliminary Model of Insider Theft of Intellectual Property

    Science.gov (United States)

    2011-01-01

    social, and technical factors. We expect future work to use modeling and simulation to identify and evaluate the effectiveness of deterrent measures in...umentation tools and has provided solutions across the energy, consumer business, health care, communications, publishing, manufacturing, media...received his B.A. from Colgate University and his Masters and Ph.D. in Psychology from Duke University. He per- formed a fellowship in Psychology in the

  10. Vascularized anal autotransplantation model in rats: preliminary report.

    Science.gov (United States)

    Araki, J; Mihara, M; Narushima, M; Iida, T; Sato, T; Koshima, I

    2011-11-01

    Ostomy has served as an effective surgery for various anorectal disfunctions. However, it must also be noted that those patients suffered greatly from stresses caused by their stoma. Many alternative therapies have been developed, but none have solved this critical issue. Meanwhile, due to the improvements in operative methods and immunosuppressive therapy, allotranplantation has gained great popularity in recent years. Therefore, we began development of an anal transplantation model. The operation was performed in six adult Wistar rats that were divided into two groups. Group 1 underwent vascular anastomoses, while group 2 did not Group 1 grafts survived, fully recovering anal function. However, many of the group 2 grafts did not survive; those that did survive showed major defects in their anus, never recovering anal function. We succeeded in establishing the rat anal transplantation model utilizing super-microsurgery. While research in anal transplantation was behind compared to that in other fields, we hope that this model will bring significant possibilities for the future. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Preliminary results of very fast computation of Moment Magnitude and focal mechanism in the context of tsunami warning

    Science.gov (United States)

    Schindelé, François; Roch, Julien; Rivera, Luis

    2015-04-01

    Various methodologies were recently developed to compute the moment magnitude and the focal mechanism, thanks to the real time access to numerous broad-band seismic data. Several methods were implemented at the CENALT, in particular the W-Phase method developed by H. Kanamori and L. Rivera. For earthquakes of magnitudes in the range 6.5-9.0, this method provides accurate results in less than 40 minutes. The context of the tsunami warning in Mediterranean, a small basin impacted in less than one hour, and with small sources but some with high tsunami potential (Boumerdes 2003), a comprehensive tsunami warning system in that region should include very fast computation of the seismic parameters. The results of the values of Mw, the focal depth and the type of fault (reverse, normal, strike-slip) are the most relevant parameters expected for the tsunami warning. Preliminary results will be presented using data in the North-eastern and Mediterranean region for the recent period 2010-2014. This work is funded by project ASTARTE - - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839

  12. Editorial: Modelling and computational challenges in granular materials

    NARCIS (Netherlands)

    Weinhart, Thomas; Thornton, Anthony Richard; Einav, Itai

    2015-01-01

    This is the editorial for the special issue on “Modelling and computational challenges in granular materials” in the journal on Computational Particle Mechanics (CPM). The issue aims to provide an opportunity for physicists, engineers, applied mathematicians and computational scientists to discuss

  13. Security Issues Model on Cloud Computing: A Case of Malaysia

    OpenAIRE

    Komeil Raisian; Jamaiah Yahaya

    2015-01-01

    By developing the cloud computing, viewpoint of many people regarding the infrastructure architectures, software distribution and improvement model changed significantly. Cloud computing associates with the pioneering deployment architecture, which could be done through grid calculating, effectiveness calculating and autonomic calculating. The fast transition towards that, has increased the worries regarding a critical issue for the effective transition of cloud computing. From the security v...

  14. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    Science.gov (United States)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  15. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  16. A Comparison between the Occurrence of Pauses, Repetitions and Recasts under Conditions of Face-to-Face and Computer-Mediated Communication: A Preliminary Study

    Science.gov (United States)

    Cabaroglu, Nese; Basaran, Suleyman; Roberts, Jon

    2010-01-01

    This study compares pauses, repetitions and recasts in matched task interactions under face-to-face and computer-mediated conditions. Six first-year English undergraduates at a Turkish University took part in Skype-based voice chat with a native speaker and face-to-face with their instructor. Preliminary quantitative analysis of transcripts showed…

  17. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  18. Preliminary study on enhancing waste management best practice model in Malaysia construction industry

    Science.gov (United States)

    Jamaludin, Amril Hadri; Karim, Nurulzatushima Abdul; Noor, Raja Nor Husna Raja Mohd; Othman, Nurulhidayah; Malik, Sulaiman Abdul

    2017-08-01

    Construction waste management (CWM) is the practice of minimizing and diverting construction waste, demolition debris, and land-clearing debris from disposal and redirecting recyclable resources back into the construction process. Best practice model means best choice from the collection of other practices that was built for purpose of construction waste management. The practice model can help the contractors in minimizing waste before the construction activities will be started. The importance of minimizing wastage will have direct impact on time, cost and quality of a construction project. This paper is focusing on the preliminary study to determine the factors of waste generation in the construction sites and identify the effectiveness of existing construction waste management practice conducted in Malaysia. The paper will also include the preliminary works of planned research location, data collection method, and analysis to be done by using the Analytical Hierarchy Process (AHP) to help in developing suitable waste management best practice model that can be used in the country.

  19. Preliminary conceptual model for mineral evolution in Yucca Mountain

    International Nuclear Information System (INIS)

    Duffy, C.J.

    1993-12-01

    A model is presented for mineral alteration in Yucca Mountain, Nevada, that suggests that the mineral transformations observed there are primarily controlled by the activity of aqueous silica. The rate of these reactions is related to the rate of evolution of the metastable silica polymorphs opal-CT and cristobalite assuming that a SiO 2(aq) is fixed at the equilibrium solubility of the most soluble silica polymorph present. The rate equations accurately predict the present depths of disappearance of opal-CT and cristobalite. The rate equations have also been used to predict the extent of future mineral alteration that may result from emplacement of a high-level nuclear waste repository in Yucca Mountain. Relatively small changes in mineralogy are predicted, but these predictions are based on the assumption that emplacement of a repository would not increase the pH of water in Yucca Mountain nor increase its carbonate content. Such changes may significantly increase mineral alteration. Some of the reactions currently occurring in Yucca Mountain consume H + and CO 3 2- . Combining reaction rate models for these reactions with water chemistry data may make it possible to estimate water flux through the basal vitrophyre of the Topopah Spring Member and to help confirm the direction and rate of flow of groundwater in Yucca Mountain

  20. Preliminary time-phased TWRS process model results

    International Nuclear Information System (INIS)

    Orme, R.M.

    1995-01-01

    This report documents the first phase of efforts to model the retrieval and processing of Hanford tank waste within the constraints of an assumed tank farm configuration. This time-phased approach simulates a first try at a retrieval sequence, the batching of waste through retrieval facilities, the batching of retrieved waste through enhanced sludge washing, the batching of liquids through pretreatment and low-level waste (LLW) vitrification, and the batching of pretreated solids through high-level waste (HLW) vitrification. The results reflect the outcome of an assumed retrieval sequence that has not been tailored with respect to accepted measures of performance. The batch data, composition variability, and final waste volume projects in this report should be regarded as tentative. Nevertheless, the results provide interesting insights into time-phased processing of the tank waste. Inspection of the composition variability, for example, suggests modifications to the retrieval sequence that will further improve the uniformity of feed to the vitrification facilities. This model will be a valuable tool for evaluating suggested retrieval sequences and establishing a time-phased processing baseline. An official recommendation on tank retrieval sequence will be made in September, 1995

  1. Feature Extraction on Brain Computer Interfaces using Discrete Dyadic Wavelet Transform: Preliminary Results

    International Nuclear Information System (INIS)

    Gareis, I; Gentiletti, G; Acevedo, R; Rufiner, L

    2011-01-01

    The purpose of this work is to evaluate different feature extraction alternatives to detect the event related evoked potential signal on brain computer interfaces, trying to minimize the time employed and the classification error, in terms of sensibility and specificity of the method, looking for alternatives to coherent averaging. In this context the results obtained performing the feature extraction using discrete dyadic wavelet transform using different mother wavelets are presented. For the classification a single layer perceptron was used. The results obtained with and without the wavelet decomposition were compared; showing an improvement on the classification rate, the specificity and the sensibility for the feature vectors obtained using some mother wavelets.

  2. The emerging role of cloud computing in molecular modelling.

    Science.gov (United States)

    Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W

    2013-07-01

    There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Reduced order methods for modeling and computational reduction

    CERN Document Server

    Rozza, Gianluigi

    2014-01-01

    This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics.  Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This...

  4. Preliminary analysis of a 1:4 scale prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Luk, V.K.; Hessheimer, M.F.

    1997-01-01

    Sandia National Laboratories is conducting a research program to investigate the integrity of nuclear containment structures. As part of the program Sandia will construct an instrumented 1:4 scale model of a prestressed concrete containment vessel (PCCV) for pressurized water reactors (PWR), which will be pressure tested up to its ultimate capacity. One of the key program objectives is to develop validated methods to predict the structural performance of containment vessels when subjected to beyond design basis loadings. Analytical prediction of structural performance requires a stepwise, systematic approach that addresses all potential failure modes. The analysis effort includes two and three-dimensional nonlinear finite element analyses of the PCCV test model to evaluate its structural performance under very high internal pressurization. Such analyses have been performed using the nonlinear concrete constitutive model, ANACAP-U, in conjunction with the ABAQUS general purpose finite element code. The analysis effort is carried out in three phases: preliminary analysis; pretest prediction; and post-test data interpretation and analysis evaluation. The preliminary analysis phase serves to provide instrumentation support and identify candidate failure modes. The associated tasks include the preliminary prediction of failure pressure and probable failure locations and the development of models to be used in the detailed failure analyses. This paper describes the modeling approaches and some of the results obtained in the first phase of the analysis effort

  5. Dynamic density functional theory of solid tumor growth: Preliminary models

    Directory of Open Access Journals (Sweden)

    Arnaud Chauviere

    2012-03-01

    Full Text Available Cancer is a disease that can be seen as a complex system whose dynamics and growth result from nonlinear processes coupled across wide ranges of spatio-temporal scales. The current mathematical modeling literature addresses issues at various scales but the development of theoretical methodologies capable of bridging gaps across scales needs further study. We present a new theoretical framework based on Dynamic Density Functional Theory (DDFT extended, for the first time, to the dynamics of living tissues by accounting for cell density correlations, different cell types, phenotypes and cell birth/death processes, in order to provide a biophysically consistent description of processes across the scales. We present an application of this approach to tumor growth.

  6. Reconstruction for interior region-of-interest inverse geometry computed tomography: preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Yoon, Do Kun; Suh, Tae Suk [Dept. of Biomedical Engineering, Research Institute of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of); Kang, Seong Hee [Dept. of Radiation Oncology, Seoul National University Hospital, Seoul (Korea, Republic of); Cho, Min Seok [Dept. of Radiation Oncology, Asan Medical Center, Seoul (Korea, Republic of); Noh, Yu Yoon [Dept. of Radiation Oncology, Eulji University Hospital, Daejeon (Korea, Republic of)

    2017-04-15

    The inverse geometry computed tomography (IGCT) composed of multiple source and small size detector has several merits such as reduction of scatter effect and large volumetric imaging within one rotation without cone-beam artifact, compared to conventional cone-beam computed tomography (CBCT). By using this multi-source characteristics, we intend to present a selective and multiple interior region-of-interest (ROI) imaging method by using a designed source on-off sequence of IGCT. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy. ROI-IGCT showed comparable image quality and has the capability to provide multi ROI image within a rotation. Projection of ROI-IGCT is performed by selective irradiation, hence unnecessary imaging dose to non-interest region can be reduced. In this regard, it seems to be useful for diagnostic or image guidance for radiotherapy.

  7. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  8. Use of Computational Modeling to Evaluate Hypotheses About the Molecular and Cellular Mechanisms of Bystander Effects

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Yuchao; Conolly, Rory B; Andersen, Melvin E.

    2006-11-21

    This report describes the development of a computational systems biology approach to evaluate the hypotheses of molecular and cellular mechanisms of adaptive response to low dose ionizing radiation. Our concept is that computational models of signaling pathways can be developed and linked to biologically based dose response models to evaluate the underlying molecular mechanisms which lead to adaptive response. For development of quantitatively accurate, predictive models, it will be necessary to describe tissues consisting of multiple cell types where the different types each contribute in their own way to the overall function of the tissue. Such a model will probably need to incorporate not only cell type-specific data but also spatial information on the architecture of the tissue and on intercellular signaling. The scope of the current model was more limited. Data obtained in a number of different biological systems were synthesized to describe a chimeric, “average” population cell. Biochemical signaling pathways involved in sensing of DNA damage and in the activation of cell cycle checkpoint controls and the apoptotic path were also included. As with any computational modeling effort, it was necessary to develop these simplified initial descriptions (models) that can be iteratively refined. This preliminary model is a starting point which, with time, can evolve to a level of refinement where large amounts of detailed biological information are synthesized and a capability for robust predictions of dose- and time-response behaviors is obtained.

  9. Computing ordinary least-squares parameter estimates for the National Descriptive Model of Mercury in Fish

    Science.gov (United States)

    Donato, David I.

    2013-01-01

    A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.

  10. Single photon emission computed tomography study of human pulmonary perfusion: preliminary findings

    Energy Technology Data Exchange (ETDEWEB)

    Carratu, L; Sofia, M [Naples Univ. (Italy). Facolta di Medicina e Chirurgia; Salvatore, M; Muto, P; Ariemma, G [Istituto Nazionale per la Prevenzione, Lo Studio e La Cura dei Tumori Fondazione Pascale, Naples (Italy); Lopez-Majano, V [Cook County Hospital, Chicago, IL (USA). Nuclear Medicine Div.

    1984-02-01

    Single photon emission computed tomography (SPECT) was performed with /sup 99/Tcsup(m)-albumin macroaggregates to study human pulmonary perfusion in healthy subjects and patients with respiratory diseases such as chronic obstructive pulmonary disease (COPD) and lung neoplasms. The reconstructed SPECT data was displayed in coronal, transverse, sagittal plane sections and compared to conventional perfusion scans. The SPECT data gave more complicated anatomical information about the extent of damage and morphology of the pulmonary vascular bed. In healthy subjects and COPD patients, qualitative and quantitative assessment of pulmonary perfusion could be obtained from serial SPECT scans with respect to distribution and relative concentration of the injected radiopharmaceutical. Furthermore, SPECT of pulmonary perfusion has been useful in detecting the extent of damage to the pulmonary circulation. This is useful for the preoperative evaluation and staging of lung cancer.

  11. Skin lesions diagnostics by on diffuse reflection spectres using computational algorithms: a preliminary study

    International Nuclear Information System (INIS)

    Orozco-Guillen, E.E.; Delgado-Atencio, J.A.; Vazquez-Montiel, S.; Castro-Ramos, J.; Villanueva-Luna, E.; Gutierrez-Delgado, F.

    2009-01-01

    The determination of diffuse reflection spectrum on human skin in the spectral range from 400nm-1000nm using an optical fiber spectrometers is a non-invasive technique widely used to study the optical parameters of this tissue, provides information about the absorption and scattering properties of light that can be employed to study the morphology and physiology of the tissue and to detect and diagnose skin diseases in early stages. In this paper a computational algorithm for the selection of the most important attributes of diffuse reflection spectra of human skin obtained with an experimental system that basically consists of a spectrometer, a white light source and bifurcated fiber optic probe that allows send and collect light. To classify the spectral signal was designed a Matlab2006 graphical interface which use support vector machines and algorithm for selecting attributes that allows to achieve a sensitivity and specificity exceeding 80% and 85% of accuracy in the classification. (Author)

  12. Ocean Modeling and Visualization on Massively Parallel Computer

    Science.gov (United States)

    Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.

    1997-01-01

    Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.

  13. Chess games: a model for RNA based computation.

    Science.gov (United States)

    Cukras, A R; Faulhammer, D; Lipton, R J; Landweber, L F

    1999-10-01

    Here we develop the theory of RNA computing and a method for solving the 'knight problem' as an instance of a satisfiability (SAT) problem. Using only biological molecules and enzymes as tools, we developed an algorithm for solving the knight problem (3 x 3 chess board) using a 10-bit combinatorial pool and sequential RNase H digestions. The results of preliminary experiments presented here reveal that the protocol recovers far more correct solutions than expected at random, but the persistence of errors still presents the greatest challenge.

  14. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  15. Airfoil Computations using the γ - Reθ Model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.

    computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64- 018, NACA64-218, NACA64-418 and NACA64-618 and the results...

  16. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  17. Computational intelligence applications in modeling and control

    CERN Document Server

    Vaidyanathan, Sundarapandian

    2015-01-01

    The development of computational intelligence (CI) systems was inspired by observable and imitable aspects of intelligent activity of human being and nature. The essence of the systems based on computational intelligence is to process and interpret data of various nature so that that CI is strictly connected with the increase of available data as well as capabilities of their processing, mutually supportive factors. Developed theories of computational intelligence were quickly applied in many fields of engineering, data analysis, forecasting, biomedicine and others. They are used in images and sounds processing and identifying, signals processing, multidimensional data visualization, steering of objects, analysis of lexicographic data, requesting systems in banking, diagnostic systems, expert systems and many other practical implementations. This book consists of 16 contributed chapters by subject experts who are specialized in the various topics addressed in this book. The special chapters have been brought ...

  18. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  19. MSFC Stream Model Preliminary Results: Modeling Recent Leonid and Perseid Encounters

    Science.gov (United States)

    Cooke, William J.; Moser, Danielle E.

    2004-01-01

    The cometary meteoroid ejection model of Jones and Brown (1996b) was used to simulate ejection from comets 55P/Tempel-Tuttle during the last 12 revolutions, and the last 9 apparitions of 109P/Swift-Tuttle. Using cometary ephemerides generated by the Jet Propulsion Laboratory s (JPL) HORIZONS Solar System Data and Ephemeris Computation Service, two independent ejection schemes were simulated. In the first case, ejection was simulated in 1 hour time steps along the comet s orbit while it was within 2.5 AU of the Sun. In the second case, ejection was simulated to occur at the hour the comet reached perihelion. A 4th order variable step-size Runge-Kutta integrator was then used to integrate meteoroid position and velocity forward in time, accounting for the effects of radiation pressure, Poynting-Robertson drag, and the gravitational forces of the planets, which were computed using JPL s DE406 planetary ephemerides. An impact parameter was computed for each particle approaching the Earth to create a flux profile, and the results compared to observations of the 1998 and 1999 Leonid showers, and the 1993 and 2004 Perseids.

  20. Assessing Internet addiction using the parsimonious Internet addiction components model - a preliminary study [forthcoming

    OpenAIRE

    Kuss, DJ; Shorter, GW; Van Rooij, AJ; Griffiths, MD; Schoenmakers, T

    2014-01-01

    Internet usage has grown exponentially over the last decade. Research indicates that excessive Internet use can lead to symptoms associated with addiction. To date, assessment of potential Internet addiction has varied regarding populations studied and instruments used, making reliable prevalence estimations difficult. To overcome the present problems a preliminary study was conducted testing a parsimonious Internet addiction components model based on Griffiths’ addiction components (2005), i...

  1. Model Checking Quantified Computation Tree Logic

    NARCIS (Netherlands)

    Rensink, Arend; Baier, C; Hermanns, H.

    2006-01-01

    Propositional temporal logic is not suitable for expressing properties on the evolution of dynamically allocated entities over time. In particular, it is not possible to trace such entities through computation steps, since this requires the ability to freely mix quantification and temporal

  2. Computational compliance criteria in water hammer modelling

    Directory of Open Access Journals (Sweden)

    Urbanowicz Kamil

    2017-01-01

    Full Text Available Among many numerical methods (finite: difference, element, volume etc. used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC, which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  3. Computational compliance criteria in water hammer modelling

    Science.gov (United States)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  4. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  5. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  6. Application of computed tomography virtual noncontrast spectral imaging in evaluation of hepatic metastases: a preliminary study.

    Science.gov (United States)

    Tian, Shi-Feng; Liu, Ai-Lian; Liu, Jing-Hong; Sun, Mei-Yu; Wang, He-Qing; Liu, Yi-Jun

    2015-03-05

    The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT) virtual noncontrast (VNC) spectral imaging in a retrospective analysis. Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC) and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial, portal venous, and equilibrium phases. The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa), venous (VNCv), and equilibrium (VNCe) phase by the material decomposition process using water and iodine as a base material pair. The image quality and the contrast-to-noise ratio (CNR) of metastasis of the four groups were compared with ANOVA analysis. The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test. There were no significant differences in image quality among TNC, VNCa and VNCv images (P > 0.05). The quality of VNCe images was significantly worse than that of other three groups (P 0.05). The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05). The quality of VNCa and VNCv images is identical to that of TNC images, and the metastasis detection rate in VNC images is similar to that in TNC images. VNC images obtained from arterial phase show metastases more clearly. Thus, VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  7. Application of Computed Tomography Virtual Noncontrast Spectral Imaging in Evaluation of Hepatic Metastases: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Shi-Feng Tian

    2015-01-01

    Full Text Available Objective: The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT virtual noncontrast (VNC spectral imaging in a retrospective analysis. Methods: Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial, portal venous, and equilibrium phases. The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa, venous (VNCv, and equilibrium (VNCe phase by the material decomposition process using water and iodine as a base material pair. The image quality and the contrast-to-noise ratio (CNR of metastasis of the four groups were compared with ANOVA analysis. The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test. Results: There were no significant differences in image quality among TNC, VNCa and VNCv images (P > 0.05. The quality of VNCe images was significantly worse than that of other three groups (P 0.05. The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05. Conclusions: The quality of VNCa and VNCv images is identical to that of TNC images, and the metastasis detection rate in VNC images is similar to that in TNC images. VNC images obtained from arterial phase show metastases more clearly. Thus, VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  8. Use of ultrafast computed tomography to quantitate regional myocardial perfusion: a preliminary report

    International Nuclear Information System (INIS)

    Rumberger, J.A.; Feiring, A.J.; Lipton, M.J.; Higgins, C.B.; Ell, S.R.; Marcus, M.L.

    1987-01-01

    The purpose of this study was to assess the potential for rapid acquisition computed axial tomography (Imatron C-100) to quantify regional myocardial perfusion. Myocardial and left ventricular cavity contrast clearance curves were constructed after injecting nonionic contrast (1 ml/kg over 2 to 3 seconds) into the inferior vena cava of six anesthetized, closed chest dogs (n = 14). Independent myocardial perfusion measurements were obtained by coincident injection of radiolabeled microspheres into the left atrium during control, intermediate and maximal myocardial vasodilation with adenosine (0.5 to 1.0 mg/kg per min, intravenously, respectively). At each flow state, 40 serial short-axis scans of the left ventricle were taken near end-diastole at the midpapillary muscle level. Contrast clearance curves were generated and analyzed from the left ventricular cavity and posterior papillary muscle regions after excluding contrast recirculation and minimizing partial volume effects. The area under the curve (gamma variate function) was determined for a region of interest placed within the left ventricular cavity. Characteristics of contrast clearance data from the posterior papillary muscle region that were evaluated included the peak myocardial opacification, area under the contrast clearance curve and a contrast clearance time defined by the full width/half maximal extent of the clearance curve. Myocardial perfusion (microspheres) ranged from 35 to 450 ml/100 g per min (mean 167 +/- 125)

  9. Preliminary integrated calculation of radionuclide cation and anion transport at Yucca Mountain using a geochemical model

    International Nuclear Information System (INIS)

    Birdsell, K.H.; Campbell, K.; Eggert, K.G.; Travis, B.J.

    1989-01-01

    This paper presents preliminary transport calculations for radionuclide movement at Yucca Mountain using preliminary data for mineral distributions, retardation parameter distributions, and hypothetical recharge scenarios. These calculations are not performance assessments, but are used to study the effectiveness of the geochemical barriers at the site at mechanistic level. The preliminary calculations presented have many shortcomings and should be viewed only as a demonstration of the modeling methodology. The simulations were run with TRACRN, a finite-difference porous flow and radionuclide transport code developed for the Yucca Mountain Project. Approximately 30,000 finite-difference nodes are used to represent the unsaturated and saturated zones underlying the repository in three dimensions. Sorption ratios for the radionuclides modeled are assumed to be functions of mineralogic assemblages of the underlying rock. These transport calculations present a representative radionuclide cation, 135 Cs and anion, 99 Tc. The effects on transport of many of the processes thought to be active at Yucca Mountain may be examined using this approach. The model provides a method for examining the integration of flow scenarios, transport, and retardation processes as currently understood for the site. It will also form the basis for estimates of the sensitivity of transport calculations to retardation processes. 11 refs., 17 figs., 1 tab

  10. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  11. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  12. SmartShadow models and methods for pervasive computing

    CERN Document Server

    Wu, Zhaohui

    2013-01-01

    SmartShadow: Models and Methods for Pervasive Computing offers a new perspective on pervasive computing with SmartShadow, which is designed to model a user as a personality ""shadow"" and to model pervasive computing environments as user-centric dynamic virtual personal spaces. Just like human beings' shadows in the physical world, it follows people wherever they go, providing them with pervasive services. The model, methods, and software infrastructure for SmartShadow are presented and an application for smart cars is also introduced.  The book can serve as a valuable reference work for resea

  13. Computer modeling of ORNL storage tank sludge mobilization and mixing

    International Nuclear Information System (INIS)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks

  14. Preliminary results of BRAVO project: brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks.

    Science.gov (United States)

    Bergamasco, Massimo; Frisoli, Antonio; Fontana, Marco; Loconsole, Claudio; Leonardis, Daniele; Troncossi, Marco; Foumashi, Mohammad Mozaffari; Parenti-Castelli, Vincenzo

    2011-01-01

    This paper presents the preliminary results of the project BRAVO (Brain computer interfaces for Robotic enhanced Action in Visuo-motOr tasks). The objective of this project is to define a new approach to the development of assistive and rehabilitative robots for motor impaired users to perform complex visuomotor tasks that require a sequence of reaches, grasps and manipulations of objects. BRAVO aims at developing new robotic interfaces and HW/SW architectures for rehabilitation and regain/restoration of motor function in patients with upper limb sensorimotor impairment through extensive rehabilitation therapy and active assistance in the execution of Activities of Daily Living. The final system developed within this project will include a robotic arm exoskeleton and a hand orthosis that will be integrated together for providing force assistance. The main novelty that BRAVO introduces is the control of the robotic assistive device through the active prediction of intention/action. The system will actually integrate the information about the movement carried out by the user with a prediction of the performed action through an interpretation of current gaze of the user (measured through eye-tracking), brain activation (measured through BCI) and force sensor measurements. © 2011 IEEE

  15. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.; Douglas, Craig C.

    2010-01-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models

  16. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  17. Models of parallel computation :a survey and classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yunquan; CHEN Guoliang; SUN Guangzhong; MIAO Qiankun

    2007-01-01

    In this paper,the state-of-the-art parallel computational model research is reviewed.We will introduce various models that were developed during the past decades.According to their targeting architecture features,especially memory organization,we classify these parallel computational models into three generations.These models and their characteristics are discussed based on three generations classification.We believe that with the ever increasing speed gap between the CPU and memory systems,incorporating non-uniform memory hierarchy into computational models will become unavoidable.With the emergence of multi-core CPUs,the parallelism hierarchy of current computing platforms becomes more and more complicated.Describing this complicated parallelism hierarchy in future computational models becomes more and more important.A semi-automatic toolkit that can extract model parameters and their values on real computers can reduce the model analysis complexity,thus allowing more complicated models with more parameters to be adopted.Hierarchical memory and hierarchical parallelism will be two very important features that should be considered in future model design and research.

  18. Patentability aspects of computational cancer models

    Science.gov (United States)

    Lishchuk, Iryna

    2017-07-01

    Multiscale cancer models, implemented in silico, simulate tumor progression at various spatial and temporal scales. Having the innovative substance and possessing the potential of being applied as decision support tools in clinical practice, patenting and obtaining patent rights in cancer models seems prima facie possible. What legal hurdles the cancer models need to overcome for being patented we inquire from this paper.

  19. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  20. Computational model of cellular metabolic dynamics

    DEFF Research Database (Denmark)

    Li, Yanjun; Solomon, Thomas; Haus, Jacob M

    2010-01-01

    of the cytosol and mitochondria. The model simulated skeletal muscle metabolic responses to insulin corresponding to human hyperinsulinemic-euglycemic clamp studies. Insulin-mediated rate of glucose disposal was the primary model input. For model validation, simulations were compared with experimental data......: intracellular metabolite concentrations and patterns of glucose disposal. Model variations were simulated to investigate three alternative mechanisms to explain insulin enhancements: Model 1 (M.1), simple mass action; M.2, insulin-mediated activation of key metabolic enzymes (i.e., hexokinase, glycogen synthase......, by application of mechanism M.3, the model predicts metabolite concentration changes and glucose partitioning patterns consistent with experimental data. The reaction rate fluxes quantified by this detailed model of insulin/glucose metabolism provide information that can be used to evaluate the development...

  1. High burnup models in computer code fair

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, B K; Swami Prasad, P; Kushwaha, H S; Mahajan, S C; Kakodar, A [Bhabha Atomic Research Centre, Bombay (India)

    1997-08-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ``Light water reactor fuel rod modelling code evaluation`` and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs.

  2. High burnup models in computer code fair

    International Nuclear Information System (INIS)

    Dutta, B.K.; Swami Prasad, P.; Kushwaha, H.S.; Mahajan, S.C.; Kakodar, A.

    1997-01-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ''Light water reactor fuel rod modelling code evaluation'' and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs

  3. Computational neurorehabilitation: modeling plasticity and learning to predict recovery.

    Science.gov (United States)

    Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas

    2016-04-30

    Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.

  4. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020

  5. Establishing a Cloud Computing Success Model for Hospitals in Taiwan.

    Science.gov (United States)

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  6. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    Directory of Open Access Journals (Sweden)

    Jiunn-Woei Lian PhD

    2017-01-01

    Full Text Available The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.

  7. Aespoe Pillar Stability Experiment. Final coupled 3D thermo-mechanical modeling. Preliminary particle mechanical modeling

    International Nuclear Information System (INIS)

    Wanne, Toivo; Johansson, Erik; Potyondy, David

    2004-02-01

    SKB is planning to perform a large-scale pillar stability experiment called APSE (Aespoe Pillar Stability Experiment) at Aespoe HRL. The study is focused on understanding and control of progressive rock failure in hard crystalline rock and damage caused by high stresses. The elastic thermo-mechanical modeling was carried out in three dimensions because of the complex test geometry and in-situ stress tensor by using a finite-difference modeling software FLAC3D. Cracking and damage formation were modeled in the area of interest (pillar between two large scale holes) in two dimensions by using the Particle Flow Code (PFC), which is based on particle mechanics. FLAC and PFC were coupled to minimize the computer resources and the computing time. According to the modeling the initial temperature rises from 15 deg C to about 65 deg C in the pillar area during the heating period of 120 days. The rising temperature due to thermal expansion induces stresses in the pillar area and after 120 days heating the stresses have increased about 33% from the excavation induced maximum stress of 150 MPa to 200 MPa in the end of the heating period. The results from FLAC3D model showed that only regions where the crack initiation stress has exceeded were identified and they extended to about two meters down the hole wall. These could be considered the areas where damage may occur during the in-situ test. When the other hole is pressurized with a 0.8 MPa confining pressure it yields that 5 MPa more stress is needed to damage the rock than without confining pressure. This makes the damaged area in some degree smaller. High compressive stresses in addition to some tensile stresses might induce some AE (acoustic emission) activity in the upper part of the hole from the very beginning of the test and are thus potential areas where AE activities may be detected. Monitoring like acoustic emissions will be measured during the test execution. The 2D coupled PFC-FLAC modeling indicated that

  8. Aespoe Pillar Stability Experiment. Final coupled 3D thermo-mechanical modeling. Preliminary particle mechanical modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wanne, Toivo; Johansson, Erik; Potyondy, David [Saanio and Riekkola Oy, Helsinki (Finland)

    2004-02-01

    SKB is planning to perform a large-scale pillar stability experiment called APSE (Aespoe Pillar Stability Experiment) at Aespoe HRL. The study is focused on understanding and control of progressive rock failure in hard crystalline rock and damage caused by high stresses. The elastic thermo-mechanical modeling was carried out in three dimensions because of the complex test geometry and in-situ stress tensor by using a finite-difference modeling software FLAC3D. Cracking and damage formation were modeled in the area of interest (pillar between two large scale holes) in two dimensions by using the Particle Flow Code (PFC), which is based on particle mechanics. FLAC and PFC were coupled to minimize the computer resources and the computing time. According to the modeling the initial temperature rises from 15 deg C to about 65 deg C in the pillar area during the heating period of 120 days. The rising temperature due to thermal expansion induces stresses in the pillar area and after 120 days heating the stresses have increased about 33% from the excavation induced maximum stress of 150 MPa to 200 MPa in the end of the heating period. The results from FLAC3D model showed that only regions where the crack initiation stress has exceeded were identified and they extended to about two meters down the hole wall. These could be considered the areas where damage may occur during the in-situ test. When the other hole is pressurized with a 0.8 MPa confining pressure it yields that 5 MPa more stress is needed to damage the rock than without confining pressure. This makes the damaged area in some degree smaller. High compressive stresses in addition to some tensile stresses might induce some AE (acoustic emission) activity in the upper part of the hole from the very beginning of the test and are thus potential areas where AE activities may be detected. Monitoring like acoustic emissions will be measured during the test execution. The 2D coupled PFC-FLAC modeling indicated that

  9. Computer modeling of Cannabinoid receptor type 1

    Directory of Open Access Journals (Sweden)

    Sapundzhi Fatima

    2018-01-01

    Full Text Available Cannabinoid receptors are important class of receptors as they are involved in various physiological processes such as appetite, pain-sensation, mood, and memory. It is important to design receptor-selective ligands in order to treat a particular disorder. The aim of the present study is to model the structure of cannabinoid receptor CB1 and to perform docking between obtained models and known ligands. Two models of CBR1 were prepared with two different methods (Modeller of Chimera and MOE. They were used for docking with GOLD 5.2. It was established a high correlation between inhibitory constant Ki of CB1 cannabinoid ligands and the ChemScore scoring function of GOLD, which concerns both models. This suggests that the models of the CB1 receptors obtained could be used for docking studies and in further investigation and design of new potential, selective and active cannabinoids with the desired effects.

  10. A preliminary three-dimensional geological framework model for Yucca Mountain

    International Nuclear Information System (INIS)

    Stirewalt, G.L.; Henderson, D.B.

    1995-01-01

    A preliminary three-dimensional geological framework model has been developed for the potential high-level radioactive waste disposal site at Yucca Mountain. The model is based on field data and was constructed using EarthVision (Version 2.0) software. It provides the basic geological framework in which variations in geological parameters and features in and adjacent to the repository block can be illustrated and analyzed. With further refinement and modification of the model through incorporation of additional data, it can be used by Nuclear Regulatory Commission (NRC) staff to determine whether representation of subsurface geological features in Department of Energy models is reasonable. Consequently, NRC staff will be able to use the model during pre-licensing and licensing phases to assess models for analyses of site suitability, design considerations, and repository performance

  11. Establishing a Cloud Computing Success Model for Hospitals in Taiwan

    OpenAIRE

    Lian, Jiunn-Woei

    2017-01-01

    The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality...

  12. Computational modeling of shallow geothermal systems

    CERN Document Server

    Al-Khoury, Rafid

    2011-01-01

    A Step-by-step Guide to Developing Innovative Computational Tools for Shallow Geothermal Systems Geothermal heat is a viable source of energy and its environmental impact in terms of CO2 emissions is significantly lower than conventional fossil fuels. Shallow geothermal systems are increasingly utilized for heating and cooling of buildings and greenhouses. However, their utilization is inconsistent with the enormous amount of energy available underneath the surface of the earth. Projects of this nature are not getting the public support they deserve because of the uncertainties associated with

  13. A preliminary concept of stochastic model of the tritium cycle in a fusion reactor

    International Nuclear Information System (INIS)

    Taczanowski, S.

    1988-01-01

    A preliminary concept of stochastic model of the tritium circulation in a fusion reactor was elaborated in purpose of determining the necessary minimum and current tritium inventory in real circumstances. A random character of reactor operation was assumed what is especially valid in the starting phase being of particularly low reliability of the assembly. A system of differential equations with random initial conditions describing the tritium cycle was solved for both operation and break states of the reactor. The distribution of the moments and of the number of breaks in the reactor operation was discussed and the possibilities of further development of the present model are indicated. 5 refs., 2 figs. (author)

  14. Frictional sliding in layered rock model: Preliminary experiments. Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Perry, K.E. Jr.; Buescher, B.J.; Anderson, D.; Epstein, J.S.

    1995-09-01

    An important aspect of determining the suitability of Yucca Mountain as a possible nuclear waste repository requires understanding the mechanical behavior of jointed rock-masses. To this end we have studied the frictional sliding between simulated rock joints in the laboratory using the technique of phase shifting moire interferometry. The models were made from stacks of Lexan plates and contained a central hole to induce slip between the plates when the models were loaded in compression. These preliminary results confirm the feasibility of the approach and show a clear evolution of slip as function of load

  15. Computational Psychometrics for Modeling System Dynamics during Stressful Disasters

    Directory of Open Access Journals (Sweden)

    Pietro Cipresso

    2017-08-01

    Full Text Available Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.

  16. Oil Spill Detection and Modelling: Preliminary Results for the Cercal Accident

    Science.gov (United States)

    da Costa, R. T.; Azevedo, A.; da Silva, J. C. B.; Oliveira, A.

    2013-03-01

    two-dimensional surface plume transport model VOILS [1] with the oil spreading formulation enabled. The remaining oil weathering processes (evaporation, emulsification, dispersion and dissolution in the water column) and shoreline retention were disregarded. The computational structure of the model is based on Eulerian-Lagrangian formulations, horizontal unstructured mesh discretization and it is soft-coupled with the tri-dimensional hydrodynamic model SELFE - Semi-Implicit Eulerian Lagrangian Finite Element [15] that uses hybrid sigma-Z coordinates in the vertical. The preliminary results of this hindcast simulation for the Cercal oil spill are presented and compared with available satellite SAR images. The forcings used play an important role in the final results. During the late stage spreading phases of the oil, about one month after the spill, the Douro River outflow is best seen in the SAR images. The morphology of the river outflow is discussed according to traditional coastal dynamics, and compared with model results. In addition to several interesting physical features that were identified, we report on the generation of Internal Solitary Waves (ISW) in the vicinity of the Douro River Plume (DRP). It is well known that trains of short-period internal waves can be generated by river plumes (such as the Columbia River). The internal structure of the observed internal waves (elevation waves or mode-2 versus mode-1 internal waves) is discussed based on the SAR signatures and available stratification. The present work has been conducted under an FCT - Fundaç ão para a Ciência e a Tecnologia / MCTES - Ministério da Ciência, Tecnologia e Ensino Superior (PIDDAC - Programa de Investimentos e Despesas de Desenvolvimento da Administraç ão Central) Portuguese funded project entitled PAC:MAN Pollution accidents in coastal areas: a Risk management system (PTDC/AACAMB/113469/2008).

  17. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  18. Development and preliminary user testing of the DCIDA (Dynamic computer interactive decision application) for 'nudging' patients towards high quality decisions.

    Science.gov (United States)

    Bansback, Nick; Li, Linda C; Lynd, Larry; Bryan, Stirling

    2014-08-01

    Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a 'think aloud' protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68-85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their

  19. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  20. The DEBOT Model, a New Global Barotropic Ocean Tidal Model: Test Computations and an Application in Related Geophysical Disciplines

    Science.gov (United States)

    Einspigel, D.; Sachl, L.; Martinec, Z.

    2014-12-01

    We present the DEBOT model, which is a new global barotropic ocean model. The DEBOT model is primarily designed for modelling of ocean flow generated by the tidal attraction of the Moon and the Sun, however it can be used for other ocean applications where the barotropic model is sufficient, for instance, a tsunami wave propagation. The model has been thoroughly tested by several different methods: 1) synthetic example which involves a tsunami-like wave propagation of an initial Gaussian depression and testing of the conservation of integral invariants, 2) a benchmark study with another barotropic model, the LSGbt model, has been performed and 3) results of realistic simulations have been compared with data from tide gauge measurements around the world. The test computations prove the validity of the numerical code and demonstrate the ability of the DEBOT model to simulate the realistic ocean tides. The DEBOT model will be principaly applied in related geophysical disciplines, for instance, in an investigation of an influence of the ocean tides on the geomagnetic field or the Earth's rotation. A module for modelling of the secondary poloidal magnetic field generated by an ocean flow is already implemented in the DEBOT model and preliminary results will be presented. The future aim is to assimilate magnetic data provided by the Swarm satellite mission into the ocean flow model.

  1. Computational Modeling of Turbulent Spray Combustion

    NARCIS (Netherlands)

    Ma, L.

    2016-01-01

    The objective of the research presented in this thesis is development and validation of predictive models or modeling approaches of liquid fuel combustion (spray combustion) in hot-diluted environments, known as flameless combustion or MILD combustion. The goal is to combine good physical insight,

  2. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    -based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...

  3. Computing broadband accelerograms using kinematic rupture modeling

    International Nuclear Information System (INIS)

    Ruiz Paredes, J.A.

    2007-05-01

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k -2 source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w 2 model with spectral amplitudes at high frequency scaled to the coefficient of directivity C d . To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C d , as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M w 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  6. The European computer model for optronic system performance prediction (ECOMOS)

    NARCIS (Netherlands)

    Kessler, S.; Bijl, P.; Labarre, L.; Repasi, E.; Wittenstein, W.; Bürsing, H.

    2017-01-01

    ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The

  7. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  8. Methods for teaching geometric modelling and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Rotkov, S.I.; Faitel`son, Yu. Ts.

    1992-05-01

    This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.

  9. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  10. Fish-Friendly Hydropower Turbine Development & Deployment: Alden Turbine Preliminary Engineering and Model Testing

    Energy Technology Data Exchange (ETDEWEB)

    Foust, J. [Voith Hydro, Inc., York, PA (USA); Hecker, G. [Alden Research Laboratory, Inc., Holden, MA (USA); Li, S. [Alden Research Laboratory, Inc., Holden, MA (USA); Allen, G. [Alden Research Laboratory, Inc., Holden, MA (USA)

    2011-10-01

    The Alden turbine was developed through the U.S. Department of Energy's (DOE's) former Advanced Hydro Turbine Systems Program (1994-2006) and, more recently, through the Electric Power Research Institute (EPRI) and the DOE's Wind & Water Power Program. The primary goal of the engineering study described here was to provide a commercially competitive turbine design that would yield fish passage survival rates comparable to or better than the survival rates of bypassing or spilling flow. Although the turbine design was performed for site conditions corresponding to 92 ft (28 m) net head and a discharge of 1500 cfs (42.5 cms), the design can be modified for additional sites with differing operating conditions. During the turbine development, design modifications were identified for the spiral case, distributor (stay vanes and wicket gates), runner, and draft tube to improve turbine performance while maintaining features for high fish passage survival. Computational results for pressure change rates and shear within the runner passage were similar in the original and final turbine geometries, while predicted minimum pressures were higher for the final turbine. The final turbine geometry and resulting flow environments are expected to further enhance the fish passage characteristics of the turbine. Computational results for the final design were shown to improve turbine efficiencies by over 6% at the selected operating condition when compared to the original concept. Prior to the release of the hydraulic components for model fabrication, finite element analysis calculations were conducted for the stay vanes, wicket gates, and runner to verify that structural design criteria for stress and deflections were met. A physical model of the turbine was manufactured and tested with data collected for power and efficiency, cavitation limits, runaway speed, axial and radial thrust, pressure pulsations, and wicket gate torque. All parameters were observed to fall

  11. Computer-aided polymer design using group contribution plus property models

    DEFF Research Database (Denmark)

    Satyanarayana, Kavitha Chelakara; Abildskov, Jens; Gani, Rafiqul

    2009-01-01

    . Polymer repeat unit property prediction models are required to calculate the properties of the generated repeat units. A systematic framework incorporating recently developed group contribution plus (GC(+)) models and an extended CAMD technique to include design of polymer repeat units is highlighted...... in this paper. The advantage of a GC(+) model in CAMD applications is that a very large number of polymer structures can be considered even though some of the group parameters may not be available. A number of case studies involving different polymer design problems have been solved through the developed......The preliminary step for polymer product design is to identify the basic repeat unit structure of the polymer that matches the target properties. Computer-aided molecular design (CAMD) approaches can be applied for generating the polymer repeat unit structures that match the required constraints...

  12. A preliminary assessment of selected atmospheric dispersion, food-chain transport, and dose-to-man computer codes for use by the DOE Office of Civilian Radioactive Waste Management

    International Nuclear Information System (INIS)

    Riggle, K.J.; Roddy, J.W.

    1989-02-01

    This work is part of the ongoing Systems Modeling Program at Oak Ridge National Laboratory, which is assisting the DOE Office of Civilian Radioactive Waste Management in selecting appropriate computer codes for the process of licensing a high-level radioactive waste repository or a monitored retrievable storage facility. A preliminary study of codes for predicting dose to man following airborne releases of radionuclides is described. These codes use models for estimating atmospheric dispersion of activity and deposition onto the ground surface, exposures via external irradiation, inhalation of airborne activity, and ingestion following transport through terrestrial food chains, and the dose per unit exposure for each exposure mode. A set of criteria is given for use in choosing codes for further examination. From a list of over 150 computer codes, five were selected for review

  13. Modelling lung cancer due to radon and smoking in WISMUT miners: Preliminary results

    International Nuclear Information System (INIS)

    Bijwaard, H.; Dekkers, F.; Van Dillen, T.

    2011-01-01

    A mechanistic two-stage carcinogenesis model has been applied to model lung-cancer mortality in the largest uranium-miner cohort available. Models with and without smoking action both fit the data well. As smoking information is largely missing from the cohort data, a method has been devised to project this information from a case-control study onto the cohort. Model calculations using 256 projections show that the method works well. Preliminary results show that if an explicit smoking action is absent in the model, this is compensated by the values of the baseline parameters. This indicates that in earlier studies performed without smoking information, the results obtained for the radiation parameters are still valid. More importantly, the inclusion of smoking-related parameters shows that these mainly influence the later stages of lung-cancer development. (authors)

  14. Alternative conceptual models and codes for unsaturated flow in fractured tuff: Preliminary assessments for GWTT-95

    International Nuclear Information System (INIS)

    Ho, C.K.; Altman, S.J.; Arnold, B.W.

    1995-09-01

    Groundwater travel time (GWTT) calculations will play an important role in addressing site-suitability criteria for the potential high-level nuclear waste repository at Yucca Mountain,Nevada. In support of these calculations, Preliminary assessments of the candidate codes and models are presented in this report. A series of benchmark studies have been designed to address important aspects of modeling flow through fractured media representative of flow at Yucca Mountain. Three codes (DUAL, FEHMN, and TOUGH 2) are compared in these benchmark studies. DUAL is a single-phase, isothermal, two-dimensional flow simulator based on the dual mixed finite element method. FEHMN is a nonisothermal, multiphase, multidimensional simulator based primarily on the finite element method. TOUGH2 is anon isothermal, multiphase, multidimensional simulator based on the integral finite difference method. Alternative conceptual models of fracture flow consisting of the equivalent continuum model (ECM) and the dual permeability (DK) model are used in the different codes

  15. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    Wiley, J.C.; Ross, D.W.; Miner, W.H. Jr.

    1988-11-01

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  16. Predictive Models and Computational Toxicology (II IBAMTOX)

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  17. Computational Models of Human Organizational Dynamics

    National Research Council Canada - National Science Library

    Courand, Gregg

    2000-01-01

    .... ThIs is the final report for our Phase II SBIR project, conducted over three years. Our research program has contributed theory, methodology, and technology for organizational modeling and analysis...

  18. Computational Model for Spacecraft/Habitat Volume

    Data.gov (United States)

    National Aeronautics and Space Administration — Please note that funding to Dr. Simon Hsiang, a critical co-investigator for the development of the Spacecraft Optimization Layout and Volume (SOLV) model, was...

  19. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  20. Computational numerical modelling of plasma focus

    International Nuclear Information System (INIS)

    Brollo, Fabricio

    2005-01-01

    Several models for calculation of the dynamics of Plasma Focus have been developed. All of them begin from the same physic principle: the current sheet run down the anode length, ionizing and collecting the gas that finds in its way.This is known as snow-plow model.Concerning pinch's compression, a MHD model is proposed.The plasma is treated as a fluid , particularly as a high ionized gas.However, there are not many models that, taking into account thermal equilibrium inside the plasma, make approximated calculations of the maximum temperatures reached in the pinch.Besides, there are no models which use those temperatures to estimate the termofusion neutron yield for the Deuterium or Deuterium-Tritium gas filled cases.In the PLADEMA network (Dense Magnetized Plasmas) a code was developed with the objective of describe the plasma focus dynamics, in a conceptual engineering stage.The codes calculates the principal variables (currents, time to focus, etc) and estimates the neutron yield in Deuterium-filled plasma focus devices.It can be affirmed that the code's experimental validation, in its axial and radial stages, was very successfully. However, it was accepted that the compression stage should be formulated again, to find a solution for a large variation of a parameter related with velocity profiles for the particles trapped inside the pinch.The objectives of this work can be stated in the next way : - Check the compression's model hypothesis. Develop a new model .- Implement the new model in the code. Compare results against experimental data of Plasma Focus devices from all around the world [es

  1. Computer models and output, Spartan REM: Appendix B

    Science.gov (United States)

    Marlowe, D. S.; West, E. J.

    1984-01-01

    A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.

  2. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  3. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D

    2012-01-01

    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  4. Model and Computing Experiment for Research and Aerosols Usage Management

    Directory of Open Access Journals (Sweden)

    Daler K. Sharipov

    2012-09-01

    Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.

  5. Computational Models for Nonlinear Aeroelastic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  6. Computational modelling of the impact of AIDS on business.

    Science.gov (United States)

    Matthews, Alan P

    2007-07-01

    An overview of computational modelling of the impact of AIDS on business in South Africa, with a detailed description of the AIDS Projection Model (APM) for companies, developed by the author, and suggestions for further work. Computational modelling of the impact of AIDS on business in South Africa requires modelling of the epidemic as a whole, and of its impact on a company. This paper gives an overview of epidemiological modelling, with an introduction to the Actuarial Society of South Africa (ASSA) model, the most widely used such model for South Africa. The APM produces projections of HIV prevalence, new infections, and AIDS mortality on a company, based on the anonymous HIV testing of company employees, and projections from the ASSA model. A smoothed statistical model of the prevalence test data is computed, and then the ASSA model projection for each category of employees is adjusted so that it matches the measured prevalence in the year of testing. FURTHER WORK: Further techniques that could be developed are microsimulation (representing individuals in the computer), scenario planning for testing strategies, and models for the business environment, such as models of entire sectors, and mapping of HIV prevalence in time and space, based on workplace and community data.

  7. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  8. Computational quench model applicable to the SMES/CICC

    Science.gov (United States)

    Luongo, Cesar A.; Chang, Chih-Lien; Partain, Kenneth D.

    1994-07-01

    A computational quench model accounting for the hydraulic peculiarities of the 200 kA SMES cable-in-conduit conductor has been developed. The model is presented and used to simulate the quench on the SMES-ETM. Conclusions are drawn concerning quench detection and protection. A plan for quench model validation is presented.

  9. Petri Net Modeling of Computer Virus Life Cycle | Ikekonwu ...

    African Journals Online (AJOL)

    Virus life cycle, which refers to the stages of development of a computer virus, is presented as a suitable area for the application of Petri nets. Petri nets a powerful modeling tool in the field of dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model is also presented. The intention of ...

  10. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  11. Computational model of miniature pulsating heat pipes

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Mario J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Givler, Richard C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-01-01

    The modeling work described herein represents Sandia National Laboratories (SNL) portion of a collaborative three-year project with Northrop Grumman Electronic Systems (NGES) and the University of Missouri to develop an advanced, thermal ground-plane (TGP), which is a device, of planar configuration, that delivers heat from a source to an ambient environment with high efficiency. Work at all three institutions was funded by DARPA/MTO; Sandia was funded under DARPA/MTO project number 015070924. This is the final report on this project for SNL. This report presents a numerical model of a pulsating heat pipe, a device employing a two phase (liquid and its vapor) working fluid confined in a closed loop channel etched/milled into a serpentine configuration in a solid metal plate. The device delivers heat from an evaporator (hot zone) to a condenser (cold zone). This new model includes key physical processes important to the operation of flat plate pulsating heat pipes (e.g. dynamic bubble nucleation, evaporation and condensation), together with conjugate heat transfer with the solid portion of the device. The model qualitatively and quantitatively predicts performance characteristics and metrics, which was demonstrated by favorable comparisons with experimental results on similar configurations. Application of the model also corroborated many previous performance observations with respect to key parameters such as heat load, fill ratio and orientation.

  12. Dual-energy bone removal computed tomography (BRCT): preliminary report of efficacy of acute intracranial hemorrhage detection.

    Science.gov (United States)

    Naruto, Norihito; Tannai, Hidenori; Nishikawa, Kazuma; Yamagishi, Kentaro; Hashimoto, Masahiko; Kawabe, Hideto; Kamisaki, Yuichi; Sumiya, Hisashi; Kuroda, Satoshi; Noguchi, Kyo

    2018-02-01

    One of the major applications of dual-energy computed tomography (DECT) is automated bone removal (BR). We hypothesized that the visualization of acute intracranial hemorrhage could be improved on BRCT by removing bone as it has the highest density tissue in the head. This preliminary study evaluated the efficacy of a DE BR algorithm for the head CT of trauma patients. Sixteen patients with acute intracranial hemorrhage within 1 day after head trauma were enrolled in this study. All CT examinations were performed on a dual-source dual-energy CT scanner. BRCT images were generated using the Bone Removal Application. Simulated standard CT and BRCT images were visually reviewed in terms of detectability (presence or absence) of acute hemorrhagic lesions. DECT depicted 28 epidural/subdural hemorrhages, 17 contusional hemorrhages, and 7 subarachnoid hemorrhages. In detecting epidural/subdural hemorrhage, BRCT [28/28 (100%)] was significantly superior to simulated standard CT [17/28 (61%)] (p = .001). In detecting contusional hemorrhage, BRCT [17/17 (100%)] was also significantly superior to simulated standard CT [11/17 (65%)] (p = .0092). BRCT was superior to simulated standard CT in detecting acute intracranial hemorrhage. BRCT could improve the detection of small intracranial hemorrhages, particularly those adjacent to bone, by removing bone that can interfere with the visualization of small acute hemorrhage. In an emergency such as head trauma, BRCT can be used as support imaging in combination with simulated standard CT and bone scale CT, although BRCT cannot replace a simulated standard CT.

  13. Computer Models and Automata Theory in Biology and Medicine

    CERN Document Server

    Baianu, I C

    2004-01-01

    The applications of computers to biological and biomedical problem solving goes back to the very beginnings of computer science, automata theory [1], and mathematical biology [2]. With the advent of more versatile and powerful computers, biological and biomedical applications of computers have proliferated so rapidly that it would be virtually impossible to compile a comprehensive review of all developments in this field. Limitations of computer simulations in biology have also come under close scrutiny, and claims have been made that biological systems have limited information processing power [3]. Such general conjectures do not, however, deter biologists and biomedical researchers from developing new computer applications in biology and medicine. Microprocessors are being widely employed in biological laboratories both for automatic data acquisition/processing and modeling; one particular area, which is of great biomedical interest, involves fast digital image processing and is already established for rout...

  14. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  15. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    been explored in this thesis by considering them as epidemic-like processes. A mathematical model has been developed based on differential equations, which studies the dynamics of the issues from the very beginning until the issues cease. This study extends classical models of the spread of epidemics...... to describe the phenomenon of contagious public outrage, which eventually leads to the spread of violence following a disclosure of some unpopular political decisions and/or activity. The results shed a new light on terror activity and provide some hint on how to curb the spreading of violence within...

  16. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  17. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  18. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  19. Preliminary evaluation of the Community Multiscale Air Quality model for 2002 over the Southeastern United States.

    Science.gov (United States)

    Morris, Ralph E; McNally, Dennis E; Tesche, Thomas W; Tonnesen, Gail; Boylan, James W; Brewer, Patricia

    2005-11-01

    The Visibility Improvement State and Tribal Association of the Southeast (VISTAS) is one of five Regional Planning Organizations that is charged with the management of haze, visibility, and other regional air quality issues in the United States. The VISTAS Phase I work effort modeled three episodes (January 2002, July 1999, and July 2001) to identify the optimal model configuration(s) to be used for the 2002 annual modeling in Phase II. Using model configurations recommended in the Phase I analysis, 2002 annual meteorological (Mesoscale Meterological Model [MM5]), emissions (Sparse Matrix Operator Kernal Emissions [SMOKE]), and air quality (Community Multiscale Air Quality [CMAQ]) simulations were performed on a 36-km grid covering the continental United States and a 12-km grid covering the Eastern United States. Model estimates were then compared against observations. This paper presents the results of the preliminary CMAQ model performance evaluation for the initial 2002 annual base case simulation. Model performance is presented for the Eastern United States using speciated fine particle concentration and wet deposition measurements from several monitoring networks. Initial results indicate fairly good performance for sulfate with fractional bias values generally within +/-20%. Nitrate is overestimated in the winter by approximately +50% and underestimated in the summer by more than -100%. Organic carbon exhibits a large summer underestimation bias of approximately -100% with much improved performance seen in the winter with a bias near zero. Performance for elemental carbon is reasonable with fractional bias values within +/- 40%. Other fine particulate (soil) and coarse particular matter exhibit large (80-150%) overestimation in the winter but improved performance in the summer. The preliminary 2002 CMAQ runs identified several areas of enhancements to improve model performance, including revised temporal allocation factors for ammonia emissions to improve

  20. Global Stability of an Epidemic Model of Computer Virus

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2014-01-01

    Full Text Available With the rapid popularization of the Internet, computers can enter or leave the Internet increasingly frequently. In fact, no antivirus software can detect and remove all sorts of computer viruses. This implies that viruses would persist on the Internet. To better understand the spread of computer viruses in these situations, a new propagation model is established and analyzed. The unique equilibrium of the model is globally asymptotically stable, in accordance with the reality. A parameter analysis of the equilibrium is also conducted.

  1. Predictive models applied to groundwater level forecasting: a preliminary experience on the alluvial aquifer of the Magra River (Italy).

    Science.gov (United States)

    Brozzo, Gianpiero; Doveri, Marco; Lelli, Matteo; Scozzari, Andrea

    2010-05-01

    Computer-based decision support systems are getting a growing interest for water managing authorities and water distribution companies. This work discusses a preliminary experience in the application of computational intelligence in a hydrological modeling framework, regarding the study area of the alluvial aquifer of the Magra River (Italy). Two sites in the studied area, corresponding to two distinct groups of wells (Battifollo and Fornola) are managed by the local drinkable water distribution company (ACAM Acque), which serves the area of La Spezia, on the Ligurian coast. Battifollo has 9 wells with a total extraction rate of about 240 liters per second, while Fornola has 44 wells with an extraction rate of about 900 liters per second. Objective of this work is to make use of time series coming from long-term monitoring activities in order to assess the trend of the groundwater level with respect to a set of environmental and exploitation parameters; this is accomplished by the experimentation of a suitable model, eligible to be used as a predictor. This activity moves on from the modeling of the system behavior, based on a set of Input/Output data, in order to characterize it without necessarily a prior knowledge of any deterministic mechanism (system identification). In this context, data series collected by continuous hydrological monitoring instrumentation installed in the studied sites, together with meteorological and water extraction data, have been analyzed in order to assess the applicability and performance of a predictive model of the groundwater level. A mixed approach (both data driven and process-based) has been experimented on the whole dataset relating to the last ten years of continuous monitoring activity. The system identification approach presented here is based on the integration of an adaptive technique based on Artificial Neural Networks (ANNs) and a blind deterministic identification approach. According to this concept, the behavior of

  2. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  3. Computational Modeling Develops Ultra-Hard Steel

    Science.gov (United States)

    2007-01-01

    Glenn Research Center's Mechanical Components Branch developed a spiral bevel or face gear test rig for testing thermal behavior, surface fatigue, strain, vibration, and noise; a full-scale, 500-horsepower helicopter main-rotor transmission testing stand; a gear rig that allows fundamental studies of the dynamic behavior of gear systems and gear noise; and a high-speed helical gear test for analyzing thermal behavior for rotorcraft. The test rig provides accelerated fatigue life testing for standard spur gears at speeds of up to 10,000 rotations per minute. The test rig enables engineers to investigate the effects of materials, heat treat, shot peen, lubricants, and other factors on the gear's performance. QuesTek Innovations LLC, based in Evanston, Illinois, recently developed a carburized, martensitic gear steel with an ultra-hard case using its computational design methodology, but needed to verify surface fatigue, lifecycle performance, and overall reliability. The Battelle Memorial Institute introduced the company to researchers at Glenn's Mechanical Components Branch and facilitated a partnership allowing researchers at the NASA Center to conduct spur gear fatigue testing for the company. Testing revealed that QuesTek's gear steel outperforms the current state-of-the-art alloys used for aviation gears in contact fatigue by almost 300 percent. With the confidence and credibility provided by the NASA testing, QuesTek is commercializing two new steel alloys. Uses for this new class of steel are limitless in areas that demand exceptional strength for high throughput applications.

  4. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  5. Electricity load modelling using computational intelligence

    NARCIS (Netherlands)

    Ter Borg, R.W.

    2005-01-01

    As a consequence of the liberalisation of the electricity markets in Europe, market players have to continuously adapt their future supply to match their customers' demands. This poses the challenge of obtaining a predictive model that accurately describes electricity loads, current in this thesis.

  6. Computational Modeling of Fluorescence Loss in Photobleaching

    DEFF Research Database (Denmark)

    Hansen, Christian Valdemar; Schroll, Achim; Wüstner, Daniel

    2015-01-01

    sequences as reaction– diffusion systems on segmented cell images. The cell geometry is extracted from microscopy images using the Chan–Vese active contours algorithm [8]. The PDE model is subsequently solved by the automated Finite Element software package FEniCS [20]. The flexibility of FEniCS allows...

  7. Radiation enhanced conduction in insulators: computer modelling

    International Nuclear Information System (INIS)

    Fisher, A.J.

    1986-10-01

    The report describes the implementation of the Klaffky-Rose-Goland-Dienes [Phys. Rev. B.21 3610,1980] model of radiation-enhanced conduction and describes the codes used. The approach is demonstrated for the data for alumina of Pells, Buckley, Hill and Murphy [AERE R.11715, 1985]. (author)

  8. GPSS and Modeling of Computer Communication Networks.

    Science.gov (United States)

    1982-04-01

    action block in a flowchart of the system being modeled. For instance, the process of capturing a facility for some length of time and then...because of the abundance of tutorial material available; whereas, far less complete 47 tutorial material is available to beginners learning SIMSCRIPT

  9. Life system modeling and intelligent computing. Pt. I. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part I of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 55 papers in this volume are organized in topical sections on intelligent modeling, monitoring, and control of complex nonlinear systems; autonomy-oriented computing and intelligent agents; advanced theory and methodology in fuzzy systems and soft computing; computational intelligence in utilization of clean and renewable energy resources; intelligent modeling, control and supervision for energy saving and pollution reduction; intelligent methods in developing vehicles, engines and equipments; computational methods and intelligence in modeling genetic and biochemical networks and regulation. (orig.)

  10. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  11. Computational Modeling of Supercritical and Transcritical Flows

    Science.gov (United States)

    2017-01-09

    Acentric factor I. Introduction Liquid rocket and gas turbine engines operate at high pressures . For gas turbines, the combustor pressurecan be 60 − 100...equation of state for several reduced pressures . The model captures the high density at very low temperatures and the supercritical behavior at high reduced...physical meaning. The temperature range over which the three roots are present is bounded by TL on the low side and TH on the high side. Figure 2: Roots

  12. Computational Modeling of Lipid Metabolism in Yeast

    Directory of Open Access Journals (Sweden)

    Vera Schützhold

    2016-09-01

    Full Text Available Lipid metabolism is essential for all major cell functions and has recently gained increasing attention in research and health studies. However, mathematical modeling by means of classical approaches such as stoichiometric networks and ordinary differential equation systems has not yet provided satisfactory insights, due to the complexity of lipid metabolism characterized by many different species with only slight differences and by promiscuous multifunctional enzymes.Here, we present a object-oriented stochastic model approach as a way to cope with the complex lipid metabolic network. While all lipid species are treated objects in the model, they can be modified by the respective converting reactions based on reaction rules, a hybrid method that integrates benefits of agent-based and classical stochastic simulation. This approach allows to follow the dynamics of all lipid species with different fatty acids, different degrees of saturation and different headgroups over time and to analyze the effect of parameter changes, potential mutations in the catalyzing enzymes or provision of different precursors. Applied to yeast metabolism during one cell cycle period, we could analyze the distribution of all lipids to the various membranes in time-dependent manner.The presented approach allows to efficiently treat the complexity of cellular lipid metabolism and to derive conclusions on the time- and location-dependent distributions of lipid species and their properties such as saturation. It is widely applicable, easily extendable and will provide further insights in healthy and diseased states of cell metabolism.

  13. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  14. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  15. Airfoil computations using the gamma-Retheta model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-05-15

    The present work addresses the validation of the implementation of the Menter, Langtry et al. gamma-theta correlation based transition model [1, 2, 3] in the EllipSys2D code. Firstly the 2. order of accuracy of the code is verified using a grid refinement study for laminar, turbulent and transitional computations. Based on this, an estimate of the error in the computations is determined to be approximately one percent in the attached region. Following the verification of the implemented model, the model is applied to four airfoils, NACA64-018, NACA64-218, NACA64-418 and NACA64-618 and the results are compared to measurements [4] and computations using the Xfoil code by Drela et al. [5]. In the linear pre stall region good agreement is observed both for lift and drag, while differences to both measurements and Xfoil computations are observed in stalled conditions. (au)

  16. Assessment of weld thickness loss in offshore pipelines using computed radiography and computational modeling

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Oliveira, D.F.; Silva, A.X.; Lopes, R.T.; Marinho, C.; Camerini, C.S.

    2009-01-01

    In order to guarantee the structural integrity of oil plants it is crucial to monitor the amount of weld thickness loss in offshore pipelines. However, in spite of its relevance, this parameter is very difficult to determine, due to both the large diameter of most pipes and the complexity of the multi-variable system involved. In this study, a computational modeling based on Monte Carlo MCNPX code is combined with computed radiography to estimate the weld thickness loss in large-diameter offshore pipelines. Results show that computational modeling is a powerful tool to estimate intensity variations in radiographic images generated by weld thickness variations, and it can be combined with computed radiography to assess weld thickness loss in offshore and subsea pipelines.

  17. Modeling Strategic Use of Human Computer Interfaces with Novel Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Laura Jane Mariano

    2015-07-01

    Full Text Available Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game’s functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic

  18. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  19. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  20. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  1. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  2. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  3. Preliminary sensitivity analyses of corrosion models for BWIP [Basalt Waste Isolation Project] container materials

    International Nuclear Information System (INIS)

    Anantatmula, R.P.

    1984-01-01

    A preliminary sensitivity analysis was performed for the corrosion models developed for Basalt Waste Isolation Project container materials. The models describe corrosion behavior of the candidate container materials (low carbon steel and Fe9Cr1Mo), in various environments that are expected in the vicinity of the waste package, by separate equations. The present sensitivity analysis yields an uncertainty in total uniform corrosion on the basis of assumed uncertainties in the parameters comprising the corrosion equations. Based on the sample scenario and the preliminary corrosion models, the uncertainty in total uniform corrosion of low carbon steel and Fe9Cr1Mo for the 1000 yr containment period are 20% and 15%, respectively. For containment periods ≥ 1000 yr, the uncertainty in corrosion during the post-closure aqueous periods controls the uncertainty in total uniform corrosion for both low carbon steel and Fe9Cr1Mo. The key parameters controlling the corrosion behavior of candidate container materials are temperature, radiation, groundwater species, etc. Tests are planned in the Basalt Waste Isolation Project containment materials test program to determine in detail the sensitivity of corrosion to these parameters. We also plan to expand the sensitivity analysis to include sensitivity coefficients and other parameters in future studies. 6 refs., 3 figs., 9 tabs

  4. Modeling of Particle Acceleration at Multiple Shocks Via Diffusive Shock Acceleration: Preliminary Results

    Science.gov (United States)

    Parker, L. N.; Zank, G. P.

    2013-12-01

    Successful forecasting of energetic particle events in space weather models require algorithms for correctly predicting the spectrum of ions accelerated from a background population of charged particles. We present preliminary results from a model that diffusively accelerates particles at multiple shocks. Our basic approach is related to box models (Protheroe and Stanev, 1998; Moraal and Axford, 1983; Ball and Kirk, 1992; Drury et al., 1999) in which a distribution of particles is diffusively accelerated inside the box while simultaneously experiencing decompression through adiabatic expansion and losses from the convection and diffusion of particles outside the box (Melrose and Pope, 1993; Zank et al., 2000). We adiabatically decompress the accelerated particle distribution between each shock by either the method explored in Melrose and Pope (1993) and Pope and Melrose (1994) or by the approach set forth in Zank et al. (2000) where we solve the transport equation by a method analogous to operator splitting. The second method incorporates the additional loss terms of convection and diffusion and allows for the use of a variable time between shocks. We use a maximum injection energy (Emax) appropriate for quasi-parallel and quasi-perpendicular shocks (Zank et al., 2000, 2006; Dosch and Shalchi, 2010) and provide a preliminary application of the diffusive acceleration of particles by multiple shocks with frequencies appropriate for solar maximum (i.e., a non-Markovian process).

  5. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  7. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  8. Modeling centrifugal cell washers using computational fluid dynamics.

    Science.gov (United States)

    Kellet, Beth E; Han, Binbing; Dandy, David S; Wickramasinghe, S Ranil

    2004-11-01

    Reinfusion of shed blood during surgery could avoid the need for blood transfusions. Prior to reinfusion of the red blood cells, the shed blood must be washed in order to remove leukocytes, platelets, and other contaminants. Further, the hematocrit of the washed blood must be increased. The feasibility of using computational fluid dynamics (CFD) to guide the design of better centrifuges for processing shed blood is explored here. The velocity field within a centrifuge bowl and the rate of protein removal from the shed blood has been studied. The results obtained indicate that CFD could help screen preliminary centrifuge bowl designs, thus reducing the number of initial experimental tests required when developing new centrifuge bowls. Although the focus of this work is on washing shed blood, the methods developed here are applicable to the design of centrifuge bowls for other blood-processing applications.

  9. Computational needs for modelling accelerator components

    International Nuclear Information System (INIS)

    Hanerfeld, H.

    1985-06-01

    The particle-in-cell MASK is being used to model several different electron accelerator components. These studies are being used both to design new devices and to understand particle behavior within existing structures. Studies include the injector for the Stanford Linear Collider and the 50 megawatt klystron currently being built at SLAC. MASK is a 2D electromagnetic code which is being used by SLAC both on our own IBM 3081 and on the CRAY X-MP at the NMFECC. Our experience with running MASK illustrates the need for supercomputers to continue work of the kind described. 3 refs., 2 figs

  10. Paradox of integration-A computational model

    Science.gov (United States)

    Krawczyk, Małgorzata J.; Kułakowski, Krzysztof

    2017-02-01

    The paradoxical aspect of integration of a social group has been highlighted by Blau (1964). During the integration process, the group members simultaneously compete for social status and play the role of the audience. Here we show that when the competition prevails over the desire of approval, a sharp transition breaks all friendly relations. However, as was described by Blau, people with high status are inclined to bother more with acceptance of others; this is achieved by praising others and revealing her/his own weak points. In our model, this action smooths the transition and improves interpersonal relations.

  11. The Agricultural Model Intercomparison and Improvement Project (AgMIP): Progress and Preliminary Results

    Science.gov (United States)

    Rosenzweig, C.

    2011-12-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a distributed climate-scenario simulation exercise for historical model intercomparison and future climate change conditions with participation of multiple crop and agricultural trade modeling groups around the world. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. Recent progress and the current status of AgMIP will be presented, highlighting three areas of activity: preliminary results from crop pilot studies, outcomes from regional workshops, and emerging scientific challenges. AgMIP crop modeling efforts are being led by pilot studies, which have been established for wheat, maize, rice, and sugarcane. These crop-specific initiatives have proven instrumental in testing and contributing to AgMIP protocols, as well as creating preliminary results for aggregation and input to agricultural trade models. Regional workshops are being held to encourage collaborations and set research activities in motion for key agricultural areas. The first of these workshops was hosted by Embrapa and UNICAMP and held in Campinas, Brazil. Outcomes from this meeting have informed crop modeling research activities within South America, AgMIP protocols, and future regional workshops. Several scientific challenges have emerged and are currently being addressed by AgMIP researchers. Areas of particular interest include geospatial weather generation, ensemble methods for climate scenarios and crop models, spatial aggregation of field-scale yields to regional and global production, and characterization of future changes in climate variability.

  12. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  13. Preliminary economic analysis of poultry litter gasification option with a simple transportation model.

    Science.gov (United States)

    Sheth, Atul C; English, Jennifer

    2005-04-01

    Several environmental issues are related to the disposal of poultry litter. In an effort to provide a more environmentally friendly alternative than landfill disposal or spreading as a soil amendment, work has been carried out previously at the University of Tennessee Space Institute (UTSI). This past UTSI work was concentrated on developing a catalytic steam gasification concept to produce energy from poultry litter. In the past UTSI studies, preliminary design and economics for a stationary, centralized gasification plant capable of processing approximately 100 ton/day of poultry litter were developed. However, in this preliminary design the economic impact of transporting litter to a centralized gasification plant location was not addressed. To determine the preliminary impact of transporting the poultry litter on the overall economics of this energy conversion plant design, a simple transportation model was developed. This model was used in conjunction with the earlier plant design prepared at UTSI to determine the economic feasibility of a centralized, stationary poultry litter gasification plant. To do so, major variables such as traveling distance, plant feed rate (or capacity), fluctuations in the sales price of the product gas (that means value of the energy), population density of poultry farms, impact of tipping fees, and cost of litter were varied. The study showed that for plant with a capacity of 1000 ton/day to be able to withstand several changes in economic conditions and sustain itself, the poultry farm density would need to be approximately 0.3 houses/mi2. Smaller plants would need either a higher energy price or some kind of subsidy to be economically feasible.

  14. Preliminary assessment of PWR Steam Generator modelling in RELAP5/MOD3

    International Nuclear Information System (INIS)

    Preece, R.J.; Putney, J.M.

    1993-07-01

    A preliminary assessment of Steam Generator (SG) modelling in the PWR thermal-hydraulic code RELAP5/MOD3 is presented. The study is based on calculations against a series of steady-state commissioning tests carried out on the Wolf Creek PWR over a range of load conditions. Data from the tests are used to assess the modelling of primary to secondary side heat transfer and, in particular, to examine the effect of reverting to the standard form of the Chen heat transfer correlation in place of the modified form applied in RELAP5/MOD2. Comparisons between the two versions of the code are also used to show how the new interphase drag model in RELAP5/MOD3 affects the calculation of SG liquid inventory and the void fraction profile in the riser

  15. Preliminary analysis on hybrid Box-Jenkins - GARCH modeling in forecasting gold price

    Science.gov (United States)

    Yaziz, Siti Roslindar; Azizan, Noor Azlinna; Ahmad, Maizah Hura; Zakaria, Roslinazairimah; Agrawal, Manju; Boland, John

    2015-02-01

    Gold has been regarded as a valuable precious metal and the most popular commodity as a healthy return investment. Hence, the analysis and prediction of gold price become very significant to investors. This study is a preliminary analysis on gold price and its volatility that focuses on the performance of hybrid Box-Jenkins models together with GARCH in analyzing and forecasting gold price. The Box-Cox formula is used as the data transformation method due to its potential best practice in normalizing data, stabilizing variance and reduces heteroscedasticity using 41-year daily gold price data series starting 2nd January 1973. Our study indicates that the proposed hybrid model ARIMA-GARCH with t-innovation can be a new potential approach in forecasting gold price. This finding proves the strength of GARCH in handling volatility in the gold price as well as overcomes the non-linear limitation in the Box-Jenkins modeling.

  16. COMPUTER MODELLING OF ENERGY SAVING EFFECTS

    Directory of Open Access Journals (Sweden)

    Marian JANCZAREK

    2016-09-01

    Full Text Available The paper presents the analysis of the dynamics of the heat transfer through the outer wall of the thermal technical spaces, taking into account the impact of the sinusoidal nature of the changes in atmospheric temperature. These temporal variations of the input on the outer surface of the chamber divider result at the output of the sinusoidal change on the inner wall of the room, but suitably suppressed and shifted in phase. Properly selected phase shift is clearly important for saving energy used for the operation associated with the maintenance of a specific regime of heat inside the thermal technical chamber support. Laboratory tests of the model and the actual object allowed for optimal design of the chamber due to the structure of the partition as well as due to the orientation of the geographical location of the chamber.

  17. Computational modeling of Metal-Organic Frameworks

    Science.gov (United States)

    Sung, Jeffrey Chuen-Fai

    In this work, the metal-organic frameworks MIL-53(Cr), DMOF-2,3-NH 2Cl, DMOF-2,5-NH2Cl, and HKUST-1 were modeled using molecular mechanics and electronic structure. The effect of electronic polarization on the adsorption of water in MIL-53(Cr) was studied using molecular dynamics simulations of water-loaded MIL-53 systems with both polarizable and non-polarizable force fields. Molecular dynamics simulations of the full systems and DFT calculations on representative framework clusters were utilized to study the difference in nitrogen adsorption between DMOF-2,3-NH2Cl and DMOF-2,5-NH 2Cl. Finally, the control of proton conduction in HKUST-1 by complexation of molecules to the Cu open metal site was investigated using the MS-EVB methodology.

  18. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  19. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  20. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.