WorldWideScience

Sample records for models microcomputer techniques

  1. The single chip microcomputer technique in an intelligent nuclear instrument

    International Nuclear Information System (INIS)

    Wang Tieliu; Sun Punan; Wang Ying

    1995-01-01

    The authors present that how to acquire and process the output signals from the nuclear detector adopting single chip microcomputer technique, including working principles and the designing method of the computer's software and hardware in the single chip microcomputer instrument

  2. Microcomputers, Model Rockets, and Race Cars.

    Science.gov (United States)

    Mirus, Edward A., Jr.

    1985-01-01

    The industrial education orientation program at Wisconsin School for the Deaf (WSD) presents problem-solving situations to all seventh- and eighth-grade hearing-impaired students. WSD developed user-friendly microcomputer software to guide students individually through complex computations involving model race cars and rockets while freeing…

  3. Activities and trends in physical protection modeling with microcomputers

    International Nuclear Information System (INIS)

    Chapman, L.D.; Harlan, C.P.

    1985-01-01

    Sandia National Laboratories developed several models in the mid to late 1970's including the Safeguards Automated Facility Evaluation (SAFE) method. The Estimate of Adversary Sequence Interruption (EASI), the Safeguards Network Analysis Procedure (SNAP), the Brief Adversary Threat Loss Estimator (BATLE), and others. These models were implemented on large computers such as the VAX 11/780 and the CDC machines. With the recent development and widespread use of the IBM PC and other microcomputers, it has become evident that several physical protection models should be made available for use on these microcomputers. Currently, there are programs under way to convert the EASI, SNAP and BATLE models to the IBM PC. The input and analysis using the EASI model has been designed to be very user friendly through the utilization of menu driven options. The SNAP modeling technique will be converted to an IBM PC/AT with many enhancements to user friendliness. Graphical assistance for entering the model and reviewing traces of the simulated output are planned. The BATLE model is being converted to the IBM PC while preserving its interactive nature. The current status of the these developments is reported in this paper

  4. Model H-90A gamma-ray spectrometer with microcomputer

    International Nuclear Information System (INIS)

    Zhang Biao; Dong Chen; Zhao Zhiming

    1994-11-01

    Model H-90A is a 4-channel differential Gamma-ray spectrometer with microcomputer. It consists of a console and NaI(TL) crystal detector with a diameter of φ75 mm x 75 mm. The instrument has excellent performance such as automatic spectrum stabilization, automatic regular timing measurement and automatic calculation of uranium, thorium and potassium contents and their ratios. Original data can be manually and automatically stored. The instrument is provided with shut down supply protective device, reading out can be repeated or be further processed through RS-232 interface output in the case of connection with computer. The working command is inputted by 'soft key' and performed by slice microcomputer automatically through software. It can be used not only in radioactive geological mapping, geochemical research and rapid field assay of radioactive elements in mineral and rock samples, but also for exploration and reconnaissance survey for uranium, thorium, potassium and seeking gold, as well as environmental monitoring

  5. Assembly language program design used in model DD80 multifunction microcomputer multichannel analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Yiziang, Wei; Ying, Chen; Xide, Zhao

    1985-05-01

    This paper describes the structures, features, flowcharts and design considerations of assembly language program used in Model DD80 (FH1920) multifunction microcomputer multichannel analyzer. On the Model TRS-80 (I) microcomputer with DD80 multifunction interface this program can be used in spectrum data acquisition, spectrum live display and some spectrum data processing.

  6. The assembly language program design used in model DD80 multifunction microcomputer multichannel analyzer

    International Nuclear Information System (INIS)

    Wei Yiziang; Chen Ying; Zhao Xide

    1985-01-01

    This paper describes the structures, features, flowcharts and design considerations of assembly language program used in Model DD80 (FH1920) multifunction microcomputer multichannel analyzer. On the Model TRS-80 (I) microcomputer with DD80 multifunction interface this program can be used in spectrum data acquisition, spectrum live display and some spectrum data processing

  7. Interactive display of molecular models using a microcomputer system

    Science.gov (United States)

    Egan, J. T.; Macelroy, R. D.

    1980-01-01

    A simple, microcomputer-based, interactive graphics display system has been developed for the presentation of perspective views of wire frame molecular models. The display system is based on a TERAK 8510a graphics computer system with a display unit consisting of microprocessor, television display and keyboard subsystems. The operating system includes a screen editor, file manager, PASCAL and BASIC compilers and command options for linking and executing programs. The graphics program, written in USCD PASCAL, involves the centering of the coordinate system, the transformation of centered model coordinates into homogeneous coordinates, the construction of a viewing transformation matrix to operate on the coordinates, clipping invisible points, perspective transformation and scaling to screen coordinates; commands available include ZOOM, ROTATE, RESET, and CHANGEVIEW. Data file structure was chosen to minimize the amount of disk storage space. Despite the inherent slowness of the system, its low cost and flexibility suggests general applicability.

  8. Micro-computed tomography and bond strength analysis of different root canal filling techniques

    Directory of Open Access Journals (Sweden)

    Juliane Nhata

    2014-01-01

    Full Text Available Introduction: The aim of this study was to evaluate the quality and bond strength of three root filling techniques (lateral compaction, continuous wave of condensation and Tagger′s Hybrid technique [THT] using micro-computed tomography (CT images and push-out tests, respectively. Materials and Methods: Thirty mandibular incisors were prepared using the same protocol and randomly divided into three groups (n = 10: Lateral condensation technique (LCT, continuous wave of condensation technique (CWCT, and THT. All specimens were filled with Gutta-percha (GP cones and AH Plus sealer. Five specimens of each group were randomly chosen for micro-CT analysis and all of them were sectioned into 1 mm slices and subjected to push-out tests. Results: Micro-CT analysis revealed less empty spaces when GP was heated within the root canals in CWCT and THT when compared to LCT. Push-out tests showed that LCT and THT had a significantly higher displacement resistance (P < 0.05 when compared to the CWCT. Bond strength was lower in apical and middle thirds than in the coronal thirds. Conclusions: It can be concluded that LCT and THT were associated with higher bond strengths to intraradicular dentine than CWCT. However, LCT was associated with more empty voids than the other techniques.

  9. The microcomputer scientific software series 2: general linear model--regression.

    Science.gov (United States)

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  10. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    International Nuclear Information System (INIS)

    Gerhard, M.A.; Sommer, S.C.

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  11. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard, M.A.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States)

    1995-04-01

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.

  12. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1998-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  13. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1997-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  14. Doing Physics with Microcomputers.

    Science.gov (United States)

    Bak, Per

    1983-01-01

    Describes how microcomputers can perform very demanding/large-scale physics calculations at speeds not much slower than those of modern, full-size computers. Among the examples provided are a Monte Carlo simulation of the three-dimensional Ising model and a program (for the Apple microcomputer) using the time-independent Schrodinger Equation. (JN)

  15. A standard library for modeling satellite orbits on a microcomputer

    Science.gov (United States)

    Beutel, Kenneth L.

    1988-03-01

    Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.

  16. A microcomputer-controlled modulation technique for the detection of transient species in UV photoelectron spectroscopy

    International Nuclear Information System (INIS)

    Lonkhuyzen, H. van; Muller, H.G.; Lange, C.A. de

    1980-01-01

    A microcomputer-controlled modulation method is described to measure UV photoelectron spectra of transient species generated in a microwave discharge. Spectra at low and high microwave power levels are simultaneously recorded and afterwards linearly combined in order to remove parent compound signals. The method is applied to discharged oxygen where the transition O 2 + ( 2 PHIsub(u)) 2 ( 1 Δsub(g)) becomes visible without interference from the parent molecule O 2 ( 3 Σsub(g) - ), and to discharged sulphur dioxide where SO( 3 Σ - ) and S( 3 P) photoelectron spectra are obtained free from SO 2 bands. Finally the build-up of transient bands as a function of time is recorded. (orig.)

  17. A Micro-Computed Tomography Technique to Study the Quality of Fibre Optics Embedded in Composite Materials

    Directory of Open Access Journals (Sweden)

    Gabriele Chiesura

    2015-05-01

    Full Text Available Quality of embedment of optical fibre sensors in carbon fibre-reinforced polymers plays an important role in the resultant properties of the composite, as well as for the correct monitoring of the structure. Therefore, availability of a tool able to check the optical fibre sensor-composite interaction becomes essential. High-resolution 3D X-ray Micro-Computed Tomography, or Micro-CT, is a relatively new non-destructive inspection technique which enables investigations of the internal structure of a sample without actually compromising its integrity. In this work the feasibility of inspecting the position, the orientation and, more generally, the quality of the embedment of an optical fibre sensor in a carbon fibre reinforced laminate at unit cell level have been proven.

  18. Real-time interferometer phase detection using an LSI-11 microcomputer and high-speed digital techniques

    International Nuclear Information System (INIS)

    Mendell, D.S.

    1978-01-01

    This paper describes the basic design and philosophy of a real-time, interferometer phase-detection system used on the 2XIIB and TMX magnetic-fusion experiments at the Lawrence Livermore Laboratory. This diagnostics system is now a satellite to a host computer and uses high-speed, emitter-coupled logic techniques to derive data on real-time phase relationships. The system's input signals can be derived from interferometer outputs over a wide range of reference frequencies. An LSI-11 microcomputer is the interface between the high-speed phase-detection logic, buffer memory, human interaction, and host computer. Phase data on a storage CRT is immediately displayed after each experimental fusion shot. An operator can interrogate this phase data more closely from an interactive control panel, while the host computer is simultaneously examining the system's buffer memory or arming the system for the next shot

  19. A simple dynamic model and transient simulation of the nuclear power reactor on microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Han, Yang Gee; Park, Cheol [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    A simple dynamic model is developed for the transient simulation of the nuclear power reactor. The dynamic model includes the normalized neutron kinetics model with reactivity feedback effects and the core thermal-hydraulics model. The main objective of this paper demonstrates the capability of the developed dynamic model to simulate various important variables of interest for a nuclear power reactor transient. Some representative results of transient simulations show the expected trends in all cases, even though no available data for comparison. In this work transient simulations are performed on a microcomputer using the DESIRE/N96T continuous system simulation language which is applicable to nuclear power reactor transient analysis. 3 refs., 9 figs. (Author)

  20. A simple dynamic model and transient simulation of the nuclear power reactor on microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Han, Yang Gee; Park, Cheol [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A simple dynamic model is developed for the transient simulation of the nuclear power reactor. The dynamic model includes the normalized neutron kinetics model with reactivity feedback effects and the core thermal-hydraulics model. The main objective of this paper demonstrates the capability of the developed dynamic model to simulate various important variables of interest for a nuclear power reactor transient. Some representative results of transient simulations show the expected trends in all cases, even though no available data for comparison. In this work transient simulations are performed on a microcomputer using the DESIRE/N96T continuous system simulation language which is applicable to nuclear power reactor transient analysis. 3 refs., 9 figs. (Author)

  1. Microcomputer Competencies for Vocational Teachers.

    Science.gov (United States)

    Roth, Gene L.; Tesolowski, Dennis G.

    1984-01-01

    This joint research and development project of two state departments of education used the DACUM (Developing a Curriculum) process to identify microcomputer competencies for vocational instructors. Brainstorming techniques were used to identify five categories of microcomputer applications and to determine which competencies belonged in each…

  2. Construction of three-dimensional tooth model by micro-computed tomography and application for data sharing.

    Science.gov (United States)

    Kato, A; Ohno, N

    2009-03-01

    The study of dental morphology is essential in terms of phylogeny. Advances in three-dimensional (3D) measurement devices have enabled us to make 3D images of teeth without destruction of samples. However, raw fundamental data on tooth shape requires complex equipment and techniques. An online database of 3D teeth models is therefore indispensable. We aimed to explore the basic methodology for constructing 3D teeth models, with application for data sharing. Geometric information on the human permanent upper left incisor was obtained using micro-computed tomography (micro-CT). Enamel, dentine, and pulp were segmented by thresholding of different gray-scale intensities. Segmented data were separately exported in STereo-Lithography Interface Format (STL). STL data were converted to Wavefront OBJ (OBJect), as many 3D computer graphics programs support the Wavefront OBJ format. Data were also applied to Quick Time Virtual Reality (QTVR) format, which allows the image to be viewed from any direction. In addition to Wavefront OBJ and QTVR data, the original CT series were provided as 16-bit Tag Image File Format (TIFF) images on the website. In conclusion, 3D teeth models were constructed in general-purpose data formats, using micro-CT and commercially available programs. Teeth models that can be used widely would benefit all those who study dental morphology.

  3. Microcomputed tomography and microfinite element modeling for evaluating polymer scaffolds architecture and their mechanical properties.

    Science.gov (United States)

    Alberich-Bayarri, Angel; Moratal, David; Ivirico, Jorge L Escobar; Rodríguez Hernández, José C; Vallés-Lluch, Ana; Martí-Bonmatí, Luis; Estellés, Jorge Más; Mano, Joao F; Pradas, Manuel Monleón; Ribelles, José L Gómez; Salmerón-Sánchez, Manuel

    2009-10-01

    Detailed knowledge of the porous architecture of synthetic scaffolds for tissue engineering, their mechanical properties, and their interrelationship was obtained in a nondestructive manner. Image analysis of microcomputed tomography (microCT) sections of different scaffolds was done. The three-dimensional (3D) reconstruction of the scaffold allows one to quantify scaffold porosity, including pore size, pore distribution, and struts' thickness. The porous morphology and porosity as calculated from microCT by image analysis agrees with that obtained experimentally by scanning electron microscopy and physically measured porosity, respectively. Furthermore, the mechanical properties of the scaffold were evaluated by making use of finite element modeling (FEM) in which the compression stress-strain test is simulated on the 3D structure reconstructed from the microCT sections. Elastic modulus as calculated from FEM is in agreement with those obtained from the stress-strain experimental test. The method was applied on qualitatively different porous structures (interconnected channels and spheres) with different chemical compositions (that lead to different elastic modulus of the base material) suitable for tissue regeneration. The elastic properties of the constructs are explained on the basis of the FEM model that supports the main mechanical conclusion of the experimental results: the elastic modulus does not depend on the geometric characteristics of the pore (pore size, interconnection throat size) but only on the total porosity of the scaffold. (c) 2009 Wiley Periodicals, Inc.

  4. Micro-computer control for super-critical He generation

    International Nuclear Information System (INIS)

    Tamada, Noriharu; Sekine, Takehiro; Tomiyama, Sakutaro

    1979-01-01

    The development of a large scale refrigeration system is being stimulated by new superconducting techniques representated by a superconducting power cable and a magnet. For the practical operation of such a large system, an automatic control system with a computer is required, because it can attain an effective and systematic operation. For this reason, we examined and developed micro-computer control techniques for supercritical He generation, as a simplified control model of the refrigeration system. The experimental results showed that the computer control system can attain fine controlability, even if the control element is only one magnetic valve, but a BASIK program language of micro-computer, which is convinient and generaly used, isn't enough one to control a more complicated system, because of its low calculating speed. Then we conclude that a more effective program language for micro-computer must be developed to realize practical refrigeration control. (author)

  5. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    Science.gov (United States)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass

  6. Nuclear spectrum analysis by using microcomputer

    International Nuclear Information System (INIS)

    Sanyal, M.K.; Mukhopadhyay, P.K.; Rao, A.D.; Pethe, V.A.

    1984-01-01

    A method is presented for analysis of nuclear spectra by using microcomputer. A nonlinear least square fit of a mathematical model with observed spectrum is performed with variable metric method. The linear search procedure of the variable metric method has been modified so that the algorithm needs less program space and computational time both of which are important for microcomputer implementation. This widely used peak analysis method can now be made available in microcomputer based multichannel analysers. (author)

  7. Microcomputer simulation model for facility performance assessment: a case study of nuclear spent fuel handling facility operations

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.; Otis, P.T.

    1985-10-01

    A microcomputer based simulation model was recently developed at the Pacific Northwest Laboratory (PNL) to assist in the evaluation of design alternatives for a proposed facility to receive, consolidate and store nuclear spent fuel from US commercial power plants. Previous performance assessments were limited to deterministic calculations and Gantt chart representations of the facility operations. To insure that the design of the facility will be adequate to meet the specified throughput requirements, the simulation model was used to analyze such factors as material flow, equipment capability and the interface between the MRS facility and the nuclear waste transportation system. The simulation analysis model was based on commercially available software and application programs designed to represent the MRS waste handling facility operations. The results of the evaluation were used by the design review team at PNL to identify areas where design modifications should be considered. 4 figs

  8. Research on single-chip microcomputer controlled rotating magnetic field mineralization model

    Science.gov (United States)

    Li, Yang; Qi, Yulin; Yang, Junxiao; Li, Na

    2017-08-01

    As one of the method of selecting ore, the magnetic separation method has the advantages of stable operation, simple process flow, high beneficiation efficiency and no chemical environment pollution. But the existing magnetic separator are more mechanical, the operation is not flexible, and can not change the magnetic field parameters according to the precision of the ore needed. Based on the existing magnetic separator is mechanical, the rotating magnetic field can be used for single chip microcomputer control as the research object, design and trial a rotating magnetic field processing prototype, and through the single-chip PWM pulse output to control the rotation of the magnetic field strength and rotating magnetic field speed. This method of using pure software to generate PWM pulse to control rotary magnetic field beneficiation, with higher flexibility, accuracy and lower cost, can give full play to the performance of single-chip.

  9. Detectability of Middle Mesial Root Canal Orifices by Troughing Technique in Mandibular Molars: A Micro-computed Tomographic Study.

    Science.gov (United States)

    Keleş, Ali; Keskin, Cangül

    2017-08-01

    The objective of the present study was to measure the orifice depth of middle mesial canals (MMCs) and evaluate the detectability of orifices using troughing preparation. For this study, 85 mandibular molar teeth with MMCs were selected from the scanned micro-computed tomographic images. The MMCs were categorized, and the distances between the MMC orifices and the cementoenamel junctions (CEJ) were recorded as the depth of the orifice. Data were evaluated with frequency analysis and a chi-square test using SPSS (SPSS Inc, Chicago, IL), and the results indicated a 5% significance level. It was found that 77.41% of the MMC orifices were at the CEJ level, whereas 5.38% and 9.69% of the MMC orifices were detectable within 1-mm and 2-mm depths from the CEJ, respectively. Of the specimens, 7.52% had MMC orifices deeper than 2 mm from the CEJ. Confluent anatomy was the most frequent configuration. No significant relation was detected between the orifice depth and MMC configuration (P > .05). It was concluded that 77.41% of the specimens did not require troughing preparation, the remaining 15.07% would require troughing, and 7.52% could not be accessed even with the troughing preparation. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. RAMS Model for Terrestrial Pathways Version 3. 0 (for microcomputers). Model-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Niebla, E.

    1989-01-01

    The RAMS Model for Terrestrial Pathways is a computer program for calculation of numeric criteria for land application and distribution and marketing of sludges under the sewage-sludge regulations at 40 CFR Part 503. The risk-assessment models covered assume that municipal sludge with specified characteristics is spread across a defined area of ground at a known rate once each year for a given number of years. Risks associated with direct land application of sludge applied after distribution and marketing are both calculated. The computer program calculates the maximum annual loading of contaminants that can be land applied and still meet the risk criteria specified as input. Software Description: The program is written in the Turbo/Basic programming language for implementation on IBM PC/AT or compatible machines using DOS 3.0 or higher operating system. Minimum core storage is 512K.

  11. Bubble Chamber Research Group Microcomputer Unit

    International Nuclear Information System (INIS)

    Bairstow, R.; Barlow, J.; Mace, P.R.; Seller, P.; Waters, M.; Watson, J.G.

    1982-05-01

    A distributed data acquisition system has been developed by the Bubble Chamber Research Group at the Rutherford Appleton laboratory for use with their film measuring machines. The system is based upon a set of microcomputers linked together with a VAX 11/780 computer, in a local area computer network. This network is of the star type and uses a packet switching technique. Each film measuring machine is equipped with a microcomputer which controls the function of the table, buffers data and enhances the interface between operators and machines. This paper provides a detailed description of each microcomputer and can be used as a reference manual for these computers. (author)

  12. Compression-recovery model of absorptive glass mat (AGM) separator guided by X-ray micro-computed tomography analysis

    Science.gov (United States)

    Kameswara Rao, P. V.; Rawal, Amit; Kumar, Vijay; Rajput, Krishn Gopal

    2017-10-01

    Absorptive glass mat (AGM) separators play a key role in enhancing the cycle life of the valve regulated lead acid (VRLA) batteries by maintaining the elastic characteristics under a defined level of compression force with the plates of the electrodes. Inevitably, there are inherent challenges to maintain the required level of compression characteristics of AGM separators during the charge and discharge of the battery. Herein, we report a three-dimensional (3D) analytical model for predicting the compression-recovery behavior of AGM separators by formulating a direct relationship with the constituent fiber and structural parameters. The analytical model of compression-recovery behavior of AGM separators has successfully included the fiber slippage criterion and internal friction losses. The presented work uses, for the first time, 3D data of fiber orientation from X-ray micro-computed tomography, for predicting the compression-recovery behavior of AGM separators. A comparison has been made between the theoretical and experimental results of compression-recovery behavior of AGM samples with defined fiber orientation characteristics. In general, the theory agreed reasonably well with the experimental results of AGM samples in both dry and wet states. Through theoretical modeling, fiber volume fraction was established as one of the key structural parameters that modulates the compression hysteresis of an AGM separator.

  13. Effect of Post Space Preparation on Apical Obturation Quality of Teeth Obturated with Different Techniques: A Micro-computed Tomographic Study.

    Science.gov (United States)

    Küçükkaya Eren, Selen; Askerbeyli Örs, Sevinc; Yılmaz, Zeliha

    2017-07-01

    The purpose of this study was to evaluate the obturation quality of root canals filled with different techniques and to determine whether post space preparation had an effect on the quality of apical obturation using micro-computed tomographic (micro-CT) imaging. The root canals of 30 human mandibular premolar teeth were instrumented, and the specimens were divided into 3 groups according to the obturation technique used: cold lateral compaction (CLC), warm vertical compaction (WVC), or single-cone (SC) techniques. The specimens were stored at 37°C and 100% humidity for 1 week. Then, the coronal root filling material was removed in order to create a post space. Micro-CT scans were performed before and after post space preparation for the volumetric analysis of voids and filling materials. Data were analyzed using repeated-measures analysis of variance and Bonferroni tests. The CLC and SC groups showed a significantly greater percentage volume of voids than the WVC group (P space preparation (P > .05). The post space preparation caused a significant increase in the percentage volume of voids in the CLC and SC groups (P space preparation (P > .05). No root fillings were void free. The WVC group presented the best obturation quality. The post space preparation negatively influenced the apical integrity of the filling materials in the CLC and SC groups, whereas it had no significant effect in the WVC group. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Trend Analysis Using Microcomputers.

    Science.gov (United States)

    Berger, Carl F.

    A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…

  15. Sketching with a microcomputer

    DEFF Research Database (Denmark)

    Jacobi, P.

    This report describes the use of a microcomputer as a tool for the sketch design phase of the building process. A housing development scheme comprising 175 dwellings is chosen for illustrating the procedures. Here the microcomputer is utilized for analysing the landscape, for the three-dimensiona...

  16. Microcomputers "Goto" School.

    Science.gov (United States)

    Piele, Donald T.

    This paper is a report of a pilot project in which a microcomputer was placed in a sixth grade classroom for eight weeks for the purpose of developing logical thinking skills. Students were first given instruction on how to program the APPLE II microcomputer to draw color graphics designs; they were then given similar problems to solve using the…

  17. Using Micro-Computed Tomography to Evaluate the Dynamics of Orthodontically Induced Root Resorption Repair in a Rat Model.

    Directory of Open Access Journals (Sweden)

    Xiaolin Xu

    Full Text Available To observe dynamic changes in root resorption repair, tooth movement relapse and alveolar bone microstructure following the application of orthodontic force.Forces of 20 g, 50 g or 100 g were delivered to the left maxillary first molars of fifteen 10-week-old rats for 14 days. Each rat was subjected to micro-computed tomography scanning at 0, 3, 7, 10, 14, 28 and 42 days after force removal. The root resorption crater volume, tooth movement relapse and alveolar bone microarchitecture were measured at each time point.From day 3 to day 14, the root resorption volume decreased significantly in each group. In the 20-g force group, the root resorption volume gradually stabilized after 14 days, whereas in the 50-g and 100-g force groups, it stabilized after 28 days. In all groups, tooth movement relapsed significantly from day 0 to day 14 and then remained stable. From day 3 to day 10, the 20-g group exhibited faster relapse than the 50-g and 100-g groups. In all groups, the structure model index and trabecular separation decreased slowly from day 0 to day 10 and eventually stabilized. Trabecular number increased slowly from day 0 to day 7 and then stabilized.The initial stage of root resorption repair did not change significantly and was followed by a dramatic repair period before stabilizing. The most serious tooth movement relapse occurred immediately after the appliance was removed, and then the tooth completely returned to the original position.

  18. Assessment of three root canal preparation techniques on root canal geometry using micro-computed tomography: In vitro study

    Directory of Open Access Journals (Sweden)

    Shaikha M Al-Ali

    2012-01-01

    Full Text Available Aim: To assess the effects of three root canal preparation techniques on canal volume and surface area using three-dimensionally reconstructed root canals in extracted human maxillary molars. Materials and Methods: Thirty extracted Human Maxillary Molars having three separate roots and similar root shape were randomly selected from a pool of extracted teeth for this study and stored in normal saline solution until used. A computed tomography scanner (Philips Brilliance CT 64-slice was used to analyze root canals in extracted maxillary molars. Specimens were scanned before and after canals were prepared using stainless steel K-Files, Ni-Ti rotary ProTaper and rotary SafeSiders instruments. Differences in dentin volume removed, the surface area, the proportion of unchanged area and canal transportation were calculated using specially developed software. Results: Instrumentation of canals increased volume and surface area. Statistical analysis found a statistically significant difference among the 3 groups in total change in volume (P = 0.001 and total change in surface area (P = 0.13. Significant differences were found when testing both groups with group III (SafeSiders. Significant differences in change of volume were noted when grouping was made with respect to canal type (in MB and DB (P < 0.05. Conclusion: The current study used computed tomography, an innovative and non destructive technique, to illustrate changes in canal geometry. Overall, there were few statistically significant differences between the three instrumentation techniques used. SafeSiders stainless steel 40/0.02 instruments exhibit a greater cutting efficiency on dentin than K-Files and ProTaper. CT is a new and valuable tool to study root canal geometry and changes after preparation in great details. Further studies with 3D-techniques are required to fully understand the biomechanical aspects of root canal preparation.

  19. The effect of a manual instrumentation technique on five types of premolar root canal geometry assessed by microcomputed tomography and three-dimensional reconstruction

    International Nuclear Information System (INIS)

    Li, Ke-Zeng; Gao, Yuan; Zhang, Ru; Hu, Tao; Guo, Bin

    2011-01-01

    Together with diagnosis and treatment planning, a good knowledge of the root canal system and its frequent variations is a necessity for successful root canal therapy. The selection of instrumentation techniques for variants in internal anatomy of teeth has significant effects on the shaping ability and cleaning effectiveness. The aim of this study was to reveal the differences made by including variations in the internal anatomy of premolars into the study protocol for investigation of a single instrumentation technique (hand ProTaper instruments) assessed by microcomputed tomography and three-dimensional reconstruction. Five single-root premolars, whose root canal systems were classified into one of five types, were scanned with micro-CT before and after preparation with a hand ProTaper instrument. Instrumentation characteristics were measured quantitatively in 3-D using a customized application framework based on MeVisLab. Numeric values were obtained for canal surface area, volume, volume changes, percentage of untouched surface, dentin wall thickness, and the thickness of dentin removed. Preparation errors were also evaluated using a color-coded reconstruction. Canal volumes and surface areas were increased after instrumentation. Prepared canals of all five types were straightened, with transportation toward the inner aspects of S-shaped or multiple curves. However, a ledge was formed at the apical third curve of the type II canal system and a wide range in the percentage of unchanged canal surfaces (27.4-83.0%) was recorded. The dentin walls were more than 0.3 mm thick except in a 1 mm zone from the apical surface and the hazardous area of the type II canal system after preparation with an F3 instrument. The 3-D color-coded images showed different morphological changes in the five types of root canal systems shaped with the same hand instrumentation technique. Premolars are among the most complex teeth for root canal treatment and instrumentation techniques

  20. The effect of a manual instrumentation technique on five types of premolar root canal geometry assessed by microcomputed tomography and three-dimensional reconstruction

    Directory of Open Access Journals (Sweden)

    Hu Tao

    2011-06-01

    Full Text Available Abstract Background Together with diagnosis and treatment planning, a good knowledge of the root canal system and its frequent variations is a necessity for successful root canal therapy. The selection of instrumentation techniques for variants in internal anatomy of teeth has significant effects on the shaping ability and cleaning effectiveness. The aim of this study was to reveal the differences made by including variations in the internal anatomy of premolars into the study protocol for investigation of a single instrumentation technique (hand ProTaper instruments assessed by microcomputed tomography and three-dimensional reconstruction. Methods Five single-root premolars, whose root canal systems were classified into one of five types, were scanned with micro-CT before and after preparation with a hand ProTaper instrument. Instrumentation characteristics were measured quantitatively in 3-D using a customized application framework based on MeVisLab. Numeric values were obtained for canal surface area, volume, volume changes, percentage of untouched surface, dentin wall thickness, and the thickness of dentin removed. Preparation errors were also evaluated using a color-coded reconstruction. Results Canal volumes and surface areas were increased after instrumentation. Prepared canals of all five types were straightened, with transportation toward the inner aspects of S-shaped or multiple curves. However, a ledge was formed at the apical third curve of the type II canal system and a wide range in the percentage of unchanged canal surfaces (27.4-83.0% was recorded. The dentin walls were more than 0.3 mm thick except in a 1 mm zone from the apical surface and the hazardous area of the type II canal system after preparation with an F3 instrument. Conclusions The 3-D color-coded images showed different morphological changes in the five types of root canal systems shaped with the same hand instrumentation technique. Premolars are among the most

  1. Comparison of three retreatment techniques with ultrasonic activation in flattened canals using micro-computed tomography and scanning electron microscopy.

    Science.gov (United States)

    Bernardes, R A; Duarte, M A H; Vivan, R R; Alcalde, M P; Vasconcelos, B C; Bramante, C M

    2015-08-17

    To use micro-CT to quantitatively evaluate the amount of residual filling material after using several techniques to remove root fillings with and without ultrasonic activation and to analyse the cleanliness of the root canal walls and dentine tubules with scanning electron microscopy (SEM). The root canals of one hundred and eight human mandibular incisors were selected and instrumented with rotary files using the BioRace system up to file size 40, .04 taper. After instrumentation, the teeth were filled using a hybrid technique with gutta-percha and sealer then divided into three groups according to the method used for removing the root filling: G1-Reciproc (using only instrument R50), G2-ProTaper Universal retreatment system and G3-Manual (hand files and Gates-Glidden burs). All groups were divided into two subgroups depending on whether ultrasonic agitation was used with the irrigants. Micro-CT scans were taken before and after removal of the filling material to detect residual material in the canal. After micro-CT analysis, the roots were cut in half, imaged by SEM and scored based on the amount of surface covered by root filling remnants. The data were analysed statistically using a significance level of 5%. All groups had retained material in the root canals after instrumentation. The Reciproc method was associated with less retained material than the ProTaper and Manual methods. Ultrasonic activation significantly reduced the amount of residual root filling in all groups (P material. Ultrasonic activation improved the removal of root filling material in all groups. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  2. A Microcomputer Program that Simulates the Baumol-Tobin Transactions Demand for Money.

    Science.gov (United States)

    Beckman, Steven

    1987-01-01

    This article describes an economic model dealing with the demand for money and a microcomputer program which enables students to experiment with cash management techniques. By simulating personal experiences, the program teaches how changes in income, interest rates, and charges for exchanging bonds and cash affect money demand. (Author/JDH)

  3. Tevatron extraction microcomputer

    International Nuclear Information System (INIS)

    Chapman, L.; Finley, D.A.; Harrison, M.; Merz, W.; Batavia, IL)

    1985-01-01

    Extraction in the Fermilab Tevatron is controlled by a multi-processor Multibus microcomputer system called QXR (Quad eXtraction Regulator). QXR monitors several analog beam signals and controls three sets of power supplies: the ''bucker'' and ''pulse'' magnets at a rate of 5760 Hz, and the ''QXR'' magnets at 720 Hz. QXR supports multiple slow spills (up to a total of 35 seconds) with multiple fast pulses intermixed. It linearizes the slow spill and bucks out the high frequency components. Fast extraction is done by outputting a variable pulse waveform. Closed loop learning techniques are used to improve performance from cycle to cycle for both slow and fast extraction. The system is connected to the Tevatron clock system so that it can track the machine cycle. QXR is also connected to the rest of the Fermilab control system, ACNET. Through ACNET, human operators and central computers can monitor and control extraction through communications with QXR. The controls hardware and software both employ some standard and some specialized components. This paper gives an overview of QXR as a control system; another paper (1) summarizes performance

  4. Tevatron extraction microcomputer

    International Nuclear Information System (INIS)

    Chapman, L.; Finley, D.A.; Harrison, M.; Merz, W.

    1985-06-01

    Extraction in the Fermilab Tevatron is controlled by a multi-processor Multibus microcomputer system called QXR (Quad eXtraction Regulator). QXR monitors several analog beam signals and controls three sets of power supplies: the ''bucker'' and ''pulse'' magnets at a rate of 5760 Hz, and the ''QXR'' magnets at 720 Hz. QXR supports multiple slow spills (up to a total of 35 seconds) with multiple fast pulses intermixed. It linearizes the slow spill and bucks out the high frequency components. Fast extraction is done by outputting a variable pulse waveform. Closed loop learning techniques are used to improve performance from cycle to cycle for both slow and fast extraction. The system is connected to the Tevatron clock system so that it can track the machine cycle. QXR is also connected to the rest of the Fermilab control system, ACNET. Through ACNET, human operators and central computers can monitor and control extraction through communications with QXR. The controls hardware and software both employ some standard and some specialized components. This paper gives an overview of QXR as a control system; another paper summarizes performance

  5. Microcomputer model for an analysis of the financial feasibility of a mining project

    International Nuclear Information System (INIS)

    Ciruelos, J.; Duchene, M.

    1983-01-01

    The model presented permits a simulation of the predicted profitability of a mining project at the stage of feasibility studies by making use of a simple individual computer, the Apple II. The model presented can be used to treat the following three areas: definition of the mode of financing the project and calculation of the financial flows which make it possible to evaluate the profitability of this project; analysis of sensitivity, which makes it possible to determine the most critical variables for the future of the project; analysis of the risk [fr

  6. Mathematical modelling techniques

    CERN Document Server

    Aris, Rutherford

    1995-01-01

    ""Engaging, elegantly written."" - Applied Mathematical ModellingMathematical modelling is a highly useful methodology designed to enable mathematicians, physicists and other scientists to formulate equations from a given nonmathematical situation. In this elegantly written volume, a distinguished theoretical chemist and engineer sets down helpful rules not only for setting up models but also for solving the mathematical problems they pose and for evaluating models.The author begins with a discussion of the term ""model,"" followed by clearly presented examples of the different types of mode

  7. Radiological and micro-computed tomography analysis of the bone at dental implants inserted 2, 3 and 4 mm apart in a minipig model with platform switching incorporated.

    Science.gov (United States)

    Elian, Nicolas; Bloom, Mitchell; Dard, Michel; Cho, Sang-Choon; Trushkowsky, Richard D; Tarnow, Dennis

    2014-02-01

    The purpose of this study was to assess the effect of inter-implant distance on interproximal bone utilizing platform switching. Analysis of interproximal bone usually depends on traditional two-dimensional radiographic assessment. Although there has been increased reliability of current techniques, there has been an inability to track bone level changes over time and in three dimensions. Micro-CT has provided three-dimensional imaging that can be used in conjunction with traditional two-dimensional radiographic techniques. This study was performed on 24 female minipigs. Twelve animals received three implants with an inter-implant distance of 3 mm on one side of the mandible and another three implants on the contra-lateral side, where the implants were placed 2 mm apart creating a split mouth design. Twelve other animals received three implants with an inter-implant distance of 3 mm on one side of the mandible and another three implants on the contra-lateral side, where the implants were placed 4 mm apart creating a split mouth design too. The quantitative evaluation was performed comparatively on radiographs taken at t 0 (immediately after implantation) and at t 8 weeks (after termination). The samples were scanned by micro-computed tomography (μCT) to quantify the first bone to implant contact (fBIC) and bone volume/total volume (BV/TV). Mixed model regressions using the nonparametric Brunner-Langer method were used to determine the effect of inter-implant distance on the measured outcomes. The change in bone level was determined using radiography and its mean was 0.05 mm for an inter-implant distance of 3 and 0.00 mm for a 2 mm distance (P = 0.7268). The mean of this outcome was 0.18 mm for the 3 mm and for 4 mm inter-implant distance (P = 0.9500). Micro-computed tomography showed that the fBIC was always located above the reference, 0.27 and 0.20 mm for the comparison of 2-3 mm (P = 0.4622) and 0.49 and 0.34 mm for the inter-implant distance of 3 and 4 mm (P

  8. Boundary representation modelling techniques

    CERN Document Server

    2006-01-01

    Provides the most complete presentation of boundary representation solid modelling yet publishedOffers basic reference information for software developers, application developers and users Includes a historical perspective as well as giving a background for modern research.

  9. Micro-computed tomography (CT) based assessment of dental regenerative therapy in the canine mandible model

    Science.gov (United States)

    Khobragade, P.; Jain, A.; Setlur Nagesh, S. V.; Andreana, S.; Dziak, R.; Sunkara, S. K.; Sunkara, S.; Bednarek, D. R.; Rudin, S.; Ionita, C. N.

    2015-03-01

    High-resolution 3D bone-tissue structure measurements may provide information critical to the understanding of the bone regeneration processes and to the bone strength assessment. Tissue engineering studies rely on such nondestructive measurements to monitor bone graft regeneration area. In this study, we measured bone yield, fractal dimension and trabecular thickness through micro-CT slices for different grafts and controls. Eight canines underwent surgery to remove a bone volume (defect) in the canine's jaw at a total of 44 different locations. We kept 11 defects empty for control and filled the remaining ones with three regenerative materials; NanoGen (NG), a FDA-approved material (n=11), a novel NanoCalcium Sulfate (NCS) material (n=11) and NCS alginate (NCS+alg) material (n=11). After a minimum of four and eight weeks, the canines were sacrificed and the jaw samples were extracted. We used a custombuilt micro-CT system to acquire the data volume and developed software to measure the bone yield, fractal dimension and trabecular thickness. The software used a segmentation algorithm based on histograms derived from volumes of interest indicated by the operator. Using bone yield and fractal dimension as indices we are able to differentiate between the control and regenerative material (p<0.005). Regenerative material NCS showed an average 63.15% bone yield improvement over the control sample, NCS+alg showed 55.55% and NanoGen showed 37.5%. The bone regeneration process and quality of bone were dependent upon the position of defect and time period of healing. This study presents one of the first quantitative comparisons using non-destructive Micro-CT analysis for bone regenerative material in a large animal with a critical defect model. Our results indicate that Micro-CT measurement could be used to monitor invivo bone regeneration studies for greater regenerative process understanding.

  10. New techniques for subdivision modelling

    OpenAIRE

    BEETS, Koen

    2006-01-01

    In this dissertation, several tools and techniques for modelling with subdivision surfaces are presented. Based on the huge amount of theoretical knowledge about subdivision surfaces, we present techniques to facilitate practical 3D modelling which make subdivision surfaces even more useful. Subdivision surfaces have reclaimed attention several years ago after their application in full-featured 3D animation movies, such as Toy Story. Since then and due to their attractive properties an ever i...

  11. Survey of semantic modeling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C.L.

    1975-07-01

    The analysis of the semantics of programing languages was attempted with numerous modeling techniques. By providing a brief survey of these techniques together with an analysis of their applicability for answering semantic issues, this report attempts to illuminate the state-of-the-art in this area. The intent is to be illustrative rather than thorough in the coverage of semantic models. A bibliography is included for the reader who is interested in pursuing this area of research in more detail.

  12. A microcomputer-based model for identifying urban and suburban roadways with critical large truck accident rates

    International Nuclear Information System (INIS)

    Brogan, J.D.; Cashwell, J.W.

    1992-01-01

    This paper presents an overview of techniques for merging highway accident record and roadway inventory files and employing the combined data set to identify spots or sections on highway facilities in urban and suburban areas with unusually high large truck accident rates. A statistical technique, the rate/quality control method, is used to calculate a critical rate for each location of interest. This critical rate may then be compared to the location's actual accident rate to identify locations for further study. Model enhancements and modifications are described to enable the technique to be employed in the evaluation of routing alternatives for the transport of radioactive material

  13. Microcomputer interfacing and applications

    CERN Document Server

    Mustafa, M A

    1990-01-01

    This is the applications guide to interfacing microcomputers. It offers practical non-mathematical solutions to interfacing problems in many applications including data acquisition and control. Emphasis is given to the definition of the objectives of the interface, then comparing possible solutions and producing the best interface for every situation. Dr Mustafa A Mustafa is a senior designer of control equipment and has written many technical articles and papers on the subject of computers and their application to control engineering.

  14. Microcomputers and computer networks

    International Nuclear Information System (INIS)

    Owens, J.L.

    1976-01-01

    Computers, for all their speed and efficiency, have their foibles and failings. Until the advent of minicomputers, users often had to supervise their programs personally to make sure they executed correctly. Minicomputers could take over some of these chores, but they were too expensive to be dedicated to any but the most vital services. Inexpensive, easily programmed microcomputers are easing this limitation, and permitting a flood of new applications. 3 figures

  15. Microcomputer-controlled world time display for public area viewing

    Science.gov (United States)

    Yep, S.; Rashidian, M.

    1982-05-01

    The design, development, and implementation of a microcomputer-controlled world clock is discussed. The system, designated international Time Display System (ITDS), integrates a Geochron Calendar Map and a microcomputer-based digital display to automatically compensate for daylight savings time, leap year, and time zone differences. An in-depth technical description of the design and development of the electronic hardware, firmware, and software systems is provided. Reference material on the time zones, fabrication techniques, and electronic subsystems are also provided.

  16. Advanced Atmospheric Ensemble Modeling Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Chiswell, S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kurzeja, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Maze, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Viner, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Werth, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-29

    Ensemble modeling (EM), the creation of multiple atmospheric simulations for a given time period, has become an essential tool for characterizing uncertainties in model predictions. We explore two novel ensemble modeling techniques: (1) perturbation of model parameters (Adaptive Programming, AP), and (2) data assimilation (Ensemble Kalman Filter, EnKF). The current research is an extension to work from last year and examines transport on a small spatial scale (<100 km) in complex terrain, for more rigorous testing of the ensemble technique. Two different release cases were studied, a coastal release (SF6) and an inland release (Freon) which consisted of two release times. Observations of tracer concentration and meteorology are used to judge the ensemble results. In addition, adaptive grid techniques have been developed to reduce required computing resources for transport calculations. Using a 20- member ensemble, the standard approach generated downwind transport that was quantitatively good for both releases; however, the EnKF method produced additional improvement for the coastal release where the spatial and temporal differences due to interior valley heating lead to the inland movement of the plume. The AP technique showed improvements for both release cases, with more improvement shown in the inland release. This research demonstrated that transport accuracy can be improved when models are adapted to a particular location/time or when important local data is assimilated into the simulation and enhances SRNL’s capability in atmospheric transport modeling in support of its current customer base and local site missions, as well as our ability to attract new customers within the intelligence community.

  17. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  18. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  19. Microcomputer Typewriting in Business Education.

    Science.gov (United States)

    Schmidt, B. June; Stewart, Jeffrey R.

    1983-01-01

    Describes a research project on the role of the instructor in managing microcomputer typewriting instruction. The teachers selected software, familiarized students with the equipment, provided support, monitored progress, helped students establish goals, and provided instructional activities. (JOW)

  20. Microcomputer Applications for Teaching Microeconomic Concepts: Some Old and New Approaches.

    Science.gov (United States)

    Smith, L. Murphy; Smith, L. C., Jr.

    1989-01-01

    Presents microcomputer programs and programing techniques and demonstrates how these programs can be used by teachers to explain economics concepts and to help students make judgments. Each microcomputer application is supplemented by traditional graphic and mathematical analysis. Discusses applications dealing with supply, demand, elasticity,…

  1. Use of microcomputers for planning and managing silviculture habitat relationships.

    Science.gov (United States)

    B.G. Marcot; R.S. McNay; R.E. Page

    1988-01-01

    Microcomputers aid in monitoring, modeling, and decision support for integrating objectives of silviculture and wildlife habitat management. Spreadsheets, data bases, statistics, and graphics programs are described for use in monitoring. Stand growth models, modeling languages, area and geobased information systems, and optimization models are discussed for use in...

  2. Three-dimensional quantification of orthodontic root resorption with time-lapsed imaging of micro-computed tomography in a rodent model.

    Science.gov (United States)

    Yang, Chongshi; Zhang, Yuanyuan; Zhang, Yan; Fan, Yubo; Deng, Feng

    2015-01-01

    Despite various X-ray approaches have been widely used to monitor root resorption after orthodontic treatment, a non-invasive and accurate method is highly desirable for long-term follow up. The aim of this study was to build a non-invasive method to quantify longitudinal orthodontic root resorption with time-lapsed images of micro-computed tomography (micro-CT) in a rodent model. Twenty male Sprague Dawley (SD) rats (aged 6-8 weeks, weighing 180-220 g) were used in this study. A 25 g orthodontic force generated by nickel-titanium coil spring was applied to the right maxillary first molar for each rat, while contralateral first molar was severed as a control. Micro-CT scan was performed at day 0 (before orthodontic load) and days 3, 7, 14, and 28 after orthodontic load. Resorption of mesial root of maxillary first molars at bilateral sides was calculated from micro-CT images with registration algorithm via reconstruction, superimposition and partition operations. Obvious resorption of mesial root of maxillary first molar can be detected at day 14 and day 28 at orthodontic side. Most of the resorption occurred in the apical region at distal side and cervical region at mesiolingual side. Desirable development of molar root of rats was identified from day 0 to day 28 at control side. The development of root concentrated on apical region. This non-invasive 3D quantification method with registration algorithm can be used in longitudinal study of root resorption. Obvious root resorption in rat molar can be observed three-dimensionally at day 14 and day 28 after orthodontic load. This indicates that registration algorithm combined with time-lapsed images provides clinic potential application in detection and quantification of root contour.

  3. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  4. Microcomputer data acquisition and control.

    Science.gov (United States)

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.

  5. Microcomputed tomography-based assessment of retrieved dental implants

    NARCIS (Netherlands)

    Narra, N.; Antalainen, A.K.; Zipprich, H.; Sándor, G.K.; Wolff, J.

    2015-01-01

    Purpose: The aim of this study was to demonstrate the potential of microcomputed tomography (micro-CT) technology in the assessment of retrieved dental implants. Cases are presented to illustrate the value of micro-CT imaging techniques in determining possible mechanical causes for dental implant

  6. Reactor physics using a microcomputer

    International Nuclear Information System (INIS)

    Murray, R.L.

    1983-01-01

    The object of the work reported is to develop educational computer modules for all aspects of reactor physics. The modules consist of a description of the theory, mathematical method, computer program listing, sample calculations, and problems for the student, along with a card deck. Modules were first written in FORTRAN for an IBM 360/75, then later in BASIC for microcomputers. Problems include: limitation of equipment, choice of format for the program, the variety of dialects of BASIC used in the different microcomputer and peripherals brands, and knowing when to quit in the process of developing a program

  7. A full automatic system controlled with IBM-PC/XT micro-computer for neutron activation analysis

    International Nuclear Information System (INIS)

    Song Quanxun

    1992-01-01

    A full automatic system controlled with micro-computers for NAA is described. All processes are automatically completed with an IBM-PC/XT micro-computer. The device is stable, reliable, flexible and convenient for use and has many functions and applications in automatical analysis of long, middle and short lived nuclides. Due to a high working efficiency of the instrument and micro-computers, both time and power can be saved. This method can be applied in other nuclear analysis techniques

  8. Bisphosphonate effects in rat unloaded hindlimb bone loss model: three-dimensional microcomputed tomographic, histomorphometric, and densitometric analyses.

    Science.gov (United States)

    Barou, O; Lafage-Proust, M H; Martel, C; Thomas, T; Tirode, F; Laroche, N; Barbier, A; Alexandre, C; Vico, L

    1999-10-01

    The effects of antiresorptive drugs on bone loss remain unclear. Using three-dimensional microtomography, dual X-ray/densitometry, and histomorphometry, we evaluated tiludronate effects in the bone loss model of immobilization in tail-suspended rats after 7, 13, and 23 days. Seventy-eight 12-week-old Wistar male rats were assigned to 13 groups: 1 baseline group, and for each time point, 1 control group treated with vehicle and three tail-suspended groups treated with either tiludronate (0.5 or 5 mg/kg) or vehicle, administered s. c. every other day, during the last week before sacrifice. In primary spongiosa (ISP), immobilization-induced bone loss plateaued after day 7 and was prevented by tiludronate. In secondary spongiosa (IISP), bone loss appeared at day 13 with a decrease in trabecular thickness and trabecular number (Tb.N) as assessed by three-dimensional microtomography. Osteoclastic parameters did not differ in tail-suspended rats versus control rats, whereas bone formation showed a biphasic pattern: after a marked decrease at day 7, osteoblastic activity and recruitment normalized at days 13 and 23, respectively. At day 23, the 80% decrease in bone mass was fully prevented by high-dose tiludronate with an increase in Tb.N without preventing trabecular thinning. In summary, at day 7, tiludronate prevented bone loss in ISP. After day 13, tiludronate prevented bone loss in ISP and IISP despite a further decrease in bone formation. Thus, the preventive effects of tiludronate in this model may be related to the alteration in bone modeling with an increase in Tb.N in ISP and subsequently in IISP.

  9. Application of in vivo micro-computed tomography in the temporal characterisation of subchondral bone architecture in a rat model of low-dose monosodium iodoacetate-induced osteoarthritis

    Science.gov (United States)

    2011-01-01

    Introduction Osteoarthritis (OA) is a complex, multifactorial joint disease affecting both the cartilage and the subchondral bone. Animal models of OA aid in the understanding of the pathogenesis of OA and testing suitable drugs for OA treatment. In this study we characterized the temporal changes in the tibial subchondral bone architecture in a rat model of low-dose monosodium iodoacetate (MIA)-induced OA using in vivo micro-computed tomography (CT). Methods Male Wistar rats received a single intra-articular injection of low-dose MIA (0.2 mg) in the right knee joint and sterile saline in the left knee joint. The animals were scanned in vivo by micro-CT at two, six, and ten weeks post-injection, analogous to early, intermediate, and advanced stages of OA, to assess architectural changes in the tibial subchondral bone. The articular cartilage changes in the tibiae were assessed macroscopically and histologically at ten weeks post-injection. Results Interestingly, tibiae of the MIA-injected knees showed significant bone loss at two weeks, followed by increased trabecular thickness and separation at six and ten weeks. The trabecular number was decreased at all time points compared to control tibiae. The tibial subchondral plate thickness of the MIA-injected knee was increased at two and six weeks and the plate porosity was increased at all time points compared to control. At ten weeks, histology revealed loss of proteoglycans, chondrocyte necrosis, chondrocyte clusters, cartilage fibrillation, and delamination in the MIA-injected tibiae, whereas the control tibiae showed no changes. Micro-CT images and histology showed the presence of subchondral bone sclerosis, cysts, and osteophytes. Conclusions These findings demonstrate that the low-dose MIA rat model closely mimics the pathological features of progressive human OA. The low-dose MIA rat model is therefore suitable to study the effect of therapeutic drugs on cartilage and bone in a non-trauma model of OA. In vivo

  10. Assessing Functional Vision Using Microcomputers.

    Science.gov (United States)

    Spencer, Simon; Ross, Malcolm

    1989-01-01

    The paper describes a software system which uses microcomputers to aid in the assessment of functional vision in visually impaired students. The software also aims to be visually stimulating and to develop hand-eye coordination, visual memory, and cognitive abilities. (DB)

  11. Microcomputers in the Introductory Laboratory.

    Science.gov (United States)

    Bare, John K.

    1982-01-01

    A microcomputer was used successfully to replicate Sternberg's 1966 study of retrieval from short-term memory and Sperling's 1960 study on sensory or iconic memory. Computers with a capacity for measuring reaction time are useful in the laboratory for introductory psychology courses. (SR)

  12. Reinforcement and Drill by Microcomputer.

    Science.gov (United States)

    Balajthy, Ernest

    1984-01-01

    Points out why drill work has a role in the language arts classroom, explores the possibilities of using a microcomputer to give children drill work, and discusses the characteristics of a good software program, along with faults found in many software programs. (FL)

  13. Portable microcomputer controlled radiation counter

    International Nuclear Information System (INIS)

    Mason, E.W.; Weber, J.M.

    1984-01-01

    A portable microcomputer controlled counter for use as a radiation counter is described. The counter uses digital processing of input pulses from a radiation detector. The number of counts received by the microcomputer per unit time is used to calculate a value for display using a calibration factor obtained during physical calibration of the instrument with a radiation source or with a pulse generator. The keyboard is used to enter calibration points. The number of calibration points which may be entered depends on the degree of accuracy desired by the user. The high voltage generator which drives the detector is triggered by pulses from the microcomputer in relation to the count rate. After processing the count, the resulting count rate or dose rate is displayed on the liquid crystal display. The counter is autoranging in which the decimal point is shifted as necessary by the microcomputer. The units displayed are determined by the user by means of a multiposition switch. Low battery and an overrange condition are displayed. An interface is provided via a connector to allow parallel transmission of data to peripheral devices. Low battery power consumption is featured. The counter is capable of providing more accurate readings than currently available counters

  14. History Microcomputer Games: Update 2.

    Science.gov (United States)

    Sargent, James E.

    1985-01-01

    Provides full narrative reviews of B-1 Nuclear Bomber (Avalon, 1982); American History Adventure (Social Science Microcomputer Review Software, 1985); Government Simulations (Prentice-Hall, 1985); and The Great War, FDR and the New Deal, and Hitler's War, all from New Worlds Software, 1985. Lists additional information on five other history and…

  15. Microcomputers: An Interlibrary Loan Application.

    Science.gov (United States)

    Evans, Elizabeth A.

    1984-01-01

    Description of a microcomputer-based system for local processing of interlibrary loan (ILL) requests developed at Environmental Protection Agency library discusses database management systems, hardware, databases, and command files. Subsequent changes resulting from system's implementation at East Carolina University Health Sciences Library are…

  16. Micro-computed tomography derived anisotropy detects tumor provoked deviations in bone in an orthotopic osteosarcoma murine model.

    Directory of Open Access Journals (Sweden)

    Heather A Cole

    Full Text Available Radiographic imaging plays a crucial role in the diagnosis of osteosarcoma. Currently, computed-tomography (CT is used to measure tumor-induced osteolysis as a marker for tumor growth by monitoring the bone fractional volume. As most tumors primarily induce osteolysis, lower bone fractional volume has been found to correlate with tumor aggressiveness. However, osteosarcoma is an exception as it induces osteolysis and produces mineralized osteoid simultaneously. Given that competent bone is highly anisotropic (systematic variance in its architectural order renders its physical properties dependent on direction of load and that tumor induced osteolysis and osteogenesis are structurally disorganized relative to competent bone, we hypothesized that μCT-derived measures of anisotropy could be used to qualitatively and quantitatively detect osteosarcoma provoked deviations in bone, both osteolysis and osteogenesis, in vivo. We tested this hypothesis in a murine model of osteosarcoma cells orthotopically injected into the tibia. We demonstrate that, in addition to bone fractional volume, μCT-derived measure of anisotropy is a complete and accurate method to monitor osteosarcoma-induced osteolysis. Additionally, we found that unlike bone fractional volume, anisotropy could also detect tumor-induced osteogenesis. These findings suggest that monitoring tumor-induced changes in the structural property isotropy of the invaded bone may represent a novel means of diagnosing primary and metastatic bone tumors.

  17. A multi-channel microcomputer data acquisition system

    International Nuclear Information System (INIS)

    Loureiro, J.S.

    1987-01-01

    A data acquisition system was developed in order to transfer automatically to a 64 kb microcomputer the data generated by a nuclear spectroscopy system in a multichannel analyser. The data in the memory are stored in a floppy disk and will be further used as data entry for any spectrum analysis program, eliminating the tedious work of manually digitizing the spectrum and the possible mistakes associated with it. The developed system connected a POLYMAX 201 DP microcomputer, under CP/M operational system, to a NUCLEAR DATA MODEL ND-65 multichannel analyser and was planned for either local spectrum analysis in the microcomputer using a simplified program, or remote analysis in a mainframe using the sophisticated analysis program SAMPO. With the present system, the time spent between printing out of the 4096 channels with the multichannel analyser printer and its corresponding introduction in the analysis program has been reduced from about 6 hours to less than 2 minutes. (author)

  18. Imaging techniques for visualizing and phenotyping congenital heart defects in murine models.

    Science.gov (United States)

    Liu, Xiaoqin; Tobita, Kimimasa; Francis, Richard J B; Lo, Cecilia W

    2013-06-01

    Mouse model is ideal for investigating the genetic and developmental etiology of congenital heart disease. However, cardiovascular phenotyping for the precise diagnosis of structural heart defects in mice remain challenging. With rapid advances in imaging techniques, there are now high throughput phenotyping tools available for the diagnosis of structural heart defects. In this review, we discuss the efficacy of four different imaging modalities for congenital heart disease diagnosis in fetal/neonatal mice, including noninvasive fetal echocardiography, micro-computed tomography (micro-CT), micro-magnetic resonance imaging (micro-MRI), and episcopic fluorescence image capture (EFIC) histopathology. The experience we have gained in the use of these imaging modalities in a large-scale mouse mutagenesis screen have validated their efficacy for congenital heart defect diagnosis in the tiny hearts of fetal and newborn mice. These cutting edge phenotyping tools will be invaluable for furthering our understanding of the developmental etiology of congenital heart disease. Copyright © 2013 Wiley Periodicals, Inc.

  19. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  20. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  1. Advanced structural equation modeling issues and techniques

    CERN Document Server

    Marcoulides, George A

    2013-01-01

    By focusing primarily on the application of structural equation modeling (SEM) techniques in example cases and situations, this book provides an understanding and working knowledge of advanced SEM techniques with a minimum of mathematical derivations. The book was written for a broad audience crossing many disciplines, assumes an understanding of graduate level multivariate statistics, including an introduction to SEM.

  2. A microcomputer for a packet switched network

    International Nuclear Information System (INIS)

    Seller, P.; Bairstow, R.; Barlow, J.; Waters, M.

    1982-12-01

    The Bubble Chamber Research Group of the Rutherford and Appleton Laboratory has a large film analysis facility. This comprises 16 digitising tables used for the measurement of bubble chamber film. Each of these tables has an associated microcomputer. These microcomputers are linked by a star structured packet switched local area network (LAN) to a VAX 11/780. The LAN, and in particular a microcomputer of novel architecture designed to act as the central switch of the network, is described. (author)

  3. Verification of Orthogrid Finite Element Modeling Techniques

    Science.gov (United States)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  4. A Strategy Modelling Technique for Financial Services

    OpenAIRE

    Heinrich, Bernd; Winter, Robert

    2004-01-01

    Strategy planning processes often suffer from a lack of conceptual models that can be used to represent business strategies in a structured and standardized form. If natural language is replaced by an at least semi-formal model, the completeness, consistency, and clarity of strategy descriptions can be drastically improved. A strategy modelling technique is proposed that is based on an analysis of modelling requirements, a discussion of related work and a critical analysis of generic approach...

  5. Probabilistic safety analysis using microcomputer

    International Nuclear Information System (INIS)

    Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos

    1990-01-01

    The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)

  6. Use of microcomputers in health and social service applications in developing nations.

    Science.gov (United States)

    Bertrand, W E

    1987-01-01

    The microcomputer is creating something of a revolution in many developing nations where historically there has been a lack of access to computer power at all levels of the health sector. For the first time, practitioners and researchers, often trained in computer techniques for developing countries, have access through microcomputers to data and information manipulation in their local workplace. While the history of microcomputers in such settings is short, this article presents early evidence from several countries which indicates the usefulness of various applications. The majority of the applications reported in the literature from clinical and research laboratories is made up of national data base systems and special studies of morbidity and mortality. Secondary applications, including assistance in biographical searches and word and graphics processing, are also reviewed in this article. A summary of the most utilized microcomputer hardware configurations completes the review.

  7. Microcomputer Database Management Systems for Bibliographic Data.

    Science.gov (United States)

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  8. Automating Relational Database Design for Microcomputer Users.

    Science.gov (United States)

    Pu, Hao-Che

    1991-01-01

    Discusses issues involved in automating the relational database design process for microcomputer users and presents a prototype of a microcomputer-based system (RA, Relation Assistant) that is based on expert systems technology and helps avoid database maintenance problems. Relational database design is explained and the importance of easy input…

  9. Microcomputer-based stepping-motor controller

    International Nuclear Information System (INIS)

    Johnson, K.

    1983-04-01

    A microcomputer-controlled stepping motor is described. A Motorola MC68701 microcomputer unit is interfaced to a Cybernetic CY500 stored-program controller that outputs through Motorola input/output isolation modules to the stepping motor. A complex multifunction controller with enhanced capabilities is thus available with a minimum number of parts

  10. Integrating Mainframe Data Bases on a Microcomputer

    OpenAIRE

    Marciniak, Thomas A.

    1985-01-01

    Microcomputers support user-friendly software for interrogating their resident data bases. Many medical data bases currently consist of files on less accessible mainframe computers with more limited inquiry capabilities. We discuss the transferring and integrating of mainframe data into microcomputer data base systems in one medical environment.

  11. Microcomputers in Education. Report No. 4798.

    Science.gov (United States)

    Feurzeig, W.; And Others

    A brief review of the history of computer-assisted instruction and discussion of the current and potential roles of microcomputers in education introduce this review of the capabilities of state-of-the-art microcomputers and currently available software for them, and some speculations about future trends and developments. A survey of current…

  12. Five Basic Microcomputer Applications for Marketing Educators.

    Science.gov (United States)

    James, Richard F.

    The microcomputer has five basic applications in marketing education--a remedial/tutorial application, instructional purposes, simulation, the project data base, and classroom management. Examples of word processing applications of a microcomputer are updating annual training plans and producing letters to advisory committee members, parents, and…

  13. Model techniques for testing heated concrete structures

    International Nuclear Information System (INIS)

    Stefanou, G.D.

    1983-01-01

    Experimental techniques are described which may be used in the laboratory to measure strains of model concrete structures representing to scale actual structures of any shape or geometry, operating at elevated temperatures, for which time-dependent creep and shrinkage strains are dominant. These strains could be used to assess the distribution of stress in the scaled structure and hence to predict the actual behaviour of concrete structures used in nuclear power stations. Similar techniques have been employed in an investigation to measure elastic, thermal, creep and shrinkage strains in heated concrete models representing to scale parts of prestressed concrete pressure vessels for nuclear reactors. (author)

  14. Possible Radiation-Induced Damage to the Molecular Structure of Wooden Artifacts Due to Micro-Computed Tomography, Handheld X-Ray Fluorescence, and X-Ray Photoelectron Spectroscopic Techniques

    Directory of Open Access Journals (Sweden)

    Madalena Kozachuk

    2016-05-01

    Full Text Available This study was undertaken to ascertain whether radiation produced by X-ray photoelectron spectroscopy (XPS, micro-computed tomography (μCT and/or portable handheld X-ray fluorescence (XRF equipment might damage wood artifacts during analysis. Changes at the molecular level were monitored by Fourier transform infrared (FTIR analysis. No significant changes in FTIR spectra were observed as a result of μCT or handheld XRF analysis. No substantial changes in the collected FTIR spectra were observed when XPS analytical times on the order of minutes were used. However, XPS analysis collected over tens of hours did produce significant changes in the FTIR spectra.

  15. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  16. "Hack" Is Not A Dirty Word--The Tenth Anniversary of Patron Access Microcomputer Centers in Libraries.

    Science.gov (United States)

    Dewey, Patrick R.

    1986-01-01

    The history of patron access microcomputers in libraries is described as carrying on a tradition that information and computer power should be shared. Questions that all types of libraries need to ask in planning microcomputer centers are considered and several model centers are described. (EM)

  17. Microcomputer generated pipe support calculations

    International Nuclear Information System (INIS)

    Hankinson, R.F.; Czarnowski, P.; Roemer, R.E.

    1991-01-01

    The cost and complexity of pipe support design has been a continuing challenge to the construction and modification of commercial nuclear facilities. Typically, pipe support design or qualification projects have required large numbers of engineers centrally located with access to mainframe computer facilities. Much engineering time has been spent repetitively performing a sequence of tasks to address complex design criteria and consolidating the results of calculations into documentation packages in accordance with strict quality requirements. The continuing challenges of cost and quality, the need for support engineering services at operating plant sites, and the substantial recent advances in microcomputer systems suggested that a stand-alone microcomputer pipe support calculation generator was feasible and had become a necessity for providing cost-effective and high quality pipe support engineering services to the industry. This paper outlines the preparation for, and the development of, an integrated pipe support design/evaluation software system which maintains all computer programs in the same environment, minimizes manual performance of standard or repetitive tasks, and generates a high quality calculation which is consistent and easily followed

  18. Periodic precipitation a microcomputer analysis of transport and reaction processes in diffusion media, with software development

    CERN Document Server

    Henisch, H K

    1991-01-01

    Containing illustrations, worked examples, graphs and tables, this book deals with periodic precipitation (also known as Liesegang Ring formation) in terms of mathematical models and their logical consequences, and is entirely concerned with microcomputer analysis and software development. Three distinctive periodic precipitation mechanisms are included: binary diffusion-reaction; solubility modulation, and competitive particle growth. The book provides didactic illustrations of a valuable investigational procedure, in the form of hypothetical experimentation by microcomputer. The development

  19. Structural Modeling Using "Scanning and Mapping" Technique

    Science.gov (United States)

    Amos, Courtney L.; Dash, Gerald S.; Shen, J. Y.; Ferguson, Frederick; Noga, Donald F. (Technical Monitor)

    2000-01-01

    Supported by NASA Glenn Center, we are in the process developing a structural damage diagnostic and monitoring system for rocket engines, which consists of five modules: Structural Modeling, Measurement Data Pre-Processor, Structural System Identification, Damage Detection Criterion, and Computer Visualization. The function of the system is to detect damage as it is incurred by the engine structures. The scientific principle to identify damage is to utilize the changes in the vibrational properties between the pre-damaged and post-damaged structures. The vibrational properties of the pre-damaged structure can be obtained based on an analytic computer model of the structure. Thus, as the first stage of the whole research plan, we currently focus on the first module - Structural Modeling. Three computer software packages are selected, and will be integrated for this purpose. They are PhotoModeler-Pro, AutoCAD-R14, and MSC/NASTRAN. AutoCAD is the most popular PC-CAD system currently available in the market. For our purpose, it plays like an interface to generate structural models of any particular engine parts or assembly, which is then passed to MSC/NASTRAN for extracting structural dynamic properties. Although AutoCAD is a powerful structural modeling tool, the complexity of engine components requires a further improvement in structural modeling techniques. We are working on a so-called "scanning and mapping" technique, which is a relatively new technique. The basic idea is to producing a full and accurate 3D structural model by tracing on multiple overlapping photographs taken from different angles. There is no need to input point positions, angles, distances or axes. Photographs can be taken by any types of cameras with different lenses. With the integration of such a modeling technique, the capability of structural modeling will be enhanced. The prototypes of any complex structural components will be produced by PhotoModeler first based on existing similar

  20. Peri-implant osseointegration after low-level laser therapy: micro-computed tomography and resonance frequency analysis in an animal model.

    Science.gov (United States)

    Mayer, Luciano; Gomes, Fernando Vacilotto; de Oliveira, Marília Gerhardt; de Moraes, João Feliz Duarte; Carlsson, Lennart

    2016-12-01

    The purpose of the present study is to evaluate the effects of low-level laser therapy on the osseointegration process by comparing resonance frequency analysis measurements performed at implant placement and after 30 days and micro-computed tomography images in irradiated vs nonirradiated rabbits. Fourteen male New Zealand rabbits were randomly divided into two groups of seven animals each, one control group (nonirradiated animals) and one experimental group that received low-level laser therapy (Thera Lase®, aluminum-gallium-arsenide laser diode, 10 J per spot, two spots per session, seven sessions, 830 nm, 50 mW, CW, Ø 0.0028 cm 2 ). The mandibular left incisor was surgically extracted in all animals, and one osseointegrated implant was placed immediately afterward (3.25ø × 11.5 mm; NanoTite, BIOMET 3i). Resonance frequency analysis was performed with the Osstell® device at implant placement and at 30 days (immediately before euthanasia). Micro-computed tomography analyses were then conducted using a high-resolution scanner (SkyScan 1172 X-ray Micro-CT) to evaluate the amount of newly formed bone around the implants. Irradiated animals showed significantly higher implant stability quotients at 30 days (64.286 ± 1.596; 95 % confidence interval (CI) 60.808-67.764) than controls (56.357 ± 1.596; 95 %CI 52.879-59.835) (P = .000). The percentage of newly formed bone around the implants was also significantly higher in irradiated animals (75.523 ± 8.510; 95 %CI 61.893-89.155) than in controls (55.012 ± 19.840; 95 %CI 41.380-68.643) (P = .027). Laser therapy, based on the irradiation protocol used in this study, was able to provide greater implant stability and increase the volume of peri-implant newly formed bone, indicating that laser irradiation effected an improvement in the osseointegration process.

  1. The Microcomputer in the Clinical Nursing Research Unit

    Science.gov (United States)

    Schwirian, Patricia M.; Byers, Sandra R.

    1982-01-01

    This paper discusses the microcomputer in clinical nursing research. There are six general areas in which computers have been useful to nurses: nursing notes and charting; patient care plans; automated monitoring of high-tech nursing units; HIS and MIS systems; personnel distribution systems; and education. Three alternative models for the conduct of clinical nursing research in a hospital are described. The first is a centralized model relying on the bureaucratic structure of the hospital. Second is a decentralized network of professional nurses and research support personnel woven together by a Clinical Nurse Researcher, and third is a dedicated clinical nursing research unit. Microcomputers have five characteristics which make them vital tools for nurse researchers: user-friendliness; environment friendliness; low cost; ease of interface with other information systems; and range and quality of software.

  2. Influence of low-intensity pulsed ultrasound on osteogenic tissue regeneration in a periodontal injury model: X-ray image alterations assessed by micro-computed tomography.

    Science.gov (United States)

    Wang, Yunji; Chai, Zhaowu; Zhang, Yuanyuan; Deng, Feng; Wang, Zhibiao; Song, Jinlin

    2014-08-01

    This study was conducted to evaluate, with micro-computed tomography, the influence of low-intensity pulsed ultrasound on wound-healing in periodontal tissues. Periodontal disease with Class II furcation involvement was surgically produced at the bilateral mandibular premolars in 8 adult male beagle dogs. Twenty-four teeth were randomly assigned among 4 groups (G): G1, periodontal flap surgery; G2, periodontal flap surgery+low-intensity pulsed ultrasound (LIPUS); G3, guided tissue regeneration (GTR) surgery; G4, GTR surgery plus LIPUS. The affected area in the experimental group was exposed to LIPUS. At 6 and 8weeks, the X-ray images of regenerated teeth were referred to micro-CT scanning for 3-D measurement. Bone volume (BV), bone surface (BS), and number of trabeculae (Tb) in G2 and G4 were higher than in G1 and G3 (pperiodontal flap surgery group. LIPUS irradiation increased the number, volume, and area of new alveolar bone trabeculae. LIPUS has the potential to promote the repair of periodontal tissue, and may work effectively if combined with GTR. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. 275 C Downhole Microcomputer System

    Energy Technology Data Exchange (ETDEWEB)

    Chris Hutchens; Hooi Miin Soo

    2008-08-31

    An HC11 controller IC and along with serial SRAM and ROM support ICs chip set were developed to support a data acquisition and control for extreme temperature/harsh environment conditions greater than 275 C. The 68HC11 microprocessor is widely used in well logging tools for control, data acquisition, and signal processing applications and was the logical choice for a downhole controller. This extreme temperature version of the 68HC11 enables new high temperature designs and additionally allows 68HC11-based well logging tools and MWD tools to be upgraded for high temperature operation in deep gas reservoirs, The microcomputer chip consists of the microprocessor ALU, a small boot ROM, 4 kbyte data RAM, counter/timer unit, serial peripheral interface (SPI), asynchronous serial interface (SCI), and the A, B, C, and D parallel ports. The chip is code compatible with the single chip mode commercial 68HC11 except for the absence of the analog to digital converter system. To avoid mask programmed internal ROM, a boot program is used to load the microcomputer program from an external mask SPI ROM. A SPI RAM IC completes the chip set and allows data RAM to be added in 4 kbyte increments. The HC11 controller IC chip set is implemented in the Peregrine Semiconductor 0.5 micron Silicon-on-Sapphire (SOS) process using a custom high temperature cell library developed at Oklahoma State University. Yield data is presented for all, the HC11, SPI-RAM and ROM. The lessons learned in this project were extended to the successful development of two high temperature versions of the LEON3 and a companion 8 Kbyte SRAM, a 200 C version for the Navy and a 275 C version for the gas industry.

  4. 2750 C Downhole Microcomputer System

    International Nuclear Information System (INIS)

    Hutchens, Chris; Soo, Hooi Miin

    2008-01-01

    An HC11 controller IC and along with serial SRAM and ROM support ICs chip set were developed to support a data acquisition and control for extreme temperature/harsh environment conditions greater than 275 C. The 68HC11 microprocessor is widely used in well logging tools for control, data acquisition, and signal processing applications and was the logical choice for a downhole controller. This extreme temperature version of the 68HC11 enables new high temperature designs and additionally allows 68HC11-based well logging tools and MWD tools to be upgraded for high temperature operation in deep gas reservoirs, The microcomputer chip consists of the microprocessor ALU, a small boot ROM, 4 kbyte data RAM, counter/timer unit, serial peripheral interface (SPI), asynchronous serial interface (SCI), and the A, B, C, and D parallel ports. The chip is code compatible with the single chip mode commercial 68HC11 except for the absence of the analog to digital converter system. To avoid mask programmed internal ROM, a boot program is used to load the microcomputer program from an external mask SPI ROM. A SPI RAM IC completes the chip set and allows data RAM to be added in 4 kbyte increments. The HC11 controller IC chip set is implemented in the Peregrine Semiconductor 0.5 micron Silicon-on-Sapphire (SOS) process using a custom high temperature cell library developed at Oklahoma State University. Yield data is presented for all, the HC11, SPI-RAM and ROM. The lessons learned in this project were extended to the successful development of two high temperature versions of the LEON3 and a companion 8 Kbyte SRAM, a 200 C version for the Navy and a 275 C version for the gas industry

  5. Modeling techniques for quantum cascade lasers

    Energy Technology Data Exchange (ETDEWEB)

    Jirauschek, Christian [Institute for Nanoelectronics, Technische Universität München, D-80333 Munich (Germany); Kubis, Tillmann [Network for Computational Nanotechnology, Purdue University, 207 S Martin Jischke Drive, West Lafayette, Indiana 47907 (United States)

    2014-03-15

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation of quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.

  6. Modeling techniques for quantum cascade lasers

    Science.gov (United States)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-01

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation of quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.

  7. Reading Diagnosis via the Microcomputer (The Printout).

    Science.gov (United States)

    Weisberg, Renee; Balajthy, Ernest

    1989-01-01

    Examines and evaluates microcomputer software designed to assist in diagnosing students' reading abilities and making instructional decisions. Claims that existing software shows valuable potential when used sensibly and critically by trained reading clinicians. (MM)

  8. Microcomputers in a Beginning Tertiary Physics Course.

    Science.gov (United States)

    Pearce, J. M.; O'Brien, R.

    1986-01-01

    Describes a college-level physics course which focuses on both physics knowledge/skills and use of microcomputers. Types of experiments done with the computers and how students use the computers to treat data are considered. (JN)

  9. Microcomputer Simulated CAD for Engineering Graphics.

    Science.gov (United States)

    Huggins, David L.; Myers, Roy E.

    1983-01-01

    Describes a simulated computer-aided-graphics (CAD) program at The Pennsylvania State University. Rationale for the program, facilities, microcomputer equipment (Apple) used, and development of a software package for simulating applied engineering graphics are considered. (JN)

  10. Rabbit tissue model (RTM) harvesting technique.

    Science.gov (United States)

    Medina, Marelyn

    2002-01-01

    A method for creating a tissue model using a female rabbit for laparoscopic simulation exercises is described. The specimen is called a Rabbit Tissue Model (RTM). Dissection techniques are described for transforming the rabbit carcass into a small, compact unit that can be used for multiple training sessions. Preservation is accomplished by using saline and refrigeration. Only the animal trunk is used, with the rest of the animal carcass being discarded. Practice exercises are provided for using the preserved organs. Basic surgical skills, such as dissection, suturing, and knot tying, can be practiced on this model. In addition, the RTM can be used with any pelvic trainer that permits placement of larger practice specimens within its confines.

  11. Evaluation of a polyetheretherketone (PEEK) titanium composite interbody spacer in an ovine lumbar interbody fusion model: biomechanical, microcomputed tomographic, and histologic analyses.

    Science.gov (United States)

    McGilvray, Kirk C; Waldorff, Erik I; Easley, Jeremiah; Seim, Howard B; Zhang, Nianli; Linovitz, Raymond J; Ryaby, James T; Puttlitz, Christian M

    2017-12-01

    The most commonly used materials used for interbody cages are titanium metal and polymer polyetheretherketone (PEEK). Both of these materials have demonstrated good biocompatibility. A major disadvantage associated with solid titanium cages is their radiopacity, limiting the postoperative monitoring of spinal fusion via standard imaging modalities. However, PEEK is radiolucent, allowing for a temporal assessment of the fusion mass by clinicians. On the other hand, PEEK is hydrophobic, which can limit bony ingrowth. Although both PEEK and titanium have demonstrated clinical success in obtaining a solid spinal fusion, innovations are being developed to improve fusion rates and to create stronger constructs using hybrid additive manufacturing approaches by incorporating both materials into a single interbody device. The purpose of this study was to examine the interbody fusion characteristic of a PEEK Titanium Composite (PTC) cage for use in lumbar fusion. Thirty-four mature female sheep underwent two-level (L 2 -L 3 and L 4 -L 5 ) interbody fusion using either a PEEK or a PTC cage (one of each per animal). Animals were sacrificed at 0, 8, 12, and 18 weeks post surgery. Post sacrifice, each surgically treated functional spinal unit underwent non-destructive kinematic testing, microcomputed tomography scanning, and histomorphometric analyses. Relative to the standard PEEK cages, the PTC constructs demonstrated significant reductions in ranges of motion and a significant increase in stiffness. These biomechanical findings were reinforced by the presence of significantly more bone at the fusion site as well as ingrowth into the porous end plates. Overall, the results indicate that PTC interbody devices could potentially lead to a more robust intervertebral fusion relative to a standard PEEK device in a clinical setting. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  12. Structural design of SBWR reactor building complex using microcomputers

    International Nuclear Information System (INIS)

    Mandagi, K.; Rajagopal, R.S.; Sawhney, P.S.; Gou, P.F.

    1993-01-01

    The design concept of Simplified Boiling Water Reactor (SBWR) plant is based on simplicity and passive features to enhance safety and reliability, improve performance, and increase economic viability. The SBWR utilizes passive systems such as Gravity Driven Core-Cooling System (GDCS) and Passive Containment Cooling System (PCCS). To suit these design features the Reactor Building (RB) complex of the SBWR is configured as an integrated structure consisting of a cylindrical Reinforced Concrete Containment Vessel (RCCV) surrounded by square reinforced concrete safety envelope and outer box structures, all sharing a common reinforced concrete basemat. This paper describes the structural analysis and design aspects of the RB complex. A 3D STARDYNE finite element model has been developed for the structural analysis of the complex using a PC Compaq 486/33L microcomputer. The structural analysis is performed for service and factored load conditions for the applicable loading combinations. The dynamic responses of containment structures due to pool hydrodynamic loads have been calculated by an axisymmetric shell model using COSMOS/M program. The RCCV is designed in accordance with ASME Section 3, Division 2 Code. The rest of the RB which is classified as Seismic Category 1 structure is designed in accordance with the ACI 349 Code. This paper shows that microcomputers can be efficiently used for the analysis and design of large and complex structures such as RCCV and Reactor Building complex. The use of microcomputers can result in significant savings in the computational cost compared with that of mainframe computers

  13. A Study of Acute and Chronic Tissue Changes in Surgical and Traumatically-Induced Experimental Models of Knee Joint Injury Using Magnetic Resonance Imaging and Micro-Computed Tomography

    Science.gov (United States)

    Fischenich, Kristine M.; Pauly, Hannah M.; Button, Keith D.; Fajardo, Ryan S.; DeCamp, Charles E.; Haut, Roger C.; Haut Donahue, Tammy L.

    2016-01-01

    Objective The objective of this study was to monitor the progression of joint damage in two animal models of knee joint trauma using two non-invasive, clinically available imaging modalities. Methods A 3-T clinical magnet and micro-computed tomography (mCT) was used to document changes immediately following injury (acute) and post-injury (chronic) at time points of 4, 8, or 12 weeks. Joint damage was recorded at dissection and compared to the chronic magnetic resonance imaging (MRI) record. Fifteen Flemish Giant rabbits were subjected to a single tibiofemoral compressive impact (ACLF), and 18 underwent a combination of anterior cruciate ligament (ACL) and meniscal transection (mACLT). Results All ACLF animals experienced ACL rupture, and 13 also experienced acute meniscal damage. All ACLF and mACLT animals showed meniscal and articular cartilage damages at dissection. Meniscal damage was documented as early as 4 weeks and worsened in 87% of the ACLF animals and 71% of the mACLT animals. Acute cartilage damage also developed further and increased in occurrence with time in both models. A progressive decrease in bone quantity and quality was documented in both models. The MRI data closely aligned with dissection notes suggesting this clinical tool may be a non-invasive method for documenting joint damage in lapine models of knee joint trauma. Conclusions The study investigates the acute to chronic progression of meniscal and cartilage damage at various time points, and chronic changes to the underlying bone in two models of posttraumatic osteoarthritis (PTOA), and highlights the dependency of the model on the location, type, and progression of damage over time. PMID:27756698

  14. Small, microcomputer-based CAMAC controller

    International Nuclear Information System (INIS)

    Juras, R.C.

    1979-01-01

    The beam buncher necessary to condition the beam from the Oak Ridge National Laboratory 25 MV tandem accelerator for post-acceleration by the Oak Ridge Isochronous Cyclotron is CAMAC-based and will be controlled via one of the serial highways of the accelerator control system. However, prior to integration into the accelerator system, the buncher requires testing, including runs on the model EN tandem at Oak Ridge. In order to facilitate testing and initial operation of the buncher, a microcomputer-based controller was assembled. The controller consists of a CAMAC crate, several CAMAC modules, a touch panel display, a controller box, and software. The controller box contains one shaft encoder and two switches. One of the switches is a coarse/fine selector. The other switch is assignable via the touch panel display and is used, for example, to turn devices on and off. Operation of the controller is described. It can be quickly assembled to control any small CAMAC-based system. 2 figures

  15. Improved modeling techniques for turbomachinery flow fields

    Energy Technology Data Exchange (ETDEWEB)

    Lakshminarayana, B. [Pennsylvania State Univ., University Park, PA (United States); Fagan, J.R. Jr. [Allison Engine Company, Indianapolis, IN (United States)

    1995-10-01

    This program has the objective of developing an improved methodology for modeling turbomachinery flow fields, including the prediction of losses and efficiency. Specifically, the program addresses the treatment of the mixing stress tensor terms attributed to deterministic flow field mechanisms required in steady-state Computational Fluid Dynamic (CFD) models for turbo-machinery flow fields. These mixing stress tensors arise due to spatial and temporal fluctuations (in an absolute frame of reference) caused by rotor-stator interaction due to various blade rows and by blade-to-blade variation of flow properties. These tasks include the acquisition of previously unavailable experimental data in a high-speed turbomachinery environment, the use of advanced techniques to analyze the data, and the development of a methodology to treat the deterministic component of the mixing stress tensor. Penn State will lead the effort to make direct measurements of the momentum and thermal mixing stress tensors in high-speed multistage compressor flow field in the turbomachinery laboratory at Penn State. They will also process the data by both conventional and conditional spectrum analysis to derive momentum and thermal mixing stress tensors due to blade-to-blade periodic and aperiodic components, revolution periodic and aperiodic components arising from various blade rows and non-deterministic (which includes random components) correlations. The modeling results from this program will be publicly available and generally applicable to steady-state Navier-Stokes solvers used for turbomachinery component (compressor or turbine) flow field predictions. These models will lead to improved methodology, including loss and efficiency prediction, for the design of high-efficiency turbomachinery and drastically reduce the time required for the design and development cycle of turbomachinery.

  16. Final report : evaluation of microcomputer applications in transportation engineering.

    Science.gov (United States)

    1984-01-01

    This study investigated areas where microcomputers can aid in the effectiveness of transportation engineering at state and local levels. A survey of the microcomputer needs of transportation professionals in state and local agencies in Virginia was c...

  17. A microcomputer controlled thermoluminescence dosimetry system

    International Nuclear Information System (INIS)

    Huyskens, C.J.; Kicken, P.J.H.

    1980-01-01

    Using a microcomputer, an automatic thermoluminescence dosimetry system for personal dosimetry and thermoluminescence detector (TLD) research was developed. Process automation, statistical computation and dose calculation are provided by this microcomputer. Recording of measurement data, as well as dose record keeping for radiological workers is carried out with floppy disk. The microcomputer also provides a human/system interface by means of a video display and a printer. The main features of this dosimetry system are its low cost, high degree of flexibility, high degree of automation and the feasibility for use in routine dosimetry as well as in TLD research. The system is in use for personal dosimetry, environmental dosimetry and for TL-research work. Because of its modular set-up several components of the system are in use for other applications, too. The system seems suited for medium sized health physics groups. (author)

  18. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  19. Economical motor protection using microcomputer technology

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, N.

    1983-09-01

    A trend to design new motors closer to their design limits and the high cost of plant shutdown has increased the need for better protection of smaller three phase motors. A single chip microcomputer relay can be applied to replace thermal overloads which are of limited effectiveness on low and medium voltage machines with comprehensive, economical motor protection. The requirement for different protection features and how they are achieved is presented. All the protection features discussed are commercially available in a compact unit that uses a single chip microcomputer.

  20. Dictionary of microelectronics and microcomputer technology

    International Nuclear Information System (INIS)

    Attiyate, Y.H.; Shah, R.R.

    1984-01-01

    This bilingual dictionary (German-English and English-German) is to give the general public a clearer idea of the terminology of microelectronics, microcomputers, data processing, and computer science. Each part contains about 7500 terms frequently encountered in practice, about 2000 of which are supplemented by precise explanations. (orig./HP) [de

  1. Dose calculation in brachytherapy with microcomputers

    International Nuclear Information System (INIS)

    Elbern, A.W.

    1989-01-01

    The computer algorithms, that allow the calculation of brachytherapy doses and its graphic representation for implants, using programs developed for Pc microcomputers are presented. These algorithms allow to localized the sources in space, from their projection in radiographics images and trace isodose counter. (C.G.C.) [pt

  2. Microcomputer Applications in Local Assessment Systems.

    Science.gov (United States)

    Harnisch, Delwyn L.; And Others

    The capabilities and hardware requirements of four microcomputer software packages produced by the Office of Educational Testing, Research and Service at the University of Illinois at Urbana-Champaign are described. These programs are: (1) the Scan-Tron Forms Analysis Package Version 2.0, an interface between an IBM-compatible and a Scan-Tron…

  3. A Laboratory Application of Microcomputer Graphics.

    Science.gov (United States)

    Gehring, Kalle B.; Moore, John W.

    1983-01-01

    A PASCAL graphics and instrument interface program for a Z80/S-100 based microcomputer was developed. The computer interfaces to a stopped-flow spectrophotometer replacing a storage oscilloscope and polaroid camera. Applications of this system are discussed, indicating that graphics and analog-to-digital boards have transformed the computer into…

  4. Microcomputer relay regulator in the CAMAC standard

    International Nuclear Information System (INIS)

    Nikolaev, V.P.

    1984-01-01

    The digital relay regulator is developed on the base of the KM001 microcomputer and KK06 controller for automatic control ob ects with transfer functions describing a broad class of systems using actuating motors (stabilitation, follow-up systems). The CAMAC relay-unit realizes the regulation law and provides the possibility to control analogous values by 8 channels

  5. Micro-Computers in Biology Inquiry.

    Science.gov (United States)

    Barnato, Carolyn; Barrett, Kathy

    1981-01-01

    Describes the modification of computer programs (BISON and POLLUT) to accommodate species and areas indigenous to the Pacific Coast area. Suggests that these programs, suitable for PET microcomputers, may foster a long-term, ongoing, inquiry-directed approach in biology. (DS)

  6. Evaluation of Five Microcomputer CAD Packages.

    Science.gov (United States)

    Leach, James A.

    1987-01-01

    Discusses the similarities, differences, advanced features, applications and number of users of five microcomputer computer-aided design (CAD) packages. Included are: "AutoCAD (V.2.17)"; "CADKEY (V.2.0)"; "CADVANCE (V.1.0)"; "Super MicroCAD"; and "VersaCAD Advanced (V.4.00)." Describes the…

  7. Print Station Operation. Microcomputing Working Paper Series.

    Science.gov (United States)

    Wozny, Lucy Anne

    During the academic year 1983-84, Drexel University instituted a new policy requiring all incoming students to have access to a microcomputer. The computer chosen to fulfill this requirement was the Macintosh from Apple Computer, Inc. Although this requirement put an additional financial burden on the Drexel student, the university administration…

  8. Machine Distribution. Microcomputing Working Papers Series.

    Science.gov (United States)

    Drexel Univ., Philadelphia, PA. Microcomputing Program.

    During the academic year 1983-84, Drexel University instituted a new policy requiring all incoming students to have access to a microcomputer. The computer chosen to fulfill this requirement was the Macintosh from Apple Computer, Inc. This paper provides a brief description of the process undertaken to select the appropriate computer (i.e.,…

  9. Fundamental image quality limits for microcomputed tomography in small animals

    International Nuclear Information System (INIS)

    Ford, N.L.; Thornton, M.M.; Holdsworth, D.W.

    2003-01-01

    Small-animal imaging has become increasingly more important as transgenic and knockout mice are produced to model human diseases. One imaging technique that has emerged is microcomputed tomography (micro-CT). For live-animal imaging, the precision in the images will be determined by the x-ray dose given to the animal. As a result, we propose a simple method to predict the noise performance of an x-ray micro-CT system as a function of dose and image resolution. An ideal, quantum-noise limited micro-CT scanner, assumed to have perfect resolution and ideal efficiency, was modeled. Using a simplified model, the coefficient of variation (COV) of the linear attenuation coefficient was calculated for a range of entrance doses and isotropic voxel sizes. COV calculations were performed for the ideal case and with simulated imperfections in efficiency and resolution. Our model was validated in phantom studies and mouse images were acquired with a specimen scanner to illustrate the results. A simplified model of noise propagation in the case of isotropic resolution indicates that the COV in the linear attenuation coefficient is proportional to (dose) -1/2 and to the (isotropic voxel size) -2 in the reconstructed volume. Therefore an improvement in the precision can be achieved only by increasing the isotropic voxel size (thereby decreasing the resolution of the image) or by increasing the x-ray dose. For the ideal scanner, a COV of 1% in the linear attenuation coefficient for an image of a mouse exposed to 0.25 Gy is obtained with a minimum isotropic voxel size of 135 μm. However, the same COV is achieved at a dose of 5.0 Gy with a 65 μm isotropic voxel size. Conversely, for a 68 mm diameter rat, a COV of 1% obtained from an image at 5.0 Gy would require an isotropic voxel size of 100 μm. These results indicate that short-term, potentially lethal, effects of ionizing radiation will limit high-resolution live animal imaging. As improvements in detector technology allow the

  10. Structural modeling techniques by finite element method

    International Nuclear Information System (INIS)

    Kang, Yeong Jin; Kim, Geung Hwan; Ju, Gwan Jeong

    1991-01-01

    This book includes introduction table of contents chapter 1 finite element idealization introduction summary of the finite element method equilibrium and compatibility in the finite element solution degrees of freedom symmetry and anti symmetry modeling guidelines local analysis example references chapter 2 static analysis structural geometry finite element models analysis procedure modeling guidelines references chapter 3 dynamic analysis models for dynamic analysis dynamic analysis procedures modeling guidelines and modeling guidelines.

  11. Application of a microcomputer-based system to control and monitor bacterial growth.

    Science.gov (United States)

    Titus, J A; Luli, G W; Dekleva, M L; Strohl, W R

    1984-02-01

    A modular microcomputer-based system was developed to control and monitor various modes of bacterial growth. The control system was composed of an Apple II Plus microcomputer with 64-kilobyte random-access memory; a Cyborg ISAAC model 91A multichannel analog-to-digital and digital-to-analog converter; paired MRR-1 pH, pO(2), and foam control units; and in-house-designed relay, servo control, and turbidimetry systems. To demonstrate the flexibility of the system, we grew bacteria under various computer-controlled and monitored modes of growth, including batch, turbidostat, and chemostat systems. The Apple-ISAAC system was programmed in Labsoft BASIC (extended Applesoft) with an average control program using ca. 6 to 8 kilobytes of memory and up to 30 kilobytes for datum arrays. This modular microcomputer-based control system was easily coupled to laboratory scale fermentors for a variety of fermentations.

  12. BIOMEHANICAL MODEL OF THE GOLF SWING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Milan Čoh

    2011-08-01

    Full Text Available Golf is an extremely complex game which depends on a number of interconnected factors. One of the most important elements is undoubtedly the golf swing technique. High performance of the golf swing technique is generated by: the level of motor abilities, high degree of movement control, the level of movement structure stabilisation, morphological characteristics, inter- and intro-muscular coordination, motivation, and concentration. The golf swing technique was investigated using the biomechanical analysis method. Kinematic parameters were registered using two synchronised high-speed cameras at a frequency of 2,000 Hz. The sample of subjects consisted of three professional golf players. The study results showed a relatively high variability of the swing technique. The maximum velocity of the ball after a wood swing ranged from 233 to 227 km/h. The velocity of the ball after an iron swing was lower by 10 km/h on average. The elevation angle of the ball ranged from 11.7 to 15.3 degrees. In the final phase of the golf swing, i.e. downswing, the trunk rotators play the key role.

  13. Respirometry techniques and activated sludge models

    NARCIS (Netherlands)

    Benes, O.; Spanjers, H.; Holba, M.

    2002-01-01

    This paper aims to explain results of respirometry experiments using Activated Sludge Model No. 1. In cases of insufficient fit of ASM No. 1, further modifications to the model were carried out and the so-called "Enzymatic model" was developed. The best-fit method was used to determine the effect of

  14. Enlist micros: Training science teachers to use microcomputers

    Science.gov (United States)

    Baird, William E.; Ellis, James D.; Kuerbis, Paul J.

    A National Science Foundation grant to the Biological Sciences Curriculum Study (BSCS) at The Colorado College supported the design and production of training materials to encourage literacy of science teachers in the use of microcomputers. ENLIST Micros is based on results of a national needs assessment that identified 22 compentencies needed by K-12 science teachers to use microcomputers for instruction. A writing team developed the 16-hour training program in the summer of 1985, and field-test coordinators tested it with 18 preservice or in-service groups during the 1985-86 academic year at 15 sites within the United States. The training materials consist of video programs, interactive computer disks for the Apple II series microcomputer, a training manual for participants, and a guide for the group leader. The experimental materials address major areas of educational computing: awareness, applications, implementation, evaluation, and resources. Each chapter contains activities developed for this program, such as viewing video segments of science teachers who are using computers effectively and running commercial science and training courseware. Role playing and small-group interaction help the teachers overcome their reluctance to use computers and plan for effective implementation of microcomputers in the school. This study examines the implementation of educational computing among 47 science teachers who completed the ENLIST Micros training at a southern university. We present results of formative evaluation for that site. Results indicate that both elementary and secondary teachers benefit from the training program and demonstrate gains in attitudes toward computer use. Participating teachers said that the program met its stated objectives and helped them obtain needed skills. Only 33 percent of these teachers, however, reported using computers one year after the training. In June 1986, the BSCS initiated a follow up to the ENLIST Micros curriculum to

  15. Three-dimensional analysis of the pulp cavity on surface models of molar teeth, using X-ray micro-computed tomography

    DEFF Research Database (Denmark)

    Markvart, Merete; Bjørndal, Lars; Darvann, Tron Andre

    2012-01-01

    . In summary, three-dimensional surface models were made with a high precision; an increased accumulation of mineral deposits was noted in molars with small pulp chambers and combined with the consistent pattern of intra-radicular connections, the potential endodontic treatment complexity is underlined...

  16. UAV State Estimation Modeling Techniques in AHRS

    Science.gov (United States)

    Razali, Shikin; Zhahir, Amzari

    2017-11-01

    Autonomous unmanned aerial vehicle (UAV) system is depending on state estimation feedback to control flight operation. Estimation on the correct state improves navigation accuracy and achieves flight mission safely. One of the sensors configuration used in UAV state is Attitude Heading and Reference System (AHRS) with application of Extended Kalman Filter (EKF) or feedback controller. The results of these two different techniques in estimating UAV states in AHRS configuration are displayed through position and attitude graphs.

  17. Investigation of a pre-clinical mandibular bone notch defect model in miniature pigs: clinical computed tomography, micro-computed tomography, and histological evaluation.

    Science.gov (United States)

    Carlisle, Patricia L; Guda, Teja; Silliman, David T; Lien, Wen; Hale, Robert G; Brown Baer, Pamela R

    2016-02-01

    To validate a critical-size mandibular bone defect model in miniature pigs. Bilateral notch defects were produced in the mandible of dentally mature miniature pigs. The right mandibular defect remained untreated while the left defect received an autograft. Bone healing was evaluated by computed tomography (CT) at 4 and 16 weeks, and by micro-CT and non-decalcified histology at 16 weeks. In both the untreated and autograft treated groups, mineralized tissue volume was reduced significantly at 4 weeks post-surgery, but was comparable to the pre-surgery levels after 16 weeks. After 16 weeks, CT analysis indicated that significantly greater bone was regenerated in the autograft treated defect than in the untreated defect (P=0.013). Regardless of the treatment, the cortical bone was superior to the defect remodeled over 16 weeks to compensate for the notch defect. The presence of considerable bone healing in both treated and untreated groups suggests that this model is inadequate as a critical-size defect. Despite healing and adaptation, the original bone geometry and quality of the pre-injured mandible was not obtained. On the other hand, this model is justified for evaluating accelerated healing and mitigating the bone remodeling response, which are both important considerations for dental implant restorations.

  18. A Low Cost Microcomputer System for Process Dynamics and Control Simulations.

    Science.gov (United States)

    Crowl, D. A.; Durisin, M. J.

    1983-01-01

    Discusses a video simulator microcomputer system used to provide real-time demonstrations to strengthen students' understanding of process dynamics and control. Also discusses hardware/software and simulations developed using the system. The four simulations model various configurations of a process liquid level tank system. (JN)

  19. Atomic absorption spectrometer readout and data reduction using the LSI-11 microcomputer

    International Nuclear Information System (INIS)

    Allen, M.J.; Wikkerink, R.W.

    1978-01-01

    Some common instruments found in the chemistry laboratory have analog chart recorder output as their primary data readout media. Data reduction from this medium is slow and relatively inaccurate. This paper describes how to interface a single LSI-11 microcomputer to PERKIN-ELMER models 603 and 303 Atomic Absorption Spectrophotometers

  20. Versatile microcomputer-based temperature controller

    International Nuclear Information System (INIS)

    Yarberry, V.R.

    1980-09-01

    The wide range of thermal responses required in laboratory and scientific equipment requires a temperature controller with a great deal of flexibility. While a number of analog temperature controllers are commercially available, they have certain limitations, such as inflexible parameter control or insufficient precision. Most lack digital interface capabilities--a necessity when the temperature controller is part of a computer-controlled automatic data acquisition system. We have developed an extremely versatile microcomputer-based temperature controller to fulfill this need in a variety of equipment. The control algorithm used allows optimal tailoring of parameters to control overshoot, response time, and accuracy. This microcomputer-based temperature controller can be used as a standalone instrument (with a teletype used to enter para-meters), or it can be integrated into a data acquisition system

  1. Scheduling nursing personnel on a microcomputer.

    Science.gov (United States)

    Liao, C J; Kao, C Y

    1997-01-01

    Suggests that with the shortage of nursing personnel, hospital administrators have to pay more attention to the needs of nurses to retain and recruit them. Also asserts that improving nurses' schedules is one of the most economic ways for the hospital administration to create a better working environment for nurses. Develops an algorithm for scheduling nursing personnel. Contrary to the current hospital approach, which schedules nurses on a person-by-person basis, the proposed algorithm constructs schedules on a day-by-day basis. The algorithm has inherent flexibility in handling a variety of possible constraints and goals, similar to other non-cyclical approaches. But, unlike most other non-cyclical approaches, it can also generate a quality schedule in a short time on a microcomputer. The algorithm was coded in C language and run on a microcomputer. The developed software is currently implemented at a leading hospital in Taiwan. The response to the initial implementation is quite promising.

  2. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  3. Microcomputer system for controlling fuel rod length

    International Nuclear Information System (INIS)

    Meyer, E.R.; Bouldin, D.W.; Bolfing, B.J.

    1979-01-01

    A system is being developed at the Oak Ridge National Laboratory (ORNL) to automatically measure and control the length of fuel rods for use in a high temperature gas-cooled reactor (HTGR). The system utilizes an LSI-11 microcomputer for monitoring fuel rod length and for adjusting the primary factor affecting length. Preliminary results indicate that the automated system can maintain fuel rod length within the specified limits of 1.940 +- 0.040 in. This system provides quality control documentation and eliminates the dependence of the current fuel rod molding process on manual length control. In addition, the microcomputer system is compatible with planned efforts to extend control to fuel rod fissile and fertile material contents

  4. Teaching Molecular Biology with Microcomputers.

    Science.gov (United States)

    Reiss, Rebecca; Jameson, David

    1984-01-01

    Describes a series of computer programs that use simulation and gaming techniques to present the basic principles of the central dogma of molecular genetics, mutation, and the genetic code. A history of discoveries in molecular biology is presented and the evolution of these computer assisted instructional programs is described. (MBR)

  5. Microcomputer control of automated TLD reader

    International Nuclear Information System (INIS)

    Bjarland, Bert.

    1979-10-01

    The interfacing electronics, the control algorithms and the developed programs of a 6800 microcomputer controlled automated TLD reader are described. The TL reading system is implemented with a photomultiplier tube and a charge-to-pulse converter. The gain of the TL reading system is controlled through the use of a temperature compensated LED reference light source. Automatic compensation of PM tube dark current is optional. The short term stability of TL readings is better than 3 %. (author)

  6. Accessing remote data bases using microcomputers

    OpenAIRE

    Saul, Peter D.

    1985-01-01

    General practitioners' access to remote data bases using microcomputers is increasing, making even the most obscure information readily available. Some of the systems available to general practitioners in the UK are described and the methods of access are outlined. General practitioners should be aware of the advances in technology; data bases are increasing in size, the cost of access is falling and their use is becoming easier.

  7. Multiprogrammation fast branch driver for microcomputer MICRAL

    International Nuclear Information System (INIS)

    Kaiser, Josef; Lacroix, Jean.

    1975-01-01

    This branch driver allows in association with the FIFO memories of the microcomputer Micral, very fast exchanges with the 7 crates of a CAMAC branch. A CAMAC programm (command, test, read, write) is loaded in the 1K FIFO buffer of the Micral before execution time and executed in sequence at a rate of 1,5μs per CAMAC command. After programm execution, data may be transferred directly on a magnetic tape [fr

  8. Moving objects management models, techniques and applications

    CERN Document Server

    Meng, Xiaofeng; Xu, Jiajie

    2014-01-01

    This book describes the topics of moving objects modeling and location tracking, indexing and querying, clustering, location uncertainty, traffic aware navigation and privacy issues as well as the application to intelligent transportation systems.

  9. Materials and techniques for model construction

    Science.gov (United States)

    Wigley, D. A.

    1985-01-01

    The problems confronting the designer of models for cryogenic wind tunnel models are discussed with particular reference to the difficulties in obtaining appropriate data on the mechanical and physical properties of candidate materials and their fabrication technologies. The relationship between strength and toughness of alloys is discussed in the context of maximizing both and avoiding the problem of dimensional and microstructural instability. All major classes of materials used in model construction are considered in some detail and in the Appendix selected numerical data is given for the most relevant materials. The stepped-specimen program to investigate stress-induced dimensional changes in alloys is discussed in detail together with interpretation of the initial results. The methods used to bond model components are considered with particular reference to the selection of filler alloys and temperature cycles to avoid microstructural degradation and loss of mechanical properties.

  10. Microcomputer-based monitoring and control system

    International Nuclear Information System (INIS)

    Talaska, D.

    1979-03-01

    This report describes a microcomputer-based monitoring and control system devised within, and used by, the Cryogenic Operations group at SLAC. Presently, a version of it is operating at the one meter liquid hydrogen bubble chamber augmenting the conventional pneumatic and human feedback system. Its use has greatly improved the controlled tolerances of temperature and pulse shape, and it has nearly eliminated the need for operating personnel to adjust the conventional pneumatic control system. The latter is most important since the rapid cycling machine can demand attentions beyond the operator's skill. Similar microcomputer systems are being prepared to monitor and control cryogenic devices situated in regions of radiation which preclude human entry and at diverse locations which defy the dexterity of the few operators assigned to maintain them. An IMSAI 8080 microcomputer is basic to the system. The key to the use of the IMSAI 8080 in this system was in the development of unique interface circuitry, and the report is mostly concerned with this

  11. Microcomputer control of a residential photovoltaic power conditioning system

    Energy Technology Data Exchange (ETDEWEB)

    Bose, B.K.; Steigerwald, R.L.; Szczesny, P.M.

    1984-01-01

    Microcomputer-based control of a residential photovoltaic power conditioning system is described. The microcomputer is responsible for array current feedback control, maximum power tracking control, array safe zone steering control, phase-locked reference wave synthesis, sequencing control, and some diagnostics. The control functions are implemented using Intel 8751 single-chip microcomputer-based hardware and software. The controller has been tested in the laboratory with the prototype power conditioner and shows excellent performance.

  12. Microcomputer control of a residential photovoltaic power conditioning system

    Energy Technology Data Exchange (ETDEWEB)

    Bose, B.K.; Steigerwald, R.L.; Szczesny, P.M.

    1985-09-01

    Microcomputer-based control of a residential photovoltaic power conditioning system is described. The microcomputer is responsible for array current feedback control, maximum power tracking control, array safe zone steering control, phase-locked reference wave synthesis, sequencing control, and some diagnostics. The control functions are implemented using Intel 8751 single-chip microcomputer-based hardware and software. The controller has been tested in the laboratory with the prototype power conditioner and shows excellent performance.

  13. Model measurements for new accelerating techniques

    International Nuclear Information System (INIS)

    Aronson, S.; Haseroth, H.; Knott, J.; Willis, W.

    1988-06-01

    We summarize the work carried out for the past two years, concerning some different ways for achieving high-field gradients, particularly in view of future linear lepton colliders. These studies and measurements on low power models concern the switched power principle and multifrequency excitation of resonant cavities. 15 refs., 12 figs

  14. Serial network simplifies the design of multiple microcomputer systems

    Energy Technology Data Exchange (ETDEWEB)

    Folkes, D.

    1981-01-01

    Recently there has been a lot of interest in developing network communication schemes for carrying digital data between locally distributed computing stations. Many of these schemes have focused on distributed networking techniques for data processing applications. These applications suggest the use of a serial, multipoint bus, where a number of remote intelligent units act as slaves to a central or host computer. Each slave would be serially addressable from the host and would perform required operations upon being addressed by the host. Based on an MK3873 single-chip microcomputer, the SCU 20 is designed to be such a remote slave device. The capabilities of the SCU 20 and its use in systems applications are examined.

  15. Practical application of Integrated National Energy Planning (INEP) using microcomputers

    International Nuclear Information System (INIS)

    Munasinghe, M.

    1989-01-01

    The paper describes the use of a practical microcomputer-based, hierarchical modelling framework for Integrated National Energy Planning (INEP), and policy analysis. The rationale for the concept and the development of the methodology are traced, following the energy crises of the 1970s. Details of the INEP process, which includes analysis at three hierarchical levels (the energy-microeconomic, energy sector and energy subsector) are given. A description of the various models, the scenarios and assumptions used in the analysis, as well as the linkages and interactions, is provided. The Sri Lanka energy situation is summarized, and the principal energy issues and options derived from the modelling are used to synthesize a national energy strategy. (author). 11 refs, 8 figs, 11 tabs

  16. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  17. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  18. EXCHANGE-RATES FORECASTING: EXPONENTIAL SMOOTHING TECHNIQUES AND ARIMA MODELS

    Directory of Open Access Journals (Sweden)

    Dezsi Eva

    2011-07-01

    Full Text Available Exchange rates forecasting is, and has been a challenging task in finance. Statistical and econometrical models are widely used in analysis and forecasting of foreign exchange rates. This paper investigates the behavior of daily exchange rates of the Romanian Leu against the Euro, United States Dollar, British Pound, Japanese Yen, Chinese Renminbi and the Russian Ruble. Smoothing techniques are generated and compared with each other. These models include the Simple Exponential Smoothing technique, as the Double Exponential Smoothing technique, the Simple Holt-Winters, the Additive Holt-Winters, namely the Autoregressive Integrated Moving Average model.

  19. The quantitative assessment of peri-implant bone responses using histomorphometry and micro-computed tomography.

    Science.gov (United States)

    Schouten, Corinne; Meijer, Gert J; van den Beucken, Jeroen J J P; Spauwen, Paul H M; Jansen, John A

    2009-09-01

    In the present study, the effects of implant design and surface properties on peri-implant bone response were evaluated with both conventional histomorphometry and micro-computed tomography (micro-CT), using two geometrically different dental implants (Screw type, St; Push-in, Pi) either or not surface-modified (non-coated, CaP-coated, or CaP-coated+TGF-beta1). After 12 weeks of implantation in a goat femoral condyle model, peri-implant bone response was evaluated in three different zones (inner: 0-500 microm; middle: 500-1000 microm; and outer: 1000-1500 microm) around the implant. Results indicated superiority of conventional histomorphometry over micro-CT, as the latter is hampered by deficits in the discrimination at the implant/tissue interface. Beyond this interface, both analysis techniques can be regarded as complementary. Histomorphometrical analysis showed an overall higher bone volume around St compared to Pi implants, but no effects of surface modification were observed. St implants showed lowest bone volumes in the outer zone, whereas inner zones were lowest for Pi implants. These results implicate that for Pi implants bone formation started from two different directions (contact- and distance osteogenesis). For St implants it was concluded that undersized implantation technique and loosening of bone fragments compress the zones for contact and distant osteogenesis, thereby improving bone volume at the interface significantly.

  20. On a Graphical Technique for Evaluating Some Rational Expectations Models

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders R.

    2011-01-01

    Campbell and Shiller (1987) proposed a graphical technique for the present value model, which consists of plotting estimates of the spread and theoretical spread as calculated from the cointegrated vector autoregressive model without imposing the restrictions implied by the present value model....... In addition to getting a visual impression of the fit of the model, the purpose is to see if the two spreads are nevertheless similar as measured by correlation, variance ratio, and noise ratio. We extend these techniques to a number of rational expectation models and give a general definition of spread...

  1. Conversion and distribution of bibliographic information for further use on microcomputers with database software such as CDS/ISIS

    International Nuclear Information System (INIS)

    Nieuwenhuysen, P.; Besemer, H.

    1990-05-01

    This paper describes methods to work on microcomputers with data obtained from bibliographic and related databases distributed by online data banks, on CD-ROM or on tape. Also, we mention some user reactions to this technique. We list the different types of software needed to perform these services. Afterwards, we report about our development of software, to convert data so that they can be entered into UNESCO's program named CDS/ISIS (Version 2.3) for local database management on IBM microcomputers or compatibles; this software allows the preservation of the structure of the source data in records, fields, subfields and field occurrences. (author). 10 refs, 1 fig

  2. Evaluation of autoradiograms using a microcomputer

    International Nuclear Information System (INIS)

    Birkholz, W.; Steinert, M.

    1983-01-01

    An equipment AURAS for evalutaion of autoradiograms has been developed. It consists of a digital photometer and the densitron system (television scanner) for digitalisation of pictures and a microprocessor for picture processing and storing of datas. The digital photometer permits a precise but time consuming scanning. The densitron system works quickly with little density classes and pseudocolouring of pictures. For the evaluation of autoradiograms with the microcomputer a programsystem MARAUS was written. It works in a dialogeous regime. The possibilities of using the equipment for evaluation of autoradiograms are demonstrated. (author)

  3. Microcomputer simulation of PWR power plant pressurizer

    International Nuclear Information System (INIS)

    Araujo, L.R.A. de; Calixto Neto, J.; Martinez, A.S.; Schirru, R.

    1990-01-01

    It is presented a method for the simulation of the pressurizer behavior of a PWR power plant. The method was implanted in a microcomputer, and it considers all the devices for the pressure control (spray and relief valves, heaters, controller, etc.). The physical phenomena and the PID (Proportional + Integral + Derivative) controller were mathematically represented by linear relations, uncoupled, discretized in the time. There are three different algorithms which take into account the non-linear effects introduced by the variation of the physical properties due to the temperature and pressure, and also the mutual effects between the physical phenomena and the PID controller. (author)

  4. Microcomputer Network for Computerized Adaptive Testing (CAT)

    Science.gov (United States)

    1984-03-01

    PRDC TR 84-33 \\Q.�d-33- \\ MICROCOMPUTER NETWOJlt FOR COMPUTERIZED ADAPTIVE TESTING ( CAT ) Baldwin Quan Thomas A . Park Gary Sandahl John H...ACCEIIION NO NPRDC TR 84-33 4. TITLE (-d Sul>tlllo) MICROCOMP UTER NETWORK FOR COMPUTERIZED ADA PTIVE TESTING ( CAT ) 1. Q B. uan T. A . Park...adaptive testing ( CAT ) Bayesian sequential testing 20. ABSTitACT (Continuo on ro•••• aide II noco .. _, _., ld-tlly ,.,. t.loclt _._.) DO Computerized

  5. Switching from computer to microcomputer architecture education

    Science.gov (United States)

    Bolanakis, Dimosthenis E.; Kotsis, Konstantinos T.; Laopoulos, Theodore

    2010-03-01

    In the last decades, the technological and scientific evolution of the computing discipline has been widely affecting research in software engineering education, which nowadays advocates more enlightened and liberal ideas. This article reviews cross-disciplinary research on a computer architecture class in consideration of its switching to microcomputer architecture. The authors present their strategies towards a successful crossing of boundaries between engineering disciplines. This communication aims at providing a different aspect on professional courses that are, nowadays, addressed at the expense of traditional courses.

  6. Single-chip microcomputer application in nuclear radiation monitoring instruments

    International Nuclear Information System (INIS)

    Zhang Songshou

    1994-01-01

    The single-chip microcomputer has advantage in many respects i.e. multiple function, small size, low-power consumption,reliability etc. It is widely used now in industry, instrumentation, communication and machinery. The author introduced usage of single-chip microcomputer in nuclear radiation monitoring instruments for control, linear compensation, calculation, changeable parameter presetting and military training

  7. User's manual for levelized power generation cost using a microcomputer

    International Nuclear Information System (INIS)

    Fuller, L.C.

    1984-08-01

    Microcomputer programs for the estimation of levelized electrical power generation costs are described. Procedures for light-water reactor plants and coal-fired plants include capital investment cost, operation and maintenance cost, fuel cycle cost, nuclear decommissioning cost, and levelized total generation cost. Programs are written in Pascal and are run on an Apple II Plus microcomputer

  8. Microcomputers and Informatics Education at the University Level.

    Science.gov (United States)

    Boyanov, Todor

    1984-01-01

    Because of the widespread use of microcomputers in Bulgaria, informatics education for all college students is considered both possible and necessary. Uses of microcomputers in various disciplines are described, including those in mathematics/mechanics, the experimental sciences, and humanities. Brief comments on computer-assisted-learning and…

  9. Microcomputer based test system for charge coupled devices

    International Nuclear Information System (INIS)

    Sidman, S.

    1981-02-01

    A microcomputer based system for testing analog charge coupled integrated circuits has been developed. It measures device performance for three parameters: dynamic range, baseline shift due to leakage current, and transfer efficiency. A companion board tester has also been developed. The software consists of a collection of BASIC and assembly language routines developed on the test system microcomputer

  10. The Design and Development of Educational Materials Using Microcomputer Technology in Distance Teaching Institutions: Some Issues for Consideration.

    Science.gov (United States)

    Yates, Christopher

    Perhaps the most significant development in microcomputer technology over the last two years has been the development of desktop publishing techniques. This technology promises to offer some significant advantages to institutions developing instructional materials in less developed countries, particularly in terms of control, cost effectiveness,…

  11. Virtual 3d City Modeling: Techniques and Applications

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3

  12. A microcomputer network for the control of digitising machines

    International Nuclear Information System (INIS)

    Seller, P.

    1981-01-01

    A distributed microcomputing network operates in the Bubble Chamber Research Group Scanning Laboratory at the Rutherford and Appleton Laboratories. A microcomputer at each digitising table buffers information, controls the functioning of the table and enhances the machine/operator interface. The system consists of fourteen microcomputers together with a VAX 11/780 computer used for data analysis. These are inter-connected via a packet switched network. This paper will describe the features of the combined system, including the distributed computing architecture and the packet switched method of communication. This paper will also describe in detail a high speed packet switching controller used as a central node of the network. This controller is a multiprocessor microcomputer system with eighteen central processor units, thirty-four direct memory access channels and thirty-four prioritorised and vectored interrupt channels. This microcomputer is of general interest as a communications controller due to its totally programmable nature. (orig.)

  13. Circuit oriented electromagnetic modeling using the PEEC techniques

    CERN Document Server

    Ruehli, Albert; Jiang, Lijun

    2017-01-01

    This book provides intuitive solutions to electromagnetic problems by using the Partial Eelement Eequivalent Ccircuit (PEEC) method. This book begins with an introduction to circuit analysis techniques, laws, and frequency and time domain analyses. The authors also treat Maxwell's equations, capacitance computations, and inductance computations through the lens of the PEEC method. Next, readers learn to build PEEC models in various forms: equivalent circuit models, non orthogonal PEEC models, skin-effect models, PEEC models for dielectrics, incident and radiate field models, and scattering PEEC models. The book concludes by considering issues like such as stability and passivity, and includes five appendices some with formulas for partial elements.

  14. [Intestinal lengthening techniques: an experimental model in dogs].

    Science.gov (United States)

    Garibay González, Francisco; Díaz Martínez, Daniel Alberto; Valencia Flores, Alejandro; González Hernández, Miguel Angel

    2005-01-01

    To compare two intestinal lengthening procedures in an experimental dog model. Intestinal lengthening is one of the methods for gastrointestinal reconstruction used for treatment of short bowel syndrome. The modification to the Bianchi's technique is an alternative. The modified technique decreases the number of anastomoses to a single one, thus reducing the risk of leaks and strictures. To our knowledge there is not any clinical or experimental report that studied both techniques, so we realized the present report. Twelve creole dogs were operated with the Bianchi technique for intestinal lengthening (group A) and other 12 creole dogs from the same race and weight were operated by the modified technique (Group B). Both groups were compared in relation to operating time, difficulties in technique, cost, intestinal lengthening and anastomoses diameter. There were no statistical difference in the anastomoses diameter (A = 9.0 mm vs. B = 8.5 mm, p = 0.3846). Operating time (142 min vs. 63 min) cost and technique difficulties were lower in group B (p anastomoses (of Group B) and intestinal segments had good blood supply and were patent along their full length. Bianchi technique and the modified technique offer two good reliable alternatives for the treatment of short bowel syndrome. The modified technique improved operating time, cost and technical issues.

  15. Efficiency determination of whole-body counter by Monte Carlo method, using a microcomputer

    International Nuclear Information System (INIS)

    Fernandes Neto, Jose Maria

    1986-01-01

    The purpose of this investigation was the development of an analytical microcomputer model to evaluate a whole body counter efficiency. The model is based on a modified Sryder's model. A stretcher type geometry along with the Monte Carlo method and a Synclair type microcomputer were used. Experimental measurements were performed using two phantoms, one as an adult and the other as a 5 year old child. The phantoms were made in acrylic and and 99m Tc, 131 I and 42 K were the radioisotopes utilized. Results showed a close relationship between experimental and predicted data for energies ranging from 250 keV to 2 MeV, but some discrepancies were found for lower energies. (author)

  16. Modelling Technique for Demonstrating Gravity Collapse Structures in Jointed Rock.

    Science.gov (United States)

    Stimpson, B.

    1979-01-01

    Described is a base-friction modeling technique for studying the development of collapse structures in jointed rocks. A moving belt beneath weak material is designed to simulate gravity. A description is given of the model frame construction. (Author/SA)

  17. A pilot modeling technique for handling-qualities research

    Science.gov (United States)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  18. Summary on several key techniques in 3D geological modeling.

    Science.gov (United States)

    Mei, Gang

    2014-01-01

    Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized.

  19. 3D Modeling Techniques for Print and Digital Media

    Science.gov (United States)

    Stephens, Megan Ashley

    In developing my thesis, I looked to gain skills using ZBrush to create 3D models, 3D scanning, and 3D printing. The models created compared the hearts of several vertebrates and were intended for students attending Comparative Vertebrate Anatomy. I used several resources to create a model of the human heart and was able to work from life while creating heart models from other vertebrates. I successfully learned ZBrush and 3D scanning, and successfully printed 3D heart models. ZBrush allowed me to create several intricate models for use in both animation and print media. The 3D scanning technique did not fit my needs for the project, but may be of use for later projects. I was able to 3D print using two different techniques as well.

  20. A New ABCD Technique to Analyze Business Models & Concepts

    OpenAIRE

    Aithal P. S.; Shailasri V. T.; Suresh Kumar P. M.

    2015-01-01

    Various techniques are used to analyze individual characteristics or organizational effectiveness like SWOT analysis, SWOC analysis, PEST analysis etc. These techniques provide an easy and systematic way of identifying various issues affecting a system and provides an opportunity for further development. Whereas these provide a broad-based assessment of individual institutions and systems, it suffers limitations while applying to business context. The success of any business model depends on ...

  1. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  2. Modeling Techniques for a Computational Efficient Dynamic Turbofan Engine Model

    Directory of Open Access Journals (Sweden)

    Rory A. Roberts

    2014-01-01

    Full Text Available A transient two-stream engine model has been developed. Individual component models developed exclusively in MATLAB/Simulink including the fan, high pressure compressor, combustor, high pressure turbine, low pressure turbine, plenum volumes, and exit nozzle have been combined to investigate the behavior of a turbofan two-stream engine. Special attention has been paid to the development of transient capabilities throughout the model, increasing physics model, eliminating algebraic constraints, and reducing simulation time through enabling the use of advanced numerical solvers. The lessening of computation time is paramount for conducting future aircraft system-level design trade studies and optimization. The new engine model is simulated for a fuel perturbation and a specified mission while tracking critical parameters. These results, as well as the simulation times, are presented. The new approach significantly reduces the simulation time.

  3. Gamma spectra analysis from a NaI(Tl) scintillation detector using a micro-computer

    International Nuclear Information System (INIS)

    Levinson, S.

    1990-01-01

    A software package of programs was devloped for qualitative and quantitative evaluation of gamma ray spectra obtained from a NaI(Tl) scintilation counter, by means of a micro-computer. The programs can easily be transformed for use with a Ge(Li) detector. The various algorithms enable automatic analyzing of a spectrum and also interactive or manual mode. The graphic programs display the measured spectrum as well as spectra of standard radionuclides which helps in the determination of peaks and related radionuclides in the spectrum. The peak search is carried out on a smoothed spectrum and is done by checking the behaviour of the second and third derivatives. The algorithm solves the problem of overlapping peaks and performs gaussian fitting, if necessary. Determination of the various radionuclides in the spectrum is done by linear minimum least squares techniques. Overall analysis of the radionuclides activities in the spectrum is obtained for samples of various counting geometries. In addition, a model was developed for efficiency calibration of flat 3X3 inch NaI(Tl) detectors for different samples measured in various counting geometries. It is based on point source experimental efficiency curve fitting. (author)

  4. Visualization of haemophilic arthropathy in F8(-/-) rats by ultrasonography and micro-computed tomography

    DEFF Research Database (Denmark)

    Christensen, K R; Roepstorff, K; Petersen, M

    2017-01-01

    opportunities. Recently, a F8(-/-) rat model of HA was developed. The size of the rat allows for convenient and high resolution imaging of the joints, which could enable in vivo studies of HA development. AIM: To determine whether HA in the F8(-/-) rat can be visualized using ultrasonography (US) and micro......-computed tomography (μCT). METHODS: Sixty F8(-/-) and 20 wild-type rats were subjected to a single or two induced knee bleeds. F8(-/-) rats were treated with either recombinant human FVIII (rhFVIII) or vehicle before the induction of knee bleeds. Haemophilic arthropathy was visualized using in vivo US and ex vivo μCT......, and the observations correlated with histological evaluation. RESULTS: US and μCT detected pathologies in the knee related to HA. There was a strong correlation between disease severity determined by μCT and histopathology. rhFVIII treatment reduced the pathology identified with both imaging techniques. CONCLUSION: US...

  5. Numerical model updating technique for structures using firefly algorithm

    Science.gov (United States)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  6. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  7. Models and Techniques for Proving Data Structure Lower Bounds

    DEFF Research Database (Denmark)

    Larsen, Kasper Green

    In this dissertation, we present a number of new techniques and tools for proving lower bounds on the operational time of data structures. These techniques provide new lines of attack for proving lower bounds in both the cell probe model, the group model, the pointer machine model and the I...... bound of tutq = (lgd􀀀1 n). For ball range searching, we get a lower bound of tutq = (n1􀀀1=d). The highest previous lower bound proved in the group model does not exceed ((lg n= lg lg n)2) on the maximum of tu and tq. Finally, we present a new technique for proving lower bounds....../O-model. In all cases, we push the frontiers further by proving lower bounds higher than what could possibly be proved using previously known techniques. For the cell probe model, our results have the following consequences: The rst (lg n) query time lower bound for linear space static data structures...

  8. Single-chip microcomputer based protection, diagnostic and recording system for longwall shearers

    Energy Technology Data Exchange (ETDEWEB)

    Heyduk, A.; Krasucki, F. (Politechnika Slaska, Gliwice (Poland). Katedra Elektryfikacji i Automatyzacji Gornictwa)

    1993-05-01

    Presents a concept of microcomputer-aided operation, protection, diagnostics and recording for shearer loaders. A two-stage mathematical model is suggested and explained. The model represents the thermal processes that determine the overcurrent protection of drive motors. Circuits for monitoring fuses, supply voltages, contacts, relays, contactors and electro-hydraulic distributors with the use of transoptors are shown. Recording characteristic operation parameters of a shearer loader during the 5 minutes before a failure is proposed. Protection, diagnosis and control functions are suggested as additional functions to the microcomputer-aided system of shearer loader control being developed at the Silesian Technical University. The system is based on the NECmicroPD 78310 microprocessor. 10 refs.

  9. A collection and information analysis of the experiment with microcomputer

    International Nuclear Information System (INIS)

    Mohd Ariffin bin Aton; Ler Leong Tat

    1985-01-01

    A microcomputer-based system for the continuous collection and analysis of data from a fermentor is described. The system was designed around commercially available hardware and interface and software packages written for microcomputers. Additional programmes were written in BASIC to allow the results to be printed in a specific format. The data read from the fermentor were automatically stored on a floppy disc and analysis on the data can be performed at our convenience. Such method for data collection is not limited to a bioreactor, however, since instruments that require continuous accurate reading, such as GLC, HPLC, etc., could be coupled to a microcomputer system. (author)

  10. Microcomputer control of automated TL reader

    International Nuclear Information System (INIS)

    Bjarland, B.

    1980-01-01

    An automatic TL reader has been developed for use within a TLD based personal monitoring service. A 6800 based microcomputer is used for system control, operator communication, calibration and checking of reader operation, and for output of data. The dosimeter identity code is printed in human readable characters on the dosimeter card, and is read by using an optical character recognition unit. The code may include individual sensitivity correction coefficients for the TL chips on the card. The chips are heated with hot nitrogen gas and the thermoluminescence is recorded by a photomultiplier tube circuit, the gain and offset of which are continuously monitored and, when necessary, adjusted, to maintain calibration. The reader may operate in any of seven modes, i.e. reading modes for three types of dosimeters, semiautomatic modes for production of the three types of dosimeters, and a monitor mode. (Auth.)

  11. VME bus based microcomputer system boards

    International Nuclear Information System (INIS)

    Chandra, A.K.; Ganesh, G.; Mayya, Anuradha; Chachondia, A.S.; Premraj, M.K.

    1991-01-01

    Several operator information systems for nuclear plants has been developed in the Division and these have involved extensive use of microcomputer boards for achieving various functions. Standard VME bus based boards have been developed to provide the most used functions. These boards have been fabricated and tested and used in several systems including Channel Temperature Monitoring systems, Disturbance Recording Systems etc. and are also proposed to be used in additional systems under developement. The use of standard bus and boards provides considerable savings in engineering time, prototyping, testing and evaluation costs, and maintenance support. This report desribes the various boards developed and the functions available on each. (author). 4 refs., 11 figs., 3 appendixes

  12. 'Micro-8' micro-computer system

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Nakahara, Yoshinori; Yamada, Takayuki; Takeuchi, Norio; Koyama, Kinji

    1978-08-01

    The micro-computer Micro-8 system has been developed to organize a data exchange network between various instruments and a computer group including a large computer system. Used for packet exchangers and terminal controllers, the system consists of ten kinds of standard boards including a CPU board with INTEL-8080 one-chip-processor. CPU architecture, BUS architecture, interrupt control, and standard-boards function are explained in circuit block diagrams. Operations of the basic I/O device, digital I/O board and communication adapter are described with definitions of the interrupt ramp status, I/O command, I/O mask, data register, etc. In the appendixes are circuit drawings, INTEL-8080 micro-processor specifications, BUS connections, I/O address mappings, jumper connections of address selection, and interface connections. (author)

  13. Microcomputer-controlled ultrasonic data acquisition system

    International Nuclear Information System (INIS)

    Simpson, W.A. Jr.

    1978-11-01

    The large volume of ultrasonic data generated by computer-aided test procedures has necessitated the development of a mobile, high-speed data acquisition and storage system. This approach offers the decided advantage of on-site data collection and remote data processing. It also utilizes standard, commercially available ultrasonic instrumentation. This system is controlled by an Intel 8080A microprocessor. The MCS80-SDK microcomputer board was chosen, and magnetic tape is used as the storage medium. A detailed description is provided of both the hardware and software developed to interface the magnetic tape storage subsystem to Biomation 8100 and Biomation 805 waveform recorders. A boxcar integrator acquisition system is also described for use when signal averaging becomes necessary. Both assembly language and machine language listings are provided for the software

  14. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  15. Plants status monitor: Modelling techniques and inherent benefits

    International Nuclear Information System (INIS)

    Breeding, R.J.; Lainoff, S.M.; Rees, D.C.; Prather, W.A.; Fickiessen, K.O.E.

    1987-01-01

    The Plant Status Monitor (PSM) is designed to provide plant personnel with information on the operational status of the plant and compliance with the plant technical specifications. The PSM software evaluates system models using a 'distributed processing' technique in which detailed models of individual systems are processed rather than by evaluating a single, plant-level model. In addition, development of the system models for PSM provides inherent benefits to the plant by forcing detailed reviews of the technical specifications, system design and operating procedures, and plant documentation. (orig.)

  16. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  17. Selection of productivity improvement techniques via mathematical modeling

    Directory of Open Access Journals (Sweden)

    Mahassan M. Khater

    2011-07-01

    Full Text Available This paper presents a new mathematical model to select an optimal combination of productivity improvement techniques. The proposed model of this paper considers four-stage cycle productivity and the productivity is assumed to be a linear function of fifty four improvement techniques. The proposed model of this paper is implemented for a real-world case study of manufacturing plant. The resulted problem is formulated as a mixed integer programming which can be solved for optimality using traditional methods. The preliminary results of the implementation of the proposed model of this paper indicate that the productivity can be improved through a change on equipments and it can be easily applied for both manufacturing and service industries.

  18. Constructing canine carotid artery stenosis model by endovascular technique

    International Nuclear Information System (INIS)

    Cheng Guangsen; Liu Yizhi

    2005-01-01

    Objective: To establish a carotid artery stenosis model by endovascular technique suitable for neuro-interventional therapy. Methods: Twelve dogs were anesthetized, the unilateral segments of the carotid arteries' tunica media and intima were damaged by a corneous guiding wire of home made. Twenty-four carotid artery stenosis models were thus created. DSA examination was performed on postprocedural weeks 2, 4, 8, 10 to estimate the changes of those stenotic carotid arteries. Results: Twenty-four carotid artery stenosis models were successfully created in twelve dogs. Conclusions: Canine carotid artery stenosis models can be created with the endovascular method having variation of pathologic characters and hemodynamic changes similar to human being. It is useful for further research involving the new technique and new material for interventional treatment. (authors)

  19. Modeling and Simulation Techniques for Large-Scale Communications Modeling

    National Research Council Canada - National Science Library

    Webb, Steve

    1997-01-01

    .... Tests of random number generators were also developed and applied to CECOM models. It was found that synchronization of random number strings in simulations is easy to implement and can provide significant savings for making comparative studies. If synchronization is in place, then statistical experiment design can be used to provide information on the sensitivity of the output to input parameters. The report concludes with recommendations and an implementation plan.

  20. Modeling and design techniques for RF power amplifiers

    CERN Document Server

    Raghavan, Arvind; Laskar, Joy

    2008-01-01

    The book covers RF power amplifier design, from device and modeling considerations to advanced circuit design architectures and techniques. It focuses on recent developments and advanced topics in this area, including numerous practical designs to back the theoretical considerations. It presents the challenges in designing power amplifiers in silicon and helps the reader improve the efficiency of linear power amplifiers, and design more accurate compact device models, with faster extraction routines, to create cost effective and reliable circuits.

  1. Techniques for discrimination-free predictive models (Chapter 12)

    NARCIS (Netherlands)

    Kamiran, F.; Calders, T.G.K.; Pechenizkiy, M.; Custers, B.H.M.; Calders, T.G.K.; Schermer, B.W.; Zarsky, T.Z.

    2013-01-01

    In this chapter, we give an overview of the techniques developed ourselves for constructing discrimination-free classifiers. In discrimination-free classification the goal is to learn a predictive model that classifies future data objects as accurately as possible, yet the predicted labels should be

  2. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  3. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  4. Air quality modelling using chemometric techniques | Azid | Journal ...

    African Journals Online (AJOL)

    This study presents that the chemometric techniques and modelling become an excellent tool in API assessment, air pollution source identification, apportionment and can be setbacks in designing an API monitoring network for effective air pollution resources management. Keywords: air pollutant index; chemometric; ANN; ...

  5. Tractor performance monitor based on a single-chip microcomputer

    Energy Technology Data Exchange (ETDEWEB)

    Bedri, A.R.; Marley, S.J.; Buchelle, W.F.; Smay, T.A.

    1981-01-01

    A tractor performance monitor based on a single-chip microcomputer was developed to measure ground speed, slip, fuel consumption (rate and total), total area, theoretical time, and total time. Transducers used are presented in detail. 5 refs.

  6. The artificial satellite observation chronograph controlled by single chip microcomputer.

    Science.gov (United States)

    Pan, Guangrong; Tan, Jufan; Ding, Yuanjun

    1991-06-01

    The instrument specifications, hardware structure, software design, and other characteristics of the chronograph mounting on a theodolite used for artificial satellite observation are presented. The instrument is a real time control system with a single chip microcomputer.

  7. Laser-based measuring equipment controlled by microcomputer

    International Nuclear Information System (INIS)

    Miron, N.; Sporea, D.; Velculescu, V.G.; Petre, M.

    1988-03-01

    Some laser-based measuring equipment controlled by microcomputer developed for industrial and scientific purposes are described. These equipments are intended for dial indicators verification, graduated rules measurement, and for very accurate measurement of the gravitational constant. (authors)

  8. Investigating Electromagnetic Induction through a Microcomputer-Based Laboratory.

    Science.gov (United States)

    Trumper, Ricardo; Gelbman, Moshe

    2000-01-01

    Describes a microcomputer-based laboratory experiment designed for high school students that very accurately analyzes Faraday's law of electromagnetic induction, addressing each variable separately while the others are kept constant. (Author/CCM)

  9. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, D.; Alfonsi, A.; Talbot, P.; Wang, C.; Maljovec, D.; Smith, C.; Rabiti, C.; Cogliati, J.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, the overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).

  10. Application of single-chip microcomputer in radiation detection

    International Nuclear Information System (INIS)

    Zhang Songshou

    1993-01-01

    The single-chip microcomputer has some advantages in many aspects for example the strong function, the small volume, the low-power, firmed and reliable. It is used widely in the control of industry, instrument, communication and machine, etc.. The paper introduces that the single-chip microcomputer is used in radiation detection, mostly including the use of control, linear, compensation, calculation, prefabricated change, improving precision and training

  11. A microcomputer-based waveform generator for Moessbauer spectrometers

    International Nuclear Information System (INIS)

    Huang Jianping; Chen Xiaomei

    1995-01-01

    A waveform generator for Moessbauer spectrometers based on 8751 single chip microcomputer is described. The reference wave form with high linearity is generated with a 12 bit DAC, and its amplitude is controlled with a 8 bit DAC. Because the channel advance and synchronous signals can be delayed arbitrarily, excellent folded spectra can be acquired. This waveform generator can be controlled with DIP switches on faceplate or series interface of the IBM-PC microcomputer

  12. X-ray diffraction identification of clay minerals by microcomputer

    International Nuclear Information System (INIS)

    Rodrigues, S.; Imasava, F.J.

    1988-01-01

    The identification of clay minerals by X-ray powder diffraction are done by searching an unknown pattern with a file of standard X-ray diffraction patterns. For this searching done by hand is necessary a long time. This paper shows a program in ''Basic'' language to be utilized in microcomputers for the math of the unknown pattern, using the high velocity of comparison of the microcomputer. A few minutes are used for the match. (author) [pt

  13. In vivo quantitative assessment of myocardial structure, function, perfusion and viability using cardiac micro-computed tomography

    NARCIS (Netherlands)

    E.D. van Deel (Elza); Y. Ridwan (Yanto); van Vliet, J.N. (J. Nicole); Belenkov, S. (Sasha); J. Essers (Jeroen)

    2016-01-01

    textabstractThe use of Micro-Computed Tomography (MicroCT) for in vivo studies of small animals as models of human disease has risen tremendously due to the fact that MicroCT provides quantitative high-resolution three-dimensional (3D) anatomical data non-destructively and longitudinally. Most

  14. Spatial Modeling of Geometallurgical Properties: Techniques and a Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Deutsch, Jared L., E-mail: jdeutsch@ualberta.ca [University of Alberta, School of Mining and Petroleum Engineering, Department of Civil and Environmental Engineering (Canada); Palmer, Kevin [Teck Resources Limited (Canada); Deutsch, Clayton V.; Szymanski, Jozef [University of Alberta, School of Mining and Petroleum Engineering, Department of Civil and Environmental Engineering (Canada); Etsell, Thomas H. [University of Alberta, Department of Chemical and Materials Engineering (Canada)

    2016-06-15

    High-resolution spatial numerical models of metallurgical properties constrained by geological controls and more extensively by measured grade and geomechanical properties constitute an important part of geometallurgy. Geostatistical and other numerical techniques are adapted and developed to construct these high-resolution models accounting for all available data. Important issues that must be addressed include unequal sampling of the metallurgical properties versus grade assays, measurements at different scale, and complex nonlinear averaging of many metallurgical parameters. This paper establishes techniques to address each of these issues with the required implementation details and also demonstrates geometallurgical mineral deposit characterization for a copper–molybdenum deposit in South America. High-resolution models of grades and comminution indices are constructed, checked, and are rigorously validated. The workflow demonstrated in this case study is applicable to many other deposit types.

  15. A fermionic molecular dynamics technique to model nuclear matter

    International Nuclear Information System (INIS)

    Vantournhout, K.; Jachowicz, N.; Ryckebusch, J.

    2009-01-01

    Full text: At sub-nuclear densities of about 10 14 g/cm 3 , nuclear matter arranges itself in a variety of complex shapes. This can be the case in the crust of neutron stars and in core-collapse supernovae. These slab like and rod like structures, designated as nuclear pasta, have been modelled with classical molecular dynamics techniques. We present a technique, based on fermionic molecular dynamics, to model nuclear matter at sub-nuclear densities in a semi classical framework. The dynamical evolution of an antisymmetric ground state is described making the assumption of periodic boundary conditions. Adding the concepts of antisymmetry, spin and probability distributions to classical molecular dynamics, brings the dynamical description of nuclear matter to a quantum mechanical level. Applications of this model vary from investigation of macroscopic observables and the equation of state to the study of fundamental interactions on the microscopic structure of the matter. (author)

  16. Model technique for aerodynamic study of boiler furnace

    Energy Technology Data Exchange (ETDEWEB)

    1966-02-01

    The help of the Division was recently sought to improve the heat transfer and reduce the exit gas temperature in a pulverized-fuel-fired boiler at an Australian power station. One approach adopted was to construct from Perspex a 1:20 scale cold-air model of the boiler furnace and to use a flow-visualization technique to study the aerodynamic patterns established when air was introduced through the p.f. burners of the model. The work established good correlations between the behaviour of the model and of the boiler furnace.

  17. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  18. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  19. ENPEP and the microcomputer version of WASP-III: Overview and recent experience

    International Nuclear Information System (INIS)

    Buehring, W.A.; Wolsko, T.D.

    1987-01-01

    Argonne National Laboratory (ANL) has developed a microcomputer-based energy planning package entitled ENergy and Power Evaluation Program (ENPEP). It consists of seven technical modules, four commercial software packages, and an executive system that conveniently integrates the many options associated with performing energy studies. The seven technical modules and their functions are as follows: MACRO allows the user to specify macroeconomic growth (global or sectoral) that will be the drivers of energy demand. DEMAND projects energy demand based upon the macroeconomic growth information supplied in MACRO. PLANTDATA provides a library of technical data on electric generating plants that is used by BALANCE and ELECTRIC. BALANCE computes marketplace energy supply/demand balances over the study period. LOAD computes detailed electric load forecast information for use in ELECTRIC. ELECTRIC, the microcomputer (PC) version of WASP-III, calculates a minimum cost electric supply system to meet electric demand and reliability goals. IMPACTS calculates environmental impacts and resource requirements associated with energy supply system options. ENPEP provides the potential for energy planners in developing countries to carry out important studies without access to inconvenient and/or expensive mainframe computers. The ELECTRIC module of ENPEP provides electric system planners the opportunity to use the WASP-III model for expansion planning of electrical generating systems. Extensive efforts have been made in converting WASP-III to the microcomputer to provide user-friendly data entry forms and options for operations. (author). 3 refs, 20 figs

  20. Use of hydrological modelling and isotope techniques in Guvenc basin

    International Nuclear Information System (INIS)

    Altinbilek, D.

    1991-07-01

    The study covers the work performed under Project No. 335-RC-TUR-5145 entitled ''Use of Hydrologic Modelling and Isotope Techniques in Guvenc Basin'' and is an initial part of a program for estimating runoff from Central Anatolia Watersheds. The study presented herein consists of mainly three parts: 1) the acquisition of a library of rainfall excess, direct runoff and isotope data for Guvenc basin; 2) the modification of SCS model to be applied to Guvenc basin first and then to other basins of Central Anatolia for predicting the surface runoff from gaged and ungaged watersheds; and 3) the use of environmental isotope technique in order to define the basin components of streamflow of Guvenc basin. 31 refs, figs and tabs

  1. Construct canine intracranial aneurysm model by endovascular technique

    International Nuclear Information System (INIS)

    Liang Xiaodong; Liu Yizhi; Ni Caifang; Ding Yi

    2004-01-01

    Objective: To construct canine bifurcation aneurysms suitable for evaluating the exploration of endovascular devices for interventional therapy by endovascular technique. Methods: The right common carotid artery of six dogs was expanded with a pliable balloon by means of endovascular technique, then embolization with detached balloon was taken at their originations DAS examination were performed on 1, 2, 3 d after the procedurse. Results: 6 aneurysm models were created in six dogs successfully with the mean width and height of the aneurysms decreasing in 3 days. Conclusions: This canine aneurysm model presents the virtue in the size and shape of human cerebral bifurcation saccular aneurysms on DSA image, suitable for developing the exploration of endovascular devices for aneurismal therapy. The procedure is quick, reliable and reproducible. (authors)

  2. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  3. Microcomputer applications for NSSS data package review

    International Nuclear Information System (INIS)

    Eng, D.J.; Smith, F.M.

    1984-01-01

    Associated with each component of a Nuclear Steam Supply System are the necessary supporting documents which demonstrate compliance with applicable codes, specifications and design criteria. The documents are grouped into individual data packages, and may exceed 800 in number for a major installation. A complete review, initiated by a utility in response to federal code regulations (10 CFR 50), can involve a tremendous number of review transactions. A data management system to assist the reviewer in the tracking and resolution of discrepancy items is currently under development and use. The system uses microcomputer-based relational database management software and provides complete and flexible capabilities to process database components and attributes based on user specified criteria. The completed database is a ''portable'' system, and can be utilized on an as-needed basis after the review is completed. A discrepancy analysis is performed to identify relations between manufacturer and discrepancy occurrence, part type vs. discrepancy type, etc. These results may prove useful in subsequent reviews or application to existing QA procedures

  4. [Preparation of simulate craniocerebral models via three dimensional printing technique].

    Science.gov (United States)

    Lan, Q; Chen, A L; Zhang, T; Zhu, Q; Xu, T

    2016-08-09

    Three dimensional (3D) printing technique was used to prepare the simulate craniocerebral models, which were applied to preoperative planning and surgical simulation. The image data was collected from PACS system. Image data of skull bone, brain tissue and tumors, cerebral arteries and aneurysms, and functional regions and relative neural tracts of the brain were extracted from thin slice scan (slice thickness 0.5 mm) of computed tomography (CT), magnetic resonance imaging (MRI, slice thickness 1mm), computed tomography angiography (CTA), and functional magnetic resonance imaging (fMRI) data, respectively. MIMICS software was applied to reconstruct colored virtual models by identifying and differentiating tissues according to their gray scales. Then the colored virtual models were submitted to 3D printer which produced life-sized craniocerebral models for surgical planning and surgical simulation. 3D printing craniocerebral models allowed neurosurgeons to perform complex procedures in specific clinical cases though detailed surgical planning. It offered great convenience for evaluating the size of spatial fissure of sellar region before surgery, which helped to optimize surgical approach planning. These 3D models also provided detailed information about the location of aneurysms and their parent arteries, which helped surgeons to choose appropriate aneurismal clips, as well as perform surgical simulation. The models further gave clear indications of depth and extent of tumors and their relationship to eloquent cortical areas and adjacent neural tracts, which were able to avoid surgical damaging of important neural structures. As a novel and promising technique, the application of 3D printing craniocerebral models could improve the surgical planning by converting virtual visualization into real life-sized models.It also contributes to functional anatomy study.

  5. Skin fluorescence model based on the Monte Carlo technique

    Science.gov (United States)

    Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.

    2003-10-01

    The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.

  6. System health monitoring using multiple-model adaptive estimation techniques

    Science.gov (United States)

    Sifford, Stanley Ryan

    Monitoring system health for fault detection and diagnosis by tracking system parameters concurrently with state estimates is approached using a new multiple-model adaptive estimation (MMAE) method. This novel method is called GRid-based Adaptive Parameter Estimation (GRAPE). GRAPE expands existing MMAE methods by using new techniques to sample the parameter space. GRAPE expands on MMAE with the hypothesis that sample models can be applied and resampled without relying on a predefined set of models. GRAPE is initially implemented in a linear framework using Kalman filter models. A more generalized GRAPE formulation is presented using extended Kalman filter (EKF) models to represent nonlinear systems. GRAPE can handle both time invariant and time varying systems as it is designed to track parameter changes. Two techniques are presented to generate parameter samples for the parallel filter models. The first approach is called selected grid-based stratification (SGBS). SGBS divides the parameter space into equally spaced strata. The second approach uses Latin Hypercube Sampling (LHS) to determine the parameter locations and minimize the total number of required models. LHS is particularly useful when the parameter dimensions grow. Adding more parameters does not require the model count to increase for LHS. Each resample is independent of the prior sample set other than the location of the parameter estimate. SGBS and LHS can be used for both the initial sample and subsequent resamples. Furthermore, resamples are not required to use the same technique. Both techniques are demonstrated for both linear and nonlinear frameworks. The GRAPE framework further formalizes the parameter tracking process through a general approach for nonlinear systems. These additional methods allow GRAPE to either narrow the focus to converged values within a parameter range or expand the range in the appropriate direction to track the parameters outside the current parameter range boundary

  7. Application of object modeling technique to medical image retrieval system

    International Nuclear Information System (INIS)

    Teshima, Fumiaki; Abe, Takeshi

    1993-01-01

    This report describes the results of discussions on the object-oriented analysis methodology, which is one of the object-oriented paradigms. In particular, we considered application of the object modeling technique (OMT) to the analysis of a medical image retrieval system. The object-oriented methodology places emphasis on the construction of an abstract model from real-world entities. The effectiveness of and future improvements to OMT are discussed from the standpoint of the system's expandability. These discussions have elucidated that the methodology is sufficiently well-organized and practical to be applied to commercial products, provided that it is applied to the appropriate problem domain. (author)

  8. Level-set techniques for facies identification in reservoir modeling

    Science.gov (United States)

    Iglesias, Marco A.; McLaughlin, Dennis

    2011-03-01

    In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil-water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301-29 2004 Inverse Problems 20 259-82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg-Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush-Kuhn-Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies.

  9. Level-set techniques for facies identification in reservoir modeling

    International Nuclear Information System (INIS)

    Iglesias, Marco A; McLaughlin, Dennis

    2011-01-01

    In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil–water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301–29; 2004 Inverse Problems 20 259–82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg–Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush–Kuhn–Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies

  10. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  11. Validation of transport models using additive flux minimization technique

    International Nuclear Information System (INIS)

    Pankin, A. Y.; Kruger, S. E.; Groebner, R. J.; Hakim, A.; Kritz, A. H.; Rafiq, T.

    2013-01-01

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile

  12. Inside marginal adaptation of crowns by X-ray micro-computed tomography

    International Nuclear Information System (INIS)

    Dos Santos, T. M.; Lima, I.; Lopes, R. T.; Author, S. B. Jr.

    2015-01-01

    The objective of this work was to access dental arcade by using X-ray micro-computed tomography. For this purpose high resolution system was used and three groups were studied: Zirkonzahn CAD-CAM system, IPS e.max Press, and metal ceramic. The three systems assessed in this study showed results of marginal and discrepancy gaps clinically accepted. The great result of 2D and 3D evaluations showed that the used technique is a powerful method to investigate quantitative characteristics of dental arcade. (authors)

  13. Inside marginal adaptation of crowns by X-ray micro-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Dos Santos, T. M.; Lima, I.; Lopes, R. T. [Nuclear Instrumentation Laboratory, Nuclear Engineering Program, Federal University of Rio de Janeiro, RJ, (Brazil); Author, S. B. Jr. [Department of Physics, Colorado State University, Ft. Collins, CO 80523, (United States)

    2015-07-01

    The objective of this work was to access dental arcade by using X-ray micro-computed tomography. For this purpose high resolution system was used and three groups were studied: Zirkonzahn CAD-CAM system, IPS e.max Press, and metal ceramic. The three systems assessed in this study showed results of marginal and discrepancy gaps clinically accepted. The great result of 2D and 3D evaluations showed that the used technique is a powerful method to investigate quantitative characteristics of dental arcade. (authors)

  14. An automated microcomputer-controlled system for neutron activation and gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Edward, J.B.; Bennett, L.G.I.

    1990-01-01

    An automated instrumental neutron activation analysis (INAA) system has been constructed at the SLOWPOKE-2 reactor at the Royal Military College of Canada (RMC). Its pneumatic transfer system is controlled by an Apple IIe computer, linked in turn to an MS-DOS-compatible microcomputer which controls data acquisition. Custom software has been created for these computers and for off-line spectral analysis using programs that incorporate either peak boundary or Gaussian peak fitting methods of analysis. This system provides the gamut of INAA techniques for the analyst. The design and performance of the hardware and software are discussed. (orig.)

  15. Improved ceramic slip casting technique. [application to aircraft model fabrication

    Science.gov (United States)

    Buck, Gregory M. (Inventor); Vasquez, Peter (Inventor)

    1993-01-01

    A primary concern in modern fluid dynamics research is the experimental verification of computational aerothermodynamic codes. This research requires high precision and detail in the test model employed. Ceramic materials are used for these models because of their low heat conductivity and their survivability at high temperatures. To fabricate such models, slip casting techniques were developed to provide net-form, precision casting capability for high-purity ceramic materials in aqueous solutions. In previous slip casting techniques, block, or flask molds made of plaster-of-paris were used to draw liquid from the slip material. Upon setting, parts were removed from the flask mold and cured in a kiln at high temperatures. Casting detail was usually limited with this technique -- detailed parts were frequently damaged upon separation from the flask mold, as the molded parts are extremely delicate in the uncured state, and the flask mold is inflexible. Ceramic surfaces were also marred by 'parting lines' caused by mold separation. This adversely affected the aerodynamic surface quality of the model as well. (Parting lines are invariably necessary on or near the leading edges of wings, nosetips, and fins for mold separation. These areas are also critical for flow boundary layer control.) Parting agents used in the casting process also affected surface quality. These agents eventually soaked into the mold, the model, or flaked off when releasing the case model. Different materials were tried, such as oils, paraffin, and even an algae. The algae released best, but some of it remained on the model and imparted an uneven texture and discoloration on the model surface when cured. According to the present invention, a wax pattern for a shell mold is provided, and an aqueous mixture of a calcium sulfate-bonded investment material is applied as a coating to the wax pattern. The coated wax pattern is then dried, followed by curing to vaporize the wax pattern and leave a shell

  16. Numerical and modeling techniques used in the EPIC code

    International Nuclear Information System (INIS)

    Pizzica, P.A.; Abramson, P.B.

    1977-01-01

    EPIC models fuel and coolant motion which result from internal fuel pin pressure (from fission gas or fuel vapor) and/or from the generation of sodium vapor pressures in the coolant channel subsequent to pin failure in an LMFBR. The modeling includes the ejection of molten fuel from the pin into a coolant channel with any amount of voiding through a clad rip which may be of any length or which may expand with time. One-dimensional Eulerian hydrodynamics is used to model both the motion of fuel and fission gas inside a molten fuel cavity and the mixture of two-phase sodium and fission gas in the channel. Motion of molten fuel particles in the coolant channel is tracked with a particle-in-cell technique

  17. Soft computing techniques toward modeling the water supplies of Cyprus.

    Science.gov (United States)

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. [Hierarchy structuring for mammography technique by interpretive structural modeling method].

    Science.gov (United States)

    Kudo, Nozomi; Kurowarabi, Kunio; Terashita, Takayoshi; Nishimoto, Naoki; Ogasawara, Katsuhiko

    2009-10-20

    Participation in screening mammography is currently desired in Japan because of the increase in breast cancer morbidity. However, the pain and discomfort of mammography is recognized as a significant deterrent for women considering this examination. Thus quick procedures, sufficient experience, and advanced skills are required for radiologic technologists. The aim of this study was to make the point of imaging techniques explicit and to help understand the complicated procedure. We interviewed 3 technologists who were highly skilled in mammography, and 14 factors were retrieved by using brainstorming and the KJ method. We then applied Interpretive Structural Modeling (ISM) to the factors and developed a hierarchical concept structure. The result showed a six-layer hierarchy whose top node was explanation of the entire procedure on mammography. Male technologists were related to as a negative factor. Factors concerned with explanation were at the upper node. We gave attention to X-ray techniques and considerations. The findings will help beginners improve their skills.

  19. Teaching scientific concepts through simple models and social communication techniques

    International Nuclear Information System (INIS)

    Tilakaratne, K.

    2011-01-01

    For science education, it is important to demonstrate to students the relevance of scientific concepts in every-day life experiences. Although there are methods available for achieving this goal, it is more effective if cultural flavor is also added to the teaching techniques and thereby the teacher and students can easily relate the subject matter to their surroundings. Furthermore, this would bridge the gap between science and day-to-day experiences in an effective manner. It could also help students to use science as a tool to solve problems faced by them and consequently they would feel science is a part of their lives. In this paper, it has been described how simple models and cultural communication techniques can be used effectively in demonstrating important scientific concepts to the students of secondary and higher secondary levels by using two consecutive activities carried out at the Institute of Fundamental Studies (IFS), Sri Lanka. (author)

  20. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  1. A new cerebral vasospasm model established with endovascular puncture technique

    International Nuclear Information System (INIS)

    Tu Jianfei; Liu Yizhi; Ji Jiansong; Zhao Zhongwei

    2011-01-01

    Objective: To investigate the method of establishing cerebral vasospasm (CVS) models in rabbits by using endovascular puncture technique. Methods: Endovascular puncture procedure was performed in 78 New Zealand white rabbits to produce subarachnoid hemorrhage (SAH). The survival rabbits were randomly divided into seven groups (3 h, 12 h, 1 d, 2 d, 3 d, 7 d and 14 d), with five rabbits in each group for both study group (SAH group) and control group. Cerebral CT scanning was carried out in all rabbits both before and after the operation. The inner diameter and the thickness of vascular wall of both posterior communicating artery (PcoA) and basilar artery (BA) were determined after the animals were sacrificed, and the results were analyzed. Results: Of 78 experimental rabbits, CVS model was successfully established in 45, including 35 of SAH group and 10 control subgroup. The technical success rate was 57.7%. Twelve hours after the procedure, the inner diameter of PcoA and BA in SAH group was decreased by 45.6% and 52.3%, respectively, when compared with these in control group. The vascular narrowing showed biphasic changes, the inner diameter markedly decreased again at the 7th day when the decrease reached its peak to 31.2% and 48.6%, respectively. Conclusion: Endovascular puncture technique is an effective method to establish CVS models in rabbits. The death rate of experimental animals can be decreased if new interventional material is used and the manipulation is carefully performed. (authors)

  2. Use of advanced modeling techniques to optimize thermal packaging designs.

    Science.gov (United States)

    Formato, Richard M; Potami, Raffaele; Ahmed, Iftekhar

    2010-01-01

    Through a detailed case study the authors demonstrate, for the first time, the capability of using advanced modeling techniques to correctly simulate the transient temperature response of a convective flow-based thermal shipper design. The objective of this case study was to demonstrate that simulation could be utilized to design a 2-inch-wall polyurethane (PUR) shipper to hold its product box temperature between 2 and 8 °C over the prescribed 96-h summer profile (product box is the portion of the shipper that is occupied by the payload). Results obtained from numerical simulation are in excellent agreement with empirical chamber data (within ±1 °C at all times), and geometrical locations of simulation maximum and minimum temperature match well with the corresponding chamber temperature measurements. Furthermore, a control simulation test case was run (results taken from identical product box locations) to compare the coupled conduction-convection model with a conduction-only model, which to date has been the state-of-the-art method. For the conduction-only simulation, all fluid elements were replaced with "solid" elements of identical size and assigned thermal properties of air. While results from the coupled thermal/fluid model closely correlated with the empirical data (±1 °C), the conduction-only model was unable to correctly capture the payload temperature trends, showing a sizeable error compared to empirical values (ΔT > 6 °C). A modeling technique capable of correctly capturing the thermal behavior of passively refrigerated shippers can be used to quickly evaluate and optimize new packaging designs. Such a capability provides a means to reduce the cost and required design time of shippers while simultaneously improving their performance. Another advantage comes from using thermal modeling (assuming a validated model is available) to predict the temperature distribution in a shipper that is exposed to ambient temperatures which were not bracketed

  3. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    Energy Technology Data Exchange (ETDEWEB)

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow

  4. Monte Carlo technique for very large ising models

    Science.gov (United States)

    Kalle, C.; Winkelmann, V.

    1982-08-01

    Rebbi's multispin coding technique is improved and applied to the kinetic Ising model with size 600*600*600. We give the central part of our computer program (for a CDC Cyber 76), which will be helpful also in a simulation of smaller systems, and describe the other tricks necessary to go to large lattices. The magnetization M at T=1.4* T c is found to decay asymptotically as exp(-t/2.90) if t is measured in Monte Carlo steps per spin, and M( t = 0) = 1 initially.

  5. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  6. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  7. Evaluation of Root Canal Preparation Using Rotary System and Hand Instruments Assessed by Micro-Computed Tomography.

    Science.gov (United States)

    Stavileci, Miranda; Hoxha, Veton; Görduysus, Ömer; Tatar, Ilkan; Laperre, Kjell; Hostens, Jeroen; Küçükkaya, Selen; Muhaxheri, Edmond

    2015-06-20

    Complete mechanical preparation of the root canal system is rarely achieved. Therefore, the purpose of this study was to evaluate and compare the root canal shaping efficacy of ProTaper rotary files and standard stainless steel K-files using micro-computed tomography. Sixty extracted upper second premolars were selected and divided into 2 groups of 30 teeth each. Before preparation, all samples were scanned by micro-computed tomography. Thirty teeth were prepared with the ProTaper system and the other 30 with stainless steel files. After preparation, the untouched surface and root canal straightening were evaluated with micro-computed tomography. The percentage of untouched root canal surface was calculated in the coronal, middle, and apical parts of the canal. We also calculated straightening of the canal after root canal preparation. Results from the 2 groups were statistically compared using the Minitab statistical package. ProTaper rotary files left less untouched root canal surface compared with manual preparation in coronal, middle, and apical sector (p<0.001). Similarly, there was a statistically significant difference in root canal straightening after preparation between the techniques (p<0.001). Neither manual nor rotary techniques completely prepared the root canal, and both techniques caused slight straightening of the root canal.

  8. Laparoscopic anterior resection: new anastomosis technique in a pig model.

    Science.gov (United States)

    Bedirli, Abdulkadir; Yucel, Deniz; Ekim, Burcu

    2014-01-01

    Bowel anastomosis after anterior resection is one of the most difficult tasks to perform during laparoscopic colorectal surgery. This study aims to evaluate a new feasible and safe intracorporeal anastomosis technique after laparoscopic left-sided colon or rectum resection in a pig model. The technique was evaluated in 5 pigs. The OrVil device (Covidien, Mansfield, Massachusetts) was inserted into the anus and advanced proximally to the rectum. A 0.5-cm incision was made in the sigmoid colon, and the 2 sutures attached to its delivery tube were cut. After the delivery tube was evacuated through the anus, the tip of the anvil was removed through the perforation. The sigmoid colon was transected just distal to the perforation with an endoscopic linear stapler. The rectosigmoid segment to be resected was removed through the anus with a grasper, and distal transection was performed. A 25-mm circular stapler was inserted and combined with the anvil, and end-to-side intracorporeal anastomosis was then performed. We performed the technique in 5 pigs. Anastomosis required an average of 12 minutes. We observed that the proximal and distal donuts were completely removed in all pigs. No anastomotic air leakage was observed in any of the animals. This study shows the efficacy and safety of intracorporeal anastomosis with the OrVil device after laparoscopic anterior resection.

  9. A Continuous Dynamic Traffic Assignment Model From Plate Scanning Technique

    Energy Technology Data Exchange (ETDEWEB)

    Rivas, A.; Gallego, I.; Sanchez-Cambronero, S.; Ruiz-Ripoll, L.; Barba, R.M.

    2016-07-01

    This paper presents a methodology for the dynamic estimation of traffic flows on all links of a network from observable field data assuming the first-in-first-out (FIFO) hypothesis. The traffic flow intensities recorded at the exit of the scanned links are propagated to obtain the flow waves on unscanned links. For that, the model calculates the flow-cost functions through information registered with the plate scanning technique. The model also responds to the concern about the parameter quality of flow-cost functions to replicate the real traffic flow behaviour. It includes a new algorithm for the adjustment of the parameter values to link characteristics when its quality is questionable. For that, it is necessary the a priori study of the location of the scanning devices to identify all path flows and to measure travel times in all links. A synthetic network is used to illustrate the proposed method and to prove its usefulness and feasibility. (Author)

  10. Reactor protection system design using micro-computers

    International Nuclear Information System (INIS)

    Fairbrother, D.B.

    1976-01-01

    Reactor protection systems for nuclear power plants have traditionally been built using analog hardware. This hardware works quite well for single parameter trip functions; however, optimum protection against DNBR and KW/ft limits requires more complex trip functions than can easily be handled with analog hardware. For this reason, Babcock and Wilcox has introduced a Reactor Protection System, called the RPS-II, that utilizes a micro-computer to handle the more complex trip functions. The paper describes the design of the RPS-II and the operation of the micro-computer within the Reactor Protection System

  11. Reactor protection system design using micro-computers

    International Nuclear Information System (INIS)

    Fairbrother, D.B.

    1977-01-01

    Reactor Protection Systems for Nuclear Power Plants have traditionally been built using analog hardware. This hardware works quite well for single parameter trip functions; however, optimum protection against DNBR and KW/ft limits requires more complex trip functions than can easily be handled with analog hardware. For this reason, Babcock and Wilcox has introduced a Reactor Protection System, called the RPS-II, that utilizes a micro-computer to handle the more complex trip functions. This paper describes the design of the RPS-II and the operation of the micro-computer within the Reactor Protection System

  12. Design and manufacture of TL analyser by using the microcomputer

    International Nuclear Information System (INIS)

    Doh, Sih Hong; Woo, Chong Ho

    1986-01-01

    This paper describes the design of the thermoluminescence analyser using microcomputer. TL analyser is designed to perform the three step heat treatment, such as pre-read heating, readout procedure and post-heating (or pre-irradiation ) anneal. We used a 12-bit A/D converter to get the precise measurement and the phase control method to control the heating temperature. Since the used Apple II microcomputer is cheap and popular, it is possible to design the economical system. Experimental results showed the successful operation with flexibility. The error of temperature control was less than ± 0.2% of the expected value. (Author)

  13. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  14. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  15. Improving default risk prediction using Bayesian model uncertainty techniques.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  16. Mechanical Properties of Nanostructured Materials Determined Through Molecular Modeling Techniques

    Science.gov (United States)

    Clancy, Thomas C.; Gates, Thomas S.

    2005-01-01

    The potential for gains in material properties over conventional materials has motivated an effort to develop novel nanostructured materials for aerospace applications. These novel materials typically consist of a polymer matrix reinforced with particles on the nanometer length scale. In this study, molecular modeling is used to construct fully atomistic models of a carbon nanotube embedded in an epoxy polymer matrix. Functionalization of the nanotube which consists of the introduction of direct chemical bonding between the polymer matrix and the nanotube, hence providing a load transfer mechanism, is systematically varied. The relative effectiveness of functionalization in a nanostructured material may depend on a variety of factors related to the details of the chemical bonding and the polymer structure at the nanotube-polymer interface. The objective of this modeling is to determine what influence the details of functionalization of the carbon nanotube with the polymer matrix has on the resulting mechanical properties. By considering a range of degree of functionalization, the structure-property relationships of these materials is examined and mechanical properties of these models are calculated using standard techniques.

  17. Characterising and modelling regolith stratigraphy using multiple geophysical techniques

    Science.gov (United States)

    Thomas, M.; Cremasco, D.; Fotheringham, T.; Hatch, M. A.; Triantifillis, J.; Wilford, J.

    2013-12-01

    Regolith is the weathered, typically mineral-rich layer from fresh bedrock to land surface. It encompasses soil (A, E and B horizons) that has undergone pedogenesis. Below is the weathered C horizon that retains at least some of the original rocky fabric and structure. At the base of this is the lower regolith boundary of continuous hard bedrock (the R horizon). Regolith may be absent, e.g. at rocky outcrops, or may be many 10's of metres deep. Comparatively little is known about regolith, and critical questions remain regarding composition and characteristics - especially deeper where the challenge of collecting reliable data increases with depth. In Australia research is underway to characterise and map regolith using consistent methods at scales ranging from local (e.g. hillslope) to continental scales. These efforts are driven by many research needs, including Critical Zone modelling and simulation. Pilot research in South Australia using digitally-based environmental correlation techniques modelled the depth to bedrock to 9 m for an upland area of 128 000 ha. One finding was the inability to reliably model local scale depth variations over horizontal distances of 2 - 3 m and vertical distances of 1 - 2 m. The need to better characterise variations in regolith to strengthen models at these fine scales was discussed. Addressing this need, we describe high intensity, ground-based multi-sensor geophysical profiling of three hillslope transects in different regolith-landscape settings to characterise fine resolution (i.e. a number of frequencies; multiple frequency, multiple coil electromagnetic induction; and high resolution resistivity. These were accompanied by georeferenced, closely spaced deep cores to 9 m - or to core refusal. The intact cores were sub-sampled to standard depths and analysed for regolith properties to compile core datasets consisting of: water content; texture; electrical conductivity; and weathered state. After preprocessing (filtering, geo

  18. Evaluation of Root Canal Preparation Using Rotary System and Hand Instruments Assessed by Micro-Computed Tomography

    Science.gov (United States)

    Stavileci, Miranda; Hoxha, Veton; Görduysus, Ömer; Tatar, Ilkan; Laperre, Kjell; Hostens, Jeroen; Küçükkaya, Selen; Muhaxheri, Edmond

    2015-01-01

    Background Complete mechanical preparation of the root canal system is rarely achieved. Therefore, the purpose of this study was to evaluate and compare the root canal shaping efficacy of ProTaper rotary files and standard stainless steel K-files using micro-computed tomography. Material/Methods Sixty extracted upper second premolars were selected and divided into 2 groups of 30 teeth each. Before preparation, all samples were scanned by micro-computed tomography. Thirty teeth were prepared with the ProTaper system and the other 30 with stainless steel files. After preparation, the untouched surface and root canal straightening were evaluated with micro-computed tomography. The percentage of untouched root canal surface was calculated in the coronal, middle, and apical parts of the canal. We also calculated straightening of the canal after root canal preparation. Results from the 2 groups were statistically compared using the Minitab statistical package. Results ProTaper rotary files left less untouched root canal surface compared with manual preparation in coronal, middle, and apical sector (protary techniques completely prepared the root canal, and both techniques caused slight straightening of the root canal. PMID:26092929

  19. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  20. Micro-Computed Tomography Evaluation of Human Fat Grafts in Nude Mice

    Science.gov (United States)

    Chung, Michael T.; Hyun, Jeong S.; Lo, David D.; Montoro, Daniel T.; Hasegawa, Masakazu; Levi, Benjamin; Januszyk, Michael; Longaker, Michael T.

    2013-01-01

    Background Although autologous fat grafting has revolutionized the field of soft tissue reconstruction and augmentation, long-term maintenance of fat grafts is unpredictable. Recent studies have reported survival rates of fat grafts to vary anywhere between 10% and 80% over time. The present study evaluated the long-term viability of human fat grafts in a murine model using a novel imaging technique allowing for in vivo volumetric analysis. Methods Human fat grafts were prepared from lipoaspirate samples using the Coleman technique. Fat was injected subcutaneously into the scalp of 10 adult Crl:NU-Foxn1nu CD-1 male mice. Micro-computed tomography (CT) was performed immediately following injection and then weekly thereafter. Fat volume was rendered by reconstructing a three-dimensional (3D) surface through cubic-spline interpolation. Specimens were also harvested at various time points and sections were prepared and stained with hematoxylin and eosin (H&E), for macrophages using CD68 and for the cannabinoid receptor 1 (CB1). Finally, samples were explanted at 8- and 12-week time points to validate calculated micro-CT volumes. Results Weekly CT scanning demonstrated progressive volume loss over the time course. However, volumetric analysis at the 8- and 12-week time points stabilized, showing an average of 62.2% and 60.9% survival, respectively. Gross analysis showed the fat graft to be healthy and vascularized. H&E analysis and staining for CD68 showed minimal inflammatory reaction with viable adipocytes. Immunohistochemical staining with anti-human CB1 antibodies confirmed human origin of the adipocytes. Conclusions Studies assessing the fate of autologous fat grafts in animals have focused on nonimaging modalities, including histological and biochemical analyses, which require euthanasia of the animals. In this study, we have demonstrated the ability to employ micro-CT for 3D reconstruction and volumetric analysis of human fat grafts in a mouse model. Importantly

  1. VLF surface-impedance modelling techniques for coal exploration

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.; Thiel, D.; O' Keefe, S. [Central Queensland University, Rockhampton, Qld. (Australia). Faculty of Engineering and Physical Systems

    2000-10-01

    New and efficient computational techniques are required for geophysical investigations of coal. This will allow automated inverse analysis procedures to be used for interpretation of field data. In this paper, a number of methods of modelling electromagnetic surface impedance measurements are reviewed, particularly as applied to typical coal seam geology found in the Bowen Basin. At present, the Impedance method and the finite-difference time-domain (FDTD) method appear to offer viable solutions although both have problems. The Impedance method is currently slightly inaccurate, and the FDTD method has large computational demands. In this paper both methods are described and results are presented for a number of geological targets. 17 refs., 14 figs.

  2. Demand Management Based on Model Predictive Control Techniques

    Directory of Open Access Journals (Sweden)

    Yasser A. Davizón

    2014-01-01

    Full Text Available Demand management (DM is the process that helps companies to sell the right product to the right customer, at the right time, and for the right price. Therefore the challenge for any company is to determine how much to sell, at what price, and to which market segment while maximizing its profits. DM also helps managers efficiently allocate undifferentiated units of capacity to the available demand with the goal of maximizing revenue. This paper introduces control system approach to demand management with dynamic pricing (DP using the model predictive control (MPC technique. In addition, we present a proper dynamical system analogy based on active suspension and a stability analysis is provided via the Lyapunov direct method.

  3. Microcomputer-assisted determination of local blood flow by 133Xe

    International Nuclear Information System (INIS)

    Hoffschir, D.; Fayart, G.; Daburon, F.

    1990-02-01

    Acute local irradiations may induce extensive and delayed cutaneous and muscular ulcerations. We intended to study the prevalence of vascular factors in their pathogeny in an experimental model developed in pigs. Local clearance curves were assessed after intra-arterial administration of 133 Xe and transferred to a micro-computer. A program for poly-exponential curve fitting, written in Asyst langage, has been developed. An iterative curve stripping is performed conventionally and then the coefficients are more precisely determined using a Gauss-Newton method. This program has been tested on experimental data obtained after a 40 Gy local over-exposure [fr

  4. A multi-microcomputer system for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Berg, B.; Krasemann, H.

    1981-01-01

    We propose a microcomputer system which allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and presumably many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 68000 microprocessor. One attraction if this processor is that it allows up to 16 M Byte random access memory. (orig.)

  5. Serials Management by Microcomputer: The Potential of DBMS.

    Science.gov (United States)

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…

  6. Microcomputers in Vocational Home Economics Classrooms in USD #512.

    Science.gov (United States)

    Shawnee Mission Public Schools, KS.

    A project was conducted to identify software suitable for use in home economics classes and to train home economics teachers to use that software with an Apple II Plus microcomputer. During the project, home economics software was identified, evaluated, and catalogued. Teaching strategies were adapted to include using the computer in the…

  7. CC80-A microcomputer crate controller for CAMAC

    International Nuclear Information System (INIS)

    Walz, H.V.

    1976-11-01

    A microcomputer crate controller has been developed for use in CAMAC based instrumentation, control, and data processing systems at the Stanford Linear Accelerator Center. This unit may be used in multicrate branch highway systems to provide local processing capability within each crate, or in a single crate to form a stand-alone CAMAC system

  8. Microcomputer Activities Which Encourage the Reading-Writing Connection.

    Science.gov (United States)

    Balajthy, Ernest

    Many reading teachers, cognizant of the creative opportunities for skill development allowed by new reading-writing software, are choosing to use microcomputers in their classrooms full-time. Adventure story creation programs capitalize on reading-writing integration by allowing children, with appropriate assistance, to create their own…

  9. A Preserved Context Indexing System for Microcomputers: PERMDEX.

    Science.gov (United States)

    Yerkey, A. Neil

    1983-01-01

    Following a discussion of derivative versus assignment indexing, use of roles, and concept behind Preserved Concept Indexing System, features of PERMDEX (microcomputer program to assist in creation of permuted printed index) are described including indexer input and prompts, the shunting algorithm, and sorting and printing routines. Fourteen…

  10. Microcomputers in the Classroom: Trojan Horse or Teacher's Pet?

    Science.gov (United States)

    Olson, John K.

    Arguing that curriculum developers need to seek a better understanding of existing classroom orders before advising reform through new technology, this paper presents a review of research on the effects of microcomputers on the stabilities of classroom practice and describes a pilot study currently underway at Queens University in Ontario to…

  11. Microcomputer-Based Digital Signal Processing Laboratory Experiments.

    Science.gov (United States)

    Tinari, Jr., Rocco; Rao, S. Sathyanarayan

    1985-01-01

    Describes a system (Apple II microcomputer interfaced to flexible, custom-designed digital hardware) which can provide: (1) Fast Fourier Transform (FFT) computation on real-time data with a video display of spectrum; (2) frequency synthesis experiments using the inverse FFT; and (3) real-time digital filtering experiments. (JN)

  12. Application of microcomputer to X-ray radiometric ore separation

    International Nuclear Information System (INIS)

    Neverov, A.D.; Aleksandrov, P.S.; Kotler, N.I.

    1988-01-01

    The practical use of microcomputers as universal means for converting information for solving applied problems of X-ray radiometric ore separation method is considered. Laboratory tests of two metals - tungsten and tin manifested high efficiency of the developed system. X-ray radiometric separator software is developed

  13. Microcomputers in Schools as a Teaching and Learning Aid.

    Science.gov (United States)

    Trotman-Dickenson, D. I.

    1986-01-01

    Presents the findings of a survey of comprehensive and independent schools' use of microcomputers as teaching and learning aids in economics. Results suggest that use is wide spread but not intensive. Teachers allocate few hours to computer programs per year, have difficulty finding suitable software, and fail to encourage use by girls. (JDH)

  14. Microcomputers as on-line catalogs in special libraries.

    Science.gov (United States)

    Faust, J B

    1986-01-01

    This article discusses the rationale for the conversion of a card catalog to an on-line system in a special library owning approximately 4000 titles. Equipment, software, and procedures are described. Pros and cons of the use of a microcomputer for such a project, as well as costs and personnel needs, are outlined.

  15. Graphic terminal based on storage tube display with microcomputer

    International Nuclear Information System (INIS)

    Leich, H.; Levchanovsky, F.; Nikulnikov, A.; Polyntsev, A.; Prikhodko, V.

    1981-01-01

    This paper describes a graphic terminal where a microcomputer realizes functions like the generation of picture elements (points, symbols, vectors), display control, processing of data received from keyboard and trackball, communication with a host computer and others. The terminal has been designed for operating in a local network as well as in autonomous control systems for data acquisition and processing in physical experiments [ru

  16. Application of Minicomputers and Microcomputers to Information Handling.

    Science.gov (United States)

    Griffiths, Jose-Marie

    This study assesses the application of both minicomputers and microcomputers to information-handling procedures and makes recommendations for automating such procedures, particularly in developing nations. The report is based on a survey of existing uses of small computing equipment in libraries, archives, and information centers which was…

  17. Microcomputer Calculation of Thermodynamic Properties from Molecular Parameters of Gases.

    Science.gov (United States)

    Venugopalan, Mundiyath

    1990-01-01

    Described in this article is a problem-solving activity which integrates the application of microcomputers with the learning of physical chemistry. Students use the program with spectroscopic data to calculate the thermodynamic properties and compare them with the values from the thermochemical tables. (Author/KR)

  18. Extending the Online Public Access Catalog into the Microcomputer Environment.

    Science.gov (United States)

    Sutton, Brett

    1990-01-01

    Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…

  19. Microcomputer-aided monitor for liquid hydrogen target system

    International Nuclear Information System (INIS)

    Kitami, T.; Watanabe, K.

    1983-03-01

    A microcomputer-aided monitor for a liquid hydrogen target system has been designed and tested. Various kinds of input data such as temperature, pressure, vacuum, etc. are scanned in a given time interval. Variation with time in any four items can be displayed on CRT and, if neccessary, printed out on a sheet of recording paper. (author)

  20. Learning Together: Micro-Computers in Crosby, Texas, Schools.

    Science.gov (United States)

    McNeil, Linda M.

    The Crosby Independent School District near Houston, Texas, planned to introduce microcomputer instruction to its nearly 3,000 students in slow stages that had teachers and students learning at the same time. The initial impetus for computers came from an administrator who found useful information at a Regional Service Center of the State…

  1. Machine Check-Out. Microcomputing Working Paper Series.

    Science.gov (United States)

    Drexel Univ., Philadelphia, PA. Microcomputing Program.

    During the academic year 1983-84, Drexel University instituted a new policy requiring all incoming students to have access to a microcomputer. The computer chosen to fulfill this requirement was the Macintosh from Apple Computer, Inc. Because Drexel University received one of the first large shipments of this new product, the degree to which these…

  2. Getting Started with Microcomputers--A Practical Beginner's Guide.

    Science.gov (United States)

    Davies, Norman F.

    1985-01-01

    Discusses the results of a questionnaire sent to experts in the field of computer assisted language learning. Covers such topics as: 1) points to consider before buying a microcomputer; 2) recommended brands and peripheral equipment; 3) software; 4) utilizing programming languages; and 5) literature and contact organizations. (SED)

  3. Enhancing photogrammetric 3d city models with procedural modeling techniques for urban planning support

    International Nuclear Information System (INIS)

    Schubiger-Banz, S; Arisona, S M; Zhong, C

    2014-01-01

    This paper presents a workflow to increase the level of detail of reality-based 3D urban models. It combines the established workflows from photogrammetry and procedural modeling in order to exploit distinct advantages of both approaches. The combination has advantages over purely automatic acquisition in terms of visual quality, accuracy and model semantics. Compared to manual modeling, procedural techniques can be much more time effective while maintaining the qualitative properties of the modeled environment. In addition, our method includes processes for procedurally adding additional features such as road and rail networks. The resulting models meet the increasing needs in urban environments for planning, inventory, and analysis

  4. A numerical integration approach suitable for simulating PWR dynamics using a microcomputer system

    International Nuclear Information System (INIS)

    Zhiwei, L.; Kerlin, T.W.

    1983-01-01

    It is attractive to use microcomputer systems to simulate nuclear power plant dynamics for the purpose of teaching and/or control system design. An analysis and a comparison of feasibility of existing numerical integration methods have been made. The criteria for choosing the integration step using various numerical integration methods including the matrix exponential method are derived. In order to speed up the simulation, an approach is presented using the Newton recursion calculus which can avoid convergence limitations in choosing the integration step size. The accuracy consideration will dominate the integration step limited. The advantages of this method have been demonstrated through a case study using CBM model 8032 microcomputer to simulate a reduced order linear PWR model under various perturbations. It has been proven theoretically and practically that the Runge-Kutta method and Adams-Moulton method are not feasible. The matrix exponential method is good at accuracy and fairly good at speed. The Newton recursion method can save 3/4 to 4/5 time compared to the matrix exponential method with reasonable accuracy. Vertical Barhis method can be expanded to deal with nonlinear nuclear power plant models and higher order models as well

  5. The phase field technique for modeling multiphase materials

    Science.gov (United States)

    Singer-Loginova, I.; Singer, H. M.

    2008-10-01

    This paper reviews methods and applications of the phase field technique, one of the fastest growing areas in computational materials science. The phase field method is used as a theory and computational tool for predictions of the evolution of arbitrarily shaped morphologies and complex microstructures in materials. In this method, the interface between two phases (e.g. solid and liquid) is treated as a region of finite width having a gradual variation of different physical quantities, i.e. it is a diffuse interface model. An auxiliary variable, the phase field or order parameter \\phi(\\vec{x}) , is introduced, which distinguishes one phase from the other. Interfaces are identified by the variation of the phase field. We begin with presenting the physical background of the phase field method and give a detailed thermodynamical derivation of the phase field equations. We demonstrate how equilibrium and non-equilibrium physical phenomena at the phase interface are incorporated into the phase field methods. Then we address in detail dendritic and directional solidification of pure and multicomponent alloys, effects of natural convection and forced flow, grain growth, nucleation, solid-solid phase transformation and highlight other applications of the phase field methods. In particular, we review the novel phase field crystal model, which combines atomistic length scales with diffusive time scales. We also discuss aspects of quantitative phase field modeling such as thin interface asymptotic analysis and coupling to thermodynamic databases. The phase field methods result in a set of partial differential equations, whose solutions require time-consuming large-scale computations and often limit the applicability of the method. Subsequently, we review numerical approaches to solve the phase field equations and present a finite difference discretization of the anisotropic Laplacian operator.

  6. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    In experimental musculoskeletal oncology, there remains a need for animal models that can be used to assess the efficacy of new and innovative treatment methodologies for bone tumors. Rat plays a very important role in the bone field especially in the evaluation of metabolic bone diseases. The objective of this study was to develop a rat osteosarcoma model for evaluation of new surgical and molecular methods of treatment for extremity sarcoma. One hundred male SD rats weighing 125.45+/-8.19 g were divided into 5 groups and anesthetized intraperitoneally with 10% chloral hydrate. Orthotopic implantation models of rat osteosarcoma were performed by injecting directly into the SD rat femur with a needle for inoculation with SD tumor cells. In the first step of the experiment, 2x10(5) to 1x10(6) UMR106 cells in 50 microl were injected intraosseously into median or distal part of the femoral shaft and the tumor take rate was determined. The second stage consisted of determining tumor volume, correlating findings from ultrasound with findings from necropsia and determining time of survival. In the third stage, the orthotopically implanted tumors and lung nodules were resected entirely, sectioned, and then counter stained with hematoxylin and eosin for histopathologic evaluation. The tumor take rate was 100% for implants with 8x10(5) tumor cells or more, which was much less than the amount required for subcutaneous implantation, with a high lung metastasis rate of 93.0%. Ultrasound and necropsia findings matched closely (r=0.942; p<0.01), which demonstrated that Doppler ultrasonography is a convenient and reliable technique for measuring cancer at any stage. Tumor growth curve showed that orthotopically implanted tumors expanded vigorously with time-lapse, especially in the first 3 weeks. The median time of survival was 38 days and surgical mortality was 0%. The UMR106 cell line has strong carcinogenic capability and high lung metastasis frequency. The present rat

  7. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  8. Modelling of 3D fractured geological systems - technique and application

    Science.gov (United States)

    Cacace, M.; Scheck-Wenderoth, M.; Cherubini, Y.; Kaiser, B. O.; Bloecher, G.

    2011-12-01

    All rocks in the earth's crust are fractured to some extent. Faults and fractures are important in different scientific and industry fields comprising engineering, geotechnical and hydrogeological applications. Many petroleum, gas and geothermal and water supply reservoirs form in faulted and fractured geological systems. Additionally, faults and fractures may control the transport of chemical contaminants into and through the subsurface. Depending on their origin and orientation with respect to the recent and palaeo stress field as well as on the overall kinematics of chemical processes occurring within them, faults and fractures can act either as hydraulic conductors providing preferential pathways for fluid to flow or as barriers preventing flow across them. The main challenge in modelling processes occurring in fractured rocks is related to the way of describing the heterogeneities of such geological systems. Flow paths are controlled by the geometry of faults and their open void space. To correctly simulate these processes an adequate 3D mesh is a basic requirement. Unfortunately, the representation of realistic 3D geological environments is limited by the complexity of embedded fracture networks often resulting in oversimplified models of the natural system. A technical description of an improved method to integrate generic dipping structures (representing faults and fractures) into a 3D porous medium is out forward. The automated mesh generation algorithm is composed of various existing routines from computational geometry (e.g. 2D-3D projection, interpolation, intersection, convex hull calculation) and meshing (e.g. triangulation in 2D and tetrahedralization in 3D). All routines have been combined in an automated software framework and the robustness of the approach has been tested and verified. These techniques and methods can be applied for fractured porous media including fault systems and therefore found wide applications in different geo-energy related

  9. Desk-top microcomputer for lab-scale process control

    International Nuclear Information System (INIS)

    Overman, R.F.; Byrd, J.S.; Goosey, M.H.; Sand, R.J.

    1981-01-01

    A desk-top microcomputer was programmed to acquire the data from various process control sensors installed in a laboratory scale liquid-liquid extraction, pulse column facility. The parameters monitored included valve positions, gamma spectra, alpha radioactivity, temperature, pH, density, and flow rates. The program for the microcomputer is written in BASIC and requires about 31000 8-bit bytes of memory. All data is stored on floppy discs, and can be displayed or printed. Unexpected data values are brought to the process operator's attention via CRT display or print-out. The general organization of the program and a few subroutines unique to polling instruments are explained. Some of the data acquisition devices were designed and built at the Savannah River Laboratory. These include a pulse height analyzer, a data multiplexer, and a data acquisition instrument. A general description of the electronics design of these instruments is also given with emphasis placed on data formatting and bus addressing

  10. MINDS: A microcomputer interactive data system for 8086-based controllers

    Science.gov (United States)

    Soeder, J. F.

    1985-01-01

    A microcomputer interactive data system (MINDS) software package for the 8086 family of microcomputers is described. To enhance program understandability and ease of code maintenance, the software is written in PL/M-86, Intel Corporation's high-level system implementation language. The MINDS software is intended to run in residence with real-time digital control software to provide displays of steady-state and transient data. In addition, the MINDS package provides classic monitor capabilities along with extended provisions for debugging an executing control system. The software uses the CP/M-86 operating system developed by Digital Research, Inc., to provide program load capabilities along with a uniform file structure for data and table storage. Finally, a library of input and output subroutines to be used with consoles equipped with PL/M-86 and assembly language is described.

  11. Microcomputer-based pneumatic controller for neutron activation analysis

    International Nuclear Information System (INIS)

    Byrd, J.S.; Sand, R.J.

    1976-10-01

    A microcomputer-based pneumatic controller for neutron activation analysis was designed and built at the Savannah River Laboratory for analysis of large numbers of geologic samples for locating potential supplies of uranium ore for the National Uranium Resource Evaluation program. In this system, commercially available microcomputer logic modules are used to transport sample capsules through a network of pressurized air lines. The logic modules are interfaced to pneumatic valves, solenoids, and photo-optical detectors. The system operates from programs stored in firmware (permanent software). It also commands a minicomputer and a hard-wired pulse height analyzer for data collection and bookkeeping tasks. The advantage of the system is that major system changes can be implemented in the firmware with no hardware changes. This report describes the hardware, firmware, and software for the electronics system

  12. [Microcomputer control of a LED stimulus display device].

    Science.gov (United States)

    Ohmoto, S; Kikuchi, T; Kumada, T

    1987-02-01

    A visual stimulus display system controlled by a microcomputer was constructed at low cost. The system consists of a LED stimulus display device, a microcomputer, two interface boards, a pointing device (a "mouse") and two kinds of software. The first software package is written in BASIC. Its functions are: to construct stimulus patterns using the mouse, to construct letter patterns (alphabet, digit, symbols and Japanese letters--kanji, hiragana, katakana), to modify the patterns, to store the patterns on a floppy disc, to translate the patterns into integer data which are used to display the patterns in the second software. The second software package, written in BASIC and machine language, controls display of a sequence of stimulus patterns in predetermined time schedules in visual experiments.

  13. Biocompatibility property of 100% strontium-substituted SiO2 -Al2 O3 -P2 O5 -CaO-CaF2 glass ceramics over 26 weeks implantation in rabbit model: Histology and micro-Computed Tomography analysis.

    Science.gov (United States)

    Basu, Bikramjit; Sabareeswaran, A; Shenoy, S J

    2015-08-01

    One of the desired properties for any new biomaterial composition is its long-term stability in a suitable animal model and such property cannot be appropriately assessed by performing short-term implantation studies. While hydroxyapatite (HA) or bioglass coated metallic biomaterials are being investigated for in vivo biocompatibility properties, such study is not extensively being pursued for bulk glass ceramics. In view of their inherent brittle nature, the implant stability as well as impact of long-term release of metallic ions on bone regeneration have been a major concern. In this perspective, the present article reports the results of the in vivo implantation experiments carried out using 100% strontium (Sr)-substituted glass ceramics with the nominal composition of 4.5 SiO2 -3Al2 O3 -1.5P2 O5 -3SrO-2SrF2 for 26 weeks in cylindrical bone defects in rabbit model. The combination of histological and micro-computed tomography analysis provided a qualitative and quantitative understanding of the bone regeneration around the glass ceramic implants in comparison to the highly bioactive HA bioglass implants (control). The sequential polychrome labeling of bone during in vivo osseointegration using three fluorochromes followed by fluorescence microscopy observation confirmed homogeneous bone formation around the test implants. The results of the present study unequivocally confirm the long-term implant stability as well as osteoconductive property of 100% Sr-substituted glass ceramics, which is comparable to that of a known bioactive implant, that is, HA-based bioglass. © 2014 Wiley Periodicals, Inc.

  14. A Rutherford Scattering Simulation with Microcomputer Graphics.

    Science.gov (United States)

    Calle, Carlos I.; Wright, Lavonia F.

    1989-01-01

    Lists a program for a simulation of Rutherford's gold foil experiment in BASIC for both Apple II and IBM compatible computers. Compares Rutherford's model of the atom with Thompson's plum pudding model of the atom. (MVL)

  15. Ex Vivo Methods for Informing Computational Models of the Mitral Valve

    OpenAIRE

    Bloodworth, Charles H.; Pierce, Eric L.; Easley, Thomas F.; Drach, Andrew; Khalighi, Amir H.; Toma, Milan; Jensen, Morten O.; Sacks, Michael S.; Yoganathan, Ajit P.

    2016-01-01

    Computational modeling of the mitral valve (MV) has potential applications for determining optimal MV repair techniques and risk of recurrent mitral regurgitation. Two key concerns for informing these models are (1) sensitivity of model performance to the accuracy of the input geometry, and, (2) acquisition of comprehensive data sets against which the simulation can be validated across clinically relevant geometries. Addressing the first concern, ex vivo micro-computed tomography (microCT) wa...

  16. A MICROCOMPUTER LINEAR PROGRAMMING PACKAGE: AN ALTERNATIVE TO MAINFRAMES

    OpenAIRE

    Laughlin, David H.

    1984-01-01

    This paper presents the capabilities and limitations of a microcomputer linear programming package. The solution algorithm is a version of the revised simplex. Rapid problem entry, user ease of operation, sensitivity analyses on objective function and right hand sides are advantages. A problem size of 150 activities and 64 constraints can be solved in present form. Due to problem size, limitations and lack of parametric and integer programming routines, this package is thought to have the mos...

  17. Simulation of Heat Transfer and Electromagnetic Fields of Protected Microcomputers

    Directory of Open Access Journals (Sweden)

    Josef Lakatos

    2006-01-01

    Full Text Available The paper presents results of collaboration between Department of mechatronics and electronics at University of Žilina and VÚVT Engineering a.s. Žilina in area of heat transfer simulations and disturbing electromagnetic radiation simulations in computer construction. The simulations results were used in development of protected microcomputer prototypes in frame of applied research at both of workplaces.

  18. MultiSimPC: a multilevel logic simulator for microcomputers

    OpenAIRE

    Kelly, John S.

    1986-01-01

    Approved for public release; distribution is unlimited This thesis describes extensions to a multilevel VLSI logic simulator named MultiSim. Originally developed by Dr. Ausif Mahmood of the Washington State University for large minicomputers such as the VAX-11/780; MultiSim is now operational on desktop microcomputers costing only a few thousand dollars. In addition, MultiSim has been expanded to include provisions for adding user-defined primitive cells to the circuit library, true mu...

  19. Use of Data Base Microcomputer Software in Descriptive Nursing Research

    OpenAIRE

    Chapman, Judy Jean

    1985-01-01

    Data base microcomputer software was used to design a file for data storage and retrieval in a qualitative nursing research project. The needs of 50 breast feeding mothers from birth to four months were studied. One thousand records with descriptive nursing data were entered into the file. The search and retrieval capability of data base software facilitated this qualitative research. The findings will be discussed in three areas: (1) infant concerns, (2) postpartum concerns, and (3) breast c...

  20. Microcomputer control system for the SuperHILAC third injector

    International Nuclear Information System (INIS)

    Lancaster, H.D.; Magyary, S.B.; Glatz, J.; Selph, F.B.; Fahmie, M.P.; Ritchie, A.L.; Keith, S.R.; Stover, G.R.; Besse, L.J.

    1979-09-01

    A new control system using the latest technology in microcomputers will be used on the third injector at the SuperHILAC. It incorporates some new and progressive ideas in both hardware and software design. These ideas were inspired by the revolution in microprocessors. The third injector project consists of a high voltage pre-injector, a Wideroe type linear accelerator, and connecting beam lines, requiring control of 80 analog and 300 boolean devices. To solve this problem, emphasizing inexpensive, commercially available hardware, we designed a control system consisting of 20 microcomputer boards with a total of 700 kilobytes of memory. Each computer board using a 16-bit microprocessor has the computing power of a typical minicomputer. With these microcomputers operating in parallel, the programming can be greatly simplified, literally replacing software with hardware. This improves system response speed and cuts costs dramatically. An easy to use interpretive language, similar to BASIC, will allow operations personnel to write special purpose programs in addition to the compiled procedures

  1. The microcomputer: A tool for personal language learning

    Directory of Open Access Journals (Sweden)

    David H. Wyatt

    2013-02-01

    Full Text Available Computer-assisted methods of teaching and learning languages have been surrounded by controversy and debate for over a decade. In 1979, however, microcomputers began to appear in a form suitable for educational applications, offering for the first time an alternative to both the cost and the approach of large computer systems. The impact of the microcomputer has been limited by a number of factors, and microcomputerassisted learning is still in a relative state of infancy. The main implications for language teaching and learning are only now beginning to be understood, just as the limiting factors are starting to disappear. This paper will assess the present situation and outline some likely future developments in the use of microcomputers in language learning. Rekenaargesteunde metodes by die onderrig en aanleer van tale is reeds meer as 'n dekade lank omgewe deur meningsverskil. In 1979 egter het mikrorekenaars hulle verskyning begin maak in 'n toepaslike vorm vir opvoedkundige doeleindes. Vir die eerste keer was daar 'n alternatief vir die koste verbonde aan en die benaderingswyse van groot rekenaarstelsels. Die trefkrag van die mikrorekenaar is deur 'n aantal faktore gekortwiek en rekenaargesteunde onderrig is steeds in sy kinderskoene. Die belangrikste implikasies vir die aanleer en onderrig van tale begin nou eers deurdring soos die beperkende faktore begin verdwyn. Hierdie artikel takseer die huidige situasie en omlyn moontlike toekomstige ontwikkelings vir die gebruik van mikrorekenaars by taalaanleer.

  2. Graphics of (X,Y) spectrum for microcomputer

    International Nuclear Information System (INIS)

    Macias B, L.R.

    1991-08-01

    When carrying out diffraction works is frequently required to visualize the spectra of the data obtained in order to analyzing them. The design for the obtaining of data in the neutron diffractometer by means of the microcomputer allows to store them in a file by means of the one which transferring to the CYBER system so that by means of its utilities the mentioned spectrum is observed in a graph. In diffraction works, it is sought to study crystalline materials by means of the execution of the Bragg law by that the mounted sample on the diffractometer is subjected to a scanning of the sample with a radiation of a well-known wavelength and this way varying the angles, the corresponding interplanar distances are determined. The main objective of this work, is starting of a data set generated by the diffractometer, to generate the graph of the corresponding (X,Y) spectra in visual form in the screen of a microcomputer and if it is required, to obtain the graph in printed form by means of the same computer program for microcomputer. (Author)

  3. Microcomputer-assisted transmission of disaster data by cellular telephone.

    Science.gov (United States)

    Wigder, H N; Fligner, D J; Rivers, D; Hotch, D

    1989-01-01

    Voice communication of information during disasters is often inadequate. In particular, simultaneous transmission by multiple callers on the same frequency can result in blocked transmissions and miscommunications. In contrast, nonvoice transmission of data requires less time than does voice communication of the same data, and may be more accurate. We conducted a pilot study to test the feasibility of a microcomputer assisted communication (MAC) network linking the disaster scene and the command hospital. The radio chosen to transmit data from the field disaster site to the command hospital was a cellular telephone connected to the microcomputer by modem. Typed communications between the microcomputer operators enabled dialogue between the disaster site and the hospitals. A computer program using commercially available software (Symphony by Lotus, Inc.) was written to allow for data entry, data transmission, and reports. Patient data, including age, sex, severity of injury, identification number, major injuries, and hospital destination were successfully transmitted from the disaster site command post to the command hospital. This pilot test demonstrated the potential applicability of MAC for facilitating transmission of patient data during a disaster.

  4. Software development assurance for special control computer sets on the base of the Electronica-60M microcomputer in the stations of fuel element nondestructive testing

    International Nuclear Information System (INIS)

    Vasil'kova, A.D.; Grachev, S.N.; Salakatova, L.S.

    1987-01-01

    A special computational control complex (SCCC) including the ''Elektronika-60M'' microcomputer, a device for communication with an object (DCO) based on the typical microprocessing set (TMS) based on LIUS-2 KTS means and an adapter for communication of the ''Elektronika-60M'' microcomputer bus with IK1 TMS bus developed for their conjugation is used for development of nondestructive control post software. An instrumental SCCC including SM-4 microcomputer, TMS units as DCO, an adapter for communication of the common bus of the SM-4 minicomputer with the IK1 TMS bus developed for their conjugation and devices for simulation of the facility operation is suggested to increase labour productivity of programmers. The SM-4 microcomputer permits to develop programs in the FORTRAN language and to carry out their effective adjustment. A subprogram library for communication with TMS units is compiled, and the technique for testing FORTRAN programs in programmable read-only memory TMS as well as start of these programs are developed

  5. Automatic video segmentation employing object/camera modeling techniques

    NARCIS (Netherlands)

    Farin, D.S.

    2005-01-01

    Practically established video compression and storage techniques still process video sequences as rectangular images without further semantic structure. However, humans watching a video sequence immediately recognize acting objects as semantic units. This semantic object separation is currently not

  6. A Titration Technique for Demonstrating a Magma Replenishment Model.

    Science.gov (United States)

    Hodder, A. P. W.

    1983-01-01

    Conductiometric titrations can be used to simulate subduction-setting volcanism. Suggestions are made as to the use of this technique in teaching volcanic mechanisms and geochemical indications of tectonic settings. (JN)

  7. Modelling skin penetration using the Laplace transform technique.

    Science.gov (United States)

    Anissimov, Y G; Watkinson, A

    2013-01-01

    The Laplace transform is a convenient mathematical tool for solving ordinary and partial differential equations. The application of this technique to problems arising in drug penetration through the skin is reviewed in this paper. © 2013 S. Karger AG, Basel.

  8. Derandomizing buffer and microcomputer memory for the use of a fast MWPC image memory

    International Nuclear Information System (INIS)

    Skvaril, J.

    1986-01-01

    We have developed a special derandomizing buffer memory for MWPC imaging which makes it possible to use a microcomputer memory in DMA mode. The buffer regulates an input data stream (X and Y coordinates of a event) into a microcomputer memory with practically no data losses. The advantages of this approach are (a) no special histogramming memory is needed and (b) a resultant image is in the memory space of the microcomputer used and can be immediately processed. (orig.)

  9. Microcomputer-based system for registration of oxygen tension in peripheral muscle.

    Science.gov (United States)

    Odman, S; Bratt, H; Erlandsson, I; Sjögren, L

    1986-01-01

    For registration of oxygen tension fields in peripheral muscle a microcomputer based system was designed on the M6800 microprocessor. The system was designed to record the signals from a multiwire oxygen electrode, MDO, which is a multiwire electrode for measuring oxygen on the surface of an organ. The system contained patient safety isolation unit built on optocopplers and the upper frequency limit was 0.64 Hz. Collected data were corrected for drift and temperature changes during the measurement by using pre- and after calibrations and a linear compensation technique. Measure drift of the electrodes were proved to be linear and thus the drift could be compensated for. The system was tested in an experiment on pig. To study the distribution of oxygen statistically mean, standard deviation, skewness and curtosis were calculated. To see changes or differences between histograms a Kolmogorv-Smirnov test was used.

  10. Micro-computed tomography imaging and analysis in developmental biology and toxicology.

    Science.gov (United States)

    Wise, L David; Winkelmann, Christopher T; Dogdas, Belma; Bagchi, Ansuman

    2013-06-01

    Micro-computed tomography (micro-CT) is a high resolution imaging technique that has expanded and strengthened in use since it was last reviewed in this journal in 2004. The technology has expanded to include more detailed analysis of bone, as well as soft tissues, by use of various contrast agents. It is increasingly applied to questions in developmental biology and developmental toxicology. Relatively high-throughput protocols now provide a powerful and efficient means to evaluate embryos and fetuses subjected to genetic manipulations or chemical exposures. This review provides an overview of the technology, including scanning, reconstruction, visualization, segmentation, and analysis of micro-CT generated images. This is followed by a review of more recent applications of the technology in some common laboratory species that highlight the diverse issues that can be addressed. Copyright © 2013 Wiley Periodicals, Inc.

  11. Application of micro-computed tomography to microstructure studies of the medicinal fungus Hericium coralloides.

    Science.gov (United States)

    Pallua, Johannes D; Kuhn, Volker; Pallua, Anton F; Pfaller, Kristian; Pallua, Anton K; Recheis, Wolfgang; Pöder, Reinhold

    2015-01-01

    The potential of 3-D nondestructive imaging techniques such as micro-computed tomography (micro-CT) was evaluated to study morphological patterns of the potential medicinal fungus Hericium coralloides (Basidiomycota). Micro-CT results were correlated with histological information gained from scanning electron microscopy (SEM) and light microscopy (LM). It is demonstrated that the combination of these imaging methods results in a more distinct picture of the morphology of the edible and potentially medicinal Hericium coralloides basidiomata. In addition we have created 3-D reconstructions and visualizations based on micro-CT imagery from a randomly selected part of the upper region of a fresh H. coralloides basidioma: Analyses for the first time allowed an approximation of the evolutionary effectiveness of this bizarrely formed basidioma type in terms of the investment of tissue biomass and its reproductive output (production of basidiospores). © 2015 by The Mycological Society of America.

  12. Operation of commercially-based microcomputer technology in a space radiation environment

    Science.gov (United States)

    Yelverton, J. N.

    This paper focuses on detection and recovery techniques that should enable the reliable operation of commercially-based microprocessor technology in the harsh radiation environment of space and at high altitudes. This approach is especially significant in light of the current shift in emphasis (due to cost) from space hardened Class-S parts qualification to a more direct use of commercial parts. The method should offset some of the concern that the newer high density state-of-the-art RISC and CISC microprocessors can be used in future space applications. Also, commercial aviation, should benefit, since radiation induced transients are a new issue arising from the increased quantities of microcomputers used in aircraft avionics.

  13. Review of air quality modeling techniques. Volume 8

    International Nuclear Information System (INIS)

    Rosen, L.C.

    1977-01-01

    Air transport and diffusion models which are applicable to the assessment of the environmental effects of nuclear, geothermal, and fossil-fuel electric generation are reviewed. The general classification of models and model inputs are discussed. A detailed examination of the statistical, Gaussian plume, Gaussian puff, one-box and species-conservation-of-mass models is given. Representative models are discussed with attention given to the assumptions, input data requirement, advantages, disadvantages and applicability of each

  14. Electromagnetic interference modeling and suppression techniques in variable-frequency drive systems

    Science.gov (United States)

    Yang, Le; Wang, Shuo; Feng, Jianghua

    2017-11-01

    Electromagnetic interference (EMI) causes electromechanical damage to the motors and degrades the reliability of variable-frequency drive (VFD) systems. Unlike fundamental frequency components in motor drive systems, high-frequency EMI noise, coupled with the parasitic parameters of the trough system, are difficult to analyze and reduce. In this article, EMI modeling techniques for different function units in a VFD system, including induction motors, motor bearings, and rectifierinverters, are reviewed and evaluated in terms of applied frequency range, model parameterization, and model accuracy. The EMI models for the motors are categorized based on modeling techniques and model topologies. Motor bearing and shaft models are also reviewed, and techniques that are used to eliminate bearing current are evaluated. Modeling techniques for conventional rectifierinverter systems are also summarized. EMI noise suppression techniques, including passive filter, Wheatstone bridge balance, active filter, and optimized modulation, are reviewed and compared based on the VFD system models.

  15. The development and application of a coincidence measurement apparatus with micro-computer system

    International Nuclear Information System (INIS)

    Du Hongshan; Zhou Youpu; Gao Junlin; Qin Deming; Cao Yunzheng; Zhao Shiping

    1987-01-01

    A coincidence measurement apparatus with micro-computer system is developed. Automatic data acquisition and processing are achieved. Results of its application for radioactive measurement are satisfactory

  16. Examining Interior Grid Nudging Techniques Using Two-Way Nesting in the WRF Model for Regional Climate Modeling

    Science.gov (United States)

    This study evaluates interior nudging techniques using the Weather Research and Forecasting (WRF) model for regional climate modeling over the conterminous United States (CONUS) using a two-way nested configuration. NCEP–Department of Energy Atmospheric Model Intercomparison Pro...

  17. Modelling tick abundance using machine learning techniques and satellite imagery

    DEFF Research Database (Denmark)

    Kjær, Lene Jung; Korslund, L.; Kjelland, V.

    satellite images to run Boosted Regression Tree machine learning algorithms to predict overall distribution (presence/absence of ticks) and relative tick abundance of nymphs and larvae in southern Scandinavia. For nymphs, the predicted abundance had a positive correlation with observed abundance...... the predicted distribution of larvae was mostly even throughout Denmark, it was primarily around the coastlines in Norway and Sweden. Abundance was fairly low overall except in some fragmented patches corresponding to forested habitats in the region. Machine learning techniques allow us to predict for larger...... the collected ticks for pathogens and using the same machine learning techniques to develop prevalence maps of the ScandTick region....

  18. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    Science.gov (United States)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  19. Application of integrated modeling technique for data services ...

    African Journals Online (AJOL)

    This paper, therefore, describes the application of the integrated simulation technique for deriving the optimum resources required for data services in an asynchronous transfer mode (ATM) based private wide area network (WAN) to guarantee specific QoS requirement. The simulation tool drastically cuts the simulation ...

  20. Prescribed wind shear modelling with the actuator line technique

    DEFF Research Database (Denmark)

    Mikkelsen, Robert Flemming; Sørensen, Jens Nørkær; Troldborg, Niels

    2007-01-01

    A method for prescribing arbitrary steady atmospheric wind shear profiles combined with CFD is presented. The method is furthermore combined with the actuator line technique governing the aerodynamic loads on a wind turbine. Computation are carried out on a wind turbine exposed to a representative...

  1. On a Numerical and Graphical Technique for Evaluating some Models Involving Rational Expectations

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    Campbell and Shiller (1987) proposed a graphical technique for the present value model which consists of plotting the spread and theoretical spread as calculated from the cointegrated vector autoregressive model. We extend these techniques to a number of rational expectation models and give...

  2. On a numerical and graphical technique for evaluating some models involving rational expectations

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    Campbell and Shiller (1987) proposed a graphical technique for the present value model which consists of plotting the spread and theoretical spread as calculated from the cointegrated vector autoregressive model. We extend these techniques to a number of rational expectation models and give...

  3. Radiation treatment planning using a microcomputer

    International Nuclear Information System (INIS)

    Lunsqui, A.R.; Calil, S.J.; Rocha, J.R.O.; Alexandre, A.C.

    1990-01-01

    The radiation treatment planning requires a lenght manipulation of data from isodose charts to obtain the best irradiation technique. Over the past 25 years this tedious operation has been replaced by computerized methods. These can reduce the working time by at least 20 times. It is being developed at the Biomedical Engineering Center a software to generate a polychromatic image of dose distribution. By means of a digitizing board, the patient contour and the beam data are transfered to the computer and stored as polinomial and Fourier series respectively. To calculate the dose distribution, the irradiated region is represented by a variable size bidimensional dot matrix. The dose at each point is calculated by correcting and adding the stored data for each beam. An algorithm for color definition according to the dose intensity was developed to display on a computer monitor the resultant matrix. A hard copy can be obtained be means of a six color plotter. (author)

  4. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    International Nuclear Information System (INIS)

    Saini, K. K.; Saini, Sanju

    2008-01-01

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  5. An eigenexpansion technique for modelling plasma start-up

    International Nuclear Information System (INIS)

    Pillsbury, R.D.

    1989-01-01

    An algorithm has been developed and implemented in a computer program that allows the estimation of PF coil voltages required to start-up an axisymmetric plasma in a tokamak in the presence of eddy currents in toroidally continuous conducting structures. The algorithm makes use of an eigen-expansion technique to solve the lumped parameter circuit loop voltage equations associated with the PF coils and passive (conducting) structures. An example of start-up for CIT (Compact Ignition Tokamak) is included

  6. Wave propagation in fluids models and numerical techniques

    CERN Document Server

    Guinot, Vincent

    2012-01-01

    This second edition with four additional chapters presents the physical principles and solution techniques for transient propagation in fluid mechanics and hydraulics. The application domains vary including contaminant transport with or without sorption, the motion of immiscible hydrocarbons in aquifers, pipe transients, open channel and shallow water flow, and compressible gas dynamics. The mathematical formulation is covered from the angle of conservation laws, with an emphasis on multidimensional problems and discontinuous flows, such as steep fronts and shock waves. Finite

  7. A vortex model for Darrieus turbine using finite element techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ponta, Fernando L. [Universidad de Buenos Aires, Dept. de Electrotecnia, Grupo ISEP, Buenos Aires (Argentina); Jacovkis, Pablo M. [Universidad de Buenos Aires, Dept. de Computacion and Inst. de Calculo, Buenos Aires (Argentina)

    2001-09-01

    Since 1970 several aerodynamic prediction models have been formulated for the Darrieus turbine. We can identify two families of models: stream-tube and vortex. The former needs much less computation time but the latter is more accurate. The purpose of this paper is to show a new option for modelling the aerodynamic behaviour of Darrieus turbines. The idea is to combine a classic free vortex model with a finite element analysis of the flow in the surroundings of the blades. This avoids some of the remaining deficiencies in classic vortex models. The agreement between analysis and experiment when predicting instantaneous blade forces and near wake flow behind the rotor is better than the one obtained in previous models. (Author)

  8. Artificial intelligence techniques for modeling database user behavior

    Science.gov (United States)

    Tanner, Steve; Graves, Sara J.

    1990-01-01

    The design and development of the adaptive modeling system is described. This system models how a user accesses a relational database management system in order to improve its performance by discovering use access patterns. In the current system, these patterns are used to improve the user interface and may be used to speed data retrieval, support query optimization and support a more flexible data representation. The system models both syntactic and semantic information about the user's access and employs both procedural and rule-based logic to manipulate the model.

  9. Reduction of thermal models of buildings: improvement of techniques using meteorological influence models; Reduction de modeles thermiques de batiments: amelioration des techniques par modelisation des sollicitations meteorologiques

    Energy Technology Data Exchange (ETDEWEB)

    Dautin, S.

    1997-04-01

    This work concerns the modeling of thermal phenomena inside buildings for the evaluation of energy exploitation costs of thermal installations and for the modeling of thermal and aeraulic transient phenomena. This thesis comprises 7 chapters dealing with: (1) the thermal phenomena inside buildings and the CLIM2000 calculation code, (2) the ETNA and GENEC experimental cells and their modeling, (3) the techniques of model reduction tested (Marshall`s truncature, Michailesco aggregation method and Moore truncature) with their algorithms and their encoding in the MATRED software, (4) the application of model reduction methods to the GENEC and ETNA cells and to a medium size dual-zone building, (5) the modeling of meteorological influences classically applied to buildings (external temperature and solar flux), (6) the analytical expression of these modeled meteorological influences. The last chapter presents the results of these improved methods on the GENEC and ETNA cells and on a lower inertia building. These new methods are compared to classical methods. (J.S.) 69 refs.

  10. Multiparous Ewe as a Model for Teaching Vaginal Hysterectomy Techniques.

    Science.gov (United States)

    Kerbage, Yohan; Cosson, Michel; Hubert, Thomas; Giraudet, Géraldine

    2017-12-01

    Despite being linked to improving patient outcomes and limiting costs, the use of vaginal hysterectomy is on the wane. Although a combination of reasons might explain this trend, one cause is a lack of practical training. An appropriate teaching model must therefore be devised. Currently, only low-fidelity simulators exist. Ewes provide an appropriate model for pelvic anatomy and are well-suited for testing vaginal mesh properties. This article sets out a vaginal hysterectomy procedure for use as an education and training model. A multiparous ewe was the model. Surgery was performed under general anesthesia. The ewe was in a lithotomy position resembling that assumed by women on the operating table. Two vaginal hysterectomies were performed on two ewes, following every step precisely as if the model were human. Each surgical step of vaginal hysterectomy performed on the ewe and on a woman were compared side by side. We identified that all surgical steps were particularly similar. The main limitations of this model are costs ($500/procedure), logistic problems (housing large animals), and public opposition to animal training models. The ewe appears to be an appropriate model for teaching and training of vaginal hysterectomy.

  11. Modeling rainfall-runoff process using soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal; Tombul, Mustafa

    2013-02-01

    Rainfall-runoff process was modeled for a small catchment in Turkey, using 4 years (1987-1991) of measurements of independent variables of rainfall and runoff values. The models used in the study were Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS) and Gene Expression Programming (GEP) which are Artificial Intelligence (AI) approaches. The applied models were trained and tested using various combinations of the independent variables. The goodness of fit for the model was evaluated in terms of the coefficient of determination (R2), root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and scatter index (SI). A comparison was also made between these models and traditional Multi Linear Regression (MLR) model. The study provides evidence that GEP (with RMSE=17.82 l/s, MAE=6.61 l/s, CE=0.72 and R2=0.978) is capable of modeling rainfall-runoff process and is a viable alternative to other applied artificial intelligence and MLR time-series methods.

  12. Experiments with microcomputer-based artificial intelligence environments

    Science.gov (United States)

    Summers, E.G.; MacDonald, R.A.

    1988-01-01

    The U.S. Geological Survey (USGS) has been experimenting with the use of relatively inexpensive microcomputers as artificial intelligence (AI) development environments. Several AI languages are available that perform fairly well on desk-top personal computers, as are low-to-medium cost expert system packages. Although performance of these systems is respectable, their speed and capacity limitations are questionable for serious earth science applications foreseen by the USGS. The most capable artificial intelligence applications currently are concentrated on what is known as the "artificial intelligence computer," and include Xerox D-series, Tektronix 4400 series, Symbolics 3600, VAX, LMI, and Texas Instruments Explorer. The artificial intelligence computer runs expert system shells and Lisp, Prolog, and Smalltalk programming languages. However, these AI environments are expensive. Recently, inexpensive 32-bit hardware has become available for the IBM/AT microcomputer. USGS has acquired and recently completed Beta-testing of the Gold Hill Systems 80386 Hummingboard, which runs Common Lisp on an IBM/AT microcomputer. Hummingboard appears to have the potential to overcome many of the speed/capacity limitations observed with AI-applications on standard personal computers. USGS is a Beta-test site for the Gold Hill Systems GoldWorks expert system. GoldWorks combines some high-end expert system shell capabilities in a medium-cost package. This shell is developed in Common Lisp, runs on the 80386 Hummingboard, and provides some expert system features formerly available only on AI-computers including frame and rule-based reasoning, on-line tutorial, multiple inheritance, and object-programming. ?? 1988 International Association for Mathematical Geology.

  13. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  14. Validation techniques of agent based modelling for geospatial simulations

    OpenAIRE

    Darvishi, M.; Ahmadi, G.

    2014-01-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent...

  15. Use of machine learning techniques for modeling of snow depth

    Directory of Open Access Journals (Sweden)

    G. V. Ayzel

    2017-01-01

    Full Text Available Snow exerts significant regulating effect on the land hydrological cycle since it controls intensity of heat and water exchange between the soil-vegetative cover and the atmosphere. Estimating of a spring flood runoff or a rain-flood on mountainous rivers requires understanding of the snow cover dynamics on a watershed. In our work, solving a problem of the snow cover depth modeling is based on both available databases of hydro-meteorological observations and easily accessible scientific software that allows complete reproduction of investigation results and further development of this theme by scientific community. In this research we used the daily observational data on the snow cover and surface meteorological parameters, obtained at three stations situated in different geographical regions: Col de Porte (France, Sodankyla (Finland, and Snoquamie Pass (USA.Statistical modeling of the snow cover depth is based on a complex of freely distributed the present-day machine learning models: Decision Trees, Adaptive Boosting, Gradient Boosting. It is demonstrated that use of combination of modern machine learning methods with available meteorological data provides the good accuracy of the snow cover modeling. The best results of snow cover depth modeling for every investigated site were obtained by the ensemble method of gradient boosting above decision trees – this model reproduces well both, the periods of snow cover accumulation and its melting. The purposeful character of learning process for models of the gradient boosting type, their ensemble character, and use of combined redundancy of a test sample in learning procedure makes this type of models a good and sustainable research tool. The results obtained can be used for estimating the snow cover characteristics for river basins where hydro-meteorological information is absent or insufficient.

  16. Sand and gravel mine operations and reclamation planning using microcomputers

    Energy Technology Data Exchange (ETDEWEB)

    Ariffin, J.B.

    1990-02-01

    The purpose of this study is to focus on the application of microcomputers, also known as personal computers, in planning for sand and gravel mine operations and reclamation at a site in Story County, Iowa. This site, called the Arrasmith Pit, is operated by Martin Marietta Aggregates, Inc. The Arrasmith site, which encompasses an area of about 25 acres, is a relatively small site for aggregate mining. However, planning for the concurrent mine operation and reclamation program at this site is just as critical as with larger sites and the planning process is the same.

  17. Microcomputer package for statistical analysis of microbial populations.

    Science.gov (United States)

    Lacroix, J M; Lavoie, M C

    1987-11-01

    We have developed a Pascal system to compare microbial populations from different ecological sites using microcomputers. The values calculated are: the coverage value and its standard error, the minimum similarity and the geometric similarity between two biological samples, and the Lambda test consisting of calculating the ratio of the mean similarity between two subsets by the mean similarity within subsets. This system is written for Apple II, IBM or compatible computers, but it can work for any computer which can use CP/M, if the programs are recompiled for such a system.

  18. A local-area-network based radiation oncology microcomputer system

    International Nuclear Information System (INIS)

    Chu, W.K.; Taylor, T.K.; Kumar, P.P.; Imray, T.J.

    1985-01-01

    The application of computerized technology in the medical specialty of radiation oncology has gained wide acceptance in the past decade. Recognizing that most radiation oncology department personnel are familiar with computer operations and terminology, it appears reasonable to attempt to expand the computer's applications to other departmental activities, such as scheduling, record keeping, billing, treatment regimen and status, etc. Instead of sharing the processing capability available on the existent treatment minicomputer, the radiation oncology computer system is based upon a microcomputer local area network (LAN). The system was conceptualized in 1984 and completed in March 1985. This article outlines the LAN-based radiation oncology computer system

  19. Program pseudo-random number generator for microcomputers

    International Nuclear Information System (INIS)

    Ososkov, G.A.

    1980-01-01

    Program pseudo-random number generators (PNG) intended for the test of control equipment and communication channels are considered. In the case of 8-bit microcomputers it is necessary to assign 4 words of storage to allocate one random number. The proposed economical algorithms of the random number generation are based on the idea of the ''mixing'' of such quarters of the preceeding random number to obtain the next one. Test results of the PNG are displayed for two such generators. A FORTRAN variant of the PNG is presented along with a program realizing the PNG made on the base of the INTEL-8080 autocode

  20. Hardware design of a microcomputer controlled diagnostic vacuum controller

    International Nuclear Information System (INIS)

    Marsala, R.J.

    1983-01-01

    The TFTR diagnostic vacuum controller (DVC) has been designed and built to control and monitor the pumps, valves and gauges which comprise a diagnostic vacuum system. The DVC is a microcomputer based self-contained controller with battery backup which may be controlled manually from front panel controls or remotely via CICADA. The DVC implements all pump and valve sequencing and provides protection against incorrect operation. There are presently two versions of the DVC operating on TFTR and a third version being used on the S-1 machine

  1. Microcomputer simulation of nuclear magnetic resonance imaging contrasts

    International Nuclear Information System (INIS)

    Le Bihan, D.

    1985-01-01

    The high information content of magnetic resonance images is due to the multiplicity of its parameters. However, this advantage introduces a difficulty in the interpretation of the contrast: an image is strongly modified according to the visualised parameters. The author proposes a micro-computer simulation program. After recalling the main intrinsic and extrinsic parameters, he shows how the program works and its interest as a pedagogic tool and as an aid for contrast optimisation of images as a function of the suspected pathology [fr

  2. A microcomputer program for coupled cycle burnup calculations

    International Nuclear Information System (INIS)

    Driscoll, M.J.; Downar, T.J.; Taylor, E.L.

    1986-01-01

    A program, designated BRACC (Burnup, Reactivity, And Cycle Coupling), has been developed for fuel management scoping calculations, and coded in the BASIC language in an interactive format for use with microcomputers. BRACC estimates batch and cycle burnups for sequential reloads for a variety of initial core conditions, and permits the user to specify either reload batch properties (enrichment, burnable poison reactivity) or the target cycle burnup. Most important fuel management tactics (out-in or low-leakage loading, coastdown, variation in number of assemblies charged) can be simulated

  3. Reactor core protection system using a 4-channel microcomputer

    International Nuclear Information System (INIS)

    Mertens, U.

    1982-12-01

    A four channel microcomputer system was fitted in Grafenrheinfeld NPP for local core protection. This system performs continuous on-line monitoring of peak power density, departure from nucleate boiling ratio and fuel duty. The system implements limitation functions with more sophisticated criteria and improved accuracy. The Grafenrheinfeld system points the way to the employment of computer based limitation system, particularly in the field of programming language, demarkation of tasks, commissioning and documentation aids, streamlining of qualification and structuring of the system. (orig.) [de

  4. Continuous thickness control of extruded pipes with assistance of microcomputers

    International Nuclear Information System (INIS)

    Breil, J.

    1983-06-01

    Because of economic and quality securing reasons a constant wall thickness of extruded pipes in circumference and extrusion direction is an important production aim. Therefore a microcomputer controlled system was developed, which controls die centering with electric motors. The control of wall thickness distribution; was realized with two conceptions: a dead time subjected control with a rotating on line wall thickness measuring instrument and an adaptive control with sensors in the pipe die. With a PI-algorithm excentricities of 30% of the wall thickness could be controlled below a trigger level of 2% within three dead times. (orig.) [de

  5. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  6. A study on the modeling techniques using LS-INGRID

    Energy Technology Data Exchange (ETDEWEB)

    Ku, J. H.; Park, S. W

    2001-03-01

    For the development of radioactive material transport packages, the verification of structural safety of a package against the free drop impact accident should be carried out. The use of LS-DYNA, which is specially developed code for impact analysis, is essential for impact analysis of the package. LS-INGRID is a pre-processor for LS-DYNA with considerable capability to deal with complex geometries and allows for parametric modeling. LS-INGRID is most effective in combination with LS-DYNA code. Although the usage of LS-INGRID seems very difficult relative to many commercial mesh generators, the productivity of users performing parametric modeling tasks with LS-INGRID can be much higher in some cases. Therefore, LS-INGRID has to be used with LS-DYNA. This report presents basic explanations for the structure and commands, basic modelling examples and advanced modelling of LS-INGRID to use it for the impact analysis of various packages. The new users can build the complex model easily, through a study for the basic examples presented in this report from the modelling to the loading and constraint conditions.

  7. Computational modelling of the HyperVapotron cooling technique

    Energy Technology Data Exchange (ETDEWEB)

    Milnes, Joseph, E-mail: Joe.Milnes@ccfe.ac.uk [Euratom/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon, OX14 3DB (United Kingdom); Burns, Alan [School of Process Material and Environmental Engineering, CFD Centre, University of Leeds, Leeds, LS2 9JT (United Kingdom); ANSYS UK, Milton Park, Oxfordshire (United Kingdom); Drikakis, Dimitris [Department of Engineering Physics, Cranfield University, Cranfield, MK43 0AL (United Kingdom)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer The heat transfer mechanisms within a HyperVapotron are examined. Black-Right-Pointing-Pointer A multiphase, CFD model is developed. Black-Right-Pointing-Pointer Modelling choices for turbulence and wall boiling are evaluated. Black-Right-Pointing-Pointer Considerable improvements in accuracy are found compared to standard boiling models. Black-Right-Pointing-Pointer The model should enable significant virtual prototyping to be performed. - Abstract: Efficient heat transfer technologies are essential for magnetically confined fusion reactors; this applies to both the current generation of experimental reactors as well as future power plants. A number of High Heat Flux devices have therefore been developed specifically for this application. One of the most promising candidates is the HyperVapotron, a water cooled device which relies on internal fins and boiling heat transfer to maximise the heat transfer capability. Over the past 30 years, numerous variations of the HyperVapotron have been built and tested at fusion research centres around the globe resulting in devices that can now sustain heat fluxes in the region of 20-30 MW/m{sup 2} in steady state. Until recently, there had been few attempts to model or understand the internal heat transfer mechanisms responsible for this exceptional performance with the result that design improvements have been traditionally sought experimentally which is both inefficient and costly. This paper presents the successful attempt to develop an engineering model of the HyperVapotron device using customisation of commercial Computational Fluid Dynamics software. To establish the most appropriate modelling choices, in-depth studies were performed examining the turbulence models (within the Reynolds Averaged Navier Stokes framework), near wall methods, grid resolution and boiling submodels. Comparing the CFD solutions with HyperVapotron experimental data suggests that a RANS-based, multiphase

  8. Universal microcomputer ABM 80 for data processing and measurement devices control

    International Nuclear Information System (INIS)

    Jagiello, S.; Kozminski, A.; Plominski, M.; Rzymkowski, K.

    1983-01-01

    The main features of the universal microcomputer ABM-80 with microprocessor INTEL 8080 are described. ABM-80 works with the measurer AZAR and the register ERD-102 or with other equipment with similar input/output parameters. Monitor program is an integral part of the microcomputer. (author)

  9. Multiple-User Microcomputer Technology and Its Application to the Library Environment.

    Science.gov (United States)

    McCarthy, Cathleen D.

    1987-01-01

    Demonstrates the ways in which multiuser and multitasking microcomputer systems can be used for the automation of small- to medium-sized library operations. The possibilities afforded by the IBM-PC AT microcomputer are discussed and a sample configuration with estimated cost projections is provided. (EM)

  10. Territories typification technique with use of statistical models

    Science.gov (United States)

    Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.

    2018-05-01

    Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.

  11. Nuclear-fuel-cycle optimization: methods and modelling techniques

    International Nuclear Information System (INIS)

    Silvennoinen, P.

    1982-01-01

    This book present methods applicable to analyzing fuel-cycle logistics and optimization as well as in evaluating the economics of different reactor strategies. After an introduction to the phases of a fuel cycle, uranium cost trends are assessed in a global perspective. Subsequent chapters deal with the fuel-cycle problems faced by a power utility. The fuel-cycle models cover the entire cycle from the supply of uranium to the disposition of spent fuel. The chapter headings are: Nuclear Fuel Cycle, Uranium Supply and Demand, Basic Model of the LWR (light water reactor) Fuel Cycle, Resolution of Uncertainties, Assessment of Proliferation Risks, Multigoal Optimization, Generalized Fuel-Cycle Models, Reactor Strategy Calculations, and Interface with Energy Strategies. 47 references, 34 figures, 25 tables

  12. Simplified Model Surgery Technique for Segmental Maxillary Surgeries

    Directory of Open Access Journals (Sweden)

    Namit Nagar

    2011-01-01

    Full Text Available Model surgery is the dental cast version of cephalometric prediction of surgical results. Patients having vertical maxillary excess with prognathism invariably require Lefort I osteotomy with maxillary segmentation and maxillary first premolar extractions during surgery. Traditionally, model surgeries in these cases have been done by sawing the model through the first premolar interproximal area and removing that segment. This clinical innovation employed the use of X-ray film strips as separators in maxillary first premolar interproximal area. The method advocated is a time-saving procedure where no special clinical or laboratory tools, such as plaster saw (with accompanying plaster dust, were required and reusable separators were made from old and discarded X-ray films.

  13. Coronary artery wall imaging in mice using osmium tetroxide and micro-computed tomography (micro-CT)

    International Nuclear Information System (INIS)

    Pai, Vinay M.; Kozlowski, Megan; Donahue, Danielle; Miller, Elishiah; Xiao, Xianghui; Chen, Marcus Y.; Yu, Zu-Xi; Connelly, Patricia; Jeffries, Kenneth; Wen, Han

    2012-01-01

    The high spatial resolution of micro-computed tomography (micro-CT) is ideal for 3D imaging of coronary arteries in intact mouse heart specimens. Previously, micro-CT of mouse heart specimens utilized intravascular contrast agents that hardened within the vessel lumen and allowed a vascular cast to be made. However, for mouse coronary artery disease models, it is highly desirable to image coronary artery walls and highlight plaques. For this purpose, we describe an ex vivo contrast-enhanced micro-CT imaging technique based on tissue staining with osmium tetroxide (OsO 4 ) solution. As a tissue-staining contrast agent, OsO 4 is retained in the vessel wall and surrounding tissue during the fixation process and cleared from the vessel lumens. Its high X-ray attenuation makes the artery wall visible in CT. Additionally, since OsO 4 preferentially binds to lipids, it highlights lipid deposition in the artery wall. We performed micro-CT of heart specimens of 5- to 25-week-old C57BL/6 wild-type mice and 5- to 13-week-old apolipoprotein E knockout (apoE -/- ) mice at 10 μm resolution. The results show that walls of coronary arteries as small as 45 μm in diameter are visible using a table-top micro-CT scanner. Similar image clarity was achieved with 1/2000th the scan time using a synchrotron CT scanner. In 13-week-old apoE mice, lipid-rich plaques are visible in the aorta. Our study shows that the combination of OsO 4 and micro-CT permits the visualization of the coronary artery wall in intact mouse hearts.

  14. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  15. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    Energy Technology Data Exchange (ETDEWEB)

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  16. Determining Plutonium Mass in Spent Fuel with Nondestructive Assay Techniques -- Preliminary Modeling Results Emphasizing Integration among Techniques

    International Nuclear Information System (INIS)

    Tobin, S.J.; Fensin, M.L.; Ludewigt, B.A.; Menlove, H.O.; Quiter, B.J.; Sandoval, N.P.; Swinhoe, M.T.; Thompson, S.J.

    2009-01-01

    There are a variety of motivations for quantifying Pu in spent (used) fuel assemblies by means of nondestructive assay (NDA) including the following: strengthen the capabilities of the International Atomic Energy Agencies to safeguards nuclear facilities, quantifying shipper/receiver difference, determining the input accountability value at reprocessing facilities and providing quantitative input to burnup credit determination for repositories. For the purpose of determining the Pu mass in spent fuel assemblies, twelve NDA techniques were identified that provide information about the composition of an assembly. A key point motivating the present research path is the realization that none of these techniques, in isolation, is capable of both (1) quantifying the elemental Pu mass of an assembly and (2) detecting the diversion of a significant number of pins. As such, the focus of this work is determining how to best integrate 2 or 3 techniques into a system that can quantify elemental Pu and to assess how well this system can detect material diversion. Furthermore, it is important economically to down-select among the various techniques before advancing to the experimental phase. In order to achieve this dual goal of integration and down-selection, a Monte Carlo library of PWR assemblies was created and is described in another paper at Global 2009 (Fensin et al.). The research presented here emphasizes integration among techniques. An overview of a five year research plan starting in 2009 is given. Preliminary modeling results for the Monte Carlo assembly library are presented for 3 NDA techniques: Delayed Neutrons, Differential Die-Away, and Nuclear Resonance Fluorescence. As part of the focus on integration, the concept of 'Pu isotopic correlation' is discussed and the role of cooling time determination.

  17. Parameter estimation in stochastic mammogram model by heuristic optimization techniques.

    NARCIS (Netherlands)

    Selvan, S.E.; Xavier, C.C.; Karssemeijer, N.; Sequeira, J.; Cherian, R.A.; Dhala, B.Y.

    2006-01-01

    The appearance of disproportionately large amounts of high-density breast parenchyma in mammograms has been found to be a strong indicator of the risk of developing breast cancer. Hence, the breast density model is popular for risk estimation or for monitoring breast density change in prevention or

  18. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  19. Teaching Behavioral Modeling and Simulation Techniques for Power Electronics Courses

    Science.gov (United States)

    Abramovitz, A.

    2011-01-01

    This paper suggests a pedagogical approach to teaching the subject of behavioral modeling of switch-mode power electronics systems through simulation by general-purpose electronic circuit simulators. The methodology is oriented toward electrical engineering (EE) students at the undergraduate level, enrolled in courses such as "Power…

  20. Biliary System Architecture: Experimental Models and Visualization Techniques

    Czech Academy of Sciences Publication Activity Database

    Sarnová, Lenka; Gregor, Martin

    2017-01-01

    Roč. 66, č. 3 (2017), s. 383-390 ISSN 0862-8408 R&D Projects: GA MŠk(CZ) LQ1604; GA ČR GA15-23858S Institutional support: RVO:68378050 Keywords : Biliary system * Mouse model * Cholestasis * Visualisation * Morphology Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Cell biology Impact factor: 1.461, year: 2016

  1. Testing Model with "Check Technique" for Physics Education

    Science.gov (United States)

    Demir, Cihat

    2016-01-01

    As the number, date and form of the written tests are structured and teacher-oriented, it is considered that it creates fear and anxiety among the students. It has been found necessary and important to form a testing model which will keep the students away from the test anxiety and allows them to learn only about the lesson. For this study,…

  2. Data assimilation techniques and modelling uncertainty in geosciences

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available "You cannot step into the same river twice". Perhaps this ancient quote is the best phrase to describe the dynamic nature of the earth system. If we regard the earth as a several mixed systems, we want to know the state of the system at any time. The state could be time-evolving, complex (such as atmosphere or simple and finding the current state requires complete knowledge of all aspects of the system. On one hand, the Measurements (in situ and satellite data are often with errors and incomplete. On the other hand, the modelling cannot be exact; therefore, the optimal combination of the measurements with the model information is the best choice to estimate the true state of the system. Data assimilation (DA methods are powerful tools to combine observations and a numerical model. Actually, DA is an interaction between uncertainty analysis, physical modelling and mathematical algorithms. DA improves knowledge of the past, present or future system states. DA provides a forecast the state of complex systems and better scientific understanding of calibration, validation, data errors and their probability distributions. Nowadays, the high performance and capabilities of DA have led to extensive use of it in different sciences such as meteorology, oceanography, hydrology and nuclear cores. In this paper, after a brief overview of the DA history and a comparison with conventional statistical methods, investigated the accuracy and computational efficiency of two main classical algorithms of DA involving stochastic DA (BLUE and Kalman filter and variational DA (3D and 4D-Var, then evaluated quantification and modelling of the errors. Finally, some of DA applications in geosciences and the challenges facing the DA are discussed.

  3. The impact of applying product-modelling techniques in configurator projects

    DEFF Research Database (Denmark)

    Hvam, Lars; Kristjansdottir, Katrin; Shafiee, Sara

    2018-01-01

    This paper aims to increase understanding of the impact of using product-modelling techniques to structure and formalise knowledge in configurator projects. Companies that provide customised products increasingly apply configurators in support of sales and design activities, reaping benefits...... that include shorter lead times, improved quality of specifications and products, and lower overall product costs. The design and implementation of configurators are a challenging task that calls for scientifically based modelling techniques to support the formal representation of configurator knowledge. Even...... the phenomenon model and information model are considered visually, (2) non-UML-based modelling techniques, in which only the phenomenon model is considered and (3) non-formal modelling techniques. This study analyses the impact to companies from increased availability of product knowledge and improved control...

  4. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  5. Ionospheric scintillation forecasting model based on NN-PSO technique

    Science.gov (United States)

    Sridhar, M.; Venkata Ratnam, D.; Padma Raju, K.; Sai Praharsha, D.; Saathvika, K.

    2017-09-01

    The forecasting and modeling of ionospheric scintillation effects are crucial for precise satellite positioning and navigation applications. In this paper, a Neural Network model, trained using Particle Swarm Optimization (PSO) algorithm, has been implemented for the prediction of amplitude scintillation index (S4) observations. The Global Positioning System (GPS) and Ionosonde data available at Darwin, Australia (12.4634° S, 130.8456° E) during 2013 has been considered. The correlation analysis between GPS S4 and Ionosonde drift velocities (hmf2 and fof2) data has been conducted for forecasting the S4 values. The results indicate that forecasted S4 values closely follow the measured S4 values for both the quiet and disturbed conditions. The outcome of this work will be useful for understanding the ionospheric scintillation phenomena over low latitude regions.

  6. Techniques for studies of unbinned model independent CP violation

    Energy Technology Data Exchange (ETDEWEB)

    Bedford, Nicholas; Weisser, Constantin; Parkes, Chris; Gersabeck, Marco; Brodzicka, Jolanta; Chen, Shanzhen [University of Manchester (United Kingdom)

    2016-07-01

    Charge-Parity (CP) violation is a known part of the Standard Model and has been observed and measured in both the B and K meson systems. The observed levels, however, are insufficient to explain the observed matter-antimatter asymmetry in the Universe, and so other sources need to be found. One area of current investigation is the D meson system, where predicted levels of CP violation are much lower than in the B and K meson systems. This means that more sensitive methods are required when searching for CP violation in this system. Several unbinned model independent methods have been proposed for this purpose, all of which need to be optimised and their sensitivities compared.

  7. SUPER CAVIAR: Memory mapping the general-purpose microcomputer

    International Nuclear Information System (INIS)

    Cittolin, S.; Taylor, B.G.

    1981-01-01

    Over the past 3 years, CAVIAR (CAMAC Video Autonomous Read-out) microcomputers have been applied in growing numbers at CERN and related institutes. As typical user programs expanded in size, and the incorporated firmware libraries were enlarged also, the microprocessor addressing limit of 64 Kbytes became a serious constraint. An enhanced microcomputer, SUPER CAVIAR, has now been created by the incorporation of memory mapping to expand the physical address space to 344 Kbytes. The new facility provides independent firmware and RAM maps, dynamic allocation of common RAM, automatic inter-page transfer modes, and a RAM/EPROM overlay. A memory-based file system has been implemented, and control and data can be interchanged between separate programs in different RAM maps. 84 Kbytes of EPROM are incorporated on the mapper card itself, as well as an ADLC serial data link. In addition to providing more space for consolidated user programs and data, SUPER CAVIAR has allowed the introduction of several improvements to the BAMBI interpreter and extensions to the CAVIAR libraries. A context editor and enhanced debug monitor have been added, as well as new data types and extended array-handling and graphics routines, including isoline plotting, line-fitting and FFT operations. A SUPER CAVIAR converter has been developed which allows a standard CAVIAR to be upgraded to incorporate the new facilities without loss of the existing investment. (orig.)

  8. A titration model for evaluating calcium hydroxide removal techniques

    Directory of Open Access Journals (Sweden)

    Mark PHILLIPS

    2015-02-01

    Full Text Available Objective Calcium hydroxide (Ca(OH2 has been used in endodontics as an intracanal medicament due to its antimicrobial effects and its ability to inactivate bacterial endotoxin. The inability to totally remove this intracanal medicament from the root canal system, however, may interfere with the setting of eugenol-based sealers or inhibit bonding of resin to dentin, thus presenting clinical challenges with endodontic treatment. This study used a chemical titration method to measure residual Ca(OH2 left after different endodontic irrigation methods. Material and Methods Eighty-six human canine roots were prepared for obturation. Thirty teeth were filled with known but different amounts of Ca(OH2 for 7 days, which were dissolved out and titrated to quantitate the residual Ca(OH2 recovered from each root to produce a standard curve. Forty-eight of the remaining teeth were filled with equal amounts of Ca(OH2 followed by gross Ca(OH2 removal using hand files and randomized treatment of either: 1 Syringe irrigation; 2 Syringe irrigation with use of an apical file; 3 Syringe irrigation with added 30 s of passive ultrasonic irrigation (PUI, or 4 Syringe irrigation with apical file and PUI (n=12/group. Residual Ca(OH2 was dissolved with glycerin and titrated to measure residual Ca(OH2 left in the root. Results No method completely removed all residual Ca(OH2. The addition of 30 s PUI with or without apical file use removed Ca(OH2 significantly better than irrigation alone. Conclusions This technique allowed quantification of residual Ca(OH2. The use of PUI (with or without apical file resulted in significantly lower Ca(OH2 residue compared to irrigation alone.

  9. Modern EMC analysis techniques II models and applications

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of modern real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, numerical investigations delve into printed circuit boards, monolithic microwave integrated circuits, radio frequency microelectro

  10. Detecting and visualizing internal 3D oleoresin in agarwood by means of micro-computed tomography

    International Nuclear Information System (INIS)

    Khairiah Yazid; Roslan Yahya; Mat Rosol Awang

    2012-01-01

    Detection and analysis of oleoresin is particularly significant since the commercial value of agarwood is related to the quantity of oleoresins that are present. A modern technique of non-destructive may reach the interior region of the wood. Currently, tomographic image data in particular is most commonly visualized in three dimensions using volume rendering. The aim of this paper is to explore the potential of high resolution non-destructive 3D visualization technique, X-ray micro-computed tomography, as imaging tools to visualize micro-structure oleoresin in agarwood. Investigations involving desktop X-ray micro-tomography system on high grade agarwood sample, performed at the Centre of Tomography in Nuclear Malaysia, demonstrate the applicability of the method. Prior to experiments, a reference test was conducted to stimulate the attenuation of oleoresin in agarwood. Based on the experiment results, micro-CT imaging with voxel size 7.0 μm is capable to of detecting oleoresin and pores in agarwood. This imaging technique, although sophisticated can be used for standard development especially in grading of agarwood for commercial activities. (author)

  11. Application of nonliner reduction techniques in chemical process modeling: a review

    International Nuclear Information System (INIS)

    Muhaimin, Z; Aziz, N.; Abd Shukor, S.R.

    2006-01-01

    Model reduction techniques have been used widely in engineering fields for electrical, mechanical as well as chemical engineering. The basic idea of reduction technique is to replace the original system by an approximating system with much smaller state-space dimension. A reduced order model is more beneficial to process and industrial field in terms of control purposes. This paper is to provide a review on application of nonlinear reduction techniques in chemical processes. The advantages and disadvantages of each technique reviewed are also highlighted

  12. Nuclear fuel cycle optimization - methods and modelling techniques

    International Nuclear Information System (INIS)

    Silvennoinen, P.

    1982-01-01

    This book is aimed at presenting methods applicable in the analysis of fuel cycle logistics and optimization as well as in evaluating the economics of different reactor strategies. After a succinct introduction to the phases of a fuel cycle, uranium cost trends are assessed in a global perspective and subsequent chapters deal with the fuel cycle problems faced by a power utility. A fundamental material flow model is introduced first in the context of light water reactor fuel cycles. Besides the minimum cost criterion, the text also deals with other objectives providing for a treatment of cost uncertainties and of the risk of proliferation of nuclear weapons. Methods to assess mixed reactor strategies, comprising also other reactor types than the light water reactor, are confined to cost minimization. In the final Chapter, the integration of nuclear capacity within a generating system is examined. (author)

  13. Application of nonlinear forecasting techniques for meteorological modeling

    Directory of Open Access Journals (Sweden)

    V. Pérez-Muñuzuri

    2000-10-01

    Full Text Available A nonlinear forecasting method was used to predict the behavior of a cloud coverage time series several hours in advance. The method is based on the reconstruction of a chaotic strange attractor using four years of cloud absorption data obtained from half-hourly Meteosat infrared images from Northwestern Spain. An exhaustive nonlinear analysis of the time series was carried out to reconstruct the phase space of the underlying chaotic attractor. The forecast values are used by a non-hydrostatic meteorological model ARPS for daily weather prediction and their results compared with surface temperature measurements from a meteorological station and a vertical sounding. The effect of noise in the time series is analyzed in terms of the prediction results.Key words: Meterology and atmospheric dynamics (mesoscale meteorology; general – General (new fields

  14. Application of nonlinear forecasting techniques for meteorological modeling

    Directory of Open Access Journals (Sweden)

    V. Pérez-Muñuzuri

    Full Text Available A nonlinear forecasting method was used to predict the behavior of a cloud coverage time series several hours in advance. The method is based on the reconstruction of a chaotic strange attractor using four years of cloud absorption data obtained from half-hourly Meteosat infrared images from Northwestern Spain. An exhaustive nonlinear analysis of the time series was carried out to reconstruct the phase space of the underlying chaotic attractor. The forecast values are used by a non-hydrostatic meteorological model ARPS for daily weather prediction and their results compared with surface temperature measurements from a meteorological station and a vertical sounding. The effect of noise in the time series is analyzed in terms of the prediction results.

    Key words: Meterology and atmospheric dynamics (mesoscale meteorology; general – General (new fields

  15. Fuel element transfer cask modelling using MCNP technique

    International Nuclear Information System (INIS)

    Rosli Darmawan

    2009-01-01

    Full text: After operating for more than 25 years, some of the Reaktor TRIGA PUSPATI (RTP) fuel elements would have been depleted. A few addition and fuel reconfiguration exercises have to be conducted in order to maintain RTP capacity. Presently, RTP spent fuels are stored at the storage area inside RTP tank. The need to transfer the fuel element outside of RTP tank may be prevalence in the near future. The preparation shall be started from now. A fuel element transfer cask has been designed according to the recommendation by the fuel manufacturer and experience of other countries. A modelling using MCNP code has been conducted to analyse the design. The result shows that the design of transfer cask fuel element is safe for handling outside the RTP tank according to recent regulatory requirement. (author)

  16. Fuel Element Transfer Cask Modelling Using MCNP Technique

    International Nuclear Information System (INIS)

    Darmawan, Rosli; Topah, Budiman Naim

    2010-01-01

    After operating for more than 25 years, some of the Reaktor TRIGA Puspati (RTP) fuel elements would have been depleted. A few addition and fuel reconfiguration exercises have to be conducted in order to maintain RTP capacity. Presently, RTP spent fuels are stored at the storage area inside RTP tank. The need to transfer the fuel element outside of RTP tank may be prevalence in the near future. The preparation shall be started from now. A fuel element transfer cask has been designed according to the recommendation by the fuel manufacturer and experience of other countries. A modelling using MCNP code has been conducted to analyse the design. The result shows that the design of transfer cask fuel element is safe for handling outside the RTP tank according to recent regulatory requirement.

  17. Advancing botnet modeling techniques for military and security simulations

    Science.gov (United States)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  18. Multivariate moment closure techniques for stochastic kinetic models

    International Nuclear Information System (INIS)

    Lakatos, Eszter; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H.

    2015-01-01

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs

  19. Multivariate moment closure techniques for stochastic kinetic models

    Energy Technology Data Exchange (ETDEWEB)

    Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H., E-mail: m.stumpf@imperial.ac.uk [Department of Life Sciences, Centre for Integrative Systems Biology and Bioinformatics, Imperial College London, London SW7 2AZ (United Kingdom)

    2015-09-07

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.

  20. A Decision Model for Selection of Microcomputers and Operating Systems.

    Science.gov (United States)

    1984-06-01

    is resilting in application software (for microccmputers) being developed almost exclu- sively tor the IBM PC and compatiole systems. NAVDAC ielt that...location can be indepen- dently accessed. RAN memory is also often called read/ write memory, hecause new information can be written into and read from...when power is lost; this is also read/ write memory. Bubble memory, however, has significantly slower access times than RAM or RON and also is not preva

  1. Evaluation of data assimilation techniques for a mesoscale meteorological model and their effects on air quality model results

    Energy Technology Data Exchange (ETDEWEB)

    Amicarelli, A; Pelliccioni, A [ISPESL - Dipartimento Insediamenti Produttivi e Interazione con l' Ambiente, Via Fontana Candida, 1 00040 Monteporzio Catone (RM) Italy (Italy); Finardi, S; Silibello, C [ARIANET, via Gilino 9, 20128 Milano (Italy); Gariazzo, C

    2008-05-01

    Data assimilation techniques are methods to limit the growth of errors in a dynamical model by allowing observations distributed in space and time to force (nudge) model solutions. They have become common for meteorological model applications in recent years, especially to enhance weather forecast and to support air-quality studies. In order to investigate the influence of different data assimilation techniques on the meteorological fields produced by RAMS model, and to evaluate their effects on the ozone and PM{sub 10} concentrations predicted by FARM model, several numeric experiments were conducted over the urban area of Rome, Italy, during a summer episode.

  2. Evaluation of data assimilation techniques for a mesoscale meteorological model and their effects on air quality model results

    Science.gov (United States)

    Amicarelli, A.; Gariazzo, C.; Finardi, S.; Pelliccioni, A.; Silibello, C.

    2008-05-01

    Data assimilation techniques are methods to limit the growth of errors in a dynamical model by allowing observations distributed in space and time to force (nudge) model solutions. They have become common for meteorological model applications in recent years, especially to enhance weather forecast and to support air-quality studies. In order to investigate the influence of different data assimilation techniques on the meteorological fields produced by RAMS model, and to evaluate their effects on the ozone and PM10 concentrations predicted by FARM model, several numeric experiments were conducted over the urban area of Rome, Italy, during a summer episode.

  3. Evaluation of data assimilation techniques for a mesoscale meteorological model and their effects on air quality model results

    International Nuclear Information System (INIS)

    Amicarelli, A; Pelliccioni, A; Finardi, S; Silibello, C; Gariazzo, C

    2008-01-01

    Data assimilation techniques are methods to limit the growth of errors in a dynamical model by allowing observations distributed in space and time to force (nudge) model solutions. They have become common for meteorological model applications in recent years, especially to enhance weather forecast and to support air-quality studies. In order to investigate the influence of different data assimilation techniques on the meteorological fields produced by RAMS model, and to evaluate their effects on the ozone and PM 10 concentrations predicted by FARM model, several numeric experiments were conducted over the urban area of Rome, Italy, during a summer episode

  4. Development of Reservoir Characterization Techniques and Production Models for Exploiting Naturally Fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, Michael L.; Brown, Raymon L.; Civan, Frauk; Hughes, Richard G.

    2001-08-15

    Research continues on characterizing and modeling the behavior of naturally fractured reservoir systems. Work has progressed on developing techniques for estimating fracture properties from seismic and well log data, developing naturally fractured wellbore models, and developing a model to characterize the transfer of fluid from the matrix to the fracture system for use in the naturally fractured reservoir simulator.

  5. Reliability analysis of microcomputer boards and computer based systems important to safety of nuclear plants

    International Nuclear Information System (INIS)

    Shrikhande, S.V.; Patil, V.K.; Ganesh, G.; Biswas, B.; Patil, R.K.

    2010-01-01

    Computer Based Systems (CBS) are employed in Indian nuclear plants for protection, control and monitoring purpose. For forthcoming CBS, Reactor Control Division has designed and developed a new standardized family of microcomputer boards qualified to stringent requirements of nuclear industry. These boards form the basic building blocks of CBS. Reliability analysis of these boards is being carried out using analysis package based on MIL-STD-217Plus methodology. The estimated failure rate values of these standardized microcomputer boards will be useful for reliability assessment of these systems. The paper presents reliability analysis of microcomputer boards and case study of a CBS system built using these boards. (author)

  6. Data flow methods for dynamic system simulation - A CSSL-IV microcomputer network interface

    Science.gov (United States)

    Makoui, A.; Karplus, W. J.

    1983-01-01

    A major problem in employing networks of microcomputers for the real-time simulation of complex systems is to allocate computational tasks to the various microcomputers in such a way that idle time and time lost in interprocess communication is minimized. The research reported in this paper is directed to the development of a software interface between a higher-level simulation language and a network of microcomputers. A CSSL-IV source program is translated to a data flow graph. This graph is then analyzed automatically so as to allocate computing tasks to the various processors.

  7. Microcomputer-based real-time optical signal processing system

    Science.gov (United States)

    Yu, F. T. S.; Cao, M. F.; Ludman, J. E.

    1986-01-01

    A microcomputer-based real-time programmable optical signal processing system utilizing a Magneto-Optic Spatial Light Modulator (MOSLM) and a Liquid Crystal Light Valve (LCLV) is described. This system can perform a myriad of complicated optical operations, such as image correlation, image subtraction, matrix multiplication and many others. The important assets of this proposed system must be the programmability and the capability of real-time addressing. The design specification and the progress toward practical implementation of this proposed system are discussed. Some preliminary experimental demonstrations are conducted. The feasible applications of this proposed system to image correlation for optical pattern recognition, image subtraction for IC chip inspection and matrix multiplication for optical computing are demonstrated.

  8. Microcomputer-controlled ultrasonic data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, W.A. Jr.

    1978-11-01

    The large volume of ultrasonic data generated by computer-aided test procedures has necessitated the development of a mobile, high-speed data acquisition and storage system. This approach offers the decided advantage of on-site data collection and remote data processing. It also utilizes standard, commercially available ultrasonic instrumentation. This system is controlled by an Intel 8080A microprocessor. The MCS80-SDK microcomputer board was chosen, and magnetic tape is used as the storage medium. A detailed description is provided of both the hardware and software developed to interface the magnetic tape storage subsystem to Biomation 8100 and Biomation 805 waveform recorders. A boxcar integrator acquisition system is also described for use when signal averaging becomes necessary. Both assembly language and machine language listings are provided for the software.

  9. Page: a program for gamma spectra analysis in PC microcomputers

    International Nuclear Information System (INIS)

    Goncalves, M.A.; Yamaura, M.; Costa, G.J.C.; Carvalho, E.I. de; Matsuda, H.T.; Araujo, B.F. de.

    1991-04-01

    PAGE is a software package, written in BASIC language, to perform gamma spectra analysis. It was developed to be used in a high-purity intrinsic germanium detector-multichannel analyser-PC microcomputer system. The analysis program of PAGE package accomplishes functions as follows: peak location; gamma nuclides identification; activity determination. Standard nuclides sources were used to calibrate the system. To perform the efficiency x energy calibration a logarithmic fit was applied. Analysis of nuclides with overlapping peaks is allowed by PAGE program. PAGE has additional auxiliary programs for: building and list of isotopic nuclear data libraries; data acquisition from multichannel analyser; spectrum display with automatic area and FWHM determinations. This software is to be applied in analytical process control where time response is a very important parameter. PAGE takes ca. 1.5 minutes to analyse a complex spectrum from a 4096 channels MCA. (author)

  10. Microcomputer-controlled thermoluminescent analyser IJS MR-200

    International Nuclear Information System (INIS)

    Mihelic, M.; Miklavzic, U.; Rupnik, Z.; Satalic, P.; Spreizer, F.; Zerovnik, I.

    1985-01-01

    Performances and concept of the multipurpose, microcomputer-controlled thermoluminescent analyser, designed for use in laboratory work TL dosemeters as well as for routine dose readings in the range from ecological to accident doses is described. The main features of the analyser are: time-linear sampling, digitalisation, storing, and subsequent displaying on the monitor time scale of the glow and and temperature curve of the TL material; digital stabilization, control and diagnostic of the analog unit; ability of storing 7 different 8-parametric heating programs; ability of storing 15 evaluation programs defined by 2 or 4 parameters and 3 different algorithms (altogether 5 types of evaluations). Analyser has several features intended for routine work: 9 function keys and possibilities of file forming on cassette or display disc, of dose calculation and averaging, of printing reports with names, and possibility of additional programming in Basic. (author)

  11. Storage and analysis of radioisotope scan data using a microcomputer

    Energy Technology Data Exchange (ETDEWEB)

    Crawshaw, I P; Diffey, B L [Dryburn Hospital, Durham (UK)

    1981-08-01

    A data storage system has been created for recording clinical radioisotope scan data on a microcomputer system, located and readily available for use in an imaging department. The input of patient data from the request cards and the results sheets is straightforward as menus and code numbers are used throughout a logical sequence of steps in the program. The questions fall into four categories; patient information, referring centre information, diagnosis and symptoms and results of the investigation. The main advantage of the analysis program is its flexibility in that it follows the same format as the input program and any combination of criteria required for analysis may be selected. The menus may readily be altered and the programs adapted for use in other hospital departments.

  12. Storage and analysis of radioisotope scan data using a microcomputer

    International Nuclear Information System (INIS)

    Crawshaw, I.P.; Diffey, B.L.

    1981-01-01

    A data storage system has been created for recording clinical radioisotope scan data on a microcomputer system, located and readily available for use in an imaging department. The input of patient data from the request cards and the results sheets is straightforward as menus and code numbers are used throughout a logical sequence of steps in the program. The questions fall into four categories; patient information, referring centre information, diagnosis and symptoms and results of the investigation. The main advantage of the analysis program is its flexibility in that it follows the same format as the input program and any combination of criteria required for analysis may be selected. The menus may readily be altered and the programs adapted for use in other hospital departments. (U.K.)

  13. A microcomputer-based daily living activity recording system.

    Science.gov (United States)

    Matsuoka, Shingo; Yonezawa, Yoshiharu; Maki, Hiromichi; Ogawa, Hidekuni; Hahn, Allen W; Thayer, Julian F; Caldwell, W Morton

    2003-01-01

    A new daily living activity recording system has been developed for monitoring health conditions and living patterns, such as respiration, posture, activity/rest ratios and general activity level. The system employs a piezoelectric sensor, a dual axis accelerometer, two low-power active filters, a low-power 8-bit single chip microcomputer and a 128 MB compact flash memory. The piezoelectric sensor, whose electrical polarization voltage is produced by mechanical strain, detects body movements. Its high-frequency output components reflect body movements produced by walking and running activities, while the low frequency components are mainly respiratory. The dual axis accelerometer detects, from body X and Y tilt angles, whether the patient is standing, sitting or lying down (prone, supine, left side or right side). The detected respiratory, behavior and posture signals are stored by the compact flash memory. After recording, these data are downloaded to a desktop computer and analyzed.

  14. The Medical Gopher — A Microcomputer Based Physician Work Station

    Science.gov (United States)

    McDonald, Clement J.

    1984-01-01

    We've developed a microcomputer medical work station intended to reduce the physician's “gopher” work of fetching, reviewing, organizing and writing that consumes his day. The system requires extensive physician interaction; so we have developed a fast and consistent menu-oriented user interface. It provides facilities for entering prescriptions, orders, problems and other medical record information and for generating flowsheets, executing reminder rules, providing ad hoc retrievals and reporting facts about drugs, tests and differential diagnoses. Each work station is connected to a central server (currently a VAX 117/80) in a network configuration, but carries all of its own programs, tables and medical records for a few hundred patients, locally. This system is tested but not yet tried. Questions remain about physician's acceptance and the true usefullness of such a work station.

  15. Microcomputer-based workforce scheduling for hospital porters.

    Science.gov (United States)

    Lin, C K

    1999-01-01

    This paper focuses on labour scheduling for hospital porters who are the major workforce providing routine cleansing of wards, transportation and messenger services. Generating an equitable monthly roster for porters while meeting the daily minimum demand is a tedious task scheduled manually by a supervisor. In considering a variety of constraints and goals, a manual schedule was usually produced in seven to ten days. To be in line with the strategic goal of scientific management of an acute care regional hospital in Hong Kong, a microcomputer-based algorithm was developed to schedule the monthly roster. The algorithm, coded in Digital Visual Fortran 5.0 Professional, could generate a monthly roster in seconds. Implementation has been carried out since September 1998 and the results proved to be useful to hospital administrators and porters. This paper discusses both the technical and human issues involved during the computerization process.

  16. Interpretive Reporting of Protein Electrophoresis Data by Microcomputer

    Science.gov (United States)

    Talamo, Thomas S.; Losos, Frank J.; Kessler, G. Frederick

    1982-01-01

    A microcomputer based system for interpretive reporting of protein electrophoretic data has been developed. Data for serum, urine and cerebrospinal fluid protein electrophoreses as well as immunoelectrophoresis can be entered. Patient demographic information is entered through the keyboard followed by manual entry of total and fractionated protein levels obtained after densitometer scanning of the electrophoretic strip. The patterns are then coded, interpreted, and final reports generated. In most cases interpretation time is less than one second. Misinterpretation by computer is uncommon and can be corrected by edit functions within the system. These discrepancies between computer and pathologist interpretation are automatically stored in a data file for later review and possible program modification. Any or all previous tests on a patient may be reviewed with graphic display of the electrophoretic pattern. The system has been in use for several months and is presently well accepted by both laboratory and clinical staff. It also allows rapid storage, retrieval and analysis of protein electrophoretic datab.

  17. Fault tolerant microcomputer based alarm annunciator for Dhruva reactor

    International Nuclear Information System (INIS)

    Chandra, A.K.

    1988-01-01

    The Dhruva alarm annunciator displays the status of 624 alarm points on an array of display windows using the standard ringback sequence. Recognizing the need for a very high availability, the system is implemented as a fault tolerant configuration. The annunciator is partitioned into three identical units; each unit is implemented using two microcomputers wired in a hot standby mode. In the event of one computer malfunctioning, the standby computer takes over control in a bouncefree transfer. The use of microprocessors has helped built-in flexibility in the system. The system also provides built-in capability to resolve the sequence of occurrence of events and conveys this information to another system for display on a CRT. This report describes the system features, fault tolerant organisation used and the hardware and software developed for the annunciation function. (author). 8 figs

  18. Microcomputer network for technological equipment monitoring and control

    International Nuclear Information System (INIS)

    Segec, O.

    1990-01-01

    The properties and purpose are characterized of a microcomputer network developed for monitoring and controlling the nuclear power plant chemistry. In the development, emphasis was put on simplicity of the components, reliability, ease of operation and availability of the components on the domestic market. So far, these criteria are only met by the DIAMO L(S) system equipped with an MH 8080 (Z80) processor. Its assets include simplicity and ruggedness, owing to which it is well suited to heavy-duty performance, whereas its drawbacks comprise a narrow extent of addressable memory and absence of any supporting software. Until now, 5 types of automated stations have been developed and submitted for test operation at the Bohunice V-2 nuclear power plant. Virtually any personal computer can be attached to the network. The system can also be installed in conventional power plants as well as beyond the power generation field. (Z.M.)

  19. Microcomputers take over in off-shore process

    Energy Technology Data Exchange (ETDEWEB)

    Claricoates, J

    1978-10-01

    The special capabilities and adaptability of microcomputers are typically demonstrated by the use of them on Mobil Exploration, Norway, Inc.'s Statfjord A Condeep platform. One of the unique features of offshore oil production is that so many different operations are conducted in a very confined space. On Statfjord A drilling, production, storage of crude oil, gas compression for sale or reinjection, custody transfer, generation and distribution of a significant amount of electric power, a hotel facility, and flight operations, all have to be accommodated and controlled. Superimposed on these are comprehensive fire detection and shutdown systems. In such a highly complex, densely packed, and highly interactive environment, a supervisory control and data acquisition system (SCADA) by providing coordinated and easily read displays, better records for alarm analysis, and possibly suggesting corrective action when a fault does occur, is an essential aid to an operator, whose decisions must always by correct.

  20. Management of quality assurance in diagnostic radiology by microcomputer

    International Nuclear Information System (INIS)

    Evans, S.H.

    1985-01-01

    Software has been written for the calculation, interpretation and presentation of quality control measurements of X-ray machines. The programs run on a Z80 based microcomputer system comprising a Comart Communicator CP500 interfaced to a Volker Craig 4404 visual display unit (VDU) and an EpsonMX80F/T printer. The software has been written in the dbaseII database language (Ashton Tate 1975) and it runs under the CP/M operating system. The programs guide the user through each routine required and can be operated without knowledge of computers. The programs calculate the results from the raw data obtained from the test equipment and these results are then stored and analysed. (U.K.)

  1. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  2. Assessing sequential data assimilation techniques for integrating GRACE data into a hydrological model

    KAUST Repository

    Khaki, M.; Hoteit, Ibrahim; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A.; Schumacher, M.; Pattiaratchi, C.

    2017-01-01

    The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques

  3. Limitations of using micro-computed tomography to predict bone-implant contact and mechanical fixation.

    Science.gov (United States)

    Liu, S; Broucek, J; Virdi, A S; Sumner, D R

    2012-01-01

    Fixation of metallic implants to bone through osseointegration is important in orthopaedics and dentistry. Model systems for studying this phenomenon would benefit from a non-destructive imaging modality so that mechanical and morphological endpoints can more readily be examined in the same specimens. The purpose of this study was to assess the utility of an automated microcomputed tomography (μCT) program for predicting bone-implant contact (BIC) and mechanical fixation strength in a rat model. Femurs in which 1.5-mm-diameter titanium implants had been in place for 4 weeks were either embedded in polymethylmethacrylate (PMMA) for preparation of 1-mm-thick cross-sectional slabs (16 femurs: 32 slabs) or were used for mechanical implant pull-out testing (n= 18 femurs). All samples were scanned by μCT at 70 kVp with 16 μm voxels and assessed by the manufacturer's software for assessing 'osseointegration volume per total volume' (OV/TV). OV/TV measures bone volume per total volume (BV/TV) in a 3-voxel-thick ring that by default excludes the 3 voxels immediately adjacent to the implant to avoid metal-induced artefacts. The plastic-embedded samples were also analysed by backscatter scanning electron microscopy (bSEM) to provide a direct comparison of OV/TV with a well-accepted technique for BIC. In μCT images in which the implant was directly embedded within PMMA, there was a zone of elevated attenuation (>50% of the attenuation value used to segment bone from marrow) which extended 48 μm away from the implant surface. Comparison of the bSEM and μCT images showed high correlations for BV/TV measurements in areas not affected by metal-induced artefacts. In addition for bSEM images, we found that there were high correlations between peri-implant BV/TV within 12 μm of the implant surface and BIC (correlation coefficients ≥0.8, p implant pull-out strength (r= 0.401, p= 0.049) and energy to failure (r= 0.435, p= 0.035). Thus, the need for the 48-μm-thick exclusion

  4. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  5. Sabots, Obturator and Gas-In-Launch Tube Techniques for Heat Flux Models in Ballistic Ranges

    Science.gov (United States)

    Bogdanoff, David W.; Wilder, Michael C.

    2013-01-01

    For thermal protection system (heat shield) design for space vehicle entry into earth and other planetary atmospheres, it is essential to know the augmentation of the heat flux due to vehicle surface roughness. At the NASA Ames Hypervelocity Free Flight Aerodynamic Facility (HFFAF) ballistic range, a campaign of heat flux studies on rough models, using infrared camera techniques, has been initiated. Several phenomena can interfere with obtaining good heat flux data when using this measuring technique. These include leakage of the hot drive gas in the gun barrel through joints in the sabot (model carrier) to create spurious thermal imprints on the model forebody, deposition of sabot material on the model forebody, thereby changing the thermal properties of the model surface and unknown in-barrel heating of the model. This report presents developments in launch techniques to greatly reduce or eliminate these problems. The techniques include the use of obturator cups behind the launch package, enclosed versus open front sabot designs and the use of hydrogen gas in the launch tube. Attention also had to be paid to the problem of the obturator drafting behind the model and impacting the model. Of the techniques presented, the obturator cups and hydrogen in the launch tube were successful when properly implemented

  6. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  7. Techniques to maximize software reliability in radiation fields

    International Nuclear Information System (INIS)

    Eichhorn, G.; Piercey, R.B.

    1986-01-01

    Microprocessor system failures due to memory corruption by single event upsets (SEUs) and/or latch-up in RAM or ROM memory are common in environments where there is high radiation flux. Traditional methods to harden microcomputer systems against SEUs and memory latch-up have usually involved expensive large scale hardware redundancy. Such systems offer higher reliability, but they tend to be more complex and non-standard. At the Space Astronomy Laboratory the authors have developed general programming techniques for producing software which is resistant to such memory failures. These techniques, which may be applied to standard off-the-shelf hardware, as well as custom designs, include an implementation of Maximally Redundant Software (MRS) model, error detection algorithms and memory verification and management

  8. The Integrated Use of Enterprise and System Dynamics Modelling Techniques in Support of Business Decisions

    Directory of Open Access Journals (Sweden)

    K. Agyapong-Kodua

    2012-01-01

    Full Text Available Enterprise modelling techniques support business process (reengineering by capturing existing processes and based on perceived outputs, support the design of future process models capable of meeting enterprise requirements. System dynamics modelling tools on the other hand are used extensively for policy analysis and modelling aspects of dynamics which impact on businesses. In this paper, the use of enterprise and system dynamics modelling techniques has been integrated to facilitate qualitative and quantitative reasoning about the structures and behaviours of processes and resource systems used by a Manufacturing Enterprise during the production of composite bearings. The case study testing reported has led to the specification of a new modelling methodology for analysing and managing dynamics and complexities in production systems. This methodology is based on a systematic transformation process, which synergises the use of a selection of public domain enterprise modelling, causal loop and continuous simulation modelling techniques. The success of the modelling process defined relies on the creation of useful CIMOSA process models which are then converted to causal loops. The causal loop models are then structured and translated to equivalent dynamic simulation models using the proprietary continuous simulation modelling tool iThink.

  9. Modelling of ground penetrating radar data in stratified media using the reflectivity technique

    International Nuclear Information System (INIS)

    Sena, Armando R; Sen, Mrinal K; Stoffa, Paul L

    2008-01-01

    Horizontally layered media are often encountered in shallow exploration geophysics. Ground penetrating radar (GPR) data in these environments can be modelled by techniques that are more efficient than finite difference (FD) or finite element (FE) schemes because the lateral homogeneity of the media allows us to reduce the dependence on the horizontal spatial variables through Fourier transforms on these coordinates. We adapt and implement the invariant embedding or reflectivity technique used to model elastic waves in layered media to model GPR data. The results obtained with the reflectivity and FDTD modelling techniques are in excellent agreement and the effects of the air–soil interface on the radiation pattern are correctly taken into account by the reflectivity technique. Comparison with real wide-angle GPR data shows that the reflectivity technique can satisfactorily reproduce the real GPR data. These results and the computationally efficient characteristics of the reflectivity technique (compared to FD or FE) demonstrate its usefulness in interpretation and possible model-based inversion schemes of GPR data in stratified media

  10. The microcomputer workstation - An alternate hardware architecture for remotely sensed image analysis

    Science.gov (United States)

    Erickson, W. K.; Hofman, L. B.; Donovan, W. E.

    1984-01-01

    Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.

  11. Software in windows for staple compounding system of microcomputer nuclear mass scale

    International Nuclear Information System (INIS)

    Wang Yanting; Zhang Yongming; Wang Yu; Jin Dongping

    1998-01-01

    The software exploited in windows for staple compounding system of microcomputer nuclear mass scale is described. The staple compounding system is briefly narrated. The software structure and its realizing method are given

  12. A nuclear pulse amplitude acquisition system based on 80C31 single-chip microcomputer

    International Nuclear Information System (INIS)

    Zhao Xiuliang; Qu Guopu; Guo Lanying; Zhang Songbai

    1999-01-01

    A kind of multichannel nuclear pulse amplitude signal acquisition system is described, which is composed of pulse peak detector, integrated S/H circuit, A/D converter and 80C31 single-chip microcomputer

  13. The development of the time-keeping clock with TS-1 single chip microcomputer.

    Science.gov (United States)

    Zhou, Jiguang; Li, Yongan

    The authors have developed a time-keeping clock with Intel 8751 single chip microcomputer that has been successfully used in time-keeping station. The hard-soft ware design and performance of the clock are introduced.

  14. Bringing the Microcomputer into the Junior High: A Success Story from Florida.

    Science.gov (United States)

    Miller, Benjamin S.

    1982-01-01

    Describes the introduction of an Apple II microcomputer into Miami Lakes (Florida) Junior High School and its success in generating enthusiasm among teachers, students, parents, and the community. (Author/RW)

  15. Medical and administrative management of a nuclear medicine department with a microcomputer

    International Nuclear Information System (INIS)

    Legras, B.; Kohler, F.

    1984-01-01

    The use of a microcomputer for data management in a department of Nuclear Medicine has allowed to reduce considerably office work, and supplies the physicians with very useful statistics on the investigations carried out [fr

  16. Digital TV-echelle spectrograph for simultaneous multielemental analysis using microcomputer control

    International Nuclear Information System (INIS)

    Davidson, J.B.; Case, A.L.

    1980-12-01

    A digital TV-echelle spectrograph with microcomputer control was developed for simultaneous multielemental analysis. The optical system is a commercially available unit originally equipped for film and photomultiplier (single element) readout. The film port was adapted for the intensifier camera. The camera output is digitized and stored in a microcomputer-controlled, 512 x 512 x 12 bit memory and image processor. Multiple spectra over the range of 200 to 800 nm are recorded in a single exposure. Spectra lasting from nanoseconds to seconds are digitized and stored in 0.033 s and displayed on a TV monitor. An inexpensive microcomputer controls the exposure, reads and displays the intensity of predetermined spectral lines, and calculates wavelengths of unknown lines. The digital addresses of unknown lines are determined by superimposing a cursor on the TV display. The microcomputer also writes into memory wavelength fiducial marks for alignment of the TV camera

  17. Microcomputer Decisions for the 1990s [and] Apple's Macintosh: A Viable Choice.

    Science.gov (United States)

    Grosch, Audrey N.

    1989-01-01

    Discussion of the factors that should be considered when purchasing or upgrading a microcomputer focuses on the MS-DOS and OS/2 operating systems. Macintosh purchasing decisions are discussed in a sidebar. A glossary is provided. (CLB)

  18. Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES

    Science.gov (United States)

    Hoerger, J.

    1984-01-01

    Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.

  19. Application of single-chip microcomputer to portable radon and radon daughters monitor

    International Nuclear Information System (INIS)

    Meng Yecheng; Huang Zhanyun; She Chengye

    1992-01-01

    Application of single-chip microcomputer to portable radon and radon daughters monitor is introduced in this paper. With the single-chip microcomputer automation comes into effect in the process from sampling to measuring of radon and radon daughters. The concentrations of radon and radon daughters can be easily shown when the conversion coefficients are pre-settled before the measurement. Moreover, the principle and design are briefly discussed according to the characteristics of the monitor

  20. Application of microcomputer in automating microscope measurements in nuclear emulsion viewing

    International Nuclear Information System (INIS)

    Blaho, D.

    1985-01-01

    Microcomputer system MPS 8010 is described as applied to the automation of data collection and control of a microscope. The data on the measured coordinates of the microscope are recorded on paper tape and listed on a typewriter. The microcomputer system also makes possible automatic control of the microscope position by means of stepping motors according to the value read-out of the paper tape. (author)

  1. Direct microcomputer controlled determination of zinc in human serum by flow injection atomic absorption spectrometry

    DEFF Research Database (Denmark)

    Simonsen, Kirsten Wiese; Nielsen, Bent; Jensen, Arne

    1986-01-01

    A procedure is described for the direct determination of zinc in human serum by fully automated, microcomputer controlled flow injection atomic absorption spectrometry (Fl-AAS). The Fl system is pumpless, using the negative pressure created by the nebuliser. It only consists of a three-way valve......, programmable from the microcomputer, to control the sample volume. No pre-treatment of the samples is necessary. The limit of detection is 0.14 mg l–1, and only small amounts of serum (

  2. Modelling techniques for predicting the long term consequences of radiation on natural aquatic populations

    International Nuclear Information System (INIS)

    Wallis, I.G.

    1978-01-01

    The purpose of this working paper is to describe modelling techniques for predicting the long term consequences of radiation on natural aquatic populations. Ideally, it would be possible to use aquatic population models: (1) to predict changes in the health and well-being of all aquatic populations as a result of changing the composition, amount and location of radionuclide discharges; (2) to compare the effects of steady, fluctuating and accidental releases of radionuclides; and (3) to evaluate the combined impact of the discharge of radionuclides and other wastes, and natural environmental stresses on aquatic populations. At the onset it should be stated that there is no existing model which can achieve this ideal performance. However, modelling skills and techniques are available to develop useful aquatic population models. This paper discusses the considerations involved in developing these models and briefly describes the various types of population models which have been developed to date

  3. A hybrid SEA/modal technique for modeling structural-acoustic interior noise in rotorcraft.

    Science.gov (United States)

    Jayachandran, V; Bonilha, M W

    2003-03-01

    This paper describes a hybrid technique that combines Statistical Energy Analysis (SEA) predictions for structural vibration with acoustic modal summation techniques to predict interior noise levels in rotorcraft. The method was applied for predicting the sound field inside a mock-up of the interior panel system of the Sikorsky S-92 helicopter. The vibration amplitudes of the frame and panel systems were predicted using a detailed SEA model and these were used as inputs to the model of the interior acoustic space. The spatial distribution of the vibration field on individual panels, and their coupling to the acoustic space were modeled using stochastic techniques. Leakage and nonresonant transmission components were accounted for using space-averaged values obtained from a SEA model of the complete structural-acoustic system. Since the cabin geometry was quite simple, the modeling of the interior acoustic space was performed using a standard modal summation technique. Sound pressure levels predicted by this approach at specific microphone locations were compared with measured data. Agreement within 3 dB in one-third octave bands above 40 Hz was observed. A large discrepancy in the one-third octave band in which the first acoustic mode is resonant (31.5 Hz) was observed. Reasons for such a discrepancy are discussed in the paper. The developed technique provides a method for modeling helicopter cabin interior noise in the frequency mid-range where neither FEA nor SEA is individually effective or accurate.

  4. Application of separable parameter space techniques to multi-tracer PET compartment modeling

    International Nuclear Information System (INIS)

    Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J

    2016-01-01

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg–Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models. (paper)

  5. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  6. Using Game Theory Techniques and Concepts to Develop Proprietary Models for Use in Intelligent Games

    Science.gov (United States)

    Christopher, Timothy Van

    2011-01-01

    This work is about analyzing games as models of systems. The goal is to understand the techniques that have been used by game designers in the past, and to compare them to the study of mathematical game theory. Through the study of a system or concept a model often emerges that can effectively educate students about making intelligent decisions…

  7. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  8. A fully blanketed early B star LTE model atmosphere using an opacity sampling technique

    International Nuclear Information System (INIS)

    Phillips, A.P.; Wright, S.L.

    1980-01-01

    A fully blanketed LTE model of a stellar atmosphere with Tsub(e) = 21914 K (thetasub(e) = 0.23), log g = 4 is presented. The model includes an explicit representation of the opacity due to the strongest lines, and uses a statistical opacity sampling technique to represent the weaker line opacity. The sampling technique is subjected to several tests and the model is compared with an atmosphere calculated using the line-distribution function method. The limitations of the distribution function method and the particular opacity sampling method used here are discussed in the light of the results obtained. (author)

  9. Identification techniques for phenomenological models of hysteresis based on the conjugate gradient method

    International Nuclear Information System (INIS)

    Andrei, Petru; Oniciuc, Liviu; Stancu, Alexandru; Stoleriu, Laurentiu

    2007-01-01

    An identification technique for the parameters of phenomenological models of hysteresis is presented. The basic idea of our technique is to set up a system of equations for the parameters of the model as a function of known quantities on the major or minor hysteresis loops (e.g. coercive force, susceptibilities at various points, remanence), or other magnetization curves. This system of equations can be either over or underspecified and is solved by using the conjugate gradient method. Numerical results related to the identification of parameters in the Energetic, Jiles-Atherton, and Preisach models are presented

  10. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    Science.gov (United States)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  11. Experimental investigation of the predictive capabilities of data driven modeling techniques in hydrology - Part 2: Application

    Directory of Open Access Journals (Sweden)

    A. Elshorbagy

    2010-10-01

    Full Text Available In this second part of the two-part paper, the data driven modeling (DDM experiment, presented and explained in the first part, is implemented. Inputs for the five case studies (half-hourly actual evapotranspiration, daily peat soil moisture, daily till soil moisture, and two daily rainfall-runoff datasets are identified, either based on previous studies or using the mutual information content. Twelve groups (realizations were randomly generated from each dataset by randomly sampling without replacement from the original dataset. Neural networks (ANNs, genetic programming (GP, evolutionary polynomial regression (EPR, Support vector machines (SVM, M5 model trees (M5, K-nearest neighbors (K-nn, and multiple linear regression (MLR techniques are implemented and applied to each of the 12 realizations of each case study. The predictive accuracy and uncertainties of the various techniques are assessed using multiple average overall error measures, scatter plots, frequency distribution of model residuals, and the deterioration rate of prediction performance during the testing phase. Gamma test is used as a guide to assist in selecting the appropriate modeling technique. Unlike two nonlinear soil moisture case studies, the results of the experiment conducted in this research study show that ANNs were a sub-optimal choice for the actual evapotranspiration and the two rainfall-runoff case studies. GP is the most successful technique due to its ability to adapt the model complexity to the modeled data. EPR performance could be close to GP with datasets that are more linear than nonlinear. SVM is sensitive to the kernel choice and if appropriately selected, the performance of SVM can improve. M5 performs very well with linear and semi linear data, which cover wide range of hydrological situations. In highly nonlinear case studies, ANNs, K-nn, and GP could be more successful than other modeling techniques. K-nn is also successful in linear situations, and it

  12. Experimental investigation of the predictive capabilities of data driven modeling techniques in hydrology - Part 2: Application

    Science.gov (United States)

    Elshorbagy, A.; Corzo, G.; Srinivasulu, S.; Solomatine, D. P.

    2010-10-01

    In this second part of the two-part paper, the data driven modeling (DDM) experiment, presented and explained in the first part, is implemented. Inputs for the five case studies (half-hourly actual evapotranspiration, daily peat soil moisture, daily till soil moisture, and two daily rainfall-runoff datasets) are identified, either based on previous studies or using the mutual information content. Twelve groups (realizations) were randomly generated from each dataset by randomly sampling without replacement from the original dataset. Neural networks (ANNs), genetic programming (GP), evolutionary polynomial regression (EPR), Support vector machines (SVM), M5 model trees (M5), K-nearest neighbors (K-nn), and multiple linear regression (MLR) techniques are implemented and applied to each of the 12 realizations of each case study. The predictive accuracy and uncertainties of the various techniques are assessed using multiple average overall error measures, scatter plots, frequency distribution of model residuals, and the deterioration rate of prediction performance during the testing phase. Gamma test is used as a guide to assist in selecting the appropriate modeling technique. Unlike two nonlinear soil moisture case studies, the results of the experiment conducted in this research study show that ANNs were a sub-optimal choice for the actual evapotranspiration and the two rainfall-runoff case studies. GP is the most successful technique due to its ability to adapt the model complexity to the modeled data. EPR performance could be close to GP with datasets that are more linear than nonlinear. SVM is sensitive to the kernel choice and if appropriately selected, the performance of SVM can improve. M5 performs very well with linear and semi linear data, which cover wide range of hydrological situations. In highly nonlinear case studies, ANNs, K-nn, and GP could be more successful than other modeling techniques. K-nn is also successful in linear situations, and it should

  13. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  14. Application of data assimilation technique for flow field simulation for Kaiga site using TAPM model

    International Nuclear Information System (INIS)

    Shrivastava, R.; Oza, R.B.; Puranik, V.D.; Hegde, M.N.; Kushwaha, H.S.

    2008-01-01

    The data assimilation techniques are becoming popular nowadays to get realistic flow field simulation for the site under consideration. The present paper describes data assimilation technique for flow field simulation for Kaiga site using the air pollution model (TAPM) developed by CSIRO, Australia. In this, the TAPM model was run for Kaiga site for a period of one month (Nov. 2004) using the analysed meteorological data supplied with the model for Central Asian (CAS) region and the model solutions were nudged with the observed wind speed and wind direction data available for the site. The model was run with 4 nested grids with grid spacing varying from 30km, 10km, 3km and 1km respectively. The models generated results with and without nudging are statistically compared with the observations. (author)

  15. Reliability modeling of digital component in plant protection system with various fault-tolerant techniques

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Kim, Hee Eun; Lee, Seung Jun; Seong, Poong Hyun

    2013-01-01

    Highlights: • Integrated fault coverage is introduced for reflecting characteristics of fault-tolerant techniques in the reliability model of digital protection system in NPPs. • The integrated fault coverage considers the process of fault-tolerant techniques from detection to fail-safe generation process. • With integrated fault coverage, the unavailability of repairable component of DPS can be estimated. • The new developed reliability model can reveal the effects of fault-tolerant techniques explicitly for risk analysis. • The reliability model makes it possible to confirm changes of unavailability according to variation of diverse factors. - Abstract: With the improvement of digital technologies, digital protection system (DPS) has more multiple sophisticated fault-tolerant techniques (FTTs), in order to increase fault detection and to help the system safely perform the required functions in spite of the possible presence of faults. Fault detection coverage is vital factor of FTT in reliability. However, the fault detection coverage is insufficient to reflect the effects of various FTTs in reliability model. To reflect characteristics of FTTs in the reliability model, integrated fault coverage is introduced. The integrated fault coverage considers the process of FTT from detection to fail-safe generation process. A model has been developed to estimate the unavailability of repairable component of DPS using the integrated fault coverage. The new developed model can quantify unavailability according to a diversity of conditions. Sensitivity studies are performed to ascertain important variables which affect the integrated fault coverage and unavailability

  16. Prototype particulate stack sampler with single-cut nozzle and microcomputer calculating/display system

    International Nuclear Information System (INIS)

    Eler, J.C.; Littlefield, L.G.; Tillery, M.I.

    1979-01-01

    A prototype particulate stack sampler (PPSS) has been developed to improve on the existing EPA Method 5 sampling apparatus. Its primary features are (1) higher sampling rate (56 1/min); (2) display (on demand) of all required variables and calculated values by a microcomputer-based calculating and display system; (3) continuous stack gas moisture determination; (4) a virtual impactor nozzle with 3 μm mass median diameter cutpoint which collects fine and coarse particle fractions on separate glass fiber filters; (5) a variable-area inlet to maintain isokinetic sampling conditions; and (6) conversion to stainless steel components from the glass specified by EPA Method 5. The basic sampling techniques of EPA Method 5 have been retained; however, versatility in the form of optional in-stack filters and general modernization of the stack sampler have been provided in the prototype design. Laboratory testing with monodisperse dye aerosols has shown the present variable inlet, virtual impactor nozzle to have a collection efficiency which is less than 77% and significant wall losses. This is primarily due to lack of symmetry in this rectangular jet impactor and short transition lengths dictated by physical design constraints (required passage of the nozzle through a 7.6 cm (3 in) diameter stack port). Electronic components have shown acceptable service in laboratory testing although no field testing of the prototype under a broad range of temperature, humidity, and SO 2 concentration has been undertaken

  17. Biomaterial porosity determined by fractal dimensions, succolarity and lacunarity on microcomputed tomographic images

    International Nuclear Information System (INIS)

    N'Diaye, Mambaye; Degeratu, Cristinel; Bouler, Jean-Michel; Chappard, Daniel

    2013-01-01

    Porous structures are becoming more and more important in biology and material science because they help in reducing the density of the grafted material. For biomaterials, porosity also increases the accessibility of cells and vessels inside the grafted area. However, descriptors of porosity are scanty. We have used a series of biomaterials with different types of porosity (created by various porogens: fibers, beads …). Blocks were studied by microcomputed tomography for the measurement of 3D porosity. 2D sections were re-sliced to analyze the microarchitecture of the pores and were transferred to image analysis programs: star volumes, interconnectivity index, Minkowski–Bouligand and Kolmogorov fractal dimensions were determined. Lacunarity and succolarity, two recently described fractal dimensions, were also computed. These parameters provided a precise description of porosity and pores' characteristics. Non-linear relationships were found between several descriptors e.g. succolarity and star volume of the material. A linear correlation was found between lacunarity and succolarity. These techniques appear suitable in the study of biomaterials usable as bone substitutes. Highlights: ► Interconnected porosity is important in the development of bone substitutes. ► Porosity was evaluated by 2D and 3D morphometry on microCT images. ► Euclidean and fractal descriptors measure interconnectivity on 2D microCT images. ► Lacunarity and succolarity were evaluated on a series of porous biomaterials

  18. Micro-computed tomography characterization of tissue engineering scaffolds: effects of pixel size and rotation step.

    Science.gov (United States)

    Cengiz, Ibrahim Fatih; Oliveira, Joaquim Miguel; Reis, Rui L

    2017-08-01

    Quantitative assessment of micro-structure of materials is of key importance in many fields including tissue engineering, biology, and dentistry. Micro-computed tomography (µ-CT) is an intensively used non-destructive technique. However, the acquisition parameters such as pixel size and rotation step may have significant effects on the obtained results. In this study, a set of tissue engineering scaffolds including examples of natural and synthetic polymers, and ceramics were analyzed. We comprehensively compared the quantitative results of µ-CT characterization using 15 acquisition scenarios that differ in the combination of the pixel size and rotation step. The results showed that the acquisition parameters could statistically significantly affect the quantified mean porosity, mean pore size, and mean wall thickness of the scaffolds. The effects are also practically important since the differences can be as high as 24% regarding the mean porosity in average, and 19.5 h and 166 GB regarding the characterization time and data storage per sample with a relatively small volume. This study showed in a quantitative manner the effects of such a wide range of acquisition scenarios on the final data, as well as the characterization time and data storage per sample. Herein, a clear picture of the effects of the pixel size and rotation step on the results is provided which can notably be useful to refine the practice of µ-CT characterization of scaffolds and economize the related resources.

  19. Transfer of physics detector models into CAD systems using modern techniques

    International Nuclear Information System (INIS)

    Dach, M.; Vuoskoski, J.

    1996-01-01

    Designing high energy physics detectors for future experiments requires sophisticated computer aided design and simulation tools. In order to satisfy the future demands in this domain, modern techniques, methods, and standards have to be applied. We present an interface application, designed and implemented using object-oriented techniques, for the widely used GEANT physics simulation package. It converts GEANT detector models into the future industrial standard, STEP. (orig.)

  20. Comparison of Analysis and Spectral Nudging Techniques for Dynamical Downscaling with the WRF Model over China

    OpenAIRE

    Ma, Yuanyuan; Yang, Yi; Mai, Xiaoping; Qiu, Chongjian; Long, Xiao; Wang, Chenghai

    2016-01-01

    To overcome the problem that the horizontal resolution of global climate models may be too low to resolve features which are important at the regional or local scales, dynamical downscaling has been extensively used. However, dynamical downscaling results generally drift away from large-scale driving fields. The nudging technique can be used to balance the performance of dynamical downscaling at large and small scales, but the performances of the two nudging techniques (analysis nudging and s...

  1. Assessment of Venous Thrombosis in Animal Models.

    Science.gov (United States)

    Grover, Steven P; Evans, Colin E; Patel, Ashish S; Modarai, Bijan; Saha, Prakash; Smith, Alberto

    2016-02-01

    Deep vein thrombosis and common complications, including pulmonary embolism and post-thrombotic syndrome, represent a major source of morbidity and mortality worldwide. Experimental models of venous thrombosis have provided considerable insight into the cellular and molecular mechanisms that regulate thrombus formation and subsequent resolution. Here, we critically appraise the ex vivo and in vivo techniques used to assess venous thrombosis in these models. Particular attention is paid to imaging modalities, including magnetic resonance imaging, micro-computed tomography, and high-frequency ultrasound that facilitate longitudinal assessment of thrombus size and composition. © 2015 American Heart Association, Inc.

  2. Construction of Eh-pH and other stability diagrams of uranium in a multicomponent system with a microcomputer

    International Nuclear Information System (INIS)

    Haung, H.; Cuentas, L.

    1989-01-01

    Stability diagrams for a multicomponent system in aqueous chemistry provide important information for hydrometallurgy, corrosion science, geochemistry and environmental science. Two distinct types of diagrams constructed are predominance diagrams and distribution diagrams. The ability to construct stability diagrams easily, quickly and accurately is most helpful in research and development and in academic programs. The use of a microcomputer is handicapped by slow speed and limited memory. Developing program methods that promote easy calculation and plot the diagram directly on a CRT or a plotter is a primary concern. As presented in this paper, the calculation of equilibrium and boundary constraints, combined with isolation of stability areas, works well for constructing predominance diagrams. Equilibrium constraints can be obtained based on free energies of formation. Boundary constraints for the ligand component are the boundary of the diagram, and constraints for the main component are the surrounding lines of each dominant ligand. Other considerations regarding the chemical model, mathematics computation and the use of microcomputers pertaining to diagram construction are discussed. The uranium in a multicomponent system is used for demonstration

  3. Influence of Heat Treatment of Nickel-Titanium Rotary Endodontic Instruments on Apical Preparation: A Micro-Computed Tomographic Study.

    Science.gov (United States)

    de Almeida, Bernardo Corrêa; Ormiga, Fabíola; de Araújo, Marcos César Pimenta; Lopes, Ricardo Tadeu; Lima, Inayá Corrêa Barbosa; dos Santos, Bernardo Camargo; Gusman, Heloisa

    2015-12-01

    The aim of this study was to make a 3-dimensional comparison of the canal transportation and changes in apical geometry using micro-computed tomographic imaging after canal preparation with K3 (SybronEndo, Orange, CA) and K3XF (SybronEndo) file systems. Twenty-eight mandibular molars were randomly divided into 2 groups according to the rotary system used in instrumentation: K3 or K3XF. The specimens were scanned by micro-computed tomographic imaging before and after instrumentation. Images before and after instrumentation from each group were compared with regard to canal volume, surface area, and structure model index (SMI) (paired t test, P instrumentation, the canals from each group were compared regarding the changes in volume, surface area, SMI, and canal transportation in the last 4 apical mm (t test, P Instrumentation with the 2 rotary systems significantly changed the canal volume, surface area, and SMI (P instrument types concerning these parameters (P > .05). There were no significant differences between the 2 groups with regard to canal transportation in the last 4 apical mm (P > .05). Both rotary systems showed adequate canal preparations with reduced values of canal transportation. Heat treatment did not influence changes in root canal geometry in the apical region. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  4. An experimental technique for the modelling of air flow movements in nuclear plant

    International Nuclear Information System (INIS)

    Ainsworth, R.W.; Hallas, N.J.

    1986-01-01

    This paper describes an experimental technique developed at Harwell to model ventilation flows in plant at 1/5th scale. The technique achieves dynamic similarity not only for forced convection imposed by the plant ventilation system, but also for the interaction between natural convection (from heated objects) and forced convection. The use of a scale model to study flow of fluids is a well established technique, relying upon various criteria, expressed in terms of dimensionless numbers, to achieve dynamic similarity. For forced convective flows, simulation of Reynolds number is sufficient, but to model natural convection and its interaction with forced convection, the Rayleigh, Grashof and Prandtl numbers must be simulated at the same time. This paper describes such a technique, used in experiments on a hypothetical glove box cell to study the interaction between forced and natural convection. The model contained features typically present in a cell, such as a man, motor, stairs, glove box, etc. The aim of the experiment was to study the overall flow patterns, especially around the model man 'working' at the glove box. The cell ventilation was theoretically designed to produce a downward flow over the face of the man working at the glove box. However, the results have shown that the flow velocities produced an upwards flow over the face of the man. The work has indicated the viability of modelling simultaneously the forced and natural convection processes in a cell. It has also demonstrated that simplistic assumptions cannot be made about ventilation flow patterns. (author)

  5. Measuring the efficacy of flunixin meglumine and meloxicam for lame sows using a GAITFour pressure mat and an embedded microcomputer-based force plate system.

    Science.gov (United States)

    Pairis-Garcia, M D; Johnson, A K; Abell, C A; Coetzee, J F; Karriker, L A; Millman, S T; Stalder, K J

    2015-05-01

    Pain associated with lameness on farm is a negative affective state and has a detrimental impact on individual farm animal welfare. Animal pain can be managed utilizing husbandry tools and through pharmacological approaches. Nonsteroidal anti-inflammatory drugs including meloxicam and flunixin meglumine are compounds used in many species for pain management because they are easy to administer, long lasting, and cost-effective. Assessing an animal's biomechanical parameters using such tools as the embedded microcomputer-based force plate system and GAITFour pressure mat gait analysis walkway system provides an objective, sensitive, and precise means to detect animals in lame states. The objectives of this study were to determine the efficacy of meloxicam and flunixin meglumine for pain mitigation in lame sows using the embedded microcomputer-based force plate system and GAITFour pressure mat gait analysis walkway system. Lameness was induced in 24 mature mixed-parity sows using a chemical synovitis model and compared 3 treatments: meloxicam (1.0 mg/kg per os), flunixin meglumine (2.2 mg/kg intramuscular) and sterile saline (intramuscular). Weight distribution (kg) for each foot was collected twice per second for a total of 5 min for each time point using the embedded microcomputer-based force plate system. Stride time, stride length, maximum pressure, activated sensors, and stance time were collected using 3 quality walks (readings) for each time point using the GAITFour pressure mat gait analysis walkway system. Sows administered flunixin meglumine or meloxicam tolerated more weight on their lame leg compared with saline sows (P embedded microcomputer-based force plate system and GAITFour pressure mat gait analysis walkway system. Analgesic drugs may be a key tool to manage negative pain affective states associated with lameness.

  6. Three-Dimensional Nonlinear Finite Element Analysis and Microcomputed Tomography Evaluation of Microgap Formation in a Dental Implant Under Oblique Loading.

    Science.gov (United States)

    Jörn, Daniela; Kohorst, Philipp; Besdo, Silke; Borchers, Lothar; Stiesch, Meike

    2016-01-01

    Since bacterial leakage along the implant-abutment interface may be responsible for peri-implant infections, a realistic estimation of the interface gap width during function is important for risk assessment. The purpose of this study was to compare two methods for investigating microgap formation in a loaded dental implant, namely, microcomputed tomography (micro-CT) and three-dimensional (3D) nonlinear finite element analysis (FEA); additionally, stresses to be expected during loading were also evaluated by FEA. An implant-abutment complex was inspected for microgaps between the abutment and implant in a micro-CT scanner under an oblique load of 200 N. A numerical model of the situation was constructed; boundary conditions and external load were defined according to the experiment. The model was refined stepwise until its load-displacement behavior corresponded sufficiently to data from previous load experiments. FEA of the final, validated model was used to determine microgap widths. These were compared with the widths as measured in micro-CT inspection. Finally, stress distributions were evaluated in selected regions. No microgaps wider than 13 μm could be detected by micro-CT for the loaded implant. FEA revealed gap widths up to 10 μm between the implant and abutment at the side of load application. Furthermore, FEA predicted plastic deformation in a limited area at the implant collar. FEA proved to be an adequate method for studying microgap formation in dental implant-abutment complexes. FEA is not limited in gap width resolution as are radiologic techniques and can also provide insight into stress distributions within the loaded complex.

  7. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-06-01

    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  8. Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling

    DEFF Research Database (Denmark)

    Dotto, C. B.; Mannina, G.; Kleidorfer, M.

    2012-01-01

    -UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multiobjective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty...... techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside...... the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper...

  9. Volumetric quantification of bone-implant contact using micro-computed tomography analysis based on region-based segmentation.

    Science.gov (United States)

    Kang, Sung-Won; Lee, Woo-Jin; Choi, Soon-Chul; Lee, Sam-Sun; Heo, Min-Suk; Huh, Kyung-Hoe; Kim, Tae-Il; Yi, Won-Jin

    2015-03-01

    We have developed a new method of segmenting the areas of absorbable implants and bone using region-based segmentation of micro-computed tomography (micro-CT) images, which allowed us to quantify volumetric bone-implant contact (VBIC) and volumetric absorption (VA). The simple threshold technique generally used in micro-CT analysis cannot be used to segment the areas of absorbable implants and bone. Instead, a region-based segmentation method, a region-labeling method, and subsequent morphological operations were successively applied to micro-CT images. The three-dimensional VBIC and VA of the absorbable implant were then calculated over the entire volume of the implant. Two-dimensional (2D) bone-implant contact (BIC) and bone area (BA) were also measured based on the conventional histomorphometric method. VA and VBIC increased significantly with as the healing period increased (pimplants using micro-CT analysis using a region-based segmentation method.

  10. Monte Carlo calculations of electron transport on microcomputers

    International Nuclear Information System (INIS)

    Chung, Manho; Jester, W.A.; Levine, S.H.; Foderaro, A.H.

    1990-01-01

    In the work described in this paper, the Monte Carlo program ZEBRA, developed by Berber and Buxton, was converted to run on the Macintosh computer using Microsoft BASIC to reduce the cost of Monte Carlo calculations using microcomputers. Then the Eltran2 program was transferred to an IBM-compatible computer. Turbo BASIC and Microsoft Quick BASIC have been used on the IBM-compatible Tandy 4000SX computer. The paper shows the running speed of the Monte Carlo programs on the different computers, normalized to one for Eltran2 on the Macintosh-SE or Macintosh-Plus computer. Higher values refer to faster running times proportionally. Since Eltran2 is a one-dimensional program, it calculates energy deposited in a semi-infinite multilayer slab. Eltran2 has been modified to a two-dimensional program called Eltran3 to computer more accurately the case with a point source, a small detector, and a short source-to-detector distance. The running time of Eltran3 is about twice as long as that of Eltran2 for a similar case

  11. Practical use of a microcomputer for ventilation calculations

    International Nuclear Information System (INIS)

    D'Albrand, N.; Froger, C.; Josien, J.P.

    1983-01-01

    Calculations of ventilation networks are necessary to elaborate the projects and to handle the gradual development of a ventilation network. The means which have been used up to the present to tackle these problems are the simulator and the computer, each with its own advantages and disadvantages. These means can be improved by considering the following needs of the user: short response time in the calculation of the state of a network; easy data input and presentation of the results (diagram, visual display); keeping records of the results. Cerchar has developed a program for the calculation of ventilation networks for use with a microcomputer psi 80 (Kontron) which has been constructed around a microprocessor Z 80. The data of the network to be calculated are entered by a keyboard and stored on a small disc; the data and the results can be displayed on a screen or produced as a print-out. This program is suitable for the calculation of networks composed of 400 branches and 300 nodes and comprising up to 9 different curves of characteristics as far as the mine fans are concerned. Cerchar investigates at present the visual display of a network on an interactive graphic terminal which, when perfected, will put at the disposal of the mine operators a tool for use by people other than data processing experts; this enables them to have permanently on the site their network for immediate access to consult it or make new calculations [fr

  12. Inexpensive remote video surveillance system with microcomputer and solar cells

    International Nuclear Information System (INIS)

    Guevara Betancourt, Edder

    2013-01-01

    A low-cost prototype is developed with a RPI plate for remote video surveillance. Additionally, the theoretical basis to provide energy independence have developed through solar cells and a battery bank. Some existing commercial monitoring systems are studied and analyzed, components such as: cameras, communication devices (WiFi and 3G), free software packages for video surveillance, control mechanisms and theory remote photovoltaic systems. A number of steps are developed to implement the module and install, configure and test each of the elements of hardware and software that make up the module, exploring the feasibility of providing intelligence to the system using the software chosen. Events that have been generated by motion detection have been simple, intuitive way to view, archive and extract. The implementation of the module by a microcomputer video surveillance and motion detection software (Zoneminder) has been an option for a lot of potential; as the platform for monitoring and recording data has provided all the tools to make a robust and secure surveillance. (author) [es

  13. Microcomputer-controlled flow meter used on a water loop

    International Nuclear Information System (INIS)

    Haniger, L.

    1982-01-01

    The report describes a microcomputer-controlled instrument intended for operational measurement on an experimental water loop. On the basis of pressure and temperature input signals the instrument calculates the specific weight, and for ten operator-selectable measuring channels it calculates the mass flow G(kp/s), or the voluminal flow Q(m 3 /h). On pressing the appropriate push-buttons the built-in display indicates the values of pressure (p) and temperature (t), as well as the values of specific weight γ calculated therefrom. For ten individually selectable channels the instrument displays either the values of the pressure differences of the measuring throttling elements (√Δpsub(i)), or the values of Gsub(i) or Qsub(i) as obtained by calculation. In addition, on pressing the Σ-push-button it summarizes the values of Gsub(i) and Qsub(i) for the selected channels. The device is controlled by an 8085 microprocessor, the analog unit MP 6812 being used as the A/D convertor. The instrument algorithm indicates some possible errors which may concern faults of input signals or mistakes in calculation. (author)

  14. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  15. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  16. A Review of Domain Modelling and Domain Imaging Techniques in Ferroelectric Crystals

    Directory of Open Access Journals (Sweden)

    John E. Huber

    2011-02-01

    Full Text Available The present paper reviews models of domain structure in ferroelectric crystals, thin films and bulk materials. Common crystal structures in ferroelectric materials are described and the theory of compatible domain patterns is introduced. Applications to multi-rank laminates are presented. Alternative models employing phase-field and related techniques are reviewed. The paper then presents methods of observing ferroelectric domain structure, including optical, polarized light, scanning electron microscopy, X-ray and neutron diffraction, atomic force microscopy and piezo-force microscopy. Use of more than one technique for unambiguous identification of the domain structure is also described.

  17. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  18. Integrated approach to model decomposed flow hydrograph using artificial neural network and conceptual techniques

    Science.gov (United States)

    Jain, Ashu; Srinivasulu, Sanaga

    2006-02-01

    This paper presents the findings of a study aimed at decomposing a flow hydrograph into different segments based on physical concepts in a catchment, and modelling different segments using different technique viz. conceptual and artificial neural networks (ANNs). An integrated modelling framework is proposed capable of modelling infiltration, base flow, evapotranspiration, soil moisture accounting, and certain segments of the decomposed flow hydrograph using conceptual techniques and the complex, non-linear, and dynamic rainfall-runoff process using ANN technique. Specifically, five different multi-layer perceptron (MLP) and two self-organizing map (SOM) models have been developed. The rainfall and streamflow data derived from the Kentucky River catchment were employed to test the proposed methodology and develop all the models. The performance of all the models was evaluated using seven different standard statistical measures. The results obtained in this study indicate that (a) the rainfall-runoff relationship in a large catchment consists of at least three or four different mappings corresponding to different dynamics of the underlying physical processes, (b) an integrated approach that models the different segments of the decomposed flow hydrograph using different techniques is better than a single ANN in modelling the complex, dynamic, non-linear, and fragmented rainfall runoff process, (c) a simple model based on the concept of flow recession is better than an ANN to model the falling limb of a flow hydrograph, and (d) decomposing a flow hydrograph into the different segments corresponding to the different dynamics based on the physical concepts is better than using the soft decomposition employed using SOM.

  19. BREEDER: a microcomputer program for financial analysis of a large-scale prototype breeder reactor

    International Nuclear Information System (INIS)

    Giese, R.F.

    1984-04-01

    This report describes a microcomputer-based, single-project financial analysis program: BREEDER. BREEDER is a user-friendly model designed to facilitate frequent and rapid analyses of the financial implications associated with alternative design and financing strategies for electric generating plants and large-scale prototype breeder (LSPB) reactors in particular. The model has proved to be a useful tool in establishing cost goals for LSPB reactors. The program is available on floppy disks for use on an IBM personal computer (or IBM look-a-like) running under PC-DOS or a Kaypro II transportable computer running under CP/M (and many other CP/M machines). The report documents version 1.5 of BREEDER and contains a user's guide. The report also includes a general overview of BREEDER, a summary of hardware requirements, a definition of all required program inputs, a description of all algorithms used in performing the construction-period and operation-period analyses, and a summary of all available reports. The appendixes contain a complete source-code listing, a cross-reference table, a sample interactive session, several sample runs, and additional documentation of the net-equity program option

  20. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  1. Forecasting performances of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2014-01-01

    . The performances of these three model selectors are compared by looking at the accuracy of the forecasts of the estimated neural network models. We apply the neural network model and the three modelling techniques to monthly industrial production and unemployment series from the G7 countries and the four......In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact...... that they form a class of universal approximators and may be expected to work well during exceptional periods such as major economic crises. Neural network models are often difficult to estimate, and we follow the idea of White (2006) of transforming the specification and nonlinear estimation problem...

  2. Machine Learning Techniques for Modelling Short Term Land-Use Change

    Directory of Open Access Journals (Sweden)

    Mileva Samardžić-Petrović

    2017-11-01

    Full Text Available The representation of land use change (LUC is often achieved by using data-driven methods that include machine learning (ML techniques. The main objectives of this research study are to implement three ML techniques, Decision Trees (DT, Neural Networks (NN, and Support Vector Machines (SVM for LUC modeling, in order to compare these three ML techniques and to find the appropriate data representation. The ML techniques are applied on the case study of LUC in three municipalities of the City of Belgrade, the Republic of Serbia, using historical geospatial data sets and considering nine land use classes. The ML models were built and assessed using two different time intervals. The information gain ranking technique and the recursive attribute elimination procedure were implemented to find the most informative attributes that were related to LUC in the study area. The results indicate that all three ML techniques can be used effectively for short-term forecasting of LUC, but the SVM achieved the highest agreement of predicted changes.

  3. Electricity market price spike analysis by a hybrid data model and feature selection technique

    International Nuclear Information System (INIS)

    Amjady, Nima; Keynia, Farshid

    2010-01-01

    In a competitive electricity market, energy price forecasting is an important activity for both suppliers and consumers. For this reason, many techniques have been proposed to predict electricity market prices in the recent years. However, electricity price is a complex volatile signal owning many spikes. Most of electricity price forecast techniques focus on the normal price prediction, while price spike forecast is a different and more complex prediction process. Price spike forecasting has two main aspects: prediction of price spike occurrence and value. In this paper, a novel technique for price spike occurrence prediction is presented composed of a new hybrid data model, a novel feature selection technique and an efficient forecast engine. The hybrid data model includes both wavelet and time domain variables as well as calendar indicators, comprising a large candidate input set. The set is refined by the proposed feature selection technique evaluating both relevancy and redundancy of the candidate inputs. The forecast engine is a probabilistic neural network, which are fed by the selected candidate inputs of the feature selection technique and predict price spike occurrence. The efficiency of the whole proposed method for price spike occurrence forecasting is evaluated by means of real data from the Queensland and PJM electricity markets. (author)

  4. Low level waste management: a compilation of models and monitoring techniques. Volume 1

    International Nuclear Information System (INIS)

    Mosier, J.E.; Fowler, J.R.; Barton, C.J.

    1980-04-01

    In support of the National Low-Level Waste (LLW) Management Research and Development Program being carried out at Oak Ridge National Laboratory, Science Applications, Inc., conducted a survey of models and monitoring techniques associated with the transport of radionuclides and other chemical species from LLW burial sites. As a result of this survey, approximately 350 models were identified. For each model the purpose and a brief description are presented. To the extent possible, a point of contact and reference material are identified. The models are organized into six technical categories: atmospheric transport, dosimetry, food chain, groundwater transport, soil transport, and surface water transport. About 4% of the models identified covered other aspects of LLW management and are placed in a miscellaneous category. A preliminary assessment of all these models was performed to determine their ability to analyze the transport of other chemical species. The models that appeared to be applicable are identified. A brief survey of the state-of-the-art techniques employed to monitor LLW burial sites is also presented, along with a very brief discussion of up-to-date burial techniques

  5. Low level waste management: a compilation of models and monitoring techniques. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mosier, J.E.; Fowler, J.R.; Barton, C.J. (comps.)

    1980-04-01

    In support of the National Low-Level Waste (LLW) Management Research and Development Program being carried out at Oak Ridge National Laboratory, Science Applications, Inc., conducted a survey of models and monitoring techniques associated with the transport of radionuclides and other chemical species from LLW burial sites. As a result of this survey, approximately 350 models were identified. For each model the purpose and a brief description are presented. To the extent possible, a point of contact and reference material are identified. The models are organized into six technical categories: atmospheric transport, dosimetry, food chain, groundwater transport, soil transport, and surface water transport. About 4% of the models identified covered other aspects of LLW management and are placed in a miscellaneous category. A preliminary assessment of all these models was performed to determine their ability to analyze the transport of other chemical species. The models that appeared to be applicable are identified. A brief survey of the state-of-the-art techniques employed to monitor LLW burial sites is also presented, along with a very brief discussion of up-to-date burial techniques.

  6. Evaluation of mesh morphing and mapping techniques in patient specific modeling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2013-01-01

    Robust generation of pelvic finite element models is necessary to understand the variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis and their strain distributions evaluated. Morphing and mapping techniques were effectively applied to generate good quality geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Evaluation of mesh morphing and mapping techniques in patient specific modelling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2012-08-01

    Robust generation of pelvic finite element models is necessary to understand variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity-based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis, and their strain distributions were evaluated. Morphing and mapping techniques were effectively applied to generate good quality and geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  8. Control System Design for Cylindrical Tank Process Using Neural Model Predictive Control Technique

    Directory of Open Access Journals (Sweden)

    M. Sridevi

    2010-10-01

    Full Text Available Chemical manufacturing and process industry requires innovative technologies for process identification. This paper deals with model identification and control of cylindrical process. Model identification of the process was done using ARMAX technique. A neural model predictive controller was designed for the identified model. The performance of the controllers was evaluated using MATLAB software. The performance of NMPC controller was compared with Smith Predictor controller and IMC controller based on rise time, settling time, overshoot and ISE and it was found that the NMPC controller is better suited for this process.

  9. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...... and efficient framework for estimation. These advantages are used to for instance estimate stochastic volatility models with leverage effect or with Student-t distributed errors. We also model changing time series characteristics of the US inflation rate by considering a heteroskedastic ARFIMA model where...

  10. Fuzzy modeling and control of rotary inverted pendulum system using LQR technique

    International Nuclear Information System (INIS)

    Fairus, M A; Mohamed, Z; Ahmad, M N

    2013-01-01

    Rotary inverted pendulum (RIP) system is a nonlinear, non-minimum phase, unstable and underactuated system. Controlling such system can be a challenge and is considered a benchmark in control theory problem. Prior to designing a controller, equations that represent the behaviour of the RIP system must be developed as accurately as possible without compromising the complexity of the equations. Through Takagi-Sugeno (T-S) fuzzy modeling technique, the nonlinear system model is then transformed into several local linear time-invariant models which are then blended together to reproduce, or approximate, the nonlinear system model within local region. A parallel distributed compensation (PDC) based fuzzy controller using linear quadratic regulator (LQR) technique is designed to control the RIP system. The results show that the designed controller able to balance the RIP system

  11. Advanced particle-in-cell simulation techniques for modeling the Lockheed Martin Compact Fusion Reactor

    Science.gov (United States)

    Welch, Dale; Font, Gabriel; Mitchell, Robert; Rose, David

    2017-10-01

    We report on particle-in-cell developments of the study of the Compact Fusion Reactor. Millisecond, two and three-dimensional simulations (cubic meter volume) of confinement and neutral beam heating of the magnetic confinement device requires accurate representation of the complex orbits, near perfect energy conservation, and significant computational power. In order to determine initial plasma fill and neutral beam heating, these simulations include ionization, elastic and charge exchange hydrogen reactions. To this end, we are pursuing fast electromagnetic kinetic modeling algorithms including a two implicit techniques and a hybrid quasi-neutral algorithm with kinetic ions. The kinetic modeling includes use of the Poisson-corrected direct implicit, magnetic implicit, as well as second-order cloud-in-cell techniques. The hybrid algorithm, ignoring electron inertial effects, is two orders of magnitude faster than kinetic but not as accurate with respect to confinement. The advantages and disadvantages of these techniques will be presented. Funded by Lockheed Martin.

  12. Analysis of Multipath Mitigation Techniques with Land Mobile Satellite Channel Model

    Directory of Open Access Journals (Sweden)

    M. Z. H. Bhuiyan J. Zhang

    2012-12-01

    Full Text Available Multipath is undesirable for Global Navigation Satellite System (GNSS receivers, since the reception of multipath can create a significant distortion to the shape of the correlation function leading to an error in the receivers’ position estimate. Many multipath mitigation techniques exist in the literature to deal with the multipath propagation problem in the context of GNSS. The multipath studies in the literature are often based on optimistic assumptions, for example, assuming a static two-path channel or a fading channel with a Rayleigh or a Nakagami distribution. But, in reality, there are a lot of channel modeling issues, for example, satellite-to-user geometry, variable number of paths, variable path delays and gains, Non Line-Of-Sight (NLOS path condition, receiver movements, etc. that are kept out of consideration when analyzing the performance of these techniques. Therefore, this is of utmost importance to analyze the performance of different multipath mitigation techniques in some realistic measurement-based channel models, for example, the Land Multipath is undesirable for Global Navigation Satellite System (GNSS receivers, since the reception of multipath can create a significant distortion to the shape of the correlation function leading to an error in the receivers’ position estimate. Many multipath mitigation techniques exist in the literature to deal with the multipath propagation problem in the context of GNSS. The multipath studies in the literature are often based on optimistic assumptions, for example, assuming a static two-path channel or a fading channel with a Rayleigh or a Nakagami distribution. But, in reality, there are a lot of channel modeling issues, for example, satellite-to-user geometry, variable number of paths, variable path delays and gains, Non Line-Of-Sight (NLOS path condition, receiver movements, etc. that are kept out of consideration when analyzing the performance of these techniques. Therefore, this

  13. Robotic and endoscopic transoral thyroidectomy: feasibility and description of the technique in the cadaveric model.

    Science.gov (United States)

    Kahramangil, Bora; Mohsin, Khuzema; Alzahrani, Hassan; Bu Ali, Daniah; Tausif, Syed; Kang, Sang-Wook; Kandil, Emad; Berber, Eren

    2017-12-01

    Numerous new approaches have been described over the years to improve the cosmetic outcomes of thyroid surgery. Transoral approach is a new technique that aims to achieve superior cosmetic outcomes by concealing the incision in the oral cavity. Transoral thyroidectomy through vestibular approach was performed in two institutions on cadaveric models. Procedure was performed endoscopically in one institution, while the robotic technique was utilized at the other. Transoral thyroidectomy was successfully performed at both institutions with robotic and endoscopic techniques. All vital structures were identified and preserved. Transoral thyroidectomy has been performed in animal and cadaveric models, as well as in some clinical studies. Our initial experience indicates the feasibility of this approach. More clinical studies are required to elucidate its full utility.

  14. Modeling of high-pressure generation using the laser colliding foil technique

    Energy Technology Data Exchange (ETDEWEB)

    Fabbro, R.; Faral, B.; Virmont, J.; Cottet, F.; Romain, J.P.

    1989-03-01

    An analytical model describing the collision of two foils is presented and applied to the collision of laser-accelerated foils. Numerical simulations have been made to verify this model and to compare its results in the case of laser-accelerated foils. Scaling laws relating the different parameters (shock pressure, laser intensity, target material, etc.) have been established. The application of this technique to high-pressure equation of state experiments is then discussed.

  15. Modeling of high-pressure generation using the laser colliding foil technique

    International Nuclear Information System (INIS)

    Fabbro, R.; Faral, B.; Virmont, J.; Cottet, F.; Romain, J.P.

    1989-01-01

    An analytical model describing the collision of two foils is presented and applied to the collision of laser-accelerated foils. Numerical simulations have been made to verify this model and to compare its results in the case of laser-accelerated foils. Scaling laws relating the different parameters (shock pressure, laser intensity, target material, etc.) have been established. The application of this technique to high-pressure equation of state experiments is then discussed

  16. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  17. The need for novel model order reduction techniques in the electronics industry (Chapter 1)

    NARCIS (Netherlands)

    Schilders, W.H.A.; Benner, P.; Hinze, M.; Maten, ter E.J.W.

    2011-01-01

    In this paper, we discuss the present and future needs of the electronics industry with regard to model order reduction. The industry has always been one of the main motivating fields for the development of MOR techniques, and continues to play this role. We discuss the search for provably passive

  18. Combined rock-physical modelling and seismic inversion techniques for characterisation of stacked sandstone reservoir

    NARCIS (Netherlands)

    Justiniano, A.; Jaya, Y.; Diephuis, G.; Veenhof, R.; Pringle, T.

    2015-01-01

    The objective of the study is to characterise the Triassic massive stacked sandstone deposits of the Main Buntsandstein Subgroup at Block Q16 located in the West Netherlands Basin. The characterisation was carried out through combining rock-physics modelling and seismic inversion techniques. The

  19. Application of modelling techniques in the food industry: determination of shelf-life for chilled foods

    NARCIS (Netherlands)

    Membré, J.M.; Johnston, M.D.; Bassett, J.; Naaktgeboren, G.; Blackburn, W.; Gorris, L.G.M.

    2005-01-01

    Microbiological modelling techniques (predictive microbiology, the Bayesian Markov Chain Monte Carlo method and a probability risk assessment approach) were combined to assess the shelf-life of an in-pack heat-treated, low-acid sauce intended to be marketed under chilled conditions. From a safety

  20. Extending the reach of strong-coupling: an iterative technique for Hamiltonian lattice models

    International Nuclear Information System (INIS)

    Alberty, J.; Greensite, J.; Patkos, A.

    1983-12-01

    The authors propose an iterative method for doing lattice strong-coupling-like calculations in a range of medium to weak couplings. The method is a modified Lanczos scheme, with greatly improved convergence properties. The technique is tested on the Mathieu equation and on a Hamiltonian finite-chain XY model, with excellent results. (Auth.)

  1. New model reduction technique for a class of parabolic partial differential equations

    NARCIS (Netherlands)

    Vajta, Miklos

    1991-01-01

    A model reduction (or lumping) technique for a class of parabolic-type partial differential equations is given, and its application is discussed. The frequency response of the temperature distribution in any multilayer solid is developed and given by a matrix expression. The distributed transfer

  2. Prediction of Monthly Summer Monsoon Rainfall Using Global Climate Models Through Artificial Neural Network Technique

    Science.gov (United States)

    Nair, Archana; Singh, Gurjeet; Mohanty, U. C.

    2018-01-01

    The monthly prediction of summer monsoon rainfall is very challenging because of its complex and chaotic nature. In this study, a non-linear technique known as Artificial Neural Network (ANN) has been employed on the outputs of Global Climate Models (GCMs) to bring out the vagaries inherent in monthly rainfall prediction. The GCMs that are considered in the study are from the International Research Institute (IRI) (2-tier CCM3v6) and the National Centre for Environmental Prediction (Coupled-CFSv2). The ANN technique is applied on different ensemble members of the individual GCMs to obtain monthly scale prediction over India as a whole and over its spatial grid points. In the present study, a double-cross-validation and simple randomization technique was used to avoid the over-fitting during training process of the ANN model. The performance of the ANN-predicted rainfall from GCMs is judged by analysing the absolute error, box plots, percentile and difference in linear error in probability space. Results suggest that there is significant improvement in prediction skill of these GCMs after applying the ANN technique. The performance analysis reveals that the ANN model is able to capture the year to year variations in monsoon months with fairly good accuracy in extreme years as well. ANN model is also able to simulate the correct signs of rainfall anomalies over different spatial points of the Indian domain.

  3. The commercial use of segmentation and predictive modeling techniques for database marketing in the Netherlands

    NARCIS (Netherlands)

    Verhoef, PC; Spring, PN; Hoekstra, JC; Leeflang, PSH

    Although the application of segmentation and predictive modeling is an important topic in the database marketing (DBM) literature, no study has yet investigated the extent of adoption of these techniques. We present the results of a Dutch survey involving 228 database marketing companies. We find

  4. Evaluation of inverse modeling techniques for pinpointing water leakages at building constructions

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2015-01-01

    The location and nature of the moisture leakages are sometimes difficult to detect. Moreover, the relation between observed inside surface moisture patterns and where the moisture enters the construction is often not clear. The objective of this paper is to investigate inverse modeling techniques as

  5. Numerical modelling of radon-222 entry into houses: An outline of techniques and results

    DEFF Research Database (Denmark)

    Andersen, C.E.

    2001-01-01

    Numerical modelling is a powerful tool for studies of soil gas and radon-222 entry into houses. It is the purpose of this paper to review some main techniques and results. In the past, modelling has focused on Darcy flow of soil gas (driven by indoor–outdoor pressure differences) and combined...... diffusive and advective transport of radon. Models of different complexity have been used. The simpler ones are finite-difference models with one or two spatial dimensions. The more complex models allow for full three-dimensional and time dependency. Advanced features include: soil heterogeneity, anisotropy......, fractures, moisture, non-uniform soil temperature, non-Darcy flow of gas, and flow caused by changes in the atmospheric pressure. Numerical models can be used to estimate the importance of specific factors for radon entry. Models are also helpful when results obtained in special laboratory or test structure...

  6. Nonlinear modelling of polymer electrolyte membrane fuel cell stack using nonlinear cancellation technique

    Energy Technology Data Exchange (ETDEWEB)

    Barus, R. P. P., E-mail: rismawan.ppb@gmail.com [Engineering Physics, Faculty of Industrial Technology, Institut Teknologi Bandung, Jalan Ganesa 10 Bandung and Centre for Material and Technical Product, Jalan Sangkuriang No. 14 Bandung (Indonesia); Tjokronegoro, H. A.; Leksono, E. [Engineering Physics, Faculty of Industrial Technology, Institut Teknologi Bandung, Jalan Ganesa 10 Bandung (Indonesia); Ismunandar [Chemistry Study, Faculty of Mathematics and Science, Institut Teknologi Bandung, Jalan Ganesa 10 Bandung (Indonesia)

    2014-09-25

    Fuel cells are promising new energy conversion devices that are friendly to the environment. A set of control systems are required in order to operate a fuel cell based power plant system optimally. For the purpose of control system design, an accurate fuel cell stack model in describing the dynamics of the real system is needed. Currently, linear model are widely used for fuel cell stack control purposes, but it has limitations in narrow operation range. While nonlinear models lead to nonlinear control implemnetation whos more complex and hard computing. In this research, nonlinear cancellation technique will be used to transform a nonlinear model into a linear form while maintaining the nonlinear characteristics. The transformation is done by replacing the input of the original model by a certain virtual input that has nonlinear relationship with the original input. Then the equality of the two models is tested by running a series of simulation. Input variation of H2, O2 and H2O as well as disturbance input I (current load) are studied by simulation. The error of comparison between the proposed model and the original nonlinear model are less than 1 %. Thus we can conclude that nonlinear cancellation technique can be used to represent fuel cell nonlinear model in a simple linear form while maintaining the nonlinear characteristics and therefore retain the wide operation range.

  7. Nonlinear modelling of polymer electrolyte membrane fuel cell stack using nonlinear cancellation technique

    International Nuclear Information System (INIS)

    Barus, R. P. P.; Tjokronegoro, H. A.; Leksono, E.; Ismunandar

    2014-01-01

    Fuel cells are promising new energy conversion devices that are friendly to the environment. A set of control systems are required in order to operate a fuel cell based power plant system optimally. For the purpose of control system design, an accurate fuel cell stack model in describing the dynamics of the real system is needed. Currently, linear model are widely used for fuel cell stack control purposes, but it has limitations in narrow operation range. While nonlinear models lead to nonlinear control implemnetation whos more complex and hard computing. In this research, nonlinear cancellation technique will be used to transform a nonlinear model into a linear form while maintaining the nonlinear characteristics. The transformation is done by replacing the input of the original model by a certain virtual input that has nonlinear relationship with the original input. Then the equality of the two models is tested by running a series of simulation. Input variation of H2, O2 and H2O as well as disturbance input I (current load) are studied by simulation. The error of comparison between the proposed model and the original nonlinear model are less than 1 %. Thus we can conclude that nonlinear cancellation technique can be used to represent fuel cell nonlinear model in a simple linear form while maintaining the nonlinear characteristics and therefore retain the wide operation range

  8. Experience with the Large Eddy Simulation (LES) Technique for the Modelling of Premixed and Non-premixed Combustion

    OpenAIRE

    Malalasekera, W; Ibrahim, SS; Masri, AR; Gubba, SR; Sadasivuni, SK

    2013-01-01

    Compared to RANS based combustion modelling, the Large Eddy Simulation (LES) technique has recently emerged as a more accurate and very adaptable technique in terms of handling complex turbulent interactions in combustion modelling problems. In this paper application of LES based combustion modelling technique and the validation of models in non-premixed and premixed situations are considered. Two well defined experimental configurations where high quality data are available for validation is...

  9. Root canal morphology of primary molars: a micro-computed tomography study.

    Science.gov (United States)

    Fumes, A C; Sousa-Neto, M D; Leoni, G B; Versiani, M A; da Silva, L A B; da Silva, R A B; Consolaro, A

    2014-10-01

    This was to investigate the root canal morphology of primary molar teeth using micro-computed tomography. Primary maxillary (n = 20) and mandibular (n = 20) molars were scanned at a resolution of 16.7 μm and analysed regarding the number, location, volume, area, structured model index (SMI), area, roundness, diameters, and length of canals, as well as the thickness of dentine in the apical third. Data were statistically compared by using paired-sample t test, independent sample t test, and one-way analysis of variance with significance level set as 5%. Overall, no statistical differences were found between the canals with respect to length, SMI, dentine thickness, area, roundness, and diameter (p > 0.05). A double canal system was observed in the mesial and mesio-buccal roots of the mandibular and maxillary molars, respectively. The thickness in the internal aspect of the roots was lower than in the external aspect. Cross-sectional evaluation of the roots in the apical third showed flat-shaped canals in the mandibular molars and ribbon- and oval-shaped canals in the maxillary molars. External and internal anatomy of the primary first molars closely resemble the primary second molars. The reported data may help clinicians to obtain a thorough understanding of the morphological variations of root canals in primary molars to overcome problems related to shaping and cleaning procedures, allowing appropriate management strategies for root canal treatment.

  10. Finite element analysis of the mechanical properties of cellular aluminium based on micro-computed tomography

    International Nuclear Information System (INIS)

    Veyhl, C.; Belova, I.V.; Murch, G.E.; Fiedler, T.

    2011-01-01

    Research highlights: → Elastic and plastic anisotropy is observed for both materials → Both show qualitatively similar characteristics with quantitative differences → Distinctly higher mechanical properties for closed-cell foam → The 'big' and 'small' models show good agreement for the closed-cell foam. - Abstract: In the present paper, the macroscopic mechanical properties of open-cell M-Pore sponge (porosity of 91-93%) and closed-cell Alporas foam (porosity of 80-86%) are investigated. The complex geometry of these cellular materials is scanned by micro-computed tomography and used in finite element (FE) analysis. The mechanical properties are determined by uni-axial compression simulations in three perpendicular directions (x-, y- and z-direction). M-Pore and Alporas exhibit the same qualitative mechanical characteristics but with quantitative differences. In both cases, strong anisotropy is observed for Young's modulus and the 0.002 offset yield stress. Furthermore, for the investigated relative density range a linear dependence between relative density and mechanical properties is found. Finally, a distinctly higher Young's modulus and 0.002 offset yield stress is observed for Alporas.

  11. In vivo microcomputed tomography evaluation of rat alveolar bone and root resorption during orthodontic tooth movement.

    Science.gov (United States)

    Ru, Nan; Liu, Sean Shih-Yao; Zhuang, Li; Li, Song; Bai, Yuxing

    2013-05-01

    To observe the real-time microarchitecture changes of the alveolar bone and root resorption during orthodontic treatment. A 10 g force was delivered to move the maxillary left first molars mesially in twenty 10-week-old rats for 14 days. The first molar and adjacent alveolar bone were scanned using in vivo microcomputed tomography at the following time points: days 0, 3, 7, and 14. Microarchitecture parameters, including bone volume fraction, structure model index, trabecular thickness, trabecular number, and trabecular separation of alveolar bone, were measured on the compression and tension side. The total root volume was measured, and the resorption crater volume at each time point was calculated. Univariate repeated measures analysis of variance with Bonferroni corrections were performed to compare the differences in each parameter between time points with significance level at P Root resorption volume of the mesial root increased significantly on day 7 of orthodontic loading. Real-time root and bone resorption during orthodontic movement can be observed in 3 dimensions using in vivo micro-CT. Alveolar bone resorption and root resorption were observed mostly in the apical third on day 7 on the compression side; bone formation was observed on day 14 on the tension side during orthodontic tooth movement.

  12. Cochlear implant-related three-dimensional characteristics determined by micro-computed tomography reconstruction.

    Science.gov (United States)

    Ni, Yusu; Dai, Peidong; Dai, Chunfu; Li, Huawei

    2017-01-01

    To explore the structural characteristics of the cochlea in three-dimensional (3D) detail using 3D micro-computed tomography (mCT) image reconstruction of the osseous labyrinth, with the aim of improving the structural design of electrodes, the selection of stimulation sites, and the effectiveness of cochlear implantation. Three temporal bones were selected from among adult donors' temporal bone specimens. A micro-CT apparatus (GE eXplore) was used to scan three specimens with a voxel resolution of 45 μm. We obtained about 460 slices/specimen, which produced abundant data. The osseous labyrinth images of three specimens were reconstructed from mCT. The cochlea and its spiral characteristics were measured precisely using Able Software 3D-DOCTOR. The 3D images of the osseous labyrinth, including the cochlea, vestibule, and semicircular canals, were reconstructed. The 3D models of the cochlea showed the spatial relationships and surface structural characteristics. Quantitative data concerning the cochlea and its spiral structural characteristics were analyzed with regard to cochlear implantation. The 3D reconstruction of mCT images clearly displayed the detailed spiral structural characteristics of the osseous labyrinth. Quantitative data regarding the cochlea and its spiral structural characteristics could help to improve electrode structural design, signal processing, and the effectiveness of cochlear implantation. Clin. Anat. 30:39-43, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. A note on the multi model super ensemble technique for reducing forecast errors

    International Nuclear Information System (INIS)

    Kantha, L.; Carniel, S.; Sclavo, M.

    2008-01-01

    The multi model super ensemble (S E) technique has been used with considerable success to improve meteorological forecasts and is now being applied to ocean models. Although the technique has been shown to produce deterministic forecasts that can be superior to the individual models in the ensemble or a simple multi model ensemble forecast, there is a clear need to understand its strengths and limitations. This paper is an attempt to do so in simple, easily understood contexts. The results demonstrate that the S E forecast is almost always better than the simple ensemble forecast, the degree of improvement depending on the properties of the models in the ensemble. However, the skill of the S E forecast with respect to the true forecast depends on a number of factors, principal among which is the skill of the models in the ensemble. As can be expected, if the ensemble consists of models with poor skill, the S E forecast will also be poor, although better than the ensemble forecast. On the other hand, the inclusion of even a single skillful model in the ensemble increases the forecast skill significantly.

  14. A characteristic study of CCF modeling techniques and optimization of CCF defense strategies

    International Nuclear Information System (INIS)

    Kim, Min Chull

    2000-02-01

    Common Cause Failures (CCFs ) are among the major contributors to risk and core damage frequency (CDF ) from operating nuclear power plants (NPPs ). Our study on CCF focused on the following aspects : 1) a characteristic study on the CCF modeling techniques and 2) development of the optimal CCF defense strategy. Firstly, the characteristics of CCF modeling techniques were studied through sensitivity study of CCF occurrence probability upon system redundancy. The modeling techniques considered in this study include those most widely used worldwide, i.e., beta factor, MGL, alpha factor, and binomial failure rate models. We found that MGL and alpha factor models are essentially identical in terms of the CCF probability. Secondly, in the study for CCF defense, the various methods identified in the previous studies for defending against CCF were classified into five different categories. Based on these categories, we developed a generic method by which the optimal CCF defense strategy can be selected. The method is not only qualitative but also quantitative in nature: the selection of the optimal strategy among candidates is based on the use of analytic hierarchical process (AHP). We applied this method to two motor-driven valves for containment sump isolation in Ulchin 3 and 4 nuclear power plants. The result indicates that the method for developing an optimal CCF defense strategy is effective

  15. HIGHLY-ACCURATE MODEL ORDER REDUCTION TECHNIQUE ON A DISCRETE DOMAIN

    Directory of Open Access Journals (Sweden)

    L. D. Ribeiro

    2015-09-01

    Full Text Available AbstractIn this work, we present a highly-accurate technique of model order reduction applied to staged processes. The proposed method reduces the dimension of the original system based on null values of moment-weighted sums of heat and mass balance residuals on real stages. To compute these sums of weighted residuals, a discrete form of Gauss-Lobatto quadrature was developed, allowing a high degree of accuracy in these calculations. The locations where the residuals are cancelled vary with time and operating conditions, characterizing a desirable adaptive nature of this technique. Balances related to upstream and downstream devices (such as condenser, reboiler, and feed tray of a distillation column are considered as boundary conditions of the corresponding difference-differential equations system. The chosen number of moments is the dimension of the reduced model being much lower than the dimension of the complete model and does not depend on the size of the original model. Scaling of the discrete independent variable related with the stages was crucial for the computational implementation of the proposed method, avoiding accumulation of round-off errors present even in low-degree polynomial approximations in the original discrete variable. Dynamical simulations of distillation columns were carried out to check the performance of the proposed model order reduction technique. The obtained results show the superiority of the proposed procedure in comparison with the orthogonal collocation method.

  16. Dynamic model reduction: An overview of available techniques with application to power systems

    Directory of Open Access Journals (Sweden)

    Đukić Savo D.

    2012-01-01

    Full Text Available This paper summarises the model reduction techniques used for the reduction of large-scale linear and nonlinear dynamic models, described by the differential and algebraic equations that are commonly used in control theory. The groups of methods discussed in this paper for reduction of the linear dynamic model are based on singular perturbation analysis, modal analysis, singular value decomposition, moment matching and methods based on a combination of singular value decomposition and moment matching. Among the nonlinear dynamic model reduction methods, proper orthogonal decomposition, the trajectory piecewise linear method, balancing-based methods, reduction by optimising system matrices and projection from a linearised model, are described. Part of the paper is devoted to the techniques commonly used for reduction (equivalencing of large-scale power systems, which are based on coherency, synchrony, singular perturbation analysis, modal analysis and identification. Two (most interesting of the described techniques are applied to the reduction of the commonly used New England 10-generator, 39-bus test power system.

  17. A New Profile Learning Model for Recommendation System based on Machine Learning Technique

    Directory of Open Access Journals (Sweden)

    Shereen H. Ali

    2016-03-01

    Full Text Available Recommender systems (RSs have been used to successfully address the information overload problem by providing personalized and targeted recommendations to the end users. RSs are software tools and techniques providing suggestions for items to be of use to a user, hence, they typically apply techniques and methodologies from Data Mining. The main contribution of this paper is to introduce a new user profile learning model to promote the recommendation accuracy of vertical recommendation systems. The proposed profile learning model employs the vertical classifier that has been used in multi classification module of the Intelligent Adaptive Vertical Recommendation (IAVR system to discover the user’s area of interest, and then build the user’s profile accordingly. Experimental results have proven the effectiveness of the proposed profile learning model, which accordingly will promote the recommendation accuracy.

  18. Effects of pump recycling technique on stimulated Brillouin scattering threshold: a theoretical model.

    Science.gov (United States)

    Al-Asadi, H A; Al-Mansoori, M H; Ajiya, M; Hitam, S; Saripan, M I; Mahdi, M A

    2010-10-11

    We develop a theoretical model that can be used to predict stimulated Brillouin scattering (SBS) threshold in optical fibers that arises through the effect of Brillouin pump recycling technique. Obtained simulation results from our model are in close agreement with our experimental results. The developed model utilizes single mode optical fiber of different lengths as the Brillouin gain media. For 5-km long single mode fiber, the calculated threshold power for SBS is about 16 mW for conventional technique. This value is reduced to about 8 mW when the residual Brillouin pump is recycled at the end of the fiber. The decrement of SBS threshold is due to longer interaction lengths between Brillouin pump and Stokes wave.

  19. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  20. Optimization models and techniques for implementation and pricing of electricity markets

    International Nuclear Information System (INIS)

    Madrigal Martinez, M.

    2001-01-01

    The operation and planning of vertically integrated electric power systems can be optimized using models that simulate solutions to problems. As the electric power industry is going through a period of restructuring, there is a need for new optimization tools. This thesis describes the importance of optimization tools and presents techniques for implementing them. It also presents methods for pricing primary electricity markets. Three modeling groups are studied. The first considers a simplified continuous and discrete model for power pool auctions. The second considers the unit commitment problem, and the third makes use of a new type of linear network-constrained clearing system model for daily markets for power and spinning reserve. The newly proposed model considers bids for supply and demand and bilateral contracts. It is a direct current model for the transmission network