WorldWideScience

Sample records for preliminary computational analysis

  1. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  2. Preliminary analysis of the MER magnetic properties experiment using a computational fluid dynamics model

    DEFF Research Database (Denmark)

    Kinch, K.M.; Merrison, J.P.; Gunnlaugsson, H.P.;

    2006-01-01

    Motivated by questions raised by the magnetic properties experiments on the NASA Mars Pathfinder and Mars Exploration Rover (MER) missions, we have studied in detail the capture of airborne magnetic dust by permanent magnets using a computational fluid dynamics (CFD) model supported by laboratory...

  3. Computer Aided Morphological Analysis for maxillo-facial diagnostic: a preliminary study

    OpenAIRE

    2008-01-01

    This article compares most of the three-dimensional (3D) morphometric methods currently proposed by the technical literature to evaluate their morphological informative value, while applying them to a case study of five patients affected by the malocclusion pathology. The compared methods are: conventional cephalometric analysis (CCA), generalised Procrustes superimposition (GPS) with principal-components analysis (PCA), thin-plate spline analysis (TPS), multisectional spline (MS) and clearan...

  4. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. A Feasibility and Preliminary Design Study. Interim Report.

    Science.gov (United States)

    Computation Planning, Inc., Bethesda, MD.

    A feasibility analysis of a single integrated central computer system for secondary schools and junior colleges finds that a central computing facility capable of serving 50 schools with a total enrollment of 100,000 students is feasible at a cost of $18 per student per year. The recommended system is a multiprogrammed-batch operation. Preliminary…

  5. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...

  6. UVISS preliminary visibility analysis

    DEFF Research Database (Denmark)

    Betto, Maurizio

    1998-01-01

    The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...

  7. Age estimation by pulp-to-tooth area ratio using cone-beam computed tomography: A preliminary analysis

    Science.gov (United States)

    Rai, Arpita; Acharya, Ashith B.; Naikmasur, Venkatesh G.

    2016-01-01

    Background: Age estimation of living or deceased individuals is an important aspect of forensic sciences. Conventionally, pulp-to-tooth area ratio (PTR) measured from periapical radiographs have been utilized as a nondestructive method of age estimation. Cone-beam computed tomography (CBCT) is a new method to acquire three-dimensional images of the teeth in living individuals. Aims: The present study investigated age estimation based on PTR of the maxillary canines measured in three planes obtained from CBCT image data. Settings and Design: Sixty subjects aged 20–85 years were included in the study. Materials and Methods: For each tooth, mid-sagittal, mid-coronal, and three axial sections—cementoenamel junction (CEJ), one-fourth root level from CEJ, and mid-root—were assessed. PTR was calculated using AutoCAD software after outlining the pulp and tooth. Statistical Analysis Used: All statistical analyses were performed using an SPSS 17.0 software program. Results and Conclusions: Linear regression analysis showed that only PTR in axial plane at CEJ had significant age correlation (r = 0.32; P < 0.05). This is probably because of clearer demarcation of pulp and tooth outline at this level. PMID:28123269

  8. Computer code and users' guide for the preliminary analysis of dual-mode space nuclear fission solid core power and propulsion systems, NUROC3A. AMS report No. 1239b

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, R.A.; Smith, W.W.

    1976-06-30

    The three-volume report describes a dual-mode nuclear space power and propulsion system concept that employs an advanced solid-core nuclear fission reactor coupled via heat pipes to one of several electric power conversion systems. The second volume describes the computer code and users' guide for the preliminary analysis of the system.

  9. Preliminary 3D computational analysis of the relationship between aortic displacement force and direction of endograft movement.

    Science.gov (United States)

    Figueroa, C Alberto; Taylor, Charles A; Yeh, Victoria; Chiou, Allen J; Gorrepati, Madhu L; Zarins, Christopher K

    2010-06-01

    Endograft migration is usually described as a downward displacement of the endograft with respect to the renal arteries. However, change in endograft position is actually a complex process in three-dimensional (3D) space. Currently, there are no established techniques to define such positional changes over time. The purpose of this study is to determine whether the direction of aortic endograft movement as observed in follow-up computed tomography (CT) scans is related to the directional displacement force acting on the endograft. We quantitated the 3D positional change over time of five abdominal endografts by determining the endograft centroid at baseline (postoperative scan) and on follow-up CT scans. The time interval between CT scans for the 5 patients ranged from 8 months to 8 years. We then used 3D image segmentation and computational fluid dynamics (CFD) techniques to quantitate the pulsatile displacement force (in Newtons [N]) acting on the endografts in the postoperative configurations. Finally, we calculated a correlation metric between the direction of the displacement force vector and the endograft movement by computing the cosine of the angle of these two vectors. The average 3D movement of the endograft centroid was 18 mm (range, 9-29 mm) with greater movement in patients with longer follow-up times. In all cases, the movement of the endograft had significant components in all three spatial directions: Two of the endografts had the largest component of movement in the transverse direction, whereas three endografts had the largest component of movement in the axial direction. The magnitude and orientation of the endograft displacement force varied depending on aortic angulation and hemodynamic conditions. The average magnitude of displacement force for all endografts was 5.8 N (range, 3.7-9.5 N). The orientation of displacement force was in general perpendicular to the greatest curvature of the endograft. The average correlation metric, defined as the

  10. Motion analysis of total cervical disc replacements using computed tomography: Preliminary experience with nine patients and a model

    Energy Technology Data Exchange (ETDEWEB)

    Svedmark, Per (Div. of Orthopedics, Dept. of Molecular Medicine and Surgery, Karolinska Institutet, Stockholm (Sweden); Stockholm Spine Center, Lowenstromska Hospital, Stockholm (Sweden)), email: per.svedmark@spinecenter.se; Lundh, Fredrik; Olivecrona, Henrik (Div. of Orthopedics, Dept. of Molecular Medicine and Surgery, Karolinska Institutet, Stockholm (Sweden)); Nemeth, Gunnar (Capio group, Stockholm (Sweden)); Noz, Marilyn E. (Dept. of Radiology, New York Univ. School of Medicine, New York (United States)); Maguire Jr, Gerald Q. (School of Information and Communication Technology, Royal Inst. of Technology, Kista (Sweden)); Zeleznik, Michael P. (Saya Systems Inc., Salt Lake City (United States))

    2011-12-15

    Background. Cervical total disc replacement (CTDR) is an alternative to anterior fusion. Therefore, it is desirable to have an accurate in vivo measurement of prosthetic kinematics and assessment of implant stability relative to the adjacent vertebrae. Purpose. To devise an in vivo CT-based method to analyze the kinematics of cervical total disc replacements (CTDR), specifically of two prosthetic components between two CT scans obtained under different conditions. Material and Methods. Nine patients with CTDR were scanned in flexion and extension of the cervical spine using a clinical CT scanner with a routine low-dose protocol. The flexion and extension CT volume data were spatially registered, and the prosthetic kinematics of two prosthetic components, an upper and a lower, was calculated and expressed in Euler angles and orthogonal linear translations relative to the upper component. For accuracy analysis, a cervical spine model incorporating the same disc replacement as used in the patients was also scanned and processed in the same manner. Results. Analysis of both the model and patients showed good repeatability, i.e. within 2 standard deviations of the mean using the 95% limits of agreement with no overlapping confidence intervals. The accuracy analysis showed that the median error was close to zero. Conclusion. The mobility of the cervical spine after total disc replacement can be effectively measured in vivo using CT. This method requires an appropriate patient positioning and scan parameters to achieve suitable image quality

  11. Concept Overview & Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-07-12

    'H2@Scale' is an opportunity for wide-scale use of hydrogen as an intermediate that carries energy from various production options to multiple uses. It is based on identifying and developing opportunities for low-cost hydrogen production and investigating opportunities for using that hydrogen across the electricity, industrial, and transportation sectors. One of the key production opportunities is use of low-cost electricity that may be generated under high penetrations of variable renewable generators such as wind and solar photovoltaics. The technical potential demand for hydrogen across the sectors is 60 million metric tons per year. The U.S. has sufficient domestic renewable resources so that each could meet that demand and could readily meet the demand using a portfolio of generation options. This presentation provides an overview of the concept and the technical potential demand and resources. It also motivates analysis and research on H2@Scale.

  12. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  13. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  14. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  15. Text analysis and computers

    OpenAIRE

    1995-01-01

    Content: Erhard Mergenthaler: Computer-assisted content analysis (3-32); Udo Kelle: Computer-aided qualitative data analysis: an overview (33-63); Christian Mair: Machine-readable text corpora and the linguistic description of danguages (64-75); Jürgen Krause: Principles of content analysis for information retrieval systems (76-99); Conference Abstracts (100-131).

  16. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  17. Modeling the complete Otto cycle: Preliminary version. [computer programming

    Science.gov (United States)

    Zeleznik, F. J.; Mcbride, B. J.

    1977-01-01

    A description is given of the equations and the computer program being developed to model the complete Otto cycle. The program incorporates such important features as: (1) heat transfer, (2) finite combustion rates, (3) complete chemical kinetics in the burned gas, (4) exhaust gas recirculation, and (5) manifold vacuum or supercharging. Changes in thermodynamic, kinetic and transport data as well as model parameters can be made without reprogramming. Preliminary calculations indicate that: (1) chemistry and heat transfer significantly affect composition and performance, (2) there seems to be a strong interaction among model parameters, and (3) a number of cycles must be calculated in order to obtain steady-state conditions.

  18. Computer-aided hepatic tumour ablation requirements and preliminary results

    CERN Document Server

    Voirin, D; Amavizca, M; Letoublon, C; Troccaz, J; Voirin, David; Payan, Yohan; Amavizca, Miriam; Letoublon, Christian; Troccaz, Jocelyne

    2002-01-01

    Surgical resection of hepatic tumours is not always possible, since it depends on different factors, among which their location inside the liver functional segments. Alternative techniques consist in local use of chemical or physical agents to destroy the tumour. Radio frequency and cryosurgical ablations are examples of such alternative techniques that may be performed percutaneously. This requires a precise localisation of the tumour placement during ablation. Computer-assisted surgery tools may be used in conjunction with these new ablation techniques to improve the therapeutic efficiency, whilst they benefit from minimal invasiveness. This paper introduces the principles of a system for computer-assisted hepatic tumour ablation and describes preliminary experiments focusing on data registration evaluation. To keep close to conventional protocols, we consider registration of pre-operative CT or MRI data to intra-operative echographic data.

  19. Analysis of vector models in quantification of artifacts produced by standard prosthetic inlays in Cone-Beam Computed Tomography (CBCT)--a preliminary study.

    Science.gov (United States)

    Różyło-Kalinowska, Ingrid; Miechowicz, Sławomir; Sarna-Boś, Katarzyna; Borowicz, Janusz; Kalinowski, Paweł

    2014-11-17

    Cone-beam computed tomography (CBCT) is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany), at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL). The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI) of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071). Ceramic inlay with zirconium dioxide (Cera Post) as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post) produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin) and active (Flexi Flange). Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  20. Analysis of Vector Models in Quantification of Artifacts Produced by Standard Prosthetic Inlays in Cone-Beam Computed Tomography (CBCT – a Preliminary Study

    Directory of Open Access Journals (Sweden)

    Ingrid Różyło-Kalinowska

    2014-11-01

    Full Text Available Cone-beam computed tomography (CBCT is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany, at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL. The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071. Ceramic inlay with zirconium dioxide (Cera Post as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin and active (Flexi Flange. Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  1. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  2. Computational Music Analysis

    DEFF Research Database (Denmark)

    in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern recognition, machine learning, topology, algebra and signal processing. Many of the methods described draw...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry.......This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...

  3. Computational Analysis of Behavior.

    Science.gov (United States)

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  4. Preliminary Analysis of Google+'s Privacy

    OpenAIRE

    2011-01-01

    In this paper we provide a preliminary analysis of Google+ privacy. We identified that Google+ shares photo metadata with users who can access the photograph and discuss its potential impact on privacy. We also identified that Google+ encourages the provision of other names including maiden name, which may help criminals performing identity theft. We show that Facebook lists are a superset of Google+ circles, both functionally and logically, even though Google+ provides a better user interfac...

  5. Trenton ICES: demonstration of a grid connected integrated community energy system. Phase II. Volume 3. Preliminary design of ICES system and analysis of community ownership: computer printouts

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-22

    This volume supplements Vol. 2 and consists entirely of computer printouts. The report consists of three parts: (1) hourly log of plant simulation based on 1982 ICES Community, with thermal storage, on-peak and off-peak electric generation, and 80% maximum kW trip-off; (2) same as (1) except without thermal storage; and (3) hourly load and demand profiles--1979, 1980, and 1982 ICES communities.

  6. Simulation of the preliminary General Electric SP-100 space reactor concept using the ATHENA computer code

    Science.gov (United States)

    Fletcher, C. D.

    The capability to perform thermal-hydraulic analyses of a space reactor using the ATHENA computer code is demonstrated. The fast reactor, liquid-lithium coolant loops, and lithium-filled heat pipes of the preliminary General electric SP-100 design were modeled with ATHENA. Two demonstration transient calculations were performed simulating accident conditions. Calculated results are available for display using the Nuclear Plant Analyzer color graphics analysis tool in addition to traditional plots. ATHENA-calculated results appear reasonable, both for steady state full power conditions, and for the two transients. This analysis represents the first known transient thermal-hydraulic simulation using an integral space reactor system model incorporating heat pipes.

  7. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  8. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  9. Analyzing high energy physics data using database computing: Preliminary report

    Science.gov (United States)

    Baden, Andrew; Day, Chris; Grossman, Robert; Lifka, Dave; Lusk, Ewing; May, Edward; Price, Larry

    1991-01-01

    A proof of concept system is described for analyzing high energy physics (HEP) data using data base computing. The system is designed to scale up to the size required for HEP experiments at the Superconducting SuperCollider (SSC) lab. These experiments will require collecting and analyzing approximately 10 to 100 million 'events' per year during proton colliding beam collisions. Each 'event' consists of a set of vectors with a total length of approx. one megabyte. This represents an increase of approx. 2 to 3 orders of magnitude in the amount of data accumulated by present HEP experiments. The system is called the HEPDBC System (High Energy Physics Database Computing System). At present, the Mark 0 HEPDBC System is completed, and can produce analysis of HEP experimental data approx. an order of magnitude faster than current production software on data sets of approx. 1 GB. The Mark 1 HEPDBC System is currently undergoing testing and is designed to analyze data sets 10 to 100 times larger.

  10. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    Science.gov (United States)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  11. BASE Flexible Array Preliminary Lithospheric Structure Analysis

    Science.gov (United States)

    Yeck, W. L.; Sheehan, A. F.; Anderson, M. L.; Siddoway, C. S.; Erslev, E.; Harder, S. H.; Miller, K. C.

    2009-12-01

    The Bighorns Arch Seismic Experiment (BASE) is a Flexible Array experiment integrated with EarthScope. The goal of BASE is to develop a better understanding of how basement-involved foreland arches form and what their link is to plate tectonic processes. To achieve this goal, the crustal structure under the Bighorn Mountain range, Bighorn Basin, and Powder River Basin of northern Wyoming and southern Montana are investigated through the deployment of 35 broadband seismometers, 200 short period seismometers, 1600 “Texan” instruments using active sources and 800 “Texan” instruments monitoring passive sources, together with field structural analysis of brittle structures. The novel combination of these approaches and anticipated simultaneous data inversion will give a detailed structural crustal image of the Bighorn region at all levels of the crust. Four models have been proposed for the formation of the Bighorn foreland arch: subhorizontal detachment within the crust, lithospheric buckling, pure shear lithospheric thickening, and fault blocks defined by lithosphere-penetrating thrust faults. During the summer of 2009, we deployed 35 broadband instruments, which have already recorded several magnitude 7+ teleseismic events. Through P wave receiver function analysis of these 35 stations folded in with many EarthScope Transportable Array stations in the region, we present a preliminary map of the Mohorovicic discontinuity. This crustal map is our first test of how the unique Moho geometries predicted by the four hypothesized models of basement involved arches fit seismic observations for the Bighorn Mountains. In addition, shear-wave splitting analysis for our first few recorded teleseisms helps us determine if strong lithospheric deformation is preserved under the range. These analyses help lead us to our final goal, a complete 4D (3D spatial plus temporal) lithospheric-scale model of arch formation which will advance our understanding of the mechanisms

  12. Computer analysis of railcar vibrations

    Science.gov (United States)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  13. Dual-fuel, dual-throat engine preliminary analysis

    Science.gov (United States)

    Obrien, C. J.

    1979-01-01

    A propulsion system analysis of the dual fuel, dual throat engine for launch vehicle applications was conducted. Basic dual throat engine characterization data were obtained to allow vehicle optimization studies to be conducted. A preliminary baseline engine system was defined.

  14. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  15. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  16. [Tuscan Chronic Care Model: a preliminary analysis].

    Science.gov (United States)

    Barbato, Angelo; Meggiolaro, Angela; Rossi, Luigi; Fioravanti, C; Palermita, F; La Torre, Giuseppe

    2015-01-01

    the aim of this study is to present a preliminary analysis of efficacy and effectiveness of a model of chronically ill care (Chronic Care Model, CCM). the analysis took into account 106 territorial modules, 1016 General Practitioners and 1,228,595 patients. The diagnostic and therapeutic pathways activated (PDTA), involved four chronic conditions, selected according to the prevalence and incidence, in Tuscany Region: Diabetes Mellitus (DM), Heart Failure (SC), Chronic Obstructive Pulmonary Disease (COPD) and stroke. Six epidemiological indicators of process and output were selected, in order to measure the model of care performed, before and after its application: adherence to specific follow-up for each pathology (use of clinical and laboratory indicators), annual average of expenditure per/capita/euro for diagnostic tests, in laboratory and instrumental, average expenditure per/capita/year for specialist visits; hospitalization rate for diseases related to the main pathology, hospitalization rate for long-term complications and rate of access to the emergency department (ED). Data were collected through the database; the differences before and after the intervention and between exposed and unexposed, were analyzed by method "Before-After (Controlled and Uncontrolled) Studies". The impact of the intervention was calculated as DD (difference of the differences). DM management showed an increased adhesion to follow-up (DD: +8.1%), and the use of laboratory diagnostics (DD: +4,9 €/year/pc), less hospitalization for long-term complications and for endocrine related diseases (DD respectively: 5.8/1000 and DD: +1.2/1000), finally a smaller increase of access to PS (DD: -1.6/1000), despite a slight increase of specialistic visits (DD: +0,38 €/year/pc). The management of SC initially showed a rising adherence to follow-up (DD: +2.3%), a decrease of specialist visits (DD:E 1.03 €/year/pc), hospitalization and access to PS for exacerbations (DD: -4.4/1000 and DD: -6

  17. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  18. Computational Aeroacoustic Analysis System Development

    Science.gov (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  19. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  20. Preliminary Economic Analysis of the TRIFOOD System.

    Science.gov (United States)

    1984-11-20

    M.D.; Carl Anderson, M.D.; and Karen Moxness, M.S.; "Nutritional Assessment of Orthopedic Patients," Mayo Clinic Proceedings (January 1981) 56:51...15) Balintfy, Joseph L., and F. Nebel , "Experiments with Computer- Assisted Menu Planning," Hospitals, Vol. 40, June 16, 1966. A .t Arthur D. Little

  1. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    Science.gov (United States)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  2. Preliminary safety design analysis of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Soo Dong; Kwon, Y. M.; Kim, K. D. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    The national long-term R and D program updated in 1997 requires Korea Atomic Energy Research Institute(KAERI) to complete by the year 2006 the basic design of Korea Advanced Liquid Metal Reactor (KALIMER), along with supporting R and D work, with the capability of resolving the issue of spent fuel storage as well as with significantly enhanced safety. KALIMER is a 150 MWe pool-type sodium cooled prototype reactor that uses metallic fuel. The conceptual design is currently under way to establish a self consistent design meeting a set of the major safety design requirements for accident prevention. Some of current emphasis include those for inherent and passive means of negative reactivity insertion and decay heat removal, high shutdown reliability, prevention of and protection from sodium chemical reaction, and high seismic margin, among others. All of these requirements affect the reactor design significantly and involve supporting R and D programs of substance. This document first introduces a set of safety design requirements and accident evaluation criteria established for the conceptual design of KALIMER and then summarizes some of the preliminary results of engineering and design analyses performed for the safety of KALIMER. 19 refs., 19 figs., 6 tabs. (Author)

  3. Preliminary analysis of patent trends for magnetic fusion technology

    Energy Technology Data Exchange (ETDEWEB)

    Levine, L.O.; Ashton, W.B.; Campbell, R.S.

    1984-02-01

    This study presents a preliminary analysis of development trends in magnetic fusion technology based on data from US patents. The research is limited to identification and description of general patent activity and ownership characteristics for 373 patents. The results suggest that more detailed studies of fusion patents could provide useful R and D planning information.

  4. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  5. Preliminary analysis of alternative fuel cycles for proliferation evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Steindler, M. J.; Ripfel, H. C.F.; Rainey, R. H.

    1977-01-01

    The ERDA Division of Nuclear Research and Applications proposed 67 nuclear fuel cycles for assessment as to their nonproliferation potential. The object of the assessment was to determine which fuel cycles pose inherently low risk for nuclear weapon proliferation while retaining the major benefits of nuclear energy. This report is a preliminary analysis of these fuel cycles to develop the fuel-recycle data that will complement reactor data, environmental data, and political considerations, which must be included in the overall evaluation. This report presents the preliminary evaluations from ANL, HEDL, ORNL, and SRL and is the basis for a continuing in-depth study. (DLC)

  6. Preliminary Integrated Safety Analysis Status Report

    Energy Technology Data Exchange (ETDEWEB)

    D. Gwyn

    2001-04-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001.

  7. Introduction to numerical analysis and scientific computing

    CERN Document Server

    Nassif, Nabil

    2013-01-01

    Computer Number Systems and Floating Point Arithmetic Introduction Conversion from Base 10 to Base 2Conversion from Base 2 to Base 10Normalized Floating Point SystemsFloating Point OperationsComputing in a Floating Point SystemFinding Roots of Real Single-Valued Functions Introduction How to Locate the Roots of a Function The Bisection Method Newton's Method The Secant MethodSolving Systems of Linear Equations by Gaussian Elimination Mathematical Preliminaries Computer Storage for Matrices. Data Structures Back Substitution for Upper Triangular Systems Gauss Reduction LU DecompositionPolynomia

  8. Preliminary analysis of GASIS user needs

    Energy Technology Data Exchange (ETDEWEB)

    Vidas, E.H.; Hugman, R.H.

    1993-12-31

    The GASIS (Gas Information System) project is a three year effort to develop a personal computer-based (CD-ROM) natural gas database and information system for the United States. GASIS will have two components: a ``Source Directory`` documenting natural gas supply-related databases and information centers and a ``Reservoir Data System`` of information for individual gas reservoirs. The Source Directory will document the location, characteristics, and accessibility of natural gas supply information sources, such as bibliographic databases, engineering and/or geological data compilations, and natural gas information centers. The Data System will be the largest portion of GASIS and will contain geological and engineering data at the reservoir level. The GASIS project will involve the compilation of existing public domain data, excerpts from Dwight`s databases, and the collection of new reservoir data. Data assembly and collection will be prioritized by the User Needs study. A ``User Needs`` assessment for the planned GASIS data system has been underway since September of this year. It is designed to cover all major segments of the gas industry, including major and independent producers, state and federal agencies, pipelines, research organizations, banks, and service companies. The objectives of the evaluation are: To design GASIS to meet the needs of industry and the research community; to determine potential applications for GASIS in order to better design the database; to prioritize data categories and specific data collection activities; to evaluate industry software and data exchange requirements.

  9. Fort Drum Preliminary Fiscal Impact Analysis.

    Science.gov (United States)

    1986-02-01

    of inmigrants 0 Fiscal histories, projections, and impacts for counties, cities, towns, villages, school districts, and the state. The results of...distribution of the inmigrating population within the three counties. Thus, an accurate forecast of the expected distribution of the inmigrating population is a...The distribution of inmigration to the school districts was made using the analysis explained in Chapter 3. Children associated with 800 new on-post

  10. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  11. A Petaflops Era Computing Analysis

    Science.gov (United States)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  12. Preliminary analysis of turbochargers rotors dynamic behaviour

    Science.gov (United States)

    Monoranu, R.; Ştirbu, C.; Bujoreanu, C.

    2016-08-01

    Turbocharger rotors for the spark and compression ignition engines are resistant steels manufactured in order to support the exhaust gas temperatures exceeding 1200 K. In fact, the mechanical stress is not large as the power consumption of these systems is up to 10 kW, but the operating speeds are high, ranging between 30000 ÷ 250000 rpm. Therefore, the correct turbochargers functioning involves, even from the design stage, the accurate evaluation of the temperature effects, of the turbine torque due to the engine exhaust gases and of the vibration system behaviour caused by very high operating speeds. In addition, the turbocharger lubrication complicates the model, because the classical hydrodynamic theory cannot be applied to evaluate the floating bush bearings. The paper proposes a FEM study using CATIA environment, both as modeling medium and as tool for the numerical analysis, in order to highlight the turbocharger complex behaviour. An accurate design may prevent some major issues which can occur during its operation.

  13. Preliminary Analysis of Helicopter Options to Support Tunisian Counterterrorism Operations

    Science.gov (United States)

    2016-04-27

    results of the current analysis and in Mouton et al., 2015, is the relative cost -effectiveness between the CH-47D and the Mi-17v5. In the previous...helicopters from Sikorsky to fulfill a number of roles in counterterrorism operations. Rising costs and delays in delivery raised the question of...whether other cost -effective options exist to meet Tunisia’s helicopter requirement. Approach Our team conducted a preliminary assessment of

  14. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  15. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    Science.gov (United States)

    Broekema, P. C.; van Nieuwpoort, R. V.; Bal, H. E.

    2015-07-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload.

  16. In vivo bioprinting for computer- and robotic-assisted medical intervention: preliminary study in mice.

    Science.gov (United States)

    Keriquel, Virginie; Guillemot, Fabien; Arnault, Isabelle; Guillotin, Bertrand; Miraux, Sylvain; Amédée, Joëlle; Fricain, Jean-Christophe; Catros, Sylvain

    2010-03-01

    We present the first attempt to apply bioprinting technologies in the perspective of computer-assisted medical interventions. A workstation dedicated to high-throughput biological laser printing has been designed. Nano-hydroxyapatite (n-HA) was printed in the mouse calvaria defect model in vivo. Critical size bone defects were performed in OF-1 male mice calvaria with a 4 mm diameter trephine. Prior to laser printing experiments, the absence of inflammation due to laser irradiation onto mice dura mater was shown by means of magnetic resonance imaging. Procedures for in vivo bioprinting and results obtained using decalcified sections and x-ray microtomography are discussed. Although heterogeneous, these preliminary results demonstrate that in vivo bioprinting is possible. Bioprinting may prove to be helpful in the future for medical robotics and computer-assisted medical interventions.

  17. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  18. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  19. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    Science.gov (United States)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  20. Preliminary CFD Analysis for HVAC System Design of a Containment Building

    Energy Technology Data Exchange (ETDEWEB)

    Son, Sung Man; Choi, Choengryul [ELSOLTEC, Yongin (Korea, Republic of); Choo, Jae Ho; Hong, Moonpyo; Kim, Hyungseok [KEPCO Engineering and Construction, Gimcheon (Korea, Republic of)

    2016-10-15

    HVAC (Heating, Ventilation, Air Conditioning) system has been mainly designed based on overall heat balance and averaging concepts, which is simple and useful for designing overall system. However, such a method has the disadvantage that cannot predict the local flow and temperature distributions in a containment building. In this study, a CFD (Computational Fluid Dynamics) preliminary analysis is carried out to obtain detailed flow and temperature distributions in a containment building and to ensure that such information can be obtained via CFD analysis. This approach can be useful for hydrogen analysis in an accident related to hydrogen released into a containment building. In this study, CFD preliminary analysis has been performed to obtain the detailed information of the reactor containment building by using the CFD analysis techniques and to ensure that such information can be obtained via CFD analysis. We confirmed that CFD analysis can offer enough detailed information about flow patterns and temperature field and that CFD technique is a useful tool for HVAC design of nuclear power plants.

  1. Computer Aided Data Analysis in Sociometry

    Science.gov (United States)

    Langeheine, Rolf

    1978-01-01

    A computer program which analyzes sociometric data is presented. The SDAS program provides classical sociometric analysis. Multi-dimensional scaling and cluster analysis techniques may be combined with the MSP program. (JKS)

  2. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  3. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  4. NRT Rotor Structural / Aeroelastic Analysis for the Preliminary Design Review

    Energy Technology Data Exchange (ETDEWEB)

    Ennis, Brandon Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Paquette, Joshua A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    This document describes the initial structural design for the National Rotor Testbed blade as presented during the preliminary design review at Sandia National Laboratories on October 28- 29, 2015. The document summarizes the structural and aeroelastic requirements placed on the NRT rotor for satisfactory deployment at the DOE/SNL SWiFT experimental facility to produce high-quality datasets for wind turbine model validation. The method and result of the NRT blade structural optimization is also presented within this report, along with analysis of its satisfaction of the design requirements.

  5. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  6. Determinants of Trade Credit: A Preliminary Analysis on Construction Sector

    Directory of Open Access Journals (Sweden)

    Nicoleta Barbuta-Misu

    2016-07-01

    Full Text Available This paper introduces a preliminary analysis of the correlations between trade credit and some selected measures of financial performance for a sample of 958 firms acting in the construction sector. The examined period covers 2004-2013. The sample derived from Amadeus database contains firms that have sold and bought on credit. Results showed that larger firms offered and used more credit than counterparties. Firms offered and used in same time credit, but not in same level. Firms with higher return on assets and profit margin used and offered less credit from suppliers, respectively to clients. Moreover, more liquid firms used less trade payables.

  7. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  8. Computational Advances for and from Bayesian Analysis

    OpenAIRE

    Andrieu, C.; Doucet, A.; Robert, C. P.

    2004-01-01

    The emergence in the past years of Bayesian analysis in many methodological and applied fields as the solution to the modeling of complex problems cannot be dissociated from major changes in its computational implementation. We show in this review how the advances in Bayesian analysis and statistical computation are intermingled.

  9. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  10. Preliminary efficacy of a computer-delivered HIV prevention intervention for African American teenage females.

    Science.gov (United States)

    Klein, Charles H; Card, Josefina J

    2011-12-01

    This study translated SiHLE (Sisters Informing, Healing, Living, and Empowering), a 12-hour Centers for Disease Control and Prevention evidence-based group-level intervention for African American females 14-18 years of age, into a 2-hour computer-delivered individual-level intervention. A randomized controlled trial (n = 178) was conducted to examine the efficacy of the new Multimedia SiHLE intervention. Average condom-protected sex acts (proportion of vaginal sex acts with condoms, last 90 days) for sexually active participants receiving Multimedia SiHLE rose from M = 51% at baseline to M = 71% at 3-month follow-up (t = 2.06, p = .05); no statistically significant difference was found in the control group. Non-sexually active intervention group participants reported a significant increase in condom self-efficacy (t = 2.36, p = .02); no statistically significant difference was found in the control group. The study provides preliminary support for the efficacy of a computer-delivered adaptation of a proven HIV prevention program for African American teenage women. This is consistent with meta-analyses that have shown that computer-delivered interventions, which can often be disseminated at lower per-capita cost than human-delivered interventions, can influence HIV risk behaviors in positive fashion.

  11. Two-dimensional computer simulation of hypervelocity impact cratering: some preliminary results for Meteor Crater, Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, J.B.; Burton, D.E.; Cunningham, M.E.; Lettis, L.A. Jr.

    1978-06-01

    A computational approach used for subsurface explosion cratering was extended to hypervelocity impact cratering. Meteor (Barringer) Crater, Arizona, was selected for the first computer simulation because it is one of the most thoroughly studied craters. It is also an excellent example of a simple, bowl-shaped crater and is one of the youngest terrestrial impact craters. Initial conditions for this calculation included a meteorite impact velocity of 15 km/s, meteorite mass of 1.67 x 10/sup 8/ kg, with a corresponding kinetic energy of 1.88 x 10/sup 16/ J (4.5 megatons). A two-dimensional Eulerian finite difference code called SOIL was used for this simulation of a cylindrical iron projectile impacting at normal incidence into a limestone target. For this initial calculation, a Tillotson equation-of-state description for iron and limestone was used with no shear strength. Results obtained for this preliminary calculation of the formation of Meteor Crater are in good agreement with field measurements. A color movie based on this calculation was produced using computer-generated graphics. 19 figures, 5 tables, 63 references.

  12. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  13. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  14. Electronic Warfare M-on-N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis

    Science.gov (United States)

    2017-04-12

    E. Jarvis Electronic Warfare M-on- N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis Advanced Techniques Branch Tactical...12-04-2017 NRL Memorandum Report Electronic Warfare M-on- N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis Donald E...ELECTRONIC WARFARE M-ON- N DIGITAL SIMULATION LOGGING REQUIREMENTS AND HDF5: A PRELIMINARY ANALYSIS 1. INTRODUCTION HDF5 technology [Folk] has been

  15. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  16. Analysis preliminary phytochemical raw extract of leaves Nephrolepis pectinata

    Directory of Open Access Journals (Sweden)

    Natally Marreiros Gomes

    2017-06-01

    Full Text Available The Nephrolepis pectinata popularly known as paulista fern, ladder-heaven, cat tail, belongs to the family Davalliaceae. For the beauty of the arrangements of their leaves ferns are quite commercialized in Brazil, however, have not been described in the literature studies on their pharmacological potential. Thus, the objective of this research was to analyze the phytochemical properties of the crude extract of the leaves of Nephrolepis pectinata. To perform the phytochemical analysis were initially made the collection of the vegetable, preparation of voucher specimen, washing, drying and grinding. Then, extraction by percolation method and end the phytochemical analysis. Preliminary results phytochemicals the crude extract of the leaves of Nephrolepis pectinata tested positive for reducing sugars, phenols/tannins (catechins tannins and catechins.

  17. Error Analysis In Computational Elastodynamics

    Science.gov (United States)

    Mukherjee, Somenath; Jafarali, P.; Prathap, Gangan

    The Finite Element Method (FEM) is the mathematical tool of the engineers and scientists to determine approximate solutions, in a discretised sense, of the concerned differential equations, which are not always amenable to closed form solutions. In this presentation, the mathematical aspects of this powerful computational tool as applied to the field of elastodynamics have been highlighted, using the first principles of virtual work and energy conservation.

  18. Cluster analysis for computer workload evaluation

    CERN Document Server

    Landau, K

    1976-01-01

    An introduction to computer workload analysis is given, showing its range of application in computer centre management, system and application programming. Cluster methods are discussed which can be used in conjunction with workload data and cluster algorithms are adapted to the specific set problem. Several samples of CDC 7600- accounting-data-collected at CERN, the European Organization for Nuclear Research-underwent a cluster analysis to determine job groups. The conclusions from resource usage of typical job groups in relation to computer workload analysis are discussed. (17 refs).

  19. Preliminary Research on Applying Computational Fluid Dynamics to Subchannel Analysis%计算流体力学应用于子通道分析的初步方法研究

    Institute of Scientific and Technical Information of China (English)

    臧金光; 闫晓; 黄善仿; 曾小康; 黄彦平

    2014-01-01

    M ulti-scale coupled simulation has become a trend in performing nuclear reac-tor safety analysis . The routine method is to combine different codes together by a common platform for data exchange and delivery .In this paper ,a new thought was proposed that the analysis technique developed at relatively small scale could be extend-ed to large scale and it was verified with the application of CFD method to perform the subchannel scale analysis .The result confirms its reasonability in a sense ,however ,it needs further improvement since it doesn’t account for the turbulent mixing between the adjacent subchannels properly .%多尺度耦合分析是目前开展反应堆安全分析的一种趋势,常规的多尺度耦合是将不同程序通过一公共耦合平台实现数据间的传递和交换。本文提出一种新的多尺度耦合思路,认为基于相对小尺度的分析手段可扩展应用至大尺度,并用计算流体力学(C FD )应用于子通道尺度的数值模拟方法进行验证。验证结果表明,该方法具有一定的合理性,但这种方法对湍流交混能力的考虑仍存在欠缺,还需进一步改进。

  20. 26 CFR 1.818-4 - Election with respect to life insurance reserves computed on preliminary term basis.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Election with respect to life insurance reserves... Election with respect to life insurance reserves computed on preliminary term basis. (a) In general. Section 818(c) permits a life insurance company issuing contracts with respect to which the life...

  1. A Preliminary MANPRINT Evaluation of the All Source Analysis (ASAS)

    Science.gov (United States)

    1988-11-01

    Rear (CEWI) FSIC ............................ 2 CEWI ( TCAE ) AIM(6) ........................... 2 DTOC AIM(6...Sensors and the Ml Battalion TCAE ..... ............... . 13 2. Ratings of Understanding of Tasks Required at the Completion of Training and at the...for transmission to the sensors and jammers. CEWI Tactical Control and Analysis Element ( TCAE ) AIM(6) The AIM module consists of a VAX 750R computer

  2. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  3. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  4. Preliminary systems-interaction results from the Digraph Matrix Analysis of the Watts Bar Nuclear Power Plant safety-injection systems

    Energy Technology Data Exchange (ETDEWEB)

    Sacks, I.J.; Ashmore, B.C.; Champney, J.M.; Alesso, H.P.

    1983-06-01

    This report provides preliminary results generated by a Digraph Matrix Analysis (DMA) for a Systems Interaction analysis performed on the Safety Injection System of the Tennessee Valley Authority Watts Bar Nuclear Power Plant. An overview of DMA is provided along with a brief description of the computer codes used in DMA.

  5. Preliminary systems-interaction results from the Digraph Matrix Analysis of the Watts Bar Nuclear Power Plant safety-injection systems

    Energy Technology Data Exchange (ETDEWEB)

    Sacks, I.J.; Ashmore, B.C.; Champney, J.M.; Alesso, H.P.

    1983-06-01

    This report provides preliminary results generated by a Digraph Matrix Analysis (DMA) for a Systems Interaction analysis performed on the Safety Injection System of the Tennessee Valley Authority Watts Bar Nuclear Power Plant. An overview of DMA is provided along with a brief description of the computer codes used in DMA.

  6. Solar Stirling power generation - Systems analysis and preliminary tests

    Science.gov (United States)

    Selcuk, M. K.; Wu, Y.-C.; Moynihan, P. I.; Day, F. D., III

    1977-01-01

    The feasibility of an electric power generation system utilizing a sun-tracking parabolic concentrator and a Stirling engine/linear alternator is being evaluated. Performance predictions and cost analysis of a proposed large distributed system are discussed. Design details and preliminary test results are presented for a 9.5 ft diameter parabolic dish at the Jet Propulsion Laboratory (Caltech) Table Mountain Test Facility. Low temperature calorimetric measurements were conducted to evaluate the concentrator performance, and a helium flow system is being used to test the solar receiver at anticipated working fluid temperatures (up to 650 or 1200 C) to evaluate the receiver thermal performance. The receiver body is designed to adapt to a free-piston Stirling engine which powers a linear alternator assembly for direct electric power generation. During the next phase of the program, experiments with an engine and receiver integrated into the concentrator assembly are planned.

  7. Primate phylogeny studied by comparative determinant analysis. A preliminary report.

    Science.gov (United States)

    Bauer, K

    1993-01-01

    In this preliminary report the divergence times for the major primate groups are given, calculated from a study by comparative determinant analysis of 69 proteins (equaling 0.1% of the whole genetic information). With an origin of the primate order set at 80 million years before present, the ages of the last common ancestors (LCAs) of man and the major primate groups obtained this way are as follows: Pan troglodytes 5.2; Gorilla gorilla 7.4; Pongo pygmaeus 19.2; Hylobates lar 20.3; Old World monkeys 31.4; Lagothrix lagotricha 46.0; Cebus albifrons 59.5; three lemur species 67.0, and Galago crassicaudatus 73.3 million years. The LCA results and the approach are shortly discussed. A full account of this extended investigation including results on nonprimate mammals and on the determinant structures and the immunologically derived evolutionary rates of the proteins analyzed will be published elsewhere.

  8. PRELIMINARY PHYTOCHEMICAL ANALYSIS OF ACTINIOPTERIS RADIATA (SWARTZ LINK.

    Directory of Open Access Journals (Sweden)

    R. Manonmani

    2013-06-01

    Full Text Available The objective of the present study was to find out the presence of preliminary phytochemicals in six different solvent extracts of Actiniopteris radiata (Swartz link. by qualitative screening methods. The solvent used for the extraction of leaf and rhizome powder were ethanol, petroleum ether, chloroform, acetone, DMSO and aqueous. The secondary metabolites such as steroids, triterpenoids, reducing sugars, sugars, alkaloids, phenolic compounds, catechins, flavonoids, saponins, tannins, anthroquinones and amino acids were screened by using standard methods. The phytochemical analysis of the ethanolic extract of both (leaf & rhizome revealed the presence of most active constituents than the other solvents. The ethanolic rhizome extracts of Actiniopteris radiata showed higher amount of phytochemicals when compared with the ethanolic leaf extracts.

  9. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  10. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  11. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  12. Using computational simulation to aid in the prediction of socket fit: a preliminary study.

    Science.gov (United States)

    Lee, Winson C C; Zhang, Ming

    2007-10-01

    This study illustrates the use of computational analysis to predict prosthetic socket fit. A simple indentation test is performed by applying force to the residual limb of a trans-tibial amputee through an indenter until the subject perceives the onset of pain. Computational finite element (FE) analysis is then applied to evaluate the magnitude of pressure underlying the indenter that initiates pain (pain threshold pressure), and the pressure at the prosthetic socket-residual limb interface. The assessment of socket fit is examined by studying whether or not the socket-limb interface pressure exceeds the pain threshold pressure of the limb. Based on the computer-aided assessment, a new prosthetic socket is then fabricated and fitted to the amputee subject. Successful socket fit is achieved at the end of this process. The approach of using computational analysis to aid in assessing socket fit allows a more efficient evaluation and re-design of the socket even before the actual fabrication and fitting of the prosthetic socket. However, more thorough investigations are required before this approach can be widely used. A subsequent part of this paper discusses the limitations and suggests future research directions in this area.

  13. Computer Graphics for System Effectiveness Analysis.

    Science.gov (United States)

    1986-05-01

    02139, August 1982. Chapra , Steven C., and Raymond P. Canale, (1985), Numerical Methods for Engineers with Personal Computer Applications New York...I -~1.2 Outline of Thesis .................................. 1..... .......... CHAPTER 11. METHOD OF ANALYSIS...Chapter VII summarizes the results and gives recommendations for future research. I - P** METHOD OF ANALYSIS 2.1 Introduction Systems effectiveness

  14. Synthesis, preliminary bioevaluation and computational analysis of caffeic acid analogues.

    Science.gov (United States)

    Liu, Zhiqian; Fu, Jianjun; Shan, Lei; Sun, Qingyan; Zhang, Weidong

    2014-05-16

    A series of caffeic acid amides were designed, synthesized and evaluated for anti-inflammatory activity. Most of them exhibited promising anti-inflammatory activity against nitric oxide (NO) generation in murine macrophage RAW264.7 cells. A 3D pharmacophore model was created based on the biological results for further structural optimization. Moreover, predication of the potential targets was also carried out by the PharmMapper server. These amide analogues represent a promising class of anti-inflammatory scaffold for further exploration and target identification.

  15. Computer Auxiliary Analysis for Stochasticity of Chaos

    Institute of Scientific and Technical Information of China (English)

    ZHAOGeng; FANGJin-qing

    2003-01-01

    In this work, we propose a mathematics-physical statistic analytical method for stochastic process of chaos, based on stochastic test via combination measurement of Monobit and Runs. Computer auxiliary analysis shows that it is of stochasticity for stochastic number produced from the chaotic circuit. Our software is written by VB and C++, the later can be tested by the former, and at the same time it is verified by stochastic number produced by the computer. So the data treatment results are reliable.

  16. Computational Analysis of PTEN Gene Mutation

    Directory of Open Access Journals (Sweden)

    Siew-Kien Mah

    2012-01-01

    Full Text Available Post-genomic data can be efficiently analyzed using computational tools. It has the advantage over the biochemical and biophysical methods in term of higher coverage. In this research, we adopted a computational analysis on PTEN gene mutation.  Mutation in PTEN is responsible for many human diseases. The results of this research provide insights into the protein domains of PTEN and the distribution of mutation.

  17. Computer Language Effciency via Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Andrea Ellero

    2011-01-01

    Full Text Available The selection of the computer language to adopt is usually driven by intuition and expertise, since it is very diffcult to compare languages taking into account all their characteristics. In this paper, we analyze the effciency of programming languages through Data Envelopment Analysis. We collected the input data from The Computer Language Benchmarks Game: we consider a large set of languages in terms of computational time, memory usage, and source code size. Various benchmark problems are tackled. We analyze the results first of all considering programming languages individually. Then, we evaluate families of them sharing some characteristics, for example, being compiled or interpreted.

  18. Preliminary analysis of distributed in situ soil moisture measurements

    Directory of Open Access Journals (Sweden)

    L. Brocca

    2005-01-01

    Full Text Available Surface soil moisture content is highly variable in both space and time. Remote sensing can provide an effective methodology for mapping surface moisture content over large areas but ground based measurements are required to test its reliability and to calibrate retrieval algorithms. Recently, we had the opportunity to design and perform an experiment aimed at jointly acquiring measurements of surface soil water content at various locations and remotely sensed hyperspectral data. The area selected for the experiment is located in central Umbria and it extends for 90km2. For the area, detailed lithological and multi-temporal landslide inventory maps were available. We identified eight plots where measurements of soil water content were made using a Time Domain Reflectometer (TDR. The plots range in size from 100m2 to 600m2, and cover a variety of topographic and morphological settings. The TDR measurements were conducted during four days, on 5 April, 15 April, 2 May and 3 May 2004. On 3 May the NERC airborne CASI 2 acquired the hyperspectral data. Preliminary analysis concerning the matching between the landslides and the soil moisture were reported. Statistical and geostatistical analysis investigating the spatial-temporal soil moisture distribution were performed. These results will be compared with the data of surface temperature obtained from the remotely sensed hyperspectral sensor.

  19. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  20. A preliminary study on the short-term efficacy of chairside computer-aided design/computer-assisted manufacturing- generated posterior lithium disilicate crowns.

    Science.gov (United States)

    Reich, Sven; Fischer, Sören; Sobotta, Bernhard; Klapper, Horst-Uwe; Gozdowski, Stephan

    2010-01-01

    The purpose of this preliminary study was to evaluate the clinical performance of chairside-generated crowns over a preliminary time period of 24 months. Forty-one posterior crowns made of a machinable lithium disilicate ceramic for full-contour crowns were inserted in 34 patients using a chairside computer-aided design/computer-assisted manufacturing technique. The crowns were evaluated at baseline and after 6, 12, and 24 months according to modified United States Public Health Service criteria. After 2 years, all reexamined crowns (n = 39) were in situ; one abutment exhibited secondary caries and two abutments received root canal treatment. Within the limited observation period, the crowns revealed clinically satisfying results.

  1. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  2. The computer in shell stability analysis

    Science.gov (United States)

    Almroth, B. O.; Starnes, J. H., Jr.

    1975-01-01

    Some examples in which the high-speed computer has been used to improve the static stability analysis capability for general shells are examined. The fundamental concepts of static stability are reviewed with emphasis on the differences between linear bifurcation buckling and nonlinear collapse. The analysis is limited to the stability of conservative systems. Three examples are considered. The problem of cylinders subjected to bending loads is used as an example to illustrate that a simple structure can have a sufficiently complicated nonlinear behavior to require a computer analysis for accurate results. An analysis of the problems involved in the modeling of stiffening elements in plate and shell structures illustrates the necessity that the analyst recognizes all important deformation modes. The stability analysis of the Skylab structure indicates the size of problems that can be solved with current state-of-the-art capability.

  3. Preliminary performance analysis of the advanced pulse compression noise radar waveform

    Science.gov (United States)

    Govoni, Mark A.; Moyer, Lee R.

    2012-06-01

    Noise radar systems encounter target fluctuation behavior similar to that of conventional systems. For noise radar systems, however, the fluctuations are not only dictated by target composition and geometry, but also by the non-uniform power envelope of their random transmit signals. This third dependency is of interest and serves as the basis for the preliminary analysis conducted in this manuscript. General conclusions are drawn on the implications of having a random power envelope and the impacts it could have on both the transmit and receive processes. Using an advanced pulse compression noise (APCN) radar waveform as the constituent signal, a computer simulation aids in quantifying potential losses and the impacts they might have on the detection performance of a real radar system.

  4. Grid-connected ICES: preliminary feasibility analysis and evaluation. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    The HEAL Complex in New Orleans will serve as a Demonstration Community for which the ICES Demonstration System will be designed. The complex is a group of hospitals, clinics, research facilities, and medical educational facilities. The five tasks reported on are: preliminary energy analysis; preliminary institutional assessment; conceptual design; firming-up of commitments; and detailed work management plan.

  5. Purification, crystallization and preliminary X-ray analysis of struthiocalcin 1 from ostrich (Struthio camelus) eggshell

    Energy Technology Data Exchange (ETDEWEB)

    Reyes-Grajeda, Juan Pablo [Unidad de Proteómica Médica, Instituto Nacional de Medicina Genómica, Mexico City (Mexico); Marín-García, Liliana [Instituto de Química, Universidad Nacional Autónoma de México (Mexico); Stojanoff, Vivian [Brookhaven National Laboratories, NSLS, Upton, New York (United States); Moreno, Abel, E-mail: carcamo@servidor.unam.mx [Instituto de Química, Universidad Nacional Autónoma de México (Mexico); Unidad de Proteómica Médica, Instituto Nacional de Medicina Genómica, Mexico City (Mexico)

    2007-11-01

    The purification, crystallization and preliminary X-ray diffraction data of the protein struthiocalcin 1 isolated from ostrich eggshell are reported. The purification, crystallization and preliminary X-ray analysis of struthiocalcin 1 (SCA-1), a protein obtained from the intramineral part of ostrich (Struthio camelus) eggshell, is reported.

  6. Investigation of Sorption and Diffusion Mechanisms, and Preliminary Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bhave, Ramesh R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jubin, Robert Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Spencer, Barry B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nair, Sankar [Georgia Inst. of Technology, Atlanta, GA (United States)

    2017-02-01

    This report describes the synthesis and evaluation of molecular sieve zeolite membranes to separate and concentrate tritiated water (HTO) from dilute HTO-bearing aqueous streams. Several monovalent and divalent cation exchanged silico alumino phosphate (SAPO-34) molecular sieve zeolite membranes were synthesized on disk supports and characterized with gas and vapor permeation measurements. The pervaporation process performance was evaluated for the separation and concentration of tritiated water. Experiments were performed using tritiated water feed solution containing tritium at the high end of the range (1 mCi/mL) anticipated in a nuclear fuel processing system that includes both acid and water streams recycling. The tritium concentration was about 0.1 ppm. The permeate was recovered under vacuum. The HTO/H2O selectivity and separation factor calculated from the measured tritium concentrations ranged from 0.99 to 1.23, and 0.83-0.98, respectively. Although the membrane performance for HTO separation was lower than expected, several encouraging observations including molecular sieving and high vapor permeance are reported. Additionally, several new approaches are proposed, such as tuning the sorption and diffusion properties offered by small pore LTA zeolite materials, and cation exchanged aluminosilicates with high metal loading. It is hypothesized that substantially improved preferential transport of tritium (HTO) resulting in a more concentrated permeate can be achieved. Preliminary economic analysis for the membrane-based process to concentrate tritiated water is also discussed.

  7. Crystallization and preliminary crystallographic analysis of recombinant human galectin-1

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Stacy A. [Institute for Glycomics, Gold Coast Campus, Griffith University, Queensland 4222 (Australia); Scott, Ken [School of Biological Sciences, University of Auckland, Auckland (New Zealand); Blanchard, Helen, E-mail: h.blanchard@griffith.edu.au [Institute for Glycomics, Gold Coast Campus, Griffith University, Queensland 4222 (Australia)

    2007-11-01

    Human galectin-1 has been cloned, expressed in E. coli, purified and crystallized in the presence of both lactose (ligand) and β-mercaptoethanol under six different conditions. The X-ray diffraction data obtained have enabled the assignment of unit-cell parameters for two novel crystal forms of human galectin-1. Galectin-1 is considered to be a regulator protein as it is ubiquitously expressed throughout the adult body and is responsible for a broad range of cellular regulatory functions. Interest in galectin-1 from a drug-design perspective is founded on evidence of its overexpression by many cancers and its immunomodulatory properties. The development of galectin-1-specific inhibitors is a rational approach to the fight against cancer because although galectin-1 induces a plethora of effects, null mice appear normal. X-ray crystallographic structure determination will aid the structure-based design of galectin-1 inhibitors. Here, the crystallization and preliminary diffraction analysis of human galectin-1 crystals generated under six different conditions is reported. X-ray diffraction data enabled the assignment of unit-cell parameters for crystals grown under two conditions, one belongs to a tetragonal crystal system and the other was determined as monoclinic P2{sub 1}, representing two new crystal forms of human galectin-1.

  8. Preliminary radiation criteria and nuclear analysis for ETF

    Energy Technology Data Exchange (ETDEWEB)

    Engholm, B.A.

    1980-09-01

    Preliminary biological and materials radiation dose criteria for the Engineering Test Facility are described and tabulated. In keeping with the ETF Mission Statement, a key biological dose criterion is a 24-hour shutdown dose rate of 2 mrem/hr on the surface of the outboard bulk shield. Materials dose criteria, which primarily govern the inboard shield design, include 10/sup 9/ rads exposure limit to epoxy insulation, 3 x 10/sup -4/ dpa damage to the TF coil copper stabilizer, and a total nuclear heating rate of 5 kW in the inboard TF coils. Nuclear analysis performed during FY 80 was directed primarily at the inboard and outboard bulk shielding, and at radiation streaming in the neutral beam drift ducts. Inboard and outboard shield thicknesses to achieve the biological and materials radiation criteria are 75 cm inboard and 125 cm outboard, the configuration consisting of alternating layers of stainless steel and borated water. The outboard shield also includes a 5 cm layer of lead. NBI duct streaming analyses performed by ORNL and LASL will play a key role in the design of the duct and NBI shielding in FY 81. The NBI aluminum cryopanel nuclear heating rate during the heating cycle is about 1 milliwatt/cm/sup 3/, which is far less than the permissible limit.

  9. Preliminary analysis of aerial hyperspectral data on shallow lacustrine waters

    Science.gov (United States)

    Bianchi, Remo; Castagnoli, A.; Cavalli, Rosa M.; Marino, Carlo M.; Pignatti, Stefano; Zilioli, Eugenio

    1995-11-01

    The availability of MIVIS hyperspectral data, deriving from an aerial survey recently performed over a test-site in Lake Garda, Italy, gave the possibility of a preliminary new insight in the field of specific applications of remote sensing to shallow water analysis. The spectroradiometers in the visible and in the thermal infrared were explored in particular, accessing to helpful information for the detection of bio-physical indicators of water quality, either related to the surface/sub-surface of waters or to the bottom of the lake, since the study area presents very shallow waters, never exceeding a 6-meter depth in any case. Primary interest was the detection of man-induced activities along the margins, like sewage effect and sedimentary structure in the bottom or algal bloom. Secondly, a correlation between absorbivity coefficients in the visible bands and bathimetric contour lines in the proximity of the marginal zone of the lake was accomplished, by means of two indicative spectroradiometric transects.

  10. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  11. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  12. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  13. Conversion Preliminary Safety Analysis Report for the NIST Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Diamond, D. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, J. S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Hanson, A. L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cheng, L-Y [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, N. [Brookhaven National Lab. (BNL), Upton, NY (United States); Cuadra, A. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-01-30

    The NIST Center for Neutron Research (NCNR) is a reactor-laboratory complex providing the National Institute of Standards and Technology (NIST) and the nation with a world-class facility for the performance of neutron-based research. The heart of this facility is the NIST research reactor (aka NBSR); a heavy water moderated and cooled reactor operating at 20 MW. It is fueled with high-enriched uranium (HEU) fuel elements. A Global Threat Reduction Initiative (GTRI) program is underway to convert the reactor to low-enriched uranium (LEU) fuel. This program includes the qualification of the proposed fuel, uranium and molybdenum alloy foil clad in an aluminum alloy, and the development of the fabrication techniques. This report is a preliminary version of the Safety Analysis Report (SAR) that would be submitted to the U.S. Nuclear Regulatory Commission (NRC) for approval prior to conversion. The report follows the recommended format and content from the NRC codified in NUREG-1537, “Guidelines for Preparing and Reviewing Applications for the Licensing of Non-power Reactors,” Chapter 18, “Highly Enriched to Low-Enriched Uranium Conversions.” The emphasis in any conversion SAR is to explain the differences between the LEU and HEU cores and to show the acceptability of the new design; there is no need to repeat information regarding the current reactor that will not change upon conversion. Hence, as seen in the report, the bulk of the SAR is devoted to Chapter 4, Reactor Description, and Chapter 13, Safety Analysis.

  14. Preliminary Analysis of Remote Monitoring & Robotic Concepts for Performance Confirmation

    Energy Technology Data Exchange (ETDEWEB)

    D.A. McAffee

    1997-02-18

    ) Identify and discuss the main Performance Confirmation monitoring needs and requirements during the post-emplacement preclosure period. This includes radiological, non-radiological, host rock, and infrastructure performance monitoring needs. It also includes monitoring for possible off-normal events. (Presented in Section 7.3). (3) Identify general approaches and methods for obtaining performance information from within the emplacement drifts for Performance Confirmation. (Presented in Section 7.4) (4)Review and discuss available technologies and design strategies that may permit the use of remotely operated systems within the hostile thermal and radiation environment expected within the emplacement drifts. (Presented in Section 7.5). (5) Based on Performance Confirmation monitoring needs and available technologies, identify potential application areas for remote systems and robotics for post-emplacement preclosure Performance Confirmation activities (Presented in Section 7.6). (6) Develop preliminary remote monitoring and robotic concepts for post-emplacement, preclosure Performance Confirmation activities. (Presented in Section 7.7) This analysis is being performed very early in the systems engineering cycle, even as issues related to the Performance Confirmation program planning phase are being formulated and while the associated needs, constraints and objectives are yet to be fully determined and defined. This analysis is part of an issue formulation effort and is primarily concerned with identification and description of key issues related to remotely monitoring repository performance for Performance Confirmation. One of the purposes of this analysis is to provide an early investigation of potential design challenges that may have a high impact on future design concepts. This analysis can be used to guide future concept development and help access what is feasible and achievable by application of remote systems technology. Future design and systems engineering

  15. First fungal genome sequence from Africa: A preliminary analysis

    Directory of Open Access Journals (Sweden)

    Rene Sutherland

    2012-01-01

    Full Text Available Some of the most significant breakthroughs in the biological sciences this century will emerge from the development of next generation sequencing technologies. The ease of availability of DNA sequence made possible through these new technologies has given researchers opportunities to study organisms in a manner that was not possible with Sanger sequencing. Scientists will, therefore, need to embrace genomics, as well as develop and nurture the human capacity to sequence genomes and utilise the ’tsunami‘ of data that emerge from genome sequencing. In response to these challenges, we sequenced the genome of Fusarium circinatum, a fungal pathogen of pine that causes pitch canker, a disease of great concern to the South African forestry industry. The sequencing work was conducted in South Africa, making F. circinatum the first eukaryotic organism for which the complete genome has been sequenced locally. Here we report on the process that was followed to sequence, assemble and perform a preliminary characterisation of the genome. Furthermore, details of the computer annotation and manual curation of this genome are presented. The F. circinatum genome was found to be nearly 44 million bases in size, which is similar to that of four other Fusarium genomes that have been sequenced elsewhere. The genome contains just over 15 000 open reading frames, which is less than that of the related species, Fusarium oxysporum, but more than that for Fusarium verticillioides. Amongst the various putative gene clusters identified in F. circinatum, those encoding the secondary metabolites fumosin and fusarin appeared to harbour evidence of gene translocation. It is anticipated that similar comparisons of other loci will provide insights into the genetic basis for pathogenicity of the pitch canker pathogen. Perhaps more importantly, this project has engaged a relatively large group of scientists

  16. Computational intelligent data analysis for sustainable development computational intelligent data analysis for sustainable development

    CERN Document Server

    Yu, Ting; Simoff, Simeon

    2016-01-01

    Computational Intelligent Data Analysis for Sustainable Development: An Introduction and Overview Ting Yu, Nitesh Chawla, and Simeon SimoffIntegrated Sustainability AnalysisTracing Embodied CO2 in Trade Using High-Resolution Input-Output Tables Daniel Moran and Arne GeschkeAggregation Effects in Carbon Footprint Accounting Using Multi-Region Input-Output Analysis Xin Zhou, Hiroaki Shirakawa, and Manfred LenzenComputational Intelligent Data Analysis for Climate ChangeClimate InformaticsClaire Monteleoni, Gavin A. Schmidt, Francis Alexander, Alexandru Niculescu-Mizil, Karsten Steinhaeuser, Michael Tippett, Arindam Banerjee, M. Benno Blumenthal, Auroop R. Ganguly, Jason E. Smerdon, and Marco TedescoComputational Data Sciences for Actionable Insights on Climate Extremes and Uncertainty Auroop R. Ganguly, Evan Kodra, Snigdhansu Chatterjee, Arindam Banerjee, and Habib N. NajmComputational Intelligent Data Analysis for Biodiversity and Species ConservationMathematical Programming Applications to Land Conservation an...

  17. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...... on information obtained from software profiling and the resulting design is validated through cosimulation. The achieved speed-up is estimated based on an analysis of profiling information from different sets of input data and various architectural options....

  18. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  19. Preliminary Core Analysis of a Micro Modular Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Chang Keun; Chang, Jongwa [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Venneri, Francesco [Ultra Safe Nuclear Corporation, Los Alamos (United States); Hawari, Ayman [NC State Univ., Raleigh (United States)

    2014-05-15

    The Micro Modular Reactor (MMR) will be 'melt-down proof'(MDP) under all circumstances, including the complete loss of coolant, and will be easily transportable and retrievable, and suitable for use with very little site preparation and Balance of Plant (BOP) requirements for a variety of applications, from power generation and process heat applications in remote areas to grid-unattached locations, including ship propulsion. The Micro Modular Reactor design proposed in this paper has 3 meter diameter core (2 meter active core) which is suitable for 'factory manufactured' and has few tens year of service life for remote deployment. We confirmed the feasibility of long term service life by a preliminary neutronic analysis in terms of the excess reactivity, the temperature feedback coefficient, and the control margins. We are able to achieve a reasonably long core life time of 5 ∼ 10 years under typical thermal hydraulic condition of a helium cooled reactor. However, on a situation where longer service period and safety is important, we can reduce the power density to the level of typical pebble bed reactor. In this case we can design 10 MWt MMR with core diameter for 10 ∼ 40 years core life time without much loss in the economics. Several burnable poisons are studied and it is found that erbia mixed in the compact matrix seems reasonably good poison. The temperature feedback coefficients were remaining negative during lifetime. Drum type control rods at reflector region and few control rods inside core region are sufficient to control the reactivity during operation and to achieve safe cold shutdown state.

  20. Preliminary evaluation of diabatic heating distribution from FGGE level 3b analysis data

    Science.gov (United States)

    Kasahara, A.; Mizzi, A. P.

    1985-01-01

    A method is presented for calculating the global distribution of diabatic heating rate. Preliminary results of global heating rate evaluated from the European center for Medium Range Weather Forecasts Level IIIb analysis data is also presented.

  1. The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis

    DEFF Research Database (Denmark)

    Kristensen, Niels Heine; Nielsen, Thorkild; Bruselius-Jensen, Maria Louisa

    2003-01-01

    Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission......Kristensen NH, Nielsen T, Bruselius-Jensen M, Scheperlen-Bøgh P, Beckie M, Foster C, Midmore P, Padel S (2002): The Organic Food Market and Marketing Initiatives in Europe: a Preliminary Analysis. Final Report to the EU Commission...

  2. Sammon mapping for preliminary analysis in Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Nicolae APOSTOLESCU

    2016-03-01

    Full Text Available The main goal of this paper is to present the implementation of the Sammon algorithm developed for finding N points in a lower m-dimensional subspace, where the original points are from a high n-dimensional space. This mapping is done so interpoints Euclidian distances in m-space correspond to the distances measured in the n-dimensional space. This method known as non-linear projection method or multidimensional scaling (MDS aims to preserve the global properties of points. The method is based on the idea of transforming the original, n-dimensional input space into a reduced, m-dimensional one, where mAnalysis (PCA may be applied as a pre-processing procedure for starting, in order to obtain the N points in the lower subspace. The algorithm was tested on hyperspectral data with spectra of various lengths. Depending of the size of the input data (number of points, the number of learning iterations and computational facilities available, Sammon mapping might be computationally expensive.

  3. Preliminary analysis of knee stress in Full Extension Landing

    Directory of Open Access Journals (Sweden)

    Majid Davoodi Makinejad

    2013-09-01

    Full Text Available OBJECTIVE: This study provides an experimental and finite element analysis of knee-joint structure during extended-knee landing based on the extracted impact force, and it numerically identifies the contact pressure, stress distribution and possibility of bone-to-bone contact when a subject lands from a safe height. METHODS: The impact time and loads were measured via inverse dynamic analysis of free landing without knee flexion from three different heights (25, 50 and 75 cm, using five subjects with an average body mass index of 18.8. Three-dimensional data were developed from computed tomography scans and were reprocessed with modeling software before being imported and analyzed by finite element analysis software. The whole leg was considered to be a fixed middle-hinged structure, while impact loads were applied to the femur in an upward direction. RESULTS: Straight landing exerted an enormous amount of pressure on the knee joint as a result of the body's inability to utilize the lower extremity muscles, thereby maximizing the threat of injury when the load exceeds the height-safety threshold. CONCLUSIONS: The researchers conclude that extended-knee landing results in serious deformation of the meniscus and cartilage and increases the risk of bone-to-bone contact and serious knee injury when the load exceeds the threshold safety height. This risk is considerably greater than the risk of injury associated with walking downhill or flexion landing activities.

  4. Analysis of rainfall-induced shallow landslides in Jamne and Jaszcze stream valleys (Polish Carpathians – preliminary results

    Directory of Open Access Journals (Sweden)

    Zydroń Tymoteusz

    2016-03-01

    Full Text Available Analysis of rainfall-induced shallow landslides in Jamne and Jaszcze stream valleys (Polish Carpathians - preliminary results. Preliminary shallow landslide susceptibility mapping of the Jamne and Jaszcze stream valleys, located in the Polish Flysch Carpathians, is presented in the paper. For the purpose of mapping, there were used SINMAP and Iverson’s models integrating infiltration and slope stability calculations. The calibration of the used models parameters, obtained from limited field and laboratory tests, was performed using data from 8-9 July 1997, when as a consequence of a very intense rainfall, 94 shallow landslides were observed on meadows and arable lands. A comparison of the slope stability calculation results and the localisation of the noticed shallow landslides showed satisfactory agreement between localisation of the observed and computed unstable areas. However, it was concluded that better simulation results were obtained using Iverson’s model.

  5. Preliminary study on washability and composition analysis of highsulfur coal in some mining areas in Guizhou

    Institute of Scientific and Technical Information of China (English)

    QIU Yue-qin; MAO Song; ZHANG Qin; TIAN Ye; LIU Zhi-hong

    2011-01-01

    Preliminary sink-float experiments on high-sulfur coal was done in some mining areas and carried on elementary analysis, industrial analysis, and ashcontent analysis. Through the experiments, definite middlings, and gangue, the phase analysis of sulfur was carried on, by which a good understanding of sulfur characters in raw coal was achieved.

  6. Cusum charts for preliminary analysis of individual observations

    NARCIS (Netherlands)

    A.J. Koning (Alex); R.J.M.M. Does (Ronald)

    1997-01-01

    textabstractA preliminary Cusum chart based on individual observations is developed from the uniformly most powerful test for the detection of linear trends. This Cusum chart is compared with several of its competitors which are based on the likelihood ratio test and on transformations of standardiz

  7. Synopsis of some preliminary computational studies related to unsaturated zone transport at Area G

    Energy Technology Data Exchange (ETDEWEB)

    Vold, E.

    1998-03-01

    Computational transport models are described with applications in three problem areas related to unsaturated zone moisture movement beneath Area G. These studies may be used to support the ongoing maintenance of the site Performance Assessment. The three areas include: a 1-D transient analysis with average tuff hydraulic properties in the near surface region with computed results compared to field data; the influence on near surface transient moisture percolation due to realistic distributions in hydraulic properties derived statistically from the observed variance in the field data; and the west to east moisture flow in a 2-D steady geometry approximation of the Pajarito Plateau. Results indicate that a simple transient model for transport of moisture volume fraction fits field data well compared to a moisture pulse observed in the active disposal unit, pit 37. Using realistic infiltration boundary conditions for summer showers and for spring snow melt conditions, the computed moisture pulses show significant propagation to less than 10-ft depth. Next, the hydraulic properties were varied on a 2-D grid using statistical distributions based on the field data means and variances for the hydraulic parameters. Near surface transient percolation in these conditions shows a qualitatively realistic percolation with a spatially variable wave front moving into the tuff; however, the flow does not channel into preferred paths and suggests there is no formation of fast paths which could enhance transportation of contaminants. Finally, moisture transport is modeled through an unsaturated 2-D slice representing the upper stratigraphic layers beneath Area G and a west-to-east cut of several miles to examine possible lateral movement from the west where percolation is assumed to be greater than at Area G. Results show some west-to-east moisture flux consistent with the assumed profile for the percolation boundary conditions.

  8. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  9. Multiresolution analysis over simple graphs for brain computer interfaces

    Science.gov (United States)

    Asensio-Cubero, J.; Gan, J. Q.; Palaniappan, R.

    2013-08-01

    Objective. Multiresolution analysis (MRA) offers a useful framework for signal analysis in the temporal and spectral domains, although commonly employed MRA methods may not be the best approach for brain computer interface (BCI) applications. This study aims to develop a new MRA system for extracting tempo-spatial-spectral features for BCI applications based on wavelet lifting over graphs. Approach. This paper proposes a new graph-based transform for wavelet lifting and a tailored simple graph representation for electroencephalography (EEG) data, which results in an MRA system where temporal, spectral and spatial characteristics are used to extract motor imagery features from EEG data. The transformed data is processed within a simple experimental framework to test the classification performance of the new method. Main Results. The proposed method can significantly improve the classification results obtained by various wavelet families using the same methodology. Preliminary results using common spatial patterns as feature extraction method show that we can achieve comparable classification accuracy to more sophisticated methodologies. From the analysis of the results we can obtain insights into the pattern development in the EEG data, which provide useful information for feature basis selection and thus for improving classification performance. Significance. Applying wavelet lifting over graphs is a new approach for handling BCI data. The inherent flexibility of the lifting scheme could lead to new approaches based on the hereby proposed method for further classification performance improvement.

  10. Computational analysis of aircraft pressure relief doors

    Science.gov (United States)

    Schott, Tyler

    Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft

  11. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  12. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    Science.gov (United States)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  13. Current Mooring Design in Partner WECs and Candidates for Preliminary Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Ferri, Francesco; Kofoed, Jens Peter

    This report is the combined report of Commercial Milestone "CM1: Design and Cost of Current Mooring Solutions of Partner WECs" and Milestone "M3: Mooring Solutions for Preliminary Analysis" of the EUDP project "Mooring Solutions for Large Wave Energy Converters". The report covers a description...... of the current mooring design of the partner Wave Energy Converter (WEC) developers in the project, together with a preliminary cost estimate of the systems....

  14. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    Science.gov (United States)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  15. Preliminary Dynamic Siol-Structure-Interaction Analysis for the Waste Handling Building

    Energy Technology Data Exchange (ETDEWEB)

    G. Wagenblast

    2000-05-01

    The objective of this analysis package is to document a preliminary dynamic seismic evaluation of a simplified design concept of the Wade Handling Building (WHB). Preliminary seismic ground motions and soil data will be used. Loading criteria of the WHB System Design Description will be used. Detail design of structural members will not be performed.. The results of the analysis will be used to determine preliminary sizes of structural concrete and steel members and to determine whether the seismic response of the structure is within an acceptable level for future License Application design of safety related facilities. In order to complete this preliminary dynamic evaluation to meet the Site Recommendation (SR) schedule, the building configuration was ''frozen in time'' as the conceptual design existed in October 1999. Modular design features and dry or wet waste storage features were intentionally excluded from this preliminary dynamic seismic evaluation. The document was prepared in accordance with the Development Plan for the ''Preliminary/Dynamic Soil Structure Interaction Analysis for the Waste Handling Building'' (CRWMS M&O 2000b), which was completed, in accordance with AP-2.13Q, ''Technical Product Development Planning''.

  16. Computer-Based Reading Programs: A Preliminary Investigation of Two Parent Implemented Programs with Students At-Risk for Reading Failure

    Science.gov (United States)

    Pindiprolu, Sekhar S.; Forbush, David

    2009-01-01

    In 2000, National Reading Panelists (NRP) reported that computer delivered reading instruction has potential for promoting the reading skills of students at-risk for reading failure. However, panelists also noted a scarcity of data present in the literature on the effects of computer-based reading instruction. This preliminary investigation…

  17. Thermal Hydraulic Analysis of K-DEMO Single Blanket Module for Preliminary Accident Analysis using MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Sung Bo; Bang, In Cheol [UNIST, Ulsan (Korea, Republic of)

    2016-05-15

    To develop the Korean fusion commercial reactor, preliminary design concept for K-DEMO (Korean fusion demonstration reactor) has been announced by NFRI (National Fusion Research Institute). This pre-conceptual study of K-DEMO has been introduced to identify technical details of a fusion power plant for the future commercialization of fusion reactor in Korea. Before this consideration, to build the K-DEMO, accident analysis is essential. Since the Fukushima accident, which is severe accident from unexpected disaster, safety analysis of nuclear power plant has become important. The safety analysis of both fission and fusion reactors is deemed crucial in demonstrating the low radiological effect of these reactors on the environment, during severe accidents. A risk analysis of K-DEMO should be performed, as a prerequisite for the construction of a fusion reactor. In this research, thermal-hydraulic analysis of single blanket module of K-DEMO is conducted for preliminary accident analysis for K-DEMO. Further study about effect of flow distributer is conducted. The normal K-DEMO operation condition is applied to the boundary condition and simulated to verify the material temperature limit using MELCOR. MELCOR is fully integrated, relatively fast-running code developed by Sandia National Laboratories. MELCOR had been used for Light Water Reactors and fusion reactor version of MELCOR was developed for ITER accident analysis. This study shows the result of thermal-hydraulic simulation of single blanket module with MELCOR which is severe accident code for nuclear fusion safety analysis. The difference of mass flow rate for each coolant channel with or without flow distributer is presented. With flow distributer, advantage of broadening temperature gradient in the K-DEMO blanket module and increase mass flow toward first wall is obtained. This can enhance the safety of K-DEMO blanket module. Most 13 .deg. C temperature difference in blanket module is obtained.

  18. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  19. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  20. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  1. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  2. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  3. Preliminary In-Flight Loads Analysis of In-Line Launch Vehicles using the VLOADS 1.4 Program

    Science.gov (United States)

    Graham, J. B.; Luz, P. L.

    1998-01-01

    To calculate structural loads of in-line launch vehicles for preliminary design, a very useful computer program is VLOADS 1.4. This software may also be used to calculate structural loads for upper stages and planetary transfer vehicles. Launch vehicle inputs such as aerodynamic coefficients, mass properties, propellants, engine thrusts, and performance data are compiled and analyzed by VLOADS to produce distributed shear loads, bending moments, axial forces, and vehicle line loads as a function of X-station along the vehicle's length. Interface loads, if any, and translational accelerations are also computed. The major strength of the software is that it enables quick turnaround analysis of structural loads for launch vehicles during the preliminary design stage of its development. This represents a significant improvement over the alternative-the time-consuming, and expensive chore of developing finite element models. VLOADS was developed as a Visual BASIC macro in a Microsoft Excel 5.0 work book on a Macintosh. VLOADS has also been implemented on a PC computer using Microsoft Excel 7.0a for Windows 95. VLOADS was developed in 1996, and the current version was released to COSMIC, NASA's Software Technology Transfer Center, in 1997. The program is a copyrighted work with all copyright vested in NASA.

  4. Pooled shRNA screenings: computational analysis.

    Science.gov (United States)

    Yu, Jiyang; Putcha, Preeti; Califano, Andrea; Silva, Jose M

    2013-01-01

    Genome-wide RNA interference screening has emerged as a powerful tool for functional genomic studies of disease-related phenotypes and the discovery of molecular therapeutic targets for human diseases. Commercial short hairpin RNA (shRNA) libraries are commonly used in this area, and state-of-the-art technologies including microarray and next-generation sequencing have emerged as powerful methods to analyze shRNA-triggered phenotypes. However, computational analysis of this complex data remains challenging due to noise and small sample size from such large-scaled experiments. In this chapter we discuss the pipelines and statistical methods of processing, quality assessment, and post-analysis for both microarray- and sequencing-based screening data.

  5. Cusum charts for preliminary analysis of individual observations

    OpenAIRE

    1997-01-01

    textabstractA preliminary Cusum chart based on individual observations is developed from the uniformly most powerful test for the detection of linear trends. This Cusum chart is compared with several of its competitors which are based on the likelihood ratio test and on transformations of standardized recursive residuals on which for instance the Q-chart methodology is based. It turns out that the new proposed Cusum chart is not only superior in the detection of linear trend out-of-control co...

  6. Volume analysis of heat-induced cracks in human molars: A preliminary study

    Directory of Open Access Journals (Sweden)

    Michael A. Sandholzer

    2014-01-01

    Full Text Available Context: Only a few methods have been published dealing with the visualization of heat-induced cracks inside bones and teeth. Aims : As a novel approach this study used nondestructive X-ray microtomography (micro-CT for volume analysis of heat-induced cracks to observe the reaction of human molars to various levels of thermal stress. Materials and Methods: Eighteen clinically extracted third molars were rehydrated and burned under controlled temperatures (400, 650, and 800°C using an electric furnace adjusted with a 25°C increase/min. The subsequent high-resolution scans (voxel-size 17.7 μm were made with a compact micro-CT scanner (SkyScan 1174. In total, 14 scans were automatically segmented with Definiens XD Developer 1.2 and three-dimensional (3D models were computed with Visage Imaging Amira 5.2.2. The results of the automated segmentation were analyzed with an analysis of variance (ANOVA and uncorrected post hoc least significant difference (LSD tests using Statistical Package for Social Sciences (SPSS 17. A probability level of P < 0.05 was used as an index of statistical significance. Results: A temperature-dependent increase of heat-induced cracks was observed between the three temperature groups (P < 0.05, ANOVA post hoc LSD. In addition, the distributions and shape of the heat-induced changes could be classified using the computed 3D models. Conclusion: The macroscopic heat-induced changes observed in this preliminary study correspond with previous observations of unrestored human teeth, yet the current observations also take into account the entire microscopic 3D expansions of heat-induced cracks within the dental hard tissues. Using the same experimental conditions proposed in the literature, this study confirms previous results, adds new observations, and offers new perspectives in the investigation of forensic evidence.

  7. Grid-connected ICES preliminary feasibility analysis and evaluation. Final report. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-30

    A group of hospitals, clinics, research facilities, and medical education facilities, known as the HEAL Complex, was chosen as the site (in New Orleans) for the demonstration of a Grid-Connected Integrated Community Energy System (ICES). The contract work included a preliminary energy supply/demand assessment of the Demonstration Community, a preliminary feasibility analysis and conceptual design of a candidate Demonstration System, preliminary assessment of institutional factors, preparation of a detailed work management plan for subsequent phases of the demonstration program, firming-up of commitments from participating parties, and reporting thereon. This Phase I study has indicated that a central ICES plant producing steam, chilled water, and by-product electricity to serve the HEAL Complex is technically and economically feasible to the extent that Phase II, Detailed Feasibility and Preliminary Design, should be implemented. (MCW)

  8. Application of Computed Tomography Virtual Noncontrast Spectral Imaging in Evaluation of Hepatic Metastases: A Preliminary Study

    Institute of Scientific and Technical Information of China (English)

    Shi-Feng Tian; Ai-Lian Liu; Jing-Hong Liu; Mei-Yu Sun; He-Qing Wang; Yi-Jun Liu

    2015-01-01

    Objective:The objective was to qualitatively and quantitatively evaluate hepatic metastases using computed tomography (CT) virtual noncontrast (VNC) spectral imaging in a retrospective analysis.Methods:Forty hepatic metastases patients underwent CT scans including the conventional true noncontrast (TNC) and the tri-phasic contrast-enhanced dual energy spectral scans in the hepatic arterial,portal venous,and equilibrium phases.The tri-phasic spectral CT images were used to obtain three groups of VNC images including in the arterial (VNCa),venous (VNCv),and equilibrium (VNCe) phase by the material decomposition process using water and iodine as a base material pair.The image quality and the contrast-to-noise ratio (CNR) of metastasis of the four groups were compared with ANOVA analysis.The metastasis detection rates with the four nonenhanced image groups were calculated and compared using the Chi-square test.Results:There were no significant differences in image quality among TNC,VNCa and VNCv images (P > 0.05).The quality of VNCe images was significantly worse than that of other three groups (P < 0.05).The mean CNR of metastasis in the TNC and VNCs images was 1.86,2.42,1.92,and 1.94,respectively; the mean CNR of metastasis in VNCa images was significantly higher than that in other three groups (P < 0.05),while no statistically significant difference was observed among VNCv,VNCe and TNC images (P > 0.05).The metastasis detection rate of the four nonenhanced groups with no statistically significant difference (P > 0.05).Conclusions:The quality of VNCa and VNCv images is identical to that of TNC images,and the metastasis detection rate in VNC images is similar to that in TNC images.VNC images obtained from arterial phase show metastases more clearly.Thus,VNCa imaging may be a surrogate to TNC imaging in hepatic metastasis diagnosis.

  9. Dynamic contrast-enhanced computed tomography as a potential biomarker in patients with metastatic renal cell carcinoma: preliminary results from the Danish Renal Cancer Group Study-1

    DEFF Research Database (Denmark)

    Mains, Jill Rachel; Donskov, Frede; Pedersen, Erik Morre

    2014-01-01

    OBJECTIVES: The aim of this study was to explore the impact of dynamic contrast-enhanced (DCE) computer tomography (CT) as a biomarker in metastatic renal cell carcinoma (mRCC). MATERIALS AND METHODS: Twelve patients with favorable or intermediate Memorial Sloan Kettering Cancer Center risk group...... and clear cell mRCC participating in an ongoing prospective randomized phase II trial comprising interleukin-2-based immunotherapy and bevacizumab were included in this preliminary analysis. All patients had a follow-up time of at least 2 years. Interpretation of DCE-CT (max slope method) was performed...... not reached, P = 0.031). CONCLUSIONS: Dynamic contrast-enhanced CT is a potential biomarker in patients with mRCC. High baseline BF and reductions in BF and BV during early treatment are associated with improved outcome. Large-scale studies are required....

  10. Turbine Fuels from Tar Sands Bitumen and Heavy Oil. Phase I. Preliminary Process Analysis.

    Science.gov (United States)

    1985-04-09

    Process Analysis A. F. Talbot. V. Elanchenny, L. H. Finkel, A. Macris and 3. P. Schwedock Sun Tech, Inc., A Subsidiary of Sun Co. P. 0. Box 1135 Marcus Hook...investigation be carried out in three discrete phases, as described below: Phase I - Preliminary process analysis includes an eval- uation of the potential of

  11. A Preliminary Study on Gender Differences in Studying Systems Analysis and Design

    Science.gov (United States)

    Lee, Fion S. L.; Wong, Kelvin C. K.

    2017-01-01

    Systems analysis and design is a crucial task in system development and is included in a typical information systems programme as a core course. This paper presented a preliminary study on gender differences in studying a systems analysis and design course of an undergraduate programme. Results indicated that male students outperformed female…

  12. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  13. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  14. Preliminary Computational Fluid Dynamics (CFD) Simulation of EIIB Push Barge in Shallow Water

    Science.gov (United States)

    Beneš, Petr; Kollárik, Róbert

    2011-12-01

    This study presents preliminary CFD simulation of EIIb push barge in inland conditions using CFD software Ansys Fluent. The RANSE (Reynolds Averaged Navier-Stokes Equation) methods are used for the viscosity solution of turbulent flow around the ship hull. Different RANSE methods are used for the comparison of their results in ship resistance calculations, for selecting the appropriate and removing inappropriate methods. This study further familiarizes on the creation of geometrical model which considers exact water depth to vessel draft ratio in shallow water conditions, grid generation, setting mathematical model in Fluent and evaluation of the simulations results.

  15. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    Science.gov (United States)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  16. Computational analysis of unmanned aerial vehicle (UAV)

    Science.gov (United States)

    Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran

    2017-01-01

    A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.

  17. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...... have become the standard with which to assess emphysema extent but airway abnormalities have so far been more challenging to quantify. Automated methods for analysis are indispensable as the visible airway tree in a CT scan can include several hundreds of individual branches. However, automation...... of scan on airway dimensions in subjects with and without COPD. The results show measured airway dimensions to be affected by differences in the level of inspiration and this dependency is again influenced by COPD. Inspiration level should therefore be accounted for when measuring airways, and airway...

  18. Children's strategies to solving additive inverse problems: a preliminary analysis

    Science.gov (United States)

    Ding, Meixia; Auxter, Abbey E.

    2017-03-01

    Prior studies show that elementary school children generally "lack" formal understanding of inverse relations. This study goes beyond lack to explore what children might "have" in their existing conception. A total of 281 students, kindergarten to third grade, were recruited to respond to a questionnaire that involved both contextual and non-contextual tasks on inverse relations, requiring both computational and explanatory skills. Results showed that children demonstrated better performance in computation than explanation. However, many students' explanations indicated that they did not necessarily utilize inverse relations for computation. Rather, they appeared to possess partial understanding, as evidenced by their use of part-whole structure, which is a key to understanding inverse relations. A close inspection of children's solution strategies further revealed that the sophistication of children's conception of part-whole structure varied in representation use and unknown quantity recognition, which suggests rich opportunities to develop students' understanding of inverse relations in lower elementary classrooms.

  19. Children's strategies to solving additive inverse problems: a preliminary analysis

    Science.gov (United States)

    Ding, Meixia; Auxter, Abbey E.

    2017-01-01

    Prior studies show that elementary school children generally "lack" formal understanding of inverse relations. This study goes beyond lack to explore what children might "have" in their existing conception. A total of 281 students, kindergarten to third grade, were recruited to respond to a questionnaire that involved both contextual and non-contextual tasks on inverse relations, requiring both computational and explanatory skills. Results showed that children demonstrated better performance in computation than explanation. However, many students' explanations indicated that they did not necessarily utilize inverse relations for computation. Rather, they appeared to possess partial understanding, as evidenced by their use of part-whole structure, which is a key to understanding inverse relations. A close inspection of children's solution strategies further revealed that the sophistication of children's conception of part-whole structure varied in representation use and unknown quantity recognition, which suggests rich opportunities to develop students' understanding of inverse relations in lower elementary classrooms.

  20. Computational electromagnetic analysis of plasmonic effects in interdigital photodetectors

    Science.gov (United States)

    Hill, Avery M.; Nusir, Ahmad I.; Nguyen, Paul V.; Manasreh, Omar M.; Herzog, Joseph B.

    2014-09-01

    Plasmonic nanostructures have been shown to act as optical antennas that enhance optical devices. This study focuses on computational electromagnetic (CEM) analysis of GaAs photodetectors with gold interdigital electrodes. Experiments have shown that the photoresponse of the devices depend greatly on the electrode spacing and the polarization of the incident light. Smaller electrode spacing and transverse polarization give rise to a larger photoresponse. This computational study will simulate the optical properties of these devices to determine what plasmonic properties and optical enhancement these devices may have. The models will be solving Maxwell's equations with a finite element method (FEM) algorithm provided by the software COMSOL Multiphysics 4.4. The preliminary results gathered from the simulations follow the same trends that were seen in the experimental data collected, that the spectral response increases when the electrode spacing decreases. Also the simulations show that incident light with the electric field polarized transversely across the electrodes produced a larger photocurrent as compared with longitudinal polarization. This dependency is similar to other plasmonic devices. The simulation results compare well with the experimental data. This work also will model enhancement effects in nanostructure devices with dimensions that are smaller than the current samples to lead the way for future nanoscale devices. By seeing the potential effects that the decreased spacing could have, it opens the door to a new set of devices on a smaller scale, potentially ones with a higher level of enhancement for these devices. In addition, the precise modeling and understanding of the effects of the parameters provides avenues to optimize the enhancement of these structures making more efficient photodetectors. Similar structures could also potentially be used for enhanced photovoltaics as well.

  1. The difference between playing games with and without the computer: a preliminary view.

    Science.gov (United States)

    Antonietti, Alessandro; Mellone, Rosa

    2003-03-01

    The authors address the question of whether associations between video games and cognitive and metacognitive variables depend either on the features of the computer or on the content of the game that the computer allows one to play. An experiment to separate these two kinds of effects was carried out by using a traditional version and a computer-supported version of Pegopolis, a solitaire game. The two versions were exactly the same except that they were played by moving pieces either on a real board or on a virtual computer-presented board. The performance levels and strategies followed during the game by the 40 undergraduates who took part in the experiment were not significantly different in the real and virtual conditions. None of the participants transferred playing strategies or practice from one version of the game to the other. Scores were not affected by gender or by the studies pursued by participants, the habit of playing games in the traditional manner or playing video games, or intelligence. Retrospective reports did not support differences in the subjective experience between the two versions. Results showed that video games, when they do not make much use of the computer's special features, produce effects because of the situations they simulate rather than because of features of the computer itself.

  2. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study

    Science.gov (United States)

    Gargiulo, Gaetano D; Mohamed, Armin; McEwan, Alistair L; Bifulco, Paolo; Cesarelli, Mario; Jin, Craig T; Ruffo, Mariano; Tapson, Jonathan; van Schaik, André

    2012-01-01

    Feedback plays an important role when learning to use a brain computer interface (BCI), particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous μ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone. PMID:23152713

  3. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  4. Thick Concrete Specimen Construction, Testing, and Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoegh, Kyle [Univ. of Minnesota, Minneapolis, MN (United States); Khazanovich, Lev [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-03-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations. A preliminary report detailed some of the challenges associated with thick reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures for using in NDE evaluation comparisons. This led to the construction of the concrete specimen presented in this report, which has sufficient reinforcement density and cross-sectional size to represent an NPP containment wall. Details on how a suitably thick concrete specimen was constructed are presented, including the construction materials, final nominal design schematic, as well as formwork and rigging required to safely meet the desired dimensions of the concrete structure. The report also details the type and methods of forming the concrete specimen as well as information on how the rebar and simulated defects were embedded. Details on how the resulting specimen was transported, safely anchored, and marked to allow access for systematic comparative NDE testing of defects in a representative NPP containment wall concrete specimen are also given. Data collection using the MIRA Ultrasonic NDE equipment and

  5. Psychological underpinnings of intrafamilial computer-mediated communication: a preliminary exploration of CMC uptake with parents and siblings.

    Science.gov (United States)

    Goby, Valerie Priscilla

    2011-06-01

    This preliminary study investigates the uptake of computer-mediated communication (CMC) with parents and siblings, an area on which no research appears to have been conducted. Given the lack of relevant literature, grounded theory methodology was used and online focus group discussions were conducted in an attempt to generate suitable hypotheses for further empirical studies. Codification of the discussion data revealed various categories of meaning, namely: a perceived inappropriateness of CMC with members of family of origin; issues relating to the family generational gap; the nature of the offline sibling/parent relationship; the non-viability of online affordances such as planned self-disclosure, deception, identity construction; and disinhibition in interactions with family-of-origin members. These themes could be molded into hypotheses to assess the psychosocial limitations of CMC and to determine if it can indeed become a ubiquitous alternative to traditional communication modes as some scholars have claimed.

  6. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    Science.gov (United States)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  7. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  8. Investigating the role of combined acoustic-visual feedback in one-dimensional synchronous brain computer interfaces, a preliminary study

    Directory of Open Access Journals (Sweden)

    Gargiulo GD

    2012-09-01

    Full Text Available Gaetano D Gargiulo,1–3 Armin Mohamed,1 Alistair L McEwan,1 Paolo Bifulco,2 Mario Cesarelli,2 Craig T Jin,1 Mariano Ruffo,2 Jonathan Tapson,3 André van Schaik31School of Electrical and Information Engineering, The University of Sydney, New South Wales, Australia; 2Dipartimento di Ingegneria Elettronica e delle Telecomunicazioni "Federico II" University of Naples, Naples, Italy; 3BENS Laboratory, MARCS Institute, The University of Western Sydney, New South Wales, AustraliaAbstract: Feedback plays an important role when learning to use a brain computer interface (BCI, particularly in the case of synchronous feedback that relies on the interaction subject. In this preliminary study, we investigate the role of combined auditory-visual feedback during synchronous µ rhythm-based BCI sessions to help the subject to remain focused on the selected imaginary task. This new combined feedback, now integrated within the general purpose BCI2000 software, has been tested on eight untrained and three trained subjects during a monodimensional left-right control task. In order to reduce the setup burden and maximize subject comfort, an electroencephalographic device suitable for dry electrodes that required no skin preparation was used. Quality and index of improvement was evaluated based on a personal self-assessment questionnaire from each subject and quantitative data based on subject performance. Results for this preliminary study show that the combined feedback was well tolerated by the subjects and improved performance in 75% of the naïve subjects compared with visual feedback alone.Keywords: brain computer interface, dry electrodes, subject feedback

  9. Preliminary Analysis of Species Partitioning in the DWPF Melter

    Energy Technology Data Exchange (ETDEWEB)

    Choi, A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kesterson, M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Johnson, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-07-15

    The work described in this report is preliminary in nature since its goal was to demonstrate the feasibility of estimating the off-gas entrainment rates from the Defense Waste Processing Facility (DWPF) melter based on a simple mass balance using measured feed and glass pour stream compositions and timeaveraged melter operating data over the duration of one canister-filling cycle. The only case considered in this study involved the SB6 pour stream sample taken while Canister #3472 was being filled over a 20-hour period on 12/20/2010, approximately three months after the bubblers were installed. The analytical results for that pour stream sample provided the necessary glass composition data for the mass balance calculations. To estimate the “matching” feed composition, which is not necessarily the same as that of the Melter Feed Tank (MFT) batch being fed at the time of pour stream sampling, a mixing model was developed involving three preceding MFT batches as well as the one being fed at that time based on the assumption of perfect mixing in the glass pool but with an induction period to account for the process delays involved in the calcination/fusion step in the cold cap and the melter turnover.

  10. Preliminary Coupling of MATRA Code for Multi-physics Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seongjin; Choi, Jinyoung; Yang, Yongsik; Kwon, Hyouk; Hwang, Daehyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    The boundary conditions such as the inlet temperature, mass flux, averaged heat flux, power distributions of the rods, and core geometry is given by constant values or functions of time. These conditions are separately calculated and provided by other codes, such as a neutronics or a system codes, into the MATRA code. In addition, the coupling of several codes in the different physics field is focused and embodied. In this study, multiphysics coupling methods were developed for a subchannel code (MATRA) with neutronics codes (MASTER, DeCART) and a fuel performance code (FRAPCON-3). Preliminary evaluation results for representative sample cases are presented. The MASTER and DeCART codes provide the power distribution of the rods in the core to the MATRA code. In case of the FRAPCON-3 code, the variation of the rod diameter induced by the thermal expansion is yielded and provided. The MATRA code transfers the thermal-hydraulic conditions that each code needs. Moreover, the coupling method with each code is described.

  11. Laboratory Investigations on Estuary Salinity Mixing: Preliminary Analysis

    Directory of Open Access Journals (Sweden)

    F. H. Nuryazmeen

    2014-05-01

    Full Text Available Estuaries are bodies of water along the coasts that are formed when fresh water from rivers flows into and mixes with salt water from the ocean. The estuaries serve as a habitat to some aquatic lives, including mangroves. Human-induced activities such as dredging of shipping lanes along the bottom estuarine, the disposal of industrial wastes into the water system and shoreline development influence estuarine dynamics which include mixing process. These activities might contribute to salinity changes and further adversely affect the estuarine ecosystem. In order to study at the characteristics of the mixing between salt water (estuary and freshwater (river, a preliminary investigation had been done in the laboratory. Fresh water was released from one end of the flume and overflowing at weir at the other end. Meanwhile, salt water was represented by the red dye tracer released through a weir and intruded upstream as a gravity current. The isohalines are plotted to see the salinity patterns. Besides, to examine the spatial and temporal salinity profiles along the laboratory investigations, the plotted graphs have been made. The results show that the changes in salinity level along the flume due to mixing between fresh water and salt water. This showed typical salt-wedge estuary characteristics.

  12. A computational design system for rapid CFD analysis

    Science.gov (United States)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.

    1992-01-01

    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  13. Computer-mediated communication and the Gallaudet University community: a preliminary report.

    Science.gov (United States)

    Hogg, Nanette M; Lomicky, Carol S; Weiner, Stephen F

    2008-01-01

    The study examined the use of computer-mediated communication (CMC) among individuals involved in a conflict sparked by the appointment of an administrator as president-designate of Gallaudet University in 2006. CMC was defined as forms of communication used for transmitting (sharing) information through networks with digital devices. There were 662 survey respondents. Respondents reported overwhelmingly (98%) that they used CMC to communicate. Students and alumni reported CMC use in larger proportions than any other group. The favorite devices among all respondents were Sidekicks, stationary computers, and laptops. Half of all respondents also reported using some form of video device. Nearly all reported using e-mail; respondents also identified Web surfing, text messaging, and blogging as popular CMC activities. The authors plan another article reporting on computer and electronic technology use as a mechanism connecting collective identity to social movements.

  14. Schlieren sequence analysis using computer vision

    Science.gov (United States)

    Smith, Nathanial Timothy

    Computer vision-based methods are proposed for extraction and measurement of flow structures of interest in schlieren video. As schlieren data has increased with faster frame rates, we are faced with thousands of images to analyze. This presents an opportunity to study global flow structures over time that may not be evident from surface measurements. A degree of automation is desirable to extract flow structures and features to give information on their behavior through the sequence. Using an interdisciplinary approach, the analysis of large schlieren data is recast as a computer vision problem. The double-cone schlieren sequence is used as a testbed for the methodology; it is unique in that it contains 5,000 images, complex phenomena, and is feature rich. Oblique structures such as shock waves and shear layers are common in schlieren images. A vision-based methodology is used to provide an estimate of oblique structure angles through the unsteady sequence. The methodology has been applied to a complex flowfield with multiple shocks. A converged detection success rate between 94% and 97% for these structures is obtained. The modified curvature scale space is used to define features at salient points on shock contours. A challenge in developing methods for feature extraction in schlieren images is the reconciliation of existing techniques with features of interest to an aerodynamicist. Domain-specific knowledge of physics must therefore be incorporated into the definition and detection phases. Known location and physically possible structure representations form a knowledge base that provides a unique feature definition and extraction. Model tip location and the motion of a shock intersection across several thousand frames are identified, localized, and tracked. Images are parsed into physically meaningful labels using segmentation. Using this representation, it is shown that in the double-cone flowfield, the dominant unsteady motion is associated with large scale

  15. SIFT - A preliminary evaluation. [Software Implemented Fault Tolerant computer for aircraft control

    Science.gov (United States)

    Palumbo, D. L.; Butler, R. W.

    1983-01-01

    This paper presents the results of a performance evaluation of the SIFT computer system conducted in the NASA AIRLAB facility. The essential system functions are described and compared to both earlier design proposals and subsequent design improvements. The functions supporting fault tolerance are found to consume significant computing resources. With SIFT's specimen task load, scheduled at a 30-Hz rate, the executive tasks such as reconfiguration, clock synchronization and interactive consistency, require 55 percent of the available task slots. Other system overhead (e.g., voting and scheduling) use an average of 50 percent of each remaining task slot.

  16. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  17. Preliminary Analysis of a Novel SAR Based Emergency System for Earth Orbit Satellites using Galileo

    NARCIS (Netherlands)

    Gill, E.K.A.; Helderweirt, A.

    2010-01-01

    This paper presents a preliminary analysis of a novel Search and Rescue (SAR) based emergency system for Low Earth Orbit (LEO) satellites using the Galileo Global Navigation Satellite System (GNSS). It starts with a description of the space user SAR system including a concept description, mission ar

  18. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  19. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  20. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  1. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review

    Science.gov (United States)

    Buteau, Chantal; Marshall, Neil; Jarvis, Daniel; Lavicza, Zsolt

    2010-01-01

    We present results of a literature review pilot study (326 papers) regarding the use of Computer Algebra Systems (CAS) in tertiary mathematics education. Several themes that have emerged from the review are discussed: diverse uses of CAS, benefits to student learning, issues of integration and mathematics learning, common and innovative usage of…

  2. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage

  3. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  4. A Solar Powered Wireless Computer Mouse: Design, Assembly and Preliminary Testing of 15 Prototypes

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.; Reich, N.H.; Alsema, E.A.; Netten, M.P.; Veefkind, M.; Silvester, S.; Elzen, B.; Verwaal, M.

    2007-01-01

    The concept and design of a solar powered wireless computer mouse has been completed, and 15 prototypes have been successfully assembled. After necessary cutting, the crystalline silicon cells show satisfactory efficiency: up to 14% when implemented into the mouse device. The implemented voltage con

  5. Benefits of texture analysis of dual energy CT for Computer-Aided pulmonary embolism detection.

    Science.gov (United States)

    Foncubierta-Rodríguez, Antonio; Jiménez del Toro, Óscar Alfonso; Platon, Alexandra; Poletti, Pierre-Alexandre; Müller, Henning; Depeursinge, Adrien

    2013-01-01

    Pulmonary embolism is an avoidable cause of death if treated immediately but delays in diagnosis and treatment lead to an increased risk. Computer-assisted image analysis of both unenhanced and contrast-enhanced computed tomography (CT) have proven useful for diagnosis of pulmonary embolism. Dual energy CT provides additional information over the standard single energy scan by generating four-dimensional (4D) data, in our case with 11 energy levels in 3D. In this paper a 4D texture analysis method capable of detecting pulmonary embolism in dual energy CT is presented. The method uses wavelet-based visual words together with an automatic geodesic-based region of interest detection algorithm to characterize the texture properties of each lung lobe. Results show an increase in performance with respect to the single energy CT analysis, as well as an accuracy gain compared to preliminary work on a small dataset.

  6. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  7. Preliminary Analysis of the Oklahoma Wavefields Demonstration Dataset

    Science.gov (United States)

    Anderson, K. R.; Sweet, J. R.; Woodward, R.; Karplus, M. S.; DeShon, H. R.; Magnani, M. B.; Hayward, C.; Langston, C. A.

    2016-12-01

    In June 2016, a field crew of 50 students, faculty, industry personnel and IRIS staff deployed a total of 390 stations as part of a community seismic experiment above an active seismic lineament in north-central Oklahoma. The goals of the experiment were to test new instrumentation and deployment strategies that record the full wavefield, and to advance understanding of earthquake source processes and regional lithospheric structure. The crew deployed 363 3C 4.5Hz Generation 2 Fairfield Z-Land nodes along three seismic lines and in a seven-layer nested gradiometer array. The seismic lines spanned a region 13 km long by 5 km wide. The nested gradiometer was designed to measure the full seismic wavefield using standard frequency-wavenumber techniques and spatial wave gradients. A broadband, 18 station "Golay 3x6" array was deployed around the gradiometer and seismic lines with an aperture of approximately 5 km to collect waveform data from local and regional events. In addition, 9 infrasound stations were deployed in order to capture and identify acoustic events that might be recorded by the seismic arrays and to quantify the wind acoustic noise effect on co-located broadband stations. The variety of instrumentation used in this deployment was chosen to capture the full seismic wavefield generated by the local and regional seismicity beneath the array and the surrounding region. We present preliminary results from the data collected during the experiment. We analyze the level of signal coherence observed across the nested gradiometer and Golay array as well as array design fidelity. We report on data quality, including completeness and noise levels, for the various types of instrumentation. We also examine the performance of co-located surface and buried nodes to determine the benefits of each installation type. Finally, we present performance comparisons between co-located nodes and broadband stations and compare these results to prior wavefield/large-N deployments

  8. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    Science.gov (United States)

    Graham, J.

    1993-03-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  9. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  10. Computer assistance in clinical functional analysis.

    Science.gov (United States)

    Ahlers, M O; Jakstat, H A

    2002-10-01

    The use of computers in the dental practice has been primarily restricted to the acquisition of billing data. Additional possibilities for use of PCs exist in diagnostic data acquisition and evaluation; clinical functional analysis seems a particularly suitable application. Such software is now available: CMDfact. Dentally, it is based on a previously developed and published examination and documentation system, the graphic user interface of which is used in the newly developed software. After the examination data have been acquired by mouse click or numerical entry, these are available for evaluation. A special function, the "Diagnosis pilot" is integrated to support the user. This helps in the assignment of the appropriate "Initial diagnoses", since it brings together the individually existing principal symptoms and suitable diagnoses for the initial diagnosis in question and also states which diagnoses "would be appropriate" for this, but are not available. With 3D animation, the software also helps the dentist to explain aspects of CMD to patients. The software also assists the dentist with a detailed multimedia help system, which provides context-sensitive help for every examination step. These help functions explain the sense of the relevant examinations, their performance and evaluation in the form of short texts and explanatory photographs and videos.

  11. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O' Connell, R.A.; Luck, D.; Camli, U.; King, L.N. (St. Vincent' s Medical Center, New York, NY (USA))

    1991-08-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy.

  12. Cone-Beam Computed Tomography Evaluation of Mental Foramen Variations: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Mahnaz Sheikhi

    2015-01-01

    Full Text Available Background. Mental foramen is important in surgical operations of premolars because it transfers the mental nerves and vessels. This study evaluated the variations of mental foramen by cone-beam computed tomography among a selected Iranian population. Materials and Methods. A total number of 180 cone-beam computed tomography projections were analyzed in terms of shape, size, direction, and horizontal and vertical positions of mental foramen in the right and left sides. Results. The most common shape was oval, opening direction was posterior-superior, horizontal position was in line with second premolar, and vertical position was apical to the adjacent dental root. The mean of foremen diameter was 3.59 mm. Conclusion. In addition to the most common types of mental foramen, other variations exist, too. Hence, it reflects the significance of preoperative radiographic examinations, especially 3-dimensional images to prevent nerve damage.

  13. Preliminary assessment of Tongue Drive System in medium term usage for computer access and wheelchair control.

    Science.gov (United States)

    Yousefi, Behnaz; Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless, wearable assistive technology that enables individuals with severe motor impairments access computers, drive wheelchairs, and control their environments using tongue motion. In this paper, we have evaluated the TDS performance as a computer input device using ISO9241-9 standard tasks for pointing and selecting, based on the well known Fitts' Law, and as a powered wheelchair controller through an obstacle course navigation task. Nine able-bodied subjects who already had tongue piercing participated in this trial over 5 sessions during 5 weeks, allowing us to study the TDS learning process and its current limiting factors. Subjects worn tongue rings made of titanium in the form of a barbell with a small rare earth magnetic tracer hermetically sealed inside the upper ball. Comparing the results between 1(st) and 5(th) sessions showed that subjects' performance improved in all the measures through 5 sessions, demonstrating the effects of learning.

  14. The Influencing Factors of Computer Adoption in Agribusiness: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Sudaryanto

    2011-08-01

    Full Text Available This research is aimed to investigate factors that influence the intentions of adopting computer for business purpose, and their implications on managerial development. Semi structured interview and courier mailed survey had been employed to collect the data. A conceptual framework and theoretical insight presented based on literature review and primary data collected from the various east java agribusiness. To develop qualitative information of sample characteristics, cross tabulation was employed. Logistic regression was used to test the research hypotheses. The research findings show that the intention to adopt computer in east java agribusiness is strongly influenced by managers whose ages are 41+, education (TAFE/D3, and sales volume. This research has a direct implication on agribusiness development for the overall east java agribusiness and provinces in Indonesia. It is expected that it will encourage other researchers to conducting similar research benchmarking with other developing countries. The complexity and wide range of agribusiness term made the research methodology complicated.

  15. Preliminary Design and Analysis of ITER In-Wall Shielding

    Institute of Scientific and Technical Information of China (English)

    LIU Changle; YU Jie; WU Songtao; CAI Yingxiang; PAN Wanjiang

    2007-01-01

    ITER in-wall shielding (IIS) is situated between the doubled shells of the ITER Vacuum Vessel (IVV). Its main functions are applied in shielding neutron, gamma-ray and toroidal field ripple reduction. The structure of IIS has been modelled according to the IVV design criteria which has been updated by the ITER team (IT). Static analysis and thermal expansion analysis were performed for the structure. Thermal-hydraulic analysis verified the heat removal capability and resulting temperature, pressure, and velocity changes in the coolant flow. Consequently, our design work is possibly suitable as a reference for IT's updated or final design in its next step.

  16. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  17. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  18. The Combination of Lecture-Based Education and Computer-Assisted learning (CAL in the Preliminary Hospital Pharmacy Internship Course

    Directory of Open Access Journals (Sweden)

    Mohammad Charkhpour

    2014-12-01

    Full Text Available Introduction: Developments in the field of information technology has profoundly affected our educational system. The efficacy of Computer-Assisted Learning (CAL has already been evaluated in medical education, but in this study, we examined the efficacy of CAL in combination with Lecture-Based Education.Methods: This quasi-experimental before and after study included 33 senior-year pharmacy students who had passed the preliminary hospital pharmacy internship course. Pre-test questionnaires were given to the students in order to examine their knowledge and attitudes. Then, three chemotherapy prescriptions were given to them. Pharmacology recourses also were available virtually. At the end, students were asked to answer post-test questionnaires with questions based upon knowledge and attitude.Results: The mean score of their knowledge was 3.48±2.04 of 20 before intervention and 17.82±2.31 of 20 after intervention. There was a statistically significant difference between the pre-test and post-testing scores (p<0.001. The mean attitude score of students before intervention was 42.48±15.59 (medium and their score after intervention was 75.97±21.03 (high. There was a statistically significant difference between pre-test and post-test results (p<0.000.Conclusion: The combination of Lecture-Based Education and Computer-Assisted Learning improved senior pharmacy students’ knowledge and attitude in hospital pharmacy internship course.

  19. Performance analysis tool (PATO): Development and preliminary validation

    National Research Council Canada - National Science Library

    Fernando Martins; Filipe Clemente; Frutuoso Silva

    2017-01-01

    .... The Performance Analysis Tool (PATO) software was built with the aim to quickly codify relationships between players and built the adjacency matrices that can be used to test the network measures...

  20. Integrated transcriptome and methylome analysis in youth at high risk for bipolar disorder: a preliminary analysis.

    Science.gov (United States)

    Fries, G R; Quevedo, J; Zeni, C P; Kazimi, I F; Zunta-Soares, G; Spiker, D E; Bowden, C L; Walss-Bass, C; Soares, J C

    2017-03-14

    First-degree relatives of patients with bipolar disorder (BD), particularly their offspring, have a higher risk of developing BD and other mental illnesses than the general population. However, the biological mechanisms underlying this increased risk are still unknown, particularly because most of the studies so far have been conducted in chronically ill adults and not in unaffected youth at high risk. In this preliminary study we analyzed genome-wide expression and methylation levels in peripheral blood mononuclear cells from children and adolescents from three matched groups: BD patients, unaffected offspring of bipolar parents (high risk) and controls (low risk). By integrating gene expression and DNA methylation and comparing the lists of differentially expressed genes and differentially methylated probes between groups, we were able to identify 43 risk genes that discriminate patients and high-risk youth from controls. Pathway analysis showed an enrichment of the glucocorticoid receptor (GR) pathway with the genes MED1, HSPA1L, GTF2A1 and TAF15, which might underlie the previously reported role of stress response in the risk for BD in vulnerable populations. Cell-based assays indicate a GR hyporesponsiveness in cells from adult BD patients compared to controls and suggest that these GR-related genes can be modulated by DNA methylation, which poses the theoretical possibility of manipulating their expression as a means to counteract the familial risk presented by those subjects. Although preliminary, our results suggest the utility of peripheral measures in the identification of biomarkers of risk in high-risk populations and further emphasize the potential role of stress and DNA methylation in the risk for BD in youth.

  1. X-ray phase computed tomography for nanoparticulated imaging probes and therapeutics: preliminary feasibility study

    Science.gov (United States)

    Tang, Xiangyang; Yang, Yi; Tang, Shaojie

    2011-03-01

    With the scientific progress in cancer biology, pharmacology and biomedical engineering, the nano-biotechnology based imaging probes and therapeutical agents (namely probes/agents) - a form of theranostics - are among the strategic solutions bearing the hope for the cure of cancer. The key feature distinguishing the nanoparticulated probes/agents from their conventional counterparts is their targeting capability. A large surface-to-volume ratio in nanoparticulated probes/agents enables the accommodation of multiple targeting, imaging and therapeutic components to cope with the intra- and inter-tumor heterogeneity. Most nanoparticulated probes/agents are synthesized with low atomic number materials and thus their x-ray attenuation are very similar to biological tissues. However, their microscopic structures are very different, which may result in significant differences in their refractive properties. Recently, the investigation in the x-ray grating-based differential phase contrast (DPC) CT has demonstrated its advantages in differentiating low-atomic materials over the conventional attenuation-based CT. We believe that a synergy of x-ray grating-based DPC CT and nanoparticulated imaging probes and therapeutic agents may play a significant role in extensive preclinical and clinical applications, or even become a modality for molecular imaging. Hence, we propose to image the refractive property of nanoparticulated imaging probes and therapeutical agents using x-ray grating-based DPC CT. In this work, we conduct a preliminary feasibility study with a focus to characterize the contrast-to-noise ratio (CNR) and contrast-detail behavior of the x-ray grating-based DPC CT. The obtained data may be instructive to the architecture design and performance optimization of the x-ray grating-based DPC CT for imaging biomarker-targeted imaging probes and therapeutic agents, and even informative to the translation of preclinical research in theranostics into clinical applications.

  2. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  3. Dynamic Stall Analysis Utilizing Interactive Computer Graphics

    Science.gov (United States)

    1988-03-01

    Blade-Vortex Interaction (BV[) studies. solkes the two-dimen i,)nal, unsteady, compressible Euler and Napier -Stokes equations in strong conservation...requirements, interactive computer graphics workstations have been evolved to complement the super -computer. Workstation capabilities, in terms of

  4. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  5. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  6. Bioelectrical impedance analysis for bovine milk: Preliminary results

    Science.gov (United States)

    Bertemes-Filho, P.; Valicheski, R.; Pereira, R. M.; Paterno, A. S.

    2010-04-01

    This work reports the investigation and analysis of bovine milk quality by using biological impedance measurements using electrical impedance spectroscopy (EIS). The samples were distinguished by a first chemical analysis using Fourier transform midinfrared spectroscopy (FTIR) and flow citometry. A set of milk samples (100ml each) obtained from 17 different cows in lactation with and without mastitis were analyzed with the proposed technique using EIS. The samples were adulterated by adding distilled water and hydrogen peroxide in a controlled manner. FTIR spectroscopy and flow cytometry were performed, and impedance measurements were made in a frequency range from 500Hz up to 1MHz with an implemented EIS system. The system's phase shift was compensated by measuring saline solutions. It was possible to show that the results obtained with the Bioelectrical Impedance Analysis (BIA) technique may detect changes in the milk caused by mastitis and the presence of water and hydrogen peroxide in the bovine milk.

  7. In-tank fluid sloshing effects during earthquakes: A preliminary computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, J.E.; Rezvani, M.A.

    1995-04-01

    Hundreds of underground radioactive waste storage tanks are located at Department of Energy (DOE) sites. At present, no technique for evaluating the pressure loads due to the impact of earthquake generated waves on the side walls and dome of the tanks is known if the wave breaks back on itself. This paper presents the results of two-dimensional Computational Fluid Dynamics (CFD) calculations of the motion of waves in a generic rectangular tank as the result of accelerations recorded during an earthquake. The advantages and limitations of this technique and methods for avoiding the limitations will be discussed.

  8. Applications of computer assisted surgery and medical robotics at the ISSSTE, México: preliminary results.

    Science.gov (United States)

    Mosso, José Luis; Pohl, Mauricio; Jimenez, Juan Ramon; Valdes, Raquel; Yañez, Oscar; Medina, Veronica; Arambula, Fernando; Padilla, Miguel Angel; Marquez, Jorge; Gastelum, Alfonso; Mosso, Alejo; Frausto, Juan

    2007-01-01

    We present the first results of four projects of a second phase of a Mexican Project Computer Assisted Surgery and Medical Robotics, supported by the Mexican Science and Technology National Council (Consejo Nacional de Ciencia y Tecnología) under grant SALUD-2002-C01-8181. The projects are being developed by three universities (UNAM, UAM, ITESM) and the goal of this project is to integrate a laboratory in a Hospital of the ISSSTE to give service to surgeons or clinicians of Endoscopic surgeons, urologist, gastrointestinal endoscopist and neurosurgeons.

  9. Fault-Tolerant Postselected Quantum Computation: Threshold Analysis

    CERN Document Server

    Knill, E

    2004-01-01

    The schemes for fault-tolerant postselected quantum computation given in [Knill, Fault-Tolerant Postselected Quantum Computation: Schemes, http://arxiv.org/abs/quant-ph/0402171] are analyzed to determine their error-tolerance. The analysis is based on computer-assisted heuristics. It indicates that if classical and quantum communication delays are negligible, then scalable qubit-based quantum computation is possible with errors above 1% per elementary quantum gate.

  10. A Preliminary Analysis of a Behavioral Classrooms Needs Assessment

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McCray, Cynthia; Lamkins, Carol; Taubman, Mitchell; McEachin, John; Cihon, Joseph H.

    2016-01-01

    Today many special education classrooms implement procedures based upon the principles of Applied Behavior Analysis (ABA) to establish educationally relevant skills and decrease aberrant behaviors. However, it is difficult for school staff and consultants to evaluate the implementation of various components of ABA and general classroom set up. In…

  11. Group training with healthy computing practices to prevent repetitive strain injury (RSI): a preliminary study.

    Science.gov (United States)

    Peper, Erik; Gibney, Katherine H; Wilson, Vietta E

    2004-12-01

    This pilot study investigated whether group training, in which participants become role models and coaches, would reduce discomfort as compared to a nontreatment Control Group. Sixteen experimental participants participated in 6 weekly 2-hr group sessions of a Healthy Computing program whereas 12 control participants received no training. None of the participants reported symptoms to their supervisors nor were they receiving medical treatment for repetitive strain injury prior to the program. The program included training in ergonomic principles, psychophysiological awareness and control, sEMG practice at the workstation, and coaching coworkers. Using two-tailed t tests to analyze the data, the Experimental Group reported (1) a significant overall reduction in most body symptoms as compared to the Control Group and (2) a significant increase in positive work-style habits, such as taking breaks at the computer, as compared to the Control Group. This study suggests that employees could possibly improve health and work style patterns based on a holistic training program delivered in a group format followed by individual practice.

  12. Preparing computers for affective communication: a psychophysiological concept and preliminary results.

    Science.gov (United States)

    Whang, Min Cheol; Lim, Joa Sang; Boucsein, Wolfram

    Despite rapid advances in technology, computers remain incapable of responding to human emotions. An exploratory study was conducted to find out what physiological parameters might be useful to differentiate among 4 emotional states, based on 2 dimensions: pleasantness versus unpleasantness and arousal versus relaxation. The 4 emotions were induced by exposing 26 undergraduate students to different combinations of olfactory and auditory stimuli, selected in a pretest from 12 stimuli by subjective ratings of arousal and valence. Changes in electroencephalographic (EEG), heart rate variability, and electrodermal measures were used to differentiate the 4 emotions. EEG activity separates pleasantness from unpleasantness only in the aroused but not in the relaxed domain, where electrodermal parameters are the differentiating ones. All three classes of parameters contribute to a separation between arousal and relaxation in the positive valence domain, whereas the latency of the electrodermal response is the only differentiating parameter in the negative domain. We discuss how such a psychophysiological approach may be incorporated into a systemic model of a computer responsive to affective communication from the user.

  13. Preliminary survivability analysis of manned spacecraft following orbital debris penetration

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong; HAN ZengYao; LIMing; ZHENG ShiGui

    2009-01-01

    Meteoroid and orbital debris (M/OD) may cause severe damages or even catastrophic failures for long-term manned spacecrafts in orbit due to the hypervelocity impact (HVI) destruction. It is essential to quantitatively assess the M/OD risk of manned spacecraft, in this paper, the catastrophic failure as-sessment function is successfully integrated into the Meteoroid & Orbital Debris Assessment and Op-timization System Tools (MODAOST), which is the M/OD risk assessment system developed by China Academy of Space Technology. The survivability assessment for the US Lab by MODAOST was con-sistent with that of the Manned Spacecraft Crew Survivability computer code (MSCSurv). Meanwhile,the simulation process showed that this function was more effective than MSCSurv for the application of the standard methodology of M/OD risk assessment instead of the Monte Carlo model. This function expands the ability of MODAOST in predicting the survivability of the typical catastrophic failure modes such as crew hypoxia and the critical cracking.

  14. Preliminary survivability analysis of manned spacecraft following orbital debris penetration

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Meteoroid and orbital debris(M/OD) may cause severe damages or even catastrophic failures for long-term manned spacecrafts in orbit due to the hypervelocity impact(HVI) destruction.It is essential to quantitatively assess the M/OD risk of manned spacecraft.In this paper,the catastrophic failure as-sessment function is successfully integrated into the Meteoroid & Orbital Debris Assessment and Op-timization System Tools(MODAOST),which is the M/OD risk assessment system developed by China Academy of Space Technology.The survivability assessment for the US Lab by MODAOST was con-sistent with that of the Manned Spacecraft Crew Survivability computer code(MSCSurv).Meanwhile,the simulation process showed that this function was more effective than MSCSurv for the application of the standard methodology of M/OD risk assessment instead of the Monte Carlo model.This function expands the ability of MODAOST in predicting the survivability of the typical catastrophic failure modes such as crew hypoxia and the critical cracking.

  15. Preliminary shielding analysis for the CSNS target station monolith

    Institute of Scientific and Technical Information of China (English)

    张斌; 陈义学; 杨寿海; 吴军; 殷雯; 梁天骄; 贾学军

    2010-01-01

    The construction of the China Spallation Neutron Source (CSNS) has been initiated at Dongguan,Guangdong,China.In spallation neutron sources the target station monolith is contaminated by a large number of fast neutrons whose energies can be as large as those of the protons of the proton beam directed towards the tungsten target.A detailed radiation transport analysis of the target station monolith is important for the construction of the CSNS.The analysis is performed using the coupled Monte Carlo and multi-dimensional discrete ordinates method.Successful elimination of the primary ray effects via the two-dimensional uncollided flux and first collision source methodology is also illustrated.The dose at the edge of the monolith is calculated.The results demonstrate that the doses received by the hall staff members are below the required standard limit.

  16. Preliminary analysis of productivity of fruiting fungi on Strzeleckie meadows

    Directory of Open Access Journals (Sweden)

    Barbara Sadowska

    2014-11-01

    Full Text Available Analysis demonstrated that the fresh ahd dry weight as well as the ash content of fungal fruit bodies collected on a forest-surrounded unmown meadow (Stellario-Deschampsietum Freitag 1957 and Caricetum elatae W.Koch 1926 were lower than the same values for a plot of exploited mown meadow and higher than on an exploited unmown meadow (Arrhenatheretum medioeuropaeum (Br.-Bl. Oberd. 1952.

  17. Computer analysis of slow vital capacity spirograms.

    Science.gov (United States)

    Primiano, F P; Bacevice, A E; Lough, M D; Doershuk, C F

    1982-01-01

    We have developed a digital computer program which evaluates the vital capacity and its subdivisions, expiratory reserve volume and inspiratory capacity. The algorithm examines the multibreath spirogram, a continuous record of quiet breathing interspersed among repeated slow, large volume maneuvers. Quiet breaths are recognized by comparing features of each breath to the respective average and variation of these features for all breaths. A self-scaling, iterative procedure is used to identify those end-tidal points that most likely represent the subject's functional residual capacity. A least-squared error baseline is then fit through these points to partition the vital capacity. Twenty-three spirograms from patients with documented pulmonary disease were independently analyzed by the computer, a pulmonary function technician, and the laboratory supervisor. No practical differences were found among the results. However, the computer's values, in contrast to those of the technician, were reproducible on repeated trials and free of computational and transcriptional errors.

  18. Behavior computing modeling, analysis, mining and decision

    CERN Document Server

    2012-01-01

    Includes six case studies on behavior applications Presents new techniques for capturing behavior characteristics in social media First dedicated source of references for the theory and applications of behavior informatics and behavior computing

  19. Granular computing analysis and design of intelligent systems

    CERN Document Server

    Pedrycz, Witold

    2013-01-01

    Information granules, as encountered in natural language, are implicit in nature. To make them fully operational so they can be effectively used to analyze and design intelligent systems, information granules need to be made explicit. An emerging discipline, granular computing focuses on formalizing information granules and unifying them to create a coherent methodological and developmental environment for intelligent system design and analysis. Granular Computing: Analysis and Design of Intelligent Systems presents the unified principles of granular computing along with its comprehensive algo

  20. Computational fluid dynamics in three dimensional angiography: Preliminary hemodynamic results of various proximal geometry

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ha Youn; Park, Sung Tae; Bae, Won Kyoung; Goo, Dong Erk [Dept. of Radiology, Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2014-12-15

    We studied the influence of proximal geometry on the results of computational fluid dynamics (CFD). We made five models of different proximal geometry from three dimensional angiography of 63-year-old women with intracranial aneurysm. CFD results were analyzed as peak systolic velocity (PSV) at inlet and outlet as well as flow velocity profile at proximal level of internal carotid artery (ICA) aneurysm. Modified model of cavernous one with proximal tubing showed faster PSV at outlet than that at inlet. The PSV of outlets of other models were slower than that of inlets. The flow velocity profiles at immediate proximal to ICA aneurysm showed similar patterns in all models, suggesting that proximal vessel geometries could affect CFD results.

  1. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  2. Computational Notes on the Numerical Analysis of Galactic Rotation Curves

    CERN Document Server

    Scelza, G

    2014-01-01

    In this paper we present a brief discussion on the salient points of the computational analysis that are at the basis of the paper \\cite{StSc}. The computational and data analysis have been made with the software Mathematica$^\\circledR$ and presented at Mathematica Italia User Group Meeting 2011.

  3. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van Keul

  4. Preliminary analysis of the mitochondrial genome evolutionary pattern in primates

    Institute of Scientific and Technical Information of China (English)

    Liang ZHAO; Xingtao ZHANG; Xingkui TAO; Weiwei WANG; Ming LI

    2012-01-01

    Since the birth of molecular evolutionary analysis,primates have been a central focus of study and mitochondrial DNA is well suited to these endeavors because of its unique features.Surprisingly,to date no comprehensive evaluation of the nucleotide substitution patterns has been conducted on the mitochondrial genome of primates.Here,we analyzed the evolutionary patterns and evaluated selection and recombination in the mitochondrial genomes of 44 Primates species downloaded from GenBank.The results revealed that a strong rate heterogeneity occurred among sites and genes in all comparisons.Likewise,an obvious decline in primate nucleotide diversity was noted in the subunit rRNAs and tRNAs as compared to the protein-coding genes.Within 13 protein-coding genes,the pattern of nonsynonymous divergence was similar to that of overall nucleotide divergence,while synonymous changes differed only for individual genes,indicating that the rate heterogeneity may result from the rate of change at nonsynonymous sites.Codon usage analysis revealed that there was intermediate codon usage bias in primate protein-coding genes,and supported the idea that GC mutation pressure might determine codon usage and that positive selection is not the driving force for the codon usage bias.Neutrality tests using site-specific positive selection from a Bayesian framework indicated no sites were under positive selection for any gene,consistent with near neutrality.Recombination tests based on the pairwise homoplasy test statistic supported complete linkage even for much older divergent primate species.Thus,with the exception of rate heterogeneity among mitochondrial genes,evaluating the validity assumed complete linkage and selective neutrality in primates prior to phylogenetic or phylogeographic analysis seems unnecessary.

  5. Preliminary analysis of the mitochondrial genome evolutionary pattern in primates.

    Science.gov (United States)

    Zhao, Liang; Zhang, Xingtao; Tao, Xingkui; Wang, Weiwei; Li, Ming

    2012-08-01

    Since the birth of molecular evolutionary analysis, primates have been a central focus of study and mitochondrial DNA is well suited to these endeavors because of its unique features. Surprisingly, to date no comprehensive evaluation of the nucleotide substitution patterns has been conducted on the mitochondrial genome of primates. Here, we analyzed the evolutionary patterns and evaluated selection and recombination in the mitochondrial genomes of 44 Primates species downloaded from GenBank. The results revealed that a strong rate heterogeneity occurred among sites and genes in all comparisons. Likewise, an obvious decline in primate nucleotide diversity was noted in the subunit rRNAs and tRNAs as compared to the protein-coding genes. Within 13 protein-coding genes, the pattern of nonsynonymous divergence was similar to that of overall nucleotide divergence, while synonymous changes differed only for individual genes, indicating that the rate heterogeneity may result from the rate of change at nonsynonymous sites. Codon usage analysis revealed that there was intermediate codon usage bias in primate protein-coding genes, and supported the idea that GC mutation pressure might determine codon usage and that positive selection is not the driving force for the codon usage bias. Neutrality tests using site-specific positive selection from a Bayesian framework indicated no sites were under positive selection for any gene, consistent with near neutrality. Recombination tests based on the pairwise homoplasy test statistic supported complete linkage even for much older divergent primate species. Thus, with the exception of rate heterogeneity among mitochondrial genes, evaluating the validity assumed complete linkage and selective neutrality in primates prior to phylogenetic or phylogeographic analysis seems unnecessary.

  6. Preliminary safety analysis for key design features of KALIMER-600

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. B.; Chang, W. P.; Suk, S. D.; Ha, K. S.; Jeong, H. Y.; Heo, S

    2004-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents in the KALIMER design with breakeven core are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2. In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated Anticipated Transient Without Scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. They are categorized as Bounding Events (BEs) because of their low probability of occurrence. In Chapter 4, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed.The performance analysis of the KALIMER-600 containment and some evaluations for the behaviors during HCDA will be performed later.

  7. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  8. Preliminary RAMI analysis of DFLL TBS for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dagui [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); University of Science and Technology of China, Hefei, Anhui, 230031 (China); Yuan, Run [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Jiaqun, E-mail: jiaqun.wang@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Wang, Fang; Wang, Jin [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China)

    2016-11-15

    Highlights: • We performed the functional analysis of the DFLL TBS. • We performed a failure mode analysis of the DFLL TBS. • We estimated the reliability and availability of the DFLL TBS. • The ITER RAMI approach was applied to the DFLL TBS for technical risk control in the design phase. - Abstract: ITER is the first fusion machine fully designed to prove the physics and technological basis for next fusion power plants. Among the main technical objectives of ITER is to test and validate design concepts of tritium breeding blankets relevant to the fusion power plants. To achieve this goal, China has proposed the dual functional lithium-lead test blanket module (DFLL TBM) concept design. The DFLL TBM and its associated ancillary system were called DFLL TBS. The DFLL TBS play a key role in next fusion reactor. In order to ensure reliable and available of DFLL TBS, the risk control project of DFLL TBS has been put on the schedule. As the stage of the ITER technical risk control policy, the RAMI (Reliability, Availability, Maintainability, Inspectability) approach was used to control the technical risk of ITER. In this paper, the RAMI approach was performed on the conceptual design of DFLL TBS. A functional breakdown was prepared on DFLL TBS, and the system was divided into 3 main functions and 72 basic functions. Based on the result of functional breakdown of DFLL TBS, the reliability block diagrams were prepared to estimate the reliability and availability of each function under the stipulated operating conditions. The inherent availability of the DFLL TBS expected after implementation of mitigation actions was calculated to be 98.57% over 2 years based on the ITER reliability database. A Failure Modes Effects and Criticality Analysis (FMECA) was performed with criticality charts highlighting the risk level of the different failure modes with regard to their probability of occurrence and their effects on the availability.

  9. Macroalgae as a Biomass Feedstock: A Preliminary Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roesijadi, Guritno; Jones, Susanne B.; Snowden-Swan, Lesley J.; Zhu, Yunhua

    2010-09-26

    A thorough of macroalgae analysis as a biofuels feedstock is warranted due to the size of this biomass resource and the need to consider all potential sources of feedstock to meet current biomass production goals. Understanding how to harness this untapped biomass resource will require additional research and development. A detailed assessment of environmental resources, cultivation and harvesting technology, conversion to fuels, connectivity with existing energy supply chains, and the associated economic and life cycle analyses will facilitate evaluation of this potentially important biomass resource.

  10. Statistical Analysis of Time Series Data (STATS). Users Manual (Preliminary)

    Science.gov (United States)

    1987-05-01

    15, 30. 60, 90, 120, andL -!/14:X.... 183 days are presently used. auto Page 1 of 10 wrpy *VtsE0> J1 record (continued) Field Variab Vlue D 2 NPRDS ...each event. 6 JEND + Order number of last period in time series to ( NPRDS ) select for analysis. If blank, the last period is assumed. 7 JPPF Plotting...values. 2 NPRDS + Actual number of periods for the event following on ’INO records until the next ID, BF, or LI record. IN record - T:E SERIES DATA

  11. A Preliminary Genetic Analysis of Complement 3 Gene and Schizophrenia.

    Directory of Open Access Journals (Sweden)

    Jianliang Ni

    Full Text Available Complement pathway activation was found to occur frequently in schizophrenia, and complement 3 (C3 plays a major role in this process. Previous studies have provided evidence for the possible role of C3 in the development of schizophrenia. In this study, we hypothesized that the gene encoding C3 (C3 may confer susceptibility to schizophrenia in Han Chinese. We analyzed 7 common single nucleotide polymorphisms (SNPs of C3 in 647 schizophrenia patients and 687 healthy controls. Peripheral C3 mRNA expression level was measured in 23 drug-naïve patients with schizophrenia and 24 controls. Two SNPs (rs1047286 and rs2250656 that deviated from Hardy-Weinberg equilibrium were excluded for further analysis. Among the remaining 5 SNPs, there was no significant difference in allele and genotype frequencies between the patient and control groups. Logistic regression analysis showed no significant SNP-gender interaction in either dominant model or recessive model. There was no significant difference in the level of peripheral C3 expression between the drug-naïve schizophrenia patients and healthy controls. In conclusion, the results of this study do not support C3 as a major genetic susceptibility factor in schizophrenia. Other factors in AP may have critical roles in schizophrenia and be worthy of further investigation.

  12. The opinions of the kindergarten teachers in relation to the introduction of computers to nursery schools: Preliminary approach

    Directory of Open Access Journals (Sweden)

    Irene Sivropoulou

    2009-03-01

    Full Text Available Computers were introduced in Greek kindergartens of our country with the new curricula for kindergarten (Inter-disciplinary Integrated Framework of Study Programs OFFICIAL JOURNAL OF THE HELLENIC REPUBLIC (376΄t.B/18-10-2001, article 6 in order to contribute to the spherical growth of children and to extend their learning. In other words it is intended that the computer will increase the interests and the motives for learning, to encourage active learning, to strengthen the dynamics of visualization, the importance of feedback, the possibility of monitoring and the possibility of connecting the school activities with extra curricula activities in order to strengthen the social and cultural dimension of kindergarten. Nevertheless technology cannot in itself, bring the sought after change in preschool education. Kindergarten teachers are the key for the successful use of computers in kindergarten. However, while kindergarten teachers in certain countries approve of the introduction and use of computers and believe that education with computers is developmentally suitable for small children, in other countries the attitude of kindergarten teachers towards computers is rather negative. This negative attitude of kindergarten teachers relates to their knowledge of computers and how often they use them or is it related to cultural factors and the prevailing educational philosophies? These questions led us to attempt to investigate the opinions of kindergarten teachers in Thessaloniki in regard to the introduction of new technologies in kindergarten. The research is made up of three interactive parts. It begins with the theoretical discussion about the introduction of computers in kindergarten, an investigation of the opinions of 122 kindergarten teachers using a questionnaire made up of 33 questions follows and it ends with the interpretative analysis.

  13. Preliminary Rock Physics Analysis on Lodgepole Formation in Manitoba, Canada

    Science.gov (United States)

    Kim, N.; Keehm, Y.

    2012-12-01

    We present rock physics analysis results of Lodgepole Formation, a carbonate reservoir in Daly Field, Manitoba, Canada. We confirmed that the Lodgepole Formation can be divided into six units in the study area: Basal Limestone, Cromer Shale, Cruickshank Crinoidal, Cruickshank Shale, Daly member and Flossie Lake member from the bottom, using eight well log data and previous works. We then performed rock physics analyses on four carbonate units (Basal Limestone, Cruickshank Crinoidal, Daly and Flossie Lake), such as Vp-porosity, AI-porosity, DEM (differential effective medium) modeling, and fluid substitution analysis. In Vp-porosity domain, the top unit, Flossie Lake member has lower porosity and higher velocity, while the other units show similar porosity and velocity. We think that this results from the diagenesis of Flossie Lake member since it bounds with unconformity. However, the four units show very similar trend in Vp-porosity domain, and we can report one Vp-porosity relation for all carbonate units of the Lodgepole formation. We also found that the acoustic impedance varies more than 10% from low porosity zone (3-6%) to high porosity zone (9-12%) from AI-porosity analysis. Thus one can delineate high porosity zone from seismic impedance data. DEM modeling showed that Flossie Lake would have relatively low aspect ratio of pores than the others, which implies that the top unit has been influenced by diagenesis. To determine fluid sensitivity of carbonate units, we conducted fluid substitution on four units from 100% water to 100% oil. The top unit, Flossie Lake, showed slight increase of Vp, which seems to be density effect. The others showed small decrease of Vp, but not significant. If we observe Vp/Vs rather than Vp, the sensitivity increases. However, fluid discrimination would be difficult because of high stiffness of rock frame. In summary, three lower carbonate units of Lodgepole Formation would be prospective and high porosity zone can be delineated

  14. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    Science.gov (United States)

    Duffy, D. Q.; Schnase, J. L.; Clune, T. L.; Kim, E. J.; Freeman, S. M.; Thompson, J. H.; Hunter, K. A.; Theriot, M. E.

    2011-12-01

    Data intensive analytic workflows bridge between the largely unstructured mass of stored scientific data and the highly structured, tailored, reduced, and refined products used by scientists in their research. In general, the initial steps of an analysis, those operations that first interact with a data repository, tend to be the most general, while data manipulations closer to the client tend to be the most specialized to the individual, to the domain, or to the science question under study. The amount of data being operated on also tends to be larger on the repository-side of the workflow, smaller toward the client-side end products. We are using MapReduce to exploit this natural stratification, optimize efficiencies along the workflow chain, and provide a preliminary qualitative and quantitative assessment of MapReduce as a means of enabling server-side, distributed climate data analysis. MapReduce is a model for distributed storage and computation that seeks to improve efficiencies of the near-archive operations that initiate workflows. Simply put, MapReduce stores chunked data on disks with associated processors in such a way that operations on the chunked data can occur in parallel and return meaningfully aggregated results. While MapReduce has proven effective for large repositories of textual data, its use in data intensive science applications has been limited, because many scientific data sets are inherently complex, have high dimensionality, and use binary formats. We are using Apache's open-source Hadoop software implementation of MapReduce on top of the Hadoop Filesystem in our evaluation. Our analyses focus on soil moisture, precipitation, and atmospheric water-vapor, important classes of observation- and simulation-derived data products. The specific data sets being used in the evaluation include MERRA monthly precipitation and soil moisture products; the MODIS Atmospheres, 8-day global water-vapor product; and the SMOS 3-day global soil moisture

  15. Brain hemisphere dominance and vocational preference: a preliminary analysis.

    Science.gov (United States)

    Szirony, Gary Michael; Pearson, L Carolyn; Burgin, John S; Murray, Gerald C; Elrod, Lisa Marie

    2007-01-01

    Recent developments in split-brain theory add support to the concept of specialization within brain hemispheres. Holland's vocational personality theory may overlap with Human Information Processing (HIP) characteristics. Holland's six RIASEC codes were developed to identify vocational personality characteristics, and HIP scales were designed to measure hemispheric laterality. Relationships between the two scales were evaluated through canonical correlation with some significant results, however not all Holland scale scores correlated with left, right, or integrated hemispheric preference. Additional findings related to participants self-perception of music and math ability were also correlated. Findings on this added analysis revealed a high correlation between perception of musical ability and right brain function but not between mathematical concept and left brain alone. Implications regarding vocational choice and work are discussed.

  16. City of Hoboken Energy Surety Analysis: Preliminary Design Summary

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Baca, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Schenkman, Benjamin L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military and Energy Systems Analysis Dept.; Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Readiness and Sustainment Technology Dept.; Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electric Power Systems Research Dept.; Henry, Jordan M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Critical Infrastructure Systems Dept.; Jensen, Richard Pearson [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geomechanics Dept.

    2014-09-01

    In 2012, Hurricane Sandy devastated much of the U.S. northeast coastal areas. Among those hardest hit was the small community of Hoboken, New Jersey, located on the banks of the Hudson River across from Manhattan. This report describes a city-wide electrical infrastructure design that uses microgrids and other infrastructure to ensure the city retains functionality should such an event occur in the future. The designs ensure that up to 55 critical buildings will retain power during blackout or flooded conditions and include analysis for microgrid architectures, performance parameters, system control, renewable energy integration, and financial opportunities (while grid connected). The results presented here are not binding and are subject to change based on input from the Hoboken stakeholders, the integrator selected to manage and implement the microgrid, or other subject matter experts during the detailed (final) phase of the design effort.

  17. Analysis of organochlorine pesticides in human milk: preliminary results.

    Science.gov (United States)

    Campoy, C; Jiménez, M; Olea-Serrano, M F; Moreno-Frías, M; Cañabate, F; Olea, N; Bayés, R; Molina-Font, J A

    2001-11-01

    In the face of evidence of human milk contamination by organochlorine pesticides, an analysis was performed on samples of milk obtained from healthy lactating women in the provinces of Granada and Almeria in Southern Spain. The samples were obtained by the Neonate Section of the Department of Pediatrics of Granada University Hospital (Neonatology Division) and by the Neonatal Service of Poniente Hospital in El Ejido, Almería. A liquid-liquid extraction procedure was performed. The cleaning of the sample before gas chromatography-mass spectrometry (GC-MS) used silica Sep-Pak. Among other pesticides, aldrin, dieldrin, DDT and its metabolites, lindane, methoxychlor and endosulfan were identified. The presence of these products was confirmed by mass spectrometry. The identification and quantification of these organochlorine molecules is important because they have estrogenic effects.

  18. The Analysis of Some Contemporary Computer Mikrosystems

    Directory of Open Access Journals (Sweden)

    Angelė Kaulakienė

    2011-04-01

    Full Text Available In every language a twofold process could be observed: 1 a huge surge of new terms and 2 a big part of these new terms make their way into the common language. The nucleus of the vocabulary and the grammatical system of the common language make the essence of a language and its national originality. Because of such an intensive development in the future terminological lexis can become a basis of a common language and it ought to be not a spontaneously formed sum of terminological lexis, but an entirety of consciously created terms, which meet the requirements of language, logic and terminology. Computer terminology, by comparison with terminology of other fields, is being created in a slightly unusual way. The first computation institutions in Lithuania were established in early sixties and a decade later there were a few computation centres and a number of key-operated and punch machines working. Together with the new computational technology many new devices, units, parts, phenomena and characteristics appeared, which needed naming. Specialists faced an obvious shortage of Lithuanian terms for computing equipment. In 1971 this gap was partly filled by „Rusų-lietuvių-anglų kalbų skaičiavimo technikos žodynas“ (Russian-Lithuanian-English dictionary of computing equipment, which for a long time (for more than 20 years was the only one terminological dictionary of this field. Only during nineties a few dictionaries of different scope appeared. Computer terminology from ten dictionaries, which are presently available, shows that 35 year period of computer terminology is a stage of its creation, the main features of which are reasonable synonymy (when both international term are being used to name the concept and variability. Such state of Lithuanian computer terminology is predetermined by some linguistic, interlinguistic and sociolinguistic factors. At present in Lithuania terminological dictionaries of various fields are being given to

  19. Preliminary Analysis of a Fully Solid State Magnetocaloric Refrigeration

    Energy Technology Data Exchange (ETDEWEB)

    Abdelaziz, Omar [ORNL

    2016-01-01

    Magnetocaloric refrigeration is an alternative refrigeration technology with significant potential energy savings compared to conventional vapor compression refrigeration technology. Most of the reported active magnetic regenerator (AMR) systems that operate based on the magnetocaloric effect use heat transfer fluid to exchange heat, which results in complicated mechanical subsystems and components such as rotating valves and hydraulic pumps. In this paper, we propose an alternative mechanism for heat transfer between the AMR and the heat source/sink. High-conductivity moving rods/sheets (e.g. copper, brass, iron, graphite, aluminum or composite structures from these) are utilized instead of heat transfer fluid significantly enhancing the heat transfer rate hence cooling/heating capacity. A one-dimensional model is developed to study the solid state AMR. In this model, the heat exchange between the solid-solid interfaces is modeled via a contact conductance, which depends on the interface apparent pressure, material hardness, thermal conductivity, surface roughness, surface slope between the interfaces, and material filled in the gap between the interfaces. Due to the tremendous impact of the heat exchange on the AMR cycle performance, a sensitivity analysis is conducted employing a response surface method, in which the apparent pressure, effective surface roughness and grease thermal conductivity are the uncertainty factors. COP and refrigeration capacity are presented as the response in the sensitivity analysis to reveal the important factors influencing the fully solid state AMR and optimize the solid state AMR efficiency. The performances of fully solid state AMR and traditional AMR are also compared and discussed in present work. The results of this study will provide general guidelines for designing high performance solid state AMR systems.

  20. Preliminary analysis of cerebrospinal fluid proteome in patients with neurocysticercosis

    Institute of Scientific and Technical Information of China (English)

    TIAN Xiao-jun; LI Jing-yi; HUANG Yong; XUE Yan-ping

    2009-01-01

    Background Neurocysticercosis is the infection of the nervous system by the larvae of Taenia solium (T. solium). Despite continuous effort, the experimental diagnosis of neurocysticercosis remains unresolved. Since the cerebrospinal fluid (CSF) contacts with the brain, dynamic information about pathological processes of the brain is likely to be reflected in CSF. Therefore, CSF may serve as a rich source of putative biomarkers related to neurocysticercosis. Comparative proteomic analysis of CSF of neurocysticercosis patients and control subjects may find differentially expressed proteins. Methods Two-dimensional difference in gel electrophoresis (2D-DIGE) was used to investigate differentially expressed proteins in CSF of patients with neurocysticercosis by comparing the protein profile of CSF from neurocysticercosis patients with that from control subjects. The differentially expressed spots/proteins were recognized with matrix-assisted laser desorption/ionization-time of flight-time of flight (MALDI-TOF-TOF) mass spectrometry. Results Forty-four enzyme digested peptides were obtained from 4 neurocysticercotic patients. Twenty-three were identified through search of the NCBI protein database with Mascot software, showing 19 up-expressed and 4 down-expressed. Of these proteins, 26S proteosome related to ATP- and ubiquitin-dependent degradation of proteins and lipocalin type prostaglandin D synthase involved in PGD2-synthesis and extracellular transporter activities were up-expressed, while transferrin related to iron metabolism within the brain was down-expressed. Conclusions This study established the proteomic profile of pooled CSF from 4 patients with neurocysticercosis, suggesting the potential value of proteomic analysis for the study of candidate biomarkers involved in the diagnosis or pathogenesis of neurocysticercosis.

  1. Preliminary Design and Analysis of the GIFTS Instrument Pointing System

    Science.gov (United States)

    Zomkowski, Paul P.

    2003-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Instrument is the next generation spectrometer for remote sensing weather satellites. The GIFTS instrument will be used to perform scans of the Earth s atmosphere by assembling a series of field-of- views (FOV) into a larger pattern. Realization of this process is achieved by step scanning the instrument FOV in a contiguous fashion across any desired portion of the visible Earth. A 2.3 arc second pointing stability, with respect to the scanning instrument, must be maintained for the duration of the FOV scan. A star tracker producing attitude data at 100 Hz rate will be used by the autonomous pointing algorithm to precisely track target FOV s on the surface of the Earth. The main objective is to validate the pointing algorithm in the presence of spacecraft disturbances and determine acceptable disturbance limits from expected noise sources. Proof of concept validation of the pointing system algorithm is carried out with a full system simulation developed using Matlab Simulink. Models for the following components function within the full system simulation: inertial reference unit (IRU), attitude control system (ACS), reaction wheels, star tracker, and mirror controller. With the spacecraft orbital position and attitude maintained to within specified limits the pointing algorithm receives quaternion, ephemeris, and initialization data that are used to construct the required mirror pointing commands at a 100 Hz rate. This comprehensive simulation will also aid in obtaining a thorough understanding of spacecraft disturbances and other sources of pointing system errors. Parameter sensitivity studies and disturbance analysis will be used to obtain limits of operability for the GIFTS instrument. The culmination of this simulation development and analysis will be used to validate the specified performance requirements outlined for this instrument.

  2. Computational Analysis of LDDMM for Brain Mapping

    Directory of Open Access Journals (Sweden)

    Can eCeritoglu

    2013-08-01

    Full Text Available One goal of computational anatomy is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids’ brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms – Freesurfer and FSL – by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations.

  3. Preliminary analysis of the use of smartwatches for longitudinal health monitoring.

    Science.gov (United States)

    Jovanov, Emil

    2015-08-01

    New generations of smartwatches feature continuous measurement of physiological parameters, such as heart rate, galvanic skin resistance (GSR), and temperature. In this paper we present the results of preliminary analysis of the use of Basis Peak smartwatch for longitudinal health monitoring during a 4 month period. Physiological measurements during sleep are validated using Zephyr Bioharness 3 monitor and SOMNOscreen+ polysomnographic monitoring system from SOMNOmedics. Average duration of sequences with no missed data was 49.9 minutes, with maximum length of 17 hours, and they represent 88.88% of recording time. Average duration of the charging event was 221.9 min, and average time between charges was 54 hours, with maximum duration of the charging event of 16.3 hours. Preliminary results indicate that the physiological monitoring performance of existing smartwatches provides sufficient performance for longitudinal monitoring of health status and analysis of health and wellness trends.

  4. Relative risk analysis in regulating the use of radiation-emitting medical devices. A preliminary application

    Energy Technology Data Exchange (ETDEWEB)

    Jones, E.D.; Banks, W.W.; Altenbach, T.J.; Fischer, L.E. [Lawrence Livermore National Lab., CA (United States)

    1995-09-01

    This report describes a preliminary application of an analysis approach for assessing relative risks in the use of radiation- emitting medical devices. Results are presented on human-initiated actions and failure modes that are most likely to occur in the use of the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step in a US Nuclear Regulatory Commission (NRC) plan to evaluate the potential role of risk analysis in regulating the use of nuclear medical devices. For this preliminary application of risk assessment, the focus was to develop a basic process using existing techniques for identifying the most likely risk contributors and their relative importance. The approach taken developed relative risk rankings and profiles that incorporated the type and quality of data available and could present results in an easily understood form. This work was performed by the Lawrence Livermore National Laboratory for the NRC.

  5. A simplified procedure of linear regression in a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Silvia Facchinetti

    2013-05-01

    Full Text Available The analysis of a statistical large data-set can be led by the study of a particularly interesting variable Y – regressed – and an explicative variable X, chosen among the remained variables, conjointly observed. The study gives a simplified procedure to obtain the functional link of the variables y=y(x by a partition of the data-set into m subsets, in which the observations are synthesized by location indices (mean or median of X and Y. Polynomial models for y(x of order r are considered to verify the characteristics of the given procedure, in particular we assume r= 1 and 2. The distributions of the parameter estimators are obtained by simulation, when the fitting is done for m= r + 1. Comparisons of the results, in terms of distribution and efficiency, are made with the results obtained by the ordinary least square methods. The study also gives some considerations on the consistency of the estimated parameters obtained by the given procedure.

  6. Preliminary Analysis of Slope Stability in Kuok and Surrounding Areas

    Directory of Open Access Journals (Sweden)

    Dewandra Bagus Eka Putra

    2016-12-01

    Full Text Available The level of slope influenced by the condition of the rocks beneath the surface. On high level of slopes, amount of surface runoff and water transport energy is also enlarged. This caused by greater gravity, in line with the surface tilt from the horizontal plane. In other words, topsoil eroded more and more. When the slope becomes twice as steep, then the amount of erosion per unit area be 2.0 - 2.5 times more. Kuok and surrounding area is the road access between the West Sumatra and Riau which plays an important role economies of both provinces. The purpose of this study is to map the locations that have fairly steep slopes and potential mode of landslides. Based on SRTM data obtained,  the roads in Kuok area has a minimum elevation of + 33 m and a maximum  + 217.329 m. Rugged road conditions with slope ranging from 24.08 ° to 44.68 ° causing this area having frequent landslides. The result of slope stability analysis in a slope near the Water Power Plant Koto Panjang, indicated that mode of active failure is toppling failure or rock fall and the potential zone of failure is in the center part of the slope.

  7. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  8. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    Energy Technology Data Exchange (ETDEWEB)

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  9. Preliminary Failure Modes and Effects Analysis of the US Massive Gas Injection Disruption Mitigation System Design

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2013-10-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a candidate design for the ITER Disruption Mitigation System. This candidate is the Massive Gas Injection System that provides machine protection in a plasma disruption event. The FMEA was quantified with “generic” component failure rate data as well as some data calculated from operating facilities, and the failure events were ranked for their criticality to system operation.

  10. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2007-08-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  11. Preliminary Failure Modes and Effects Analysis of the US DCLL Test Blanket Module

    Energy Technology Data Exchange (ETDEWEB)

    Lee C. Cadwallader

    2010-06-01

    This report presents the results of a preliminary failure modes and effects analysis (FMEA) of a small tritium-breeding test blanket module design for the International Thermonuclear Experimental Reactor. The FMEA was quantified with “generic” component failure rate data, and the failure events are binned into postulated initiating event families and frequency categories for safety assessment. An appendix to this report contains repair time data to support an occupational radiation exposure assessment for test blanket module maintenance.

  12. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  13. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  14. PRELIMINARY PHYTOCHEMICAL ANALYSIS AND ACUTE ORAL TOXICITY STUDY OF CLITORIA TERNATEA LINN. ROOTS IN ALBINO MICE

    Directory of Open Access Journals (Sweden)

    Deka Manalisha

    2011-12-01

    Full Text Available Clitoria ternatea has been using since the ancient times for its medicinal values. Almost all the parts of the plant have medicinal property. The root of the plant is reported to have anti diarrheal, Anti histamic, cholinergic activity etc. Traditionally the root has been using for the treatment of many diseases like leucorrhoea, diarrhea, urinary problems, diuretic, impotency, stomach trouble etc. The present study was designed to investigate the preliminary phytochemical analysis and acute oral toxicity of the root of the plant. The shed dried materials were grinded and used in the study. The preliminary phytochemical analysis was done by following standard protocols. For acute oral toxicity study, methanolic extract of the root was used. The extract was prepared by standard protocol. The preliminary phytochemical analysis showed the presence of proteins, carbohydrates, glycosides, resins, saponin, flavonoid, alkaloids, steroids and phenol. The acute oral toxicity study showed no mortality up to a dose of 3000 mg per kg body weight. The presence of plant chemicals revealed the medicinal values and the non toxic property of the plant indicated the value of the plant as medicine. Thus we can conclude that, the root of the plant can be used as a safe drug against many diseases.

  15. PRELIMINARY PHYTOCHEMICAL ANALYSIS AND ACUTE ORAL TOXICITY STUDY OF MUCUNA PRURIENS LINN. IN ALBINO MICE

    Directory of Open Access Journals (Sweden)

    Deka Manalisha

    2012-02-01

    Full Text Available Mucuna Pruriens Linn. is an annual, climbing shrub which has an important place among aphrodisiac herbs in India since the ancient times. The plant has been using traditionally for many medicinal purposes such as Infertility, Parkinson’s disease, Loss of libido, Antioxidant, Anti venom, Anti microbial etc. The present study was carried out to investigate the preliminary phytochemical analysis and acute oral toxicity of the seeds of M.pruriens on albino mice. Matured seeds of M.pruriens were dried in shed and grinded in a mechanical grinder. The preliminary phytochemical analysis was done by following standard protocols. For acute oral toxicity study, methanolic extract of the seeds were used. The extract was prepared in a Soxlet apparatus. The preliminary phytochemical analysis showed the presence of protein, carbohydrates, glycosides, alkaloids, steroids, flavonoids, phenols and tannins. The acute oral toxicity study showed no mortality up to a dose of 4000 mg per kg body weight. The presence of plant chemicals revealed the medicinal values and the non toxic property of the plant indicated the value of the plant as medicine. Thus, we can conclude that, the seed of the plant can be used as a safe drug against many diseases.

  16. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  17. Conversation Analysis of Computer-Mediated Communication

    Science.gov (United States)

    Gonzalez-Lloret, Marta

    2011-01-01

    The potential of computer-mediated communication (CMC) for language learning resides mainly in the possibility that learners have to engage with other speakers of the language, including L1 speakers. The inclusion of CMC in the L2 classroom provides an opportunity for students to utilize authentic language in real interaction, rather than the more…

  18. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  20. Computational analysis of ozonation in bubble columns

    Energy Technology Data Exchange (ETDEWEB)

    Quinones-Bolanos, E. [Univ. of Guelph, School of Engineering, Guelph, Ontario (Canada)]|[Univ. de Cartagena, Facultad de Ciencias e Ingenieria, Cartagena de Indias (Colombia); Zhou, H.; Otten, L. [Univ. of Guelph, School of Engineering, Guelph, Ontario (Canada)]. E-mail: hzhou@uoguelph.ca

    2002-06-15

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  1. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  2. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  3. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  4. GC-MS analysis, preliminary phytochemical screening, physicochemical analysis and anti-diabetic activity of ethanol extract of Jasminum cuspidatum leaves

    National Research Council Canada - National Science Library

    Singumsetty Vinay; Shaik Karimulla; Devarajan Saravanan

    2014-01-01

    The purpose of the present study was investigating the GC-MS analysis, preliminary phytochemical screening, physicochemical analysis and anti-diabetic activity of ethanol extract of the leaves of Jasminum cuspidatum...

  5. ANSI/ASHRAE/IESNA Standard 90.1-2010 Preliminary Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Rosenberg, Michael I.

    2010-11-01

    The United States (U.S.) Department of Energy (DOE) conducted a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The preliminary analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s preliminary determination. However, out of the 109 addenda, 34 were preliminarily determined to have measureable and quantifiable impact.

  6. Preliminary phytochemical analysis and DPPH free radical scavenging activity of Trewia nudiflora Linn. roots and leaves.

    Science.gov (United States)

    Balakrishnan, N; Srivastava, Mayank; Tiwari, Pallavi

    2013-11-01

    Oxidative stress is one of the major causative factors of many chronic and degenerative diseases. Plants have been used in traditional medicine in different parts of world for thousands of years and continue to provide new remedies for human kind. The present study was to investigate the preliminary phytochemical analysis of various extracts of roots and leaves of Trewia nudiflora (Euphorbiaceae) and antioxidant activity by 1,1,diphenyl-2-picryl hydrazyl (DPPH) radical scavenging method. The preliminary phytochemical screening showed the presence of several phytochemicals including alkaloids, glycosides, flavonoids, steroids, phenolic compounds and tannins. The ethanol and aqueous extracts of roots and leaves of Trewia nudiflora showed significant antioxidant activity compared to standard drug ascorbic acid.

  7. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  8. Computer-assisted quantification of interstitial lung disease associated with rheumatoid arthritis: Preliminary technical validation

    Energy Technology Data Exchange (ETDEWEB)

    Marten, K. [Department of Radiology, Georg August University of Goettingen, Robert-Koch-Strasse 40, 37075 Goettingen (Germany); Dicken, V. [MeVis Research GmbH, Universitaetsallee 29, 28359 Bremen (Germany); Kneitz, C. [Department of Rheumatology and Clinical Immunology, Medizinische Klinik and Poliklinik, University Hospital of Wuerzburg, Klinikstrasse 6, 97070 Wuerzburg (Germany); Hoehmann, M.; Kenn, W.; Hahn, D. [Department of Radiology, University Hospital of Wuerzburg, Josef-Schneider-Strasse 2, 97080 Wuerzburg (Germany); Engelke, C. [Department of Radiology, Georg August University of Goettingen, Robert-Koch-Strasse 40, 37075 Goettingen (Germany)], E-mail: c.engelke@med.uni-goettingen.de

    2009-11-15

    Purpose: To validate a threshold-based prototype software application (MeVis PULMO 3D) for quantification of chronic interstitial lung disease (ILD) in patients with rheumatoid arthritis (RA) using variable threshold settings for segmentation of diseased lung areas. Methods: Twenty-two patients with rheumatoid arthritis were included and underwent thin-section CT (4 x 1.25 mm collimation). CT scans were assessed by two observers for extent of ILD (EoILD), and twice by MeVis PULMO 3D for each protocol. MeVis PULMO 3D used four segmentation threshold (ST) settings (ST = -740, -780, -800 and -840 HU). Pulmonary function tests were obtained in all patients. Statistical evaluation used 95% limits of agreement (LoA) and linear regression analysis. Results: There was total concordance between the software measurements. Interobserver agreement was good (LoA = -28.36 to 17.58%). EoILD by readers correlated strongly with DL{sub CO} (r = -0.702, p < 0.0001) and moderately with FVC (r = -0.523, p = 0.018). There was close correlation between readers and MeVis PULMO 3D with best results for ST <780 HU (EoILD vs. MeVis PULMO 3D: r = 0.650 for ST = -800 and -840 HU, respectively; p = 0.002). MeVis PULMO 3D correlated best with DL{sub CO} at ST of -800 HU (r = -0.44, -0.49, -0.58 and -0.57 for ST = -740, -780, -800 and -840, respectively; p = 0.007-0.05) and moderately with FVC (r = -0.44, -0.51, -0.59 and -0.45 for ST = -740, -780, -800 and -840), respectively; p = 0.007-0.05). Conclusion: The MeVis PULMO 3D system used holds promise to become a valuable instrument for quantification of chronic ILD in patients with RA when using the threshold value of -800 HU, with evidence of the closest correlations, both with human observers and physiologic impairment.

  9. CONSTRUCTION OF THE CHINESE LEARNERS' PARALLEL CORPUS OF JAPANESE AND ITS PRELIMINARY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Masatake Dantsuji

    2004-01-01

    Full Text Available This study aims to introduce the project to construct the Chinese learners' corpus (LC of Japanese at Dalian University of Technology (DUT, and detail the LC construction, development of DUT Corpus Linguistics Tools, and contribution to the education of Japanese as a second language. The outstanding characteristic of the LC is its parallel form with learners' Japanese texts and their Chinese translation, which enables us to make comprehensive analysis of the influence of Chinese (L1 to Japanese (L2. We have made a preliminary analysis of the errors contained.

  10. Preliminary Design and Analysis of the ARES Atmospheric Flight Vehicle Thermal Control System

    Science.gov (United States)

    Gasbarre, J. F.; Dillman, R. A.

    2003-01-01

    The Aerial Regional-scale Environmental Survey (ARES) is a proposed 2007 Mars Scout Mission that will be the first mission to deploy an atmospheric flight vehicle (AFV) on another planet. This paper will describe the preliminary design and analysis of the AFV thermal control system for its flight through the Martian atmosphere and also present other analyses broadening the scope of that design to include other phases of the ARES mission. Initial analyses are discussed and results of trade studies are presented which detail the design process for AFV thermal control. Finally, results of the most recent AFV thermal analysis are shown and the plans for future work are discussed.

  11. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  12. Error computation for adaptive finite element analysis

    CERN Document Server

    Khan, A A; Memon, I R; Ming, X Y

    2002-01-01

    The paper gives a simple numerical procedure for computations of errors generated by the discretisation process of finite element method. The procedure given is based on the ZZ error estimator which is believed to be reasonable accurate and thus can be readily implemented in any existing finite element codes. The devised procedure not only estimates the global energy norm error but also evaluates the local errors in individual elements. In the example, the given procedure is combined with an adaptive refinement procedure, which provides guidance for optimal mesh designing and allows the user to obtain a desired accuracy with a limited number of interaction. (author)

  13. Computer-aided Analysis of Phisiological Systems

    Directory of Open Access Journals (Sweden)

    Balázs Benyó

    2007-12-01

    Full Text Available This paper presents the recent biomedical engineering research activity of theMedical Informatics Laboratory at the Budapest University of Technology and Economics.The research projects are carried out in the fields as follows: Computer aidedidentification of physiological systems; Diabetic management and blood glucose control;Remote patient monitoring and diagnostic system; Automated system for analyzing cardiacultrasound images; Single-channel hybrid ECG segmentation; Event recognition and stateclassification to detect brain ischemia by means of EEG signal processing; Detection ofbreathing disorders like apnea and hypopnea; Molecular biology studies with DNA-chips;Evaluation of the cry of normal hearing and hard of hearing infants.

  14. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  15. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  18. Computer analysis of digital well logs

    Science.gov (United States)

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  19. PROGTEST: A Computer System for the Analysis of Computational Computer Programs.

    Science.gov (United States)

    1980-04-01

    Richard Loller, Graphic Arts Branch Ms Linda Prieto , Word Processing Center A-i APPENDIX B CAA-D-80-1 DISTRIBUTION Addressee # of Copies Defense...Development Center ATTN: Alan Barnum Is Griffiss Air Force Base, NY 13441 B-6 CAA-D-80-1 Mr. Glen Ingram Scientific Computing Division Room A151

  20. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  2. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  3. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  4. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...... classification (as by using transport layer port numbers, Deep Packet Inspection (DPI), statistical classification) and assessed their usefulness in particular areas. We found that the classification techniques based on port numbers are not accurate anymore as most applications use dynamic port numbers, while...... DPI is relatively slow, requires a lot of processing power, and causes a lot of privacy concerns. Statistical classifiers based on Machine Learning Algorithms (MLAs) were shown to be fast and accurate. At the same time, they do not consume a lot of resources and do not cause privacy concerns. However...

  5. Computer teaching process optimization strategy analysis of thinking ability

    Directory of Open Access Journals (Sweden)

    Luo Liang

    2016-01-01

    Full Text Available As is known to all, computer is a college student in a university course, one of the basic course in the process of education for college students which lay a theoretical foundation for the next professional learning. At the same time, in recent years, countries and universities attach great importance to and focus on computer teaching for young college students, the purpose is to improve students’ thinking ability, eventually to promote college students’ ability to use computational thinking to solve and analyze the problems of daily life. Therefore, this article on how to the calculation of optimization in the process of computer teaching college students thinking ability on further discussion and analysis, and then explore the strategies and methods, so as to promote the computer teaching in the process of the cultivation of thinking ability and optimize the computer

  6. Waste Feed Delivery System Phase 1 Preliminary RAM Analysis [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    DYKES, A.A.

    2000-10-11

    This report presents the updated results of the preliminary reliability, availability, and maintainability (RAM) analysis of selected waste feed delivery (WFD) operations to be performed by the Tank Farm Contractor (TFC) during Phase I activities in support of the Waste Treatment and Immobilization Plant (WTP). For planning purposes, waste feed tanks are being divided into five classes in accordance with the type of waste in each tank and the activities required to retrieve, qualify, and transfer waste feed. This report reflects the baseline design and operating concept, as of the beginning of Fiscal Year 2000, for the delivery of feed from three of these classes, represented by source tanks 241-AN-102, 241-AZ-101 and 241-AN-105. The preliminary RAM analysis quantifies the potential schedule delay associated with operations and maintenance (OBM) field activities needed to accomplish these operations. The RAM analysis is preliminary because the system design, process definition, and activity planning are in a state of evolution. The results are being used to support the continuing development of an O&M Concept tailored to the unique requirements of the WFD Program, which is being documented in various volumes of the Waste Feed Delivery Technical Basis (Carlson. 1999, Rasmussen 1999, and Orme 2000). The waste feed provided to the WTP must: (1) meet limits for chemical and radioactive constituents based on pre-established compositional envelopes (i.e., feed quality); (2) be in acceptable quantities within a prescribed sequence to meet feed quantities; and (3) meet schedule requirements (i.e., feed timing). In the absence of new criteria related to acceptable schedule performance due to the termination of the TWRS Privatization Contract, the original criteria from the Tank Waste Remediation System (77443s) Privatization Contract (DOE 1998) will continue to be used for this analysis.

  7. Data Analysis through a Generalized Interactive Computer Animation Method (DATICAM)

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, J.N.; Schweider, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process.

  8. Data analysis through interactive computer animation method (DATICAM)

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process.

  9. COMPUTER DATA ANALYSIS AND MODELING: COMPLEX STOCHASTIC DATA AND SYSTEMS

    OpenAIRE

    2010-01-01

    This collection of papers includes proceedings of the Ninth International Conference “Computer Data Analysis and Modeling: Complex Stochastic Data and Systems” organized by the Belarusian State University and held in September 2010 in Minsk. The papers are devoted to the topical problems: robust and nonparametric data analysis; statistical analysis of time series and forecasting; multivariate data analysis; design of experiments; statistical signal and image processing...

  10. Acoustic analysis of a computer cooling fan

    Science.gov (United States)

    Huang, Lixi; Wang, Jian

    2005-10-01

    Noise radiated by a typical computer cooling fan is investigated experimentally and analyzed within the framework of rotor-stator interaction noise using point source formulation. The fan is 9 cm in rotor casing diameter and its design speed is 3000 rpm. The main noise sources are found and quantified; they are (a) the inlet flow distortion caused by the sharp edges of the incomplete bellmouth due to the square outer framework, (b) the interaction of rotor blades with the downstream struts which hold the motor, and (c) the extra size of one strut carrying electrical wiring. Methods are devised to extract the rotor-strut interaction noise, (b) and (c), radiated by the component forces of drag and thrust at the leading and higher order spinning pressure modes, as well as the leading edge noise generated by (a). By re-installing the original fan rotor in various casings, the noises radiated by the three features of the original fan are separated, and details of the directivity are interpreted. It is found that the inlet flow distortion and the unequal set of four struts make about the same amount of noise. Their corrections show a potential of around 10-dB sound power reduction.

  11. Computational Analysis of Safety Injection Tank Performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Nietiadia, Yohanes Setiawan; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of); Addad, Yacine; Yoon, Ho Joon [Khalifa University of Science Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-10-15

    The APR 1400 is a large pressurized water reactor (PWR). Just like many other water reactors, it has an emergency core cooling system (ECCS). One of the most important components in the ECCS is the safety injection tank (SIT). Inside the SIT, a fluidic device is installed, which passively controls the mass flow of the safety injection and eliminates the need for low pressure safety injection pumps. As more passive safety mechanisms are being pursued, it has become more important to understand flow structure and the loss mechanism within the fluidic device. Current computational fluid dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study proposes to find a more exact result using CFD and more realistic modeling. The SIT of APR1400 was analyzed using MARS and CFD. CFD calculation was executed first to obtain the form loss factor. Using the two form loss factors from the vendor and calculation, calculation using MARS was performed to compare with experiment. The accumulator model in MARS was quite accurate in predicting the water level. The pipe model showed some difference with the experimental data in the water level.

  12. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  13. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  14. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  15. Computational Understanding: Analysis of Sentences and Context

    Science.gov (United States)

    1974-05-01

    to take English texts, disambxguate the words and semantic relationships in- volved, and settle questions like anaphoric reference, to the point...rather than what the word in isolation might mean. Tfit theory of text analysis ?.l*o stresses binding by predictions. To assume that a word is...cluster is basically the bundle of predictions and structures, knowledge that can bind a ’.ext into a unit. The cluster has much the same theoretical

  16. Preliminary analysis of Alvito-Odivelas reservoir system operation under climate change scenarios

    OpenAIRE

    2008-01-01

    The present study provides a preliminary analysis of the impact of climate change on a water resources system of Alentejo region in the South of Portugal. Regional climate model HadRM3P forced by the Global Circulation Model HadAM3P A2 of the Hadley Centre, is used to derive temperature and precipitation data, which in turn is used as input to hydrological model (SHETRAN) for simulation of future streamflow. Dynamic programming based models are used for operation of reservoir system in order ...

  17. Stock assessment of Haliporoides triarthrus (Fam. Solenoceridae) off Mozambique: a preliminary analysis

    OpenAIRE

    Torstensen, E.; Pacule, H.

    1992-01-01

    The pink shrimp, Haliporoides triarthrus, is an important species in the deep-water shrimp fishery in Mozambique. Total catches are in the range of 1,500 to 2,700 tons, with the pink shrimp accounting for 70-90%. Estimates of growth parameters and of natural mortality are used for a preliminary assessment of the fishery, based on length-structured virtual population analysis and yield-per-recruit analyses. With an arbitrarily chosen terminal fishing mortality F, the results indicate a situati...

  18. Preliminary Analysis of Liquid Metal MHD Pressure Drop in the Blanket for the FDS

    Institute of Scientific and Technical Information of China (English)

    王红艳; 吴宜灿; 何晓雄

    2002-01-01

    Preliminary analysis and calculation of liquid metal Li17Pb83 magnetohydrodynamic (MHD) pressure drop in the blanket for the FDS have been presented to evaluate the significance of MHD effects on the thermal-hydraulic design of the blanket. To decrease the liquid metal MHD pressure drop, Al2O3 is applied as an electronically insulated coating onto the inner surface of the ducts. The requirement for the insulated coating to reduce the additional leakage pressure drop caused by coating imperfections has been analyzed. Finally, the total liquid metal MHD pressure drop and magnetic pump power in the FDS blanket have been given.

  19. Preliminary performance analysis of a transverse flow spectrally selective two-slab packed bed volumetric receiver

    CSIR Research Space (South Africa)

    Roos, TH

    2016-05-01

    Full Text Available stream_source_info Roos_2016_ABSTRACT.pdf.txt stream_content_type text/plain stream_size 2694 Content-Encoding UTF-8 stream_name Roos_2016_ABSTRACT.pdf.txt Content-Type text/plain; charset=UTF-8 21st SolarPACES... International Conference (SolarPACES 2015), 13-16 October 2015 Preliminary Performance Analysis of a Transverse Flow Spectrally Selective Two-slab Packed Bed Volumetric Receiver Thomas H. Roos1, a) and Thomas M. Harms2, b) 1Aeronautical Systems...

  20. Preliminary Report: Analysis of the baseline study on the prevalence of Salmonella in laying hen flocks of Gallus gallus

    DEFF Research Database (Denmark)

    Hald, Tine

    This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary for the establ......This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary...

  1. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  2. Computing and Visualizing Log-linear Analysis Interactively

    Directory of Open Access Journals (Sweden)

    Pedro M. Valero-Mora

    2002-09-01

    Full Text Available The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997 and receives the name of LoginViSta (for Log-linear analysis in ViSTa. ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990 that features an object-oriented approach for statistical computing and one that allows for The purpose of this paper is to describe a simple program for computing log-linear analysis based on a direct manipulation interface that emphasizes the use of plots for guiding the analysis and evaluating the results obtained. The program described here works as a plugin for ViSta (Young 1997 and receives the name of LoginViSta (for Log-linear analysis in ViSTa. ViSta is a statistical package based on Lisp-Stat. Lisp-Stat is a statistical programming environment developed by Luke Tierney (1990 that features an object-oriented approach for statistical computing and one that allows for Computing and Visualizing Pedro Valero-Mora and Forrest W. Young interactive and dynamic graphs.

  3. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  4. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  5. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  6. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  7. Preliminary phytochemical screening, Antibacterial potential and GC-MS analysis of two medicinal plant extracts.

    Science.gov (United States)

    Vijayaram, Seerangaraj; Kannan, Suruli; Saravanan, Konda Mani; Vasantharaj, Seerangaraj; Sathiyavimal, Selvam; P, Palanisamy Senthilkumar

    2016-05-01

    The presence study was aimed to catalyze the primary metabolites and their confirmation by using GC-MS analysis and antibacterial potential of leaf extract of two important medicinal plant viz., Eucalyptus and Azadirachta indica. The antibacterial potential of the methanol leaf extract of the studied species was tested against Escherichia coli, Pseudomonas aeruginosa, Klebsiellap neumoniae, Streptococcus pyogens, Staphylococcus aureus using by agar well diffusion method. The higher zone of inhibition (16mm) was observed against the bacterium Pseudomonas aeruginosa at 100μl concentration of methanol leaf extract. Preliminary phytochemical analysis of studied species shows that presence of phytochemical compounds like steroids, phenolic compounds and flavonoids. GC-MS analysis confirms the occurrence of 20 different compounds in the methanol leaf extract of the both studied species.

  8. 1972 preliminary safety analysis report based on a conceptual design of a proposed repository in Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Blomeke, J.O.

    1977-08-01

    This preliminary safety analysis report is based on a proposed Federal Repository at Lyons, Kansas, for receiving, handling, and depositing radioactive solid wastes in bedded salt during the remainder of this century. The safety analysis applies to a hypothetical site in central Kansas identical to the Lyons site, except that it is free of nearby salt solution-mining operations and bore holes that cannot be plugged to Repository specifications. This PSAR contains much information that also appears in the conceptual design report. Much of the geological-hydrological information was gathered in the Lyons area. This report is organized in 16 sections: considerations leading to the proposed Repository, design requirements and criteria, a description of the Lyons site and its environs, land improvements, support facilities, utilities, different impacts of Repository operations, safety analysis, design confirmation program, operational management, requirements for eventually decommissioning the facility, design criteria for protection from severe natural events, and the proposed program of experimental investigations. (DLC)

  9. Preliminary Cluster Analysis For Several Representatives Of Genus Kerivoula (Chiroptera: Vespertilionidae) in Borneo

    Science.gov (United States)

    Hasan, Noor Haliza; Abdullah, M. T.

    2008-01-01

    The aim of the study is to use cluster analysis on morphometric parameters within the genus Kerivoula to produce a dendrogram and to determine the suitability of this method to describe the relationship among species within this genus. A total of 15 adult male individuals from genus Kerivoula taken from sampling trips around Borneo and specimens kept at the zoological museum of Universiti Malaysia Sarawak were examined. A total of 27 characters using dental, skull and external body measurements were recorded. Clustering analysis illustrated the grouping and morphometric relationships between the species of this genus. It has clearly separated each species from each other despite the overlapping of measurements of some species within the genus. Cluster analysis provides an alternative approach to make a preliminary identification of a species.

  10. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  11. A computer analysis of the Schreber Memoirs.

    Science.gov (United States)

    Klein, R H

    1976-06-01

    With the aid of a computerized system for content analysis, WORDS, the complete Schreber Memoirs was subjected to various multivariate reduction techniques in order to investigate the major content themes of this document. The findings included the prevalence of somatic concerns throughout the Memoirs, clear references to persecutory ideas and to Schreber's assumption of a redemptive role, complex encapsulated concerns about Schreber's relationship with God, a lack of any close relationship between sexuality and sexual transformation either to themes of castration or procreation, and the fact that neither sun, God, nor Flechsig was significantly associated with clusters concerning gender, sexuality, or castration. These findings are discussed in relation to psychodynamic interpretations furnished by prior investigators who employed different research methods.

  12. Computational Models for Analysis of Illicit Activities

    DEFF Research Database (Denmark)

    Nizamani, Sarwat

    Numerous illicit activities happen in our society, which, from time to time affect the population by harming individuals directly or indirectly. Researchers from different disciplines have contributed to developing strategies to analyze such activities, in order to help law enforcement agents....... These models include a model for analyzing evolution of terrorist networks; a text classification model for detecting suspicious text and identification of suspected authors of anonymous emails; and a semantic analysis model for news reports, which may help analyze the illicit activities in certain area...... with location and temporal information. For the network evolution, the hierarchical agglomerative clustering approach has been applied to terrorist networks as case studies. The networks' evolutions show that how individual actors who are initially isolated from each other are converted in small groups, which...

  13. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  14. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  15. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. The purification, crystallization and preliminary X-ray diffraction analysis of dihydrodipicolinate synthase from Clostridium botulinum

    Energy Technology Data Exchange (ETDEWEB)

    Dobson, Renwick C. J., E-mail: rdobson@unimelb.edu.au; Atkinson, Sarah C. [Department of Biochemistry and Molecular Biology, University of Melbourne, Parkville, Victoria 3010 (Australia); Bio21 Molecular Science and Biotechnology Institute, 30 Flemington Road, University of Melbourne, Parkville, Victoria 3010 (Australia); Gorman, Michael A. [St Vincents Institute, 9 Princes Street, Fitzroy, Victoria 3065 (Australia); Newman, Janet M. [CSIRO Division of Molecular and Health Technologies, 343 Royal Parade, Parkville, Victoria 3052 (Australia); Parker, Michael W. [Department of Biochemistry and Molecular Biology, University of Melbourne, Parkville, Victoria 3010 (Australia); Bio21 Molecular Science and Biotechnology Institute, 30 Flemington Road, University of Melbourne, Parkville, Victoria 3010 (Australia); St Vincents Institute, 9 Princes Street, Fitzroy, Victoria 3065 (Australia); Perugini, Matthew A. [Department of Biochemistry and Molecular Biology, University of Melbourne, Parkville, Victoria 3010 (Australia); Bio21 Molecular Science and Biotechnology Institute, 30 Flemington Road, University of Melbourne, Parkville, Victoria 3010 (Australia)

    2008-03-01

    Dihydrodipicolinate synthase (DHDPS), an enzyme in the lysine-biosynthetic pathway, is a promising target for antibiotic development against pathogenic bacteria. Here, the expression, purification, crystallization and preliminary diffraction analysis of DHDPS from C. botulinum are reported. In recent years, dihydrodipicolinate synthase (DHDPS; EC 4.2.1.52) has received considerable attention from both mechanistic and structural viewpoints. This enzyme, which is part of the diaminopimelate pathway leading to lysine, couples (S)-aspartate-β-semialdehyde with pyruvate via a Schiff base to a conserved active-site lysine. In this paper, the expression, purification, crystallization and preliminary X-ray diffraction analysis of DHDPS from Clostridium botulinum, an important bacterial pathogen, are presented. The enzyme was crystallized in a number of forms, predominantly using PEG precipitants, with the best crystal diffracting to beyond 1.9 Å resolution and displaying P4{sub 2}2{sub 1}2 symmetry. The unit-cell parameters were a = b = 92.9, c = 60.4 Å. The crystal volume per protein weight (V{sub M}) was 2.07 Å{sup 3} Da{sup −1}, with an estimated solvent content of 41%. The structure of the enzyme will help guide the design of novel therapeutics against the C. botulinum pathogen.

  18. A Preliminary Analysis of Reactor Performance Test (LOEP) for a Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonil; Park, Su-Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The final phase of commissioning is reactor performance test, which is to prove the integrated performance and safety of the research reactor at full power with fuel loaded such as neutron power calibration, Control Absorber Rod/Second Shutdown Rod drop time, InC function test, Criticality, Rod worth, Core heat removal with natural mechanism, and so forth. The last test will be safety-related one to assure the result of the safety analysis of the research reactor is marginal enough to be sure about the nuclear safety by showing the reactor satisfies the acceptance criteria of the safety functions such as for reactivity control, maintenance of auxiliaries, reactor pool water inventory control, core heat removal, and confinement isolation. After all, the fuel integrity will be ensured by verifying there is no meaningful change in the radiation levels. To confirm the performance of safety equipment, loss of normal electric power (LOEP), possibly categorized as Anticipated Operational Occurrence (AOO), is selected as a key experiment to figure out how safe the research reactor is before turning over the research reactor to the owner. This paper presents a preliminary analysis of the reactor performance test (LOEP) for a research reactor. The results showed how different the transient between conservative estimate and best estimate will look. Preliminary analyses have shown all probable thermal-hydraulic transient behavior of importance as to opening of flap valve, minimum critical heat flux ratio, the change of flow direction, and important values of thermal-hydraulic parameters.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  20. Preliminary design and thermal analysis of device for finish cooling Jaffa biscuits in a.d. 'Jaffa'- Crvenka

    Directory of Open Access Journals (Sweden)

    Salemović Duško R.

    2015-01-01

    Full Text Available In this paper preliminary design of device for finish cooling chocolate topping of biscuits in A.D. 'Jaffa'- Crvenka was done. The proposed preliminary design followed by the required technological process of finish cooling biscuits and required parameters of process which was supposed to get and which represented part of project task. Thermal analysis was made and obtained percentage error between surface contact of the air and chocolate topping, obtained from heat balance and geometrical over proposed preliminary design, wasn't more than 0.67%. This is a preliminary design completely justified because using required length of belt conveyor receive required temperature of chocolate topping at the end of the cooling process.

  1. Parallel computation of seismic analysis of high arch dam

    Institute of Scientific and Technical Information of China (English)

    Chen Houqun; Ma Huaifa; Tu Jin; Cheng Guangqing; Tang Juzhen

    2008-01-01

    Parallel computation programs are developed for three-dimensional meso-mechanics analysis of fully-graded dam concrete and seismic response analysis of high arch dams (ADs), based on the Parallel Finite Element Program Generator (PFEPG). The computational algorithms of the numerical simulation of the meso-structure of concrete specimens were studied. Taking into account damage evolution, static preload, strain rate effect, and the heterogeneity of the meso-structure of dam concrete, the fracture processes of damage evolution and configuration of the cracks can be directly simulated. In the seismic response analysis of ADs, all the following factors are involved, such as the nonlinear contact due to the opening and slipping of the contraction joints, energy dispersion of the far-field foundation, dynamic interactions of the dam-foundation-reservoir system, and the combining effects of seismic action with all static loads. The correctness, reliability and efficiency of the two parallel computational programs are verified with practical illustrations.

  2. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  3. Mini-DIAL system measurements coupled with multivariate data analysis to identify TIC and TIM simulants: preliminary absorption database analysis.

    Science.gov (United States)

    Gaudio, P.; Malizia, A.; Gelfusa, M.; Martinelli, E.; Di Natale, C.; Poggi, L. A.; Bellecci, C.

    2017-01-01

    Nowadays Toxic Industrial Components (TICs) and Toxic Industrial Materials (TIMs) are one of the most dangerous and diffuse vehicle of contamination in urban and industrial areas. The academic world together with the industrial and military one are working on innovative solutions to monitor the diffusion in atmosphere of such pollutants. In this phase the most common commercial sensors are based on “point detection” technology but it is clear that such instruments cannot satisfy the needs of the smart cities. The new challenge is developing stand-off systems to continuously monitor the atmosphere. Quantum Electronics and Plasma Physics (QEP) research group has a long experience in laser system development and has built two demonstrators based on DIAL (Differential Absorption of Light) technology could be able to identify chemical agents in atmosphere. In this work the authors will present one of those DIAL system, the miniaturized one, together with the preliminary results of an experimental campaign conducted on TICs and TIMs simulants in cell with aim of use the absorption database for the further atmospheric an analysis using the same DIAL system. The experimental results are analysed with standard multivariate data analysis technique as Principal Component Analysis (PCA) to develop a classification model aimed at identifying organic chemical compound in atmosphere. The preliminary results of absorption coefficients of some chemical compound are shown together pre PCA analysis.

  4. A preliminary uncertainty analysis of phenomenological inputs in TEXAS-V code

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Kim, H. D.; Ahn, K. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    Uncertainty analysis is important step in safety analysis of nuclear power plants. The better estimate for the computer codes is on the increase instead of conservative codes. These efforts aim to get more precise evaluation of safety margins, and aim at determining the rate of change in the prediction of codes with one or more input parameters varies within its range of interest. From this point of view, a severe accident uncertainty analysis system, SAUNA, has been improved for TEXAS-V FCI uncertainty analysis. The main objective of this paper is to present the TEXAS FCI uncertainty analysis results implemented through the SAUNA code

  5. Preliminary Nuclear Analysis for the HANARO Fuel Element with Burnable Absorber

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chul Gyo; Kim, So Young; In, Won Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Burnable absorber is used for reducing reactivity swing and power peaking in high performance research reactors. Development of the HANARO fuel element with burnable absorber was started in the U-Mo fuel development program at HANARO, but detailed full core analysis was not performed because the current HANARO fuel management system is uncertain to analysis the HANARO core with burnable absorber. A sophisticated reactor physics system is required to analysis the core. The McCARD code was selected and the detailed McCARD core models, in which the basic HANARO core model was developed by one of the McCARD developers, are used in this study. The development of nuclear fuel requires a long time and correct developing direction especially by the nuclear analysis. This paper presents a preliminary nuclear analysis to promote the fuel development. Based on the developed fuel, the further nuclear analysis will improve reactor performance and safety. Basic nuclear analysis for the HANARO and the AHR were performed for getting the proper fuel elements with burnable absorber. Addition of 0.3 - 0.4% Cd to the fuel meat is promising for the current HANARO fuel element. Small addition of burnable absorber may not change any fuel characteristics of the HANARO fuel element, but various basic tests and irradiation tests at the HANARO core are required.

  6. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  8. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  9. Rigorous computer analysis of the Chow-Robbins game

    CERN Document Server

    Häggström, Olle

    2012-01-01

    Flip a coin repeatedly, and stop whenever you want. Your payoff is the proportion of heads, and you wish to maximize this payoff in expectation. This so-called Chow-Robbins game is amenable to computer analysis, but while simple-minded number crunching can show that it is best to continue in a given position, establishing rigorously that stopping is optimal seems at first sight to require "backward induction from infinity". We establish a simple upper bound on the expected payoff in a given position, allowing efficient and rigorous computer analysis of positions early in the game. In particular we confirm that with 5 heads and 3 tails, stopping is optimal.

  10. First Experiences with LHC Grid Computing and Distributed Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  11. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.

    1999-10-07

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative.

  12. Treatment by gliding arc of epoxy resin: preliminary analysis of surface modifications

    Science.gov (United States)

    Faubert, F.; Wartel, M.; Pellerin, N.; Pellerin, S.; Cochet, V.; Regnier, E.; Hnatiuc, B.

    2016-12-01

    Treatments with atmospheric pressure non-thermal plasma are easy to implement and inexpensive. Among them gliding arc (GlidArc) remains rarely used in surface treatment of polymers. However, it offers economic and flexible way to treat quickly large areas. In addition the choice of carrier gas makes it possible to bring the active species and other radicals allowing different types of grafting and functionalization of the treated surfaces, for example in order to apply for anti-biofouling prevention. This preliminary work includes analysis of the surface of epoxy resins by infrared spectroscopy: the different affected chemical bonds were studied depending on the duration of treatment. The degree of oxidation (the C/O ratio) is obtained by X-ray microanalysis and contact angle analysis have been performed to determinate the wettability properties of the treated surface. A spectroscopic study of the plasma allows to determine the possible active species in the different zones of the discharge.

  13. Los Alamos National Laboratory corregated metal pipe saw facility preliminary safety analysis report. Volume I

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-09-19

    This Preliminary Safety Analysis Report addresses site assessment, facility design and construction, and design operation of the processing systems in the Corrugated Metal Pipe Saw Facility with respect to normal and abnormal conditions. Potential hazards are identified, credible accidents relative to the operation of the facility and the process systems are analyzed, and the consequences of postulated accidents are presented. The risk associated with normal operations, abnormal operations, and natural phenomena are analyzed. The accident analysis presented shows that the impact of the facility will be acceptable for all foreseeable normal and abnormal conditions of operation. Specifically, under normal conditions the facility will have impacts within the limits posted by applicable DOE guidelines, and in accident conditions the facility will similarly meet or exceed the requirements of all applicable standards. 16 figs., 6 tabs.

  14. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  15. Preliminary cladistic analysis of genera of the cestode order Trypanorhyncha Diesing, 1863.

    Science.gov (United States)

    Beveridge, I; Campbell, R A; Palm, H W

    1999-01-01

    A preliminary cladistic analysis was carried out on the 49 currently recognised genera of the order Trypanorhyncha. Forty-four characters were analysed; a functional outgroup was used for scolex and strobilar characters, while Nybelinia was utilised to polarise characters related to the rhyncheal system. Eight well-resolved clades were evident in the resultant cladogram, which is compared with existing phenetic classifications. An analysis of families resulted in a similar clustering of taxa to that observed in the case of the genera. The results suggest that two key characters used in existing classifications, namely the presence of sensory fossettes on the bothridia and the development of atypical heteroacanth and poeciloacanth armatures from typical heteroacanth armatures, have occurred on several occasions. Some clades provide support for the arrangements used in current classifications. Suggestions are made for future avenues of research which might provide more robust phylogenetic data for the Trypanorhyncha.

  16. Dimensions of Human-Work Domain Interaction: A Preliminary Analysis for the Design of a Corporate Digital Library.

    Science.gov (United States)

    Xie, Hong

    2003-01-01

    Applies the cognitive system engineering approach to investigate human-work interaction at a corporate setting. Reports preliminary analysis of data collected from diary analysis and interview of 20 subjects. Results identify three dimensions for each of four interactive activities involved in human-work interaction and their relationships.…

  17. Advances in computational design and analysis of airbreathing propulsion systems

    Science.gov (United States)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  18. History in the Computer Science Curriculum

    OpenAIRE

    1995-01-01

    IFIP Working Group 9.7 (History of Computing) is charged with not only encouraging the preservation of computer artifacts, the recording of the memoirs of pioneers, and the analysis of the downstream impact of computer innovations, but also on the development of educational modules on the history of computing. This paper presents an initial report on the study of the history of computing and informatics and preliminary proposals for the inclusion of aspects of the history of computing and i...

  19. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  20. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  1. Preliminary failure modes and effects analysis on Korean HCCR TBS to be tested in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Mu-Young, E-mail: myahn74@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Yi-Hyun; Lee, Youngmin [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Postulated initiating events are identified through failure modes and effects analysis on the current HCCR TBS design. • A set of postulated initiating events are selected for consideration of deterministic analysis. • Accident evolutions on the selected postualted initiating events are qualitatively described for deterministic analysis. - Abstract: Korean Helium cooled ceramic reflector (HCCR) Test blanket system (TBS), which comprises Test blanket module (TBM) and ancillary systems in various locations of ITER building, is operated at high temperature and pressure with decay heat. Therefore, safety is utmost concern in design process and it is required to demonstrate that the HCCR TBS is designed to comply with the safety requirements and guidelines of ITER. Due to complexity of the system with many interfaces with ITER, a systematic approach is necessary for safety analysis. This paper presents preliminary failure modes and effects analysis (FMEA) study performed for the HCCR TBS. FMEA is a systematic methodology in which failure modes for components in the system and their consequences are studied from the bottom-up. Over eighty failure modes have been investigated on the HCCR TBS. The failure modes that have similar consequences are grouped as postulated initiating events (PIEs) and total seven reference accident scenarios are derived from FMEA study for deterministic accident analysis. Failure modes not covered here due to evolving design of the HCCR TBS and uncertainty in maintenance procedures will be studied further in near future.

  2. Expression, purification, crystallization and preliminary crystallographic analysis of the proliferation-associated protein Ebp1

    Energy Technology Data Exchange (ETDEWEB)

    Kowalinski, Eva; Bange, Gert; Wild, Klemens; Sinning, Irmgard, E-mail: irmi.sinning@bzh.uni-heidelberg.de [Heidelberg University Biochemistry Center, INF 328, D-69120 Heidelberg (Germany)

    2007-09-01

    Preliminary X-ray analysis of the proliferation-associated protein Ebp1 from Homo sapiens is provided. ErbB-3-binding protein 1 (Ebp1) is a member of the family of proliferation-associated 2G4 proteins (PA2G4s) and plays a role in cellular growth and differentiation. Ligand-induced activation of the transmembrane receptor ErbB3 leads to dissociation of Ebp1 from the receptor in a phosphorylation-dependent manner. The non-associated protein is involved in transcriptional and translational regulation in the cell. Here, the overexpression, purification, crystallization and preliminary crystallographic studies of Ebp1 from Homo sapiens are reported. Initially observed crystals were improved by serial seeding to single crystals suitable for data collection. The optimized crystals belong to the tetragonal space group P4{sub 1}2{sub 1}2 or P4{sub 3}2{sub 1}2 and diffracted to a resolution of 1.6 Å.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  4. Computer analysis of shells of revolution using asymptotic results

    Science.gov (United States)

    Steele, C. R.; Ranjan, G. V.; Goto, C.; Pulliam, T. H.

    1979-01-01

    It is suggested that asymptotic results for the behavior of thin shells can be incorporated in a general computer code for the analysis of a complex shell structure. The advantage when compared to existing finite difference or finite element codes is a substantial reduction in computational labor with the capability of working to a specified level of accuracy. A reduction in user preparation time and dependance on user judgment is also gained, since mesh spacing can be internally generated. The general theory is described in this paper, as well as the implementation in the computer code FAST 1 (Functional Algorithm for Shell Theory) for the analysis of the general axisymmetric shell structure with axisymmetric loading.

  5. Finite element dynamic analysis on CDC STAR-100 computer

    Science.gov (United States)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  6. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  7. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  9. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  10. submitter Generalized Harmonic Analysis of Computed and Measured Magnetic Fields

    CERN Document Server

    Auchmann, B; Petrone, C; Russenschuck, S

    2016-01-01

    In this paper, we present a generalized approach for the harmonic analysis of the magnetic field in accelerator magnets. This analysis is based on the covariant components of the computed or measured magnetic flux density. The multipole coefficients obtained in this way can be used for magnet optimization and field reconstruction in the interior of circular and elliptical boundaries in the bore of straight magnets.

  11. Computational fluid dynamics analysis of a mixed flow pump impeller

    African Journals Online (AJOL)

    ATHARVA

    results of CFD analysis, the velocity and pressure in the outlet of the impeller is predicted. ... The numerical simulation can provide quite accurate information on the fluid ... of the computational domain the mass flow rate, the turbulence intensity, and a reference pressure are specified. .... Averaged velocity distribution.

  12. Computing support for advanced medical data analysis and imaging

    CERN Document Server

    Wiślicki, W; Białas, P; Czerwiński, E; Kapłon, Ł; Kochanowski, A; Korcyl, G; Kowal, J; Kowalski, P; Kozik, T; Krzemień, W; Molenda, M; Moskal, P; Niedźwiecki, S; Pałka, M; Pawlik, M; Raczyński, L; Rudy, Z; Salabura, P; Sharma, N G; Silarski, M; Słomski, A; Smyrski, J; Strzelecki, A; Wieczorek, A; Zieliński, M; Zoń, N

    2014-01-01

    We discuss computing issues for data analysis and image reconstruction of PET-TOF medical scanner or other medical scanning devices producing large volumes of data. Service architecture based on the grid and cloud concepts for distributed processing is proposed and critically discussed.

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. Forest Fire History... A Computer Method of Data Analysis

    Science.gov (United States)

    Romain M. Meese

    1973-01-01

    A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...

  15. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  16. Peer-to-peer computing in health-promoting voluntary organizations: a system design analysis.

    Science.gov (United States)

    Irestig, Magnus; Hallberg, Niklas; Eriksson, Henrik; Timpka, Toomas

    2005-10-01

    A large part of the health promotion in today's society is performed as peer-to-peer empowerment in voluntary organisations such as sports clubs, charities, and trade unions. In order to prevent work-related illness and long-term sickness absence, the aim of this study is to explore computer network services for empowerment of employees by peer-to-peer communication. The 'technique trade-off method was used for the analysis of the system design. A Critical Incident Technique questionnaire was distributed to a representative sample of trade union shop stewards (n = 386), and focus-group seminars were arranged where a preliminary set of requirements was discussed. Seven basic requirements were identified and matched to a set of 12 design issues for computer network services, allocating a subset of design issues to each requirement. The conclusion is that the systems design displays an inexpensive and potentially feasible method for peer-to-peer computing in voluntary health-promoting organisations.

  17. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  18. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  19. Preliminary Analysis on the Relative Solution Space Sizes for MTSP with Genetic Algorithm

    Science.gov (United States)

    Hao, Junling

    It is well known that the chromosome design is pivotal to solve the multiple traveling salesman problems with genetic algorithm. A well-designed chromosome coding can eliminate or reduce the redundant solutions. One chromosome and two chromosome design methods and a recently proposed two-part chromosome design are firstly introduced in this paper. Then the preliminary quantitative comparison analysis of the solution spaces of three different chromosome design methods is presented when the number of cities is linear with the travelers. The concept of relative solution space is proposed in order to compare the relative size of the solution spaces. The solution space of two-part chromosome design is much smaller than those of the traditional chromosome design. The result given in this paper provides a good guideline for the possible algorithmic design and engineering applications.

  20. Crystallization and preliminary X-ray analysis of alginate importer from Sphingomonas sp. A1.

    Science.gov (United States)

    Maruyama, Yukie; Itoh, Takafumi; Nishitani, Yu; Mikami, Bunzo; Hashimoto, Wataru; Murata, Kousaku

    2012-03-01

    Sphingomonas sp. A1 directly incorporates alginate polysaccharides through a 'superchannel' comprising a pit on the cell surface, alginate-binding proteins in the periplasm and an ABC transporter (alginate importer) in the inner membrane. Alginate importer, consisting of four subunits, AlgM1, AlgM2 and two molecules of AlgS, was crystallized in the presence of the binding protein AlgQ2. Preliminary X-ray analysis showed that the crystal diffracted to 3.3 Å resolution and belonged to space group P2(1)2(1)2(1), with unit-cell parameters a = 72.5, b = 136.8, c = 273.3 Å, suggesting the presence of one complex in the asymmetric unit.

  1. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Hart, Reid; Athalye, Rahul A.; Rosenberg, Michael I.; Richman, Eric E.; Winiarski, David W.

    2014-03-01

    Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. When the U.S. Department of Energy (DOE) issues an affirmative determination on Standard 90.1, states are statutorily required to certify within two years that they have reviewed and updated the commercial provisions of their building energy code, with respect to energy efficiency, to meet or exceed the revised standard. This report provides a preliminary qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition).

  2. Crystallization and preliminary X-ray analysis of Rv1674c from Mycobacterium tuberculosis

    Science.gov (United States)

    Li, Jincheng; Wang, Xudong; Gong, Weimin; Niu, Chunyan; Zhang, Min

    2015-01-01

    Adaptations to hypoxia play an important role in Mycobacterium tuberculosis pathogenesis. Rv0324, which contains an HTH DNA-binding domain and a rhodanese domain, is one of the key transcription regulators in response to hypoxia. M. tuberculosis Rv1674c is a homologue of Rv0324. To understand the interdomain interaction and regulation of the HTH domain and the rhodanese domain, recombinant Rv1674c protein was purified and crystallized by the vapour-diffusion method. The crystals diffracted to 2.25 Å resolution. Preliminary diffraction analysis suggests that the crystals belonged to space group P3121 or P3221, with unit-cell parameters a = b = 67.8, c = 174.5 Å, α = β = 90, γ = 120°. The Matthews coefficient was calculated to be 2.44 Å3 Da−1, assuming that the crystallographic asymmetric unit contains two protein molecules. PMID:25760714

  3. Preliminary phytochemical analysis, antibacterial, antifungal and anticandidal activities of successive extracts of Crossandra infundibuliformis

    Institute of Scientific and Technical Information of China (English)

    MadhumithaG; SaralAM

    2011-01-01

    Objective:To investigate the phytochemical, antibacterial, antifungal and anticandidal activity of successive extracts of Crossandra infundibuliformis (Acanthaceae) leaves. Methods:Preliminary screening on the presence of alkaloids, saponins, phytosterols, phenolic compounds, flavanoids, tannins, carbohydrates, terpenoids, oils and fats were carried out by phytochemical analysis. The antibacterial, antifungal and anticandidal activities were done by agar well diffusion technique. Results:The successive extracts have an array of chemical constituents and the MIC values of antibacterial activity ranges from 0.007 8 to 0.015 0μg/mL. In case of antifungal and anticandidal activities the MIC values were between 0.125 and 0.250μg/mL. Conclusions:These findings demonstrate that the leaf extracts of C. infundibuliformis presents excellent antimicrobial activities and thus have great potential as a source for natural health care products.

  4. Preliminary C3 Loading Analysis for Future High-Altitude Unmanned Aircraft in the NAS

    Science.gov (United States)

    Ho, Yan-Shek; Gheorghisor, Izabela; Box, Frank

    2006-01-01

    This document provides a preliminary assessment and summary of the command, control, and communications (C(sup 3)) loading requirements of a generic future high-altitude, long-endurance unmanned aircraft (UA) operating at in the National Airspace System. Two principal types of C(sup 3) traffic are considered in our analysis: communications links providing air traffic services (ATS) to the UA and its human pilot, and the command and control data links enabling the pilot to operate the UA remotely. we have quantified the loading requirements of both types of traffic for two different assumed levels of UA autonomy. Our results indicate that the potential use of UA-borne relays for the ATS links, and the degree of autonomy exercised by the UA during the departure and arrival phases of its flight, will be among the key drivers of C(sup 3) loading and bandwidth requirements.

  5. Visual Assessment on Coastal Cruise Tourism: A Preliminary Planning Using Importance Performance Analysis

    Science.gov (United States)

    Trisutomo, S.

    2017-07-01

    Importance-Performance Analysis (IPA) has been widely applied in many cases. In this research, IPA was applied to measure perceive on coastal tourism objects and its possibility to be developed as coastal cruise tourism in Makassar. Three objects, i.e. Akkarena recreational site, Losari public space at waterfront, and Paotere traditional Phinisi ships port, were selected and assessed visually from water area by a group of purposive resource persons. The importance and performance of 10 attributes of each site were scored using Likert scale from 1 to 5. Data were processed by SPSS-21 than resulted Cartesian graph which the scores were divided in four quadrants: Quadrant I concentric here, Quadrant II keep up the good work, Quadrant III low priority, and Quadrant IV possible overkill. The attributes in each quadrant could be considered as the platform for preliminary planning of coastal cruise tour in Makassar

  6. Preliminary control system design and analysis for the Space Station Furnace Facility thermal control system

    Science.gov (United States)

    Jackson, M. E.

    1995-01-01

    This report presents the Space Station Furnace Facility (SSFF) thermal control system (TCS) preliminary control system design and analysis. The SSFF provides the necessary core systems to operate various materials processing furnaces. The TCS is defined as one of the core systems, and its function is to collect excess heat from furnaces and to provide precise cold temperature control of components and of certain furnace zones. Physical interconnection of parallel thermal control subsystems through a common pump implies the description of the TCS by coupled nonlinear differential equations in pressure and flow. This report formulates the system equations and develops the controllers that cause the interconnected subsystems to satisfy flow rate tracking requirements. Extensive digital simulation results are presented to show the flow rate tracking performance.

  7. Acanthamoeba polyphaga mimivirus NDK: preliminary crystallographic analysis of the first viral nucleoside diphosphate kinase.

    Science.gov (United States)

    Jeudy, Sandra; Coutard, Bruno; Lebrun, Régine; Abergel, Chantal

    2005-06-01

    The complete sequence of the largest known double-stranded DNA virus, Acanthamoeba polyphaga mimivirus, has recently been determined [Raoult et al. (2004), Science, 306, 1344-1350] and revealed numerous genes not expected to be found in a virus. A comprehensive structural and functional study of these gene products was initiated [Abergel et al. (2005), Acta Cryst. F61, 212-215] both to better understand their role in the virus physiology and to obtain some clues to the origin of DNA viruses. Here, the preliminary crystallographic analysis of the viral nucleoside diphosphate kinase protein is reported. The crystal belongs to the cubic space group P2(1)3, with unit-cell parameter 99.425 A. The self-rotation function confirms that there are two monomers per asymmetric unit related by a twofold non-crystallographic axis and that the unit cell thus contains four biological entities.

  8. Preliminary analysis of habitat utilization by woodland caribou in northwestern Ontario using satellite telemetry

    Directory of Open Access Journals (Sweden)

    T.L. Hillis

    1998-03-01

    Full Text Available Locational data collected over a one year period from 10 female woodland caribou, Rangifer tarandus caribou, collared with Argos satellite collars in northwestern Ontario, Canada were superimposed on supervised Landsat images using Geographical Information System (GIS technology. Landscape parameters, land cover classifications, and drainage were utilized to create the basemap. Using ARCVIEW software, all digital fixes from collared caribou with information of date, time, and activity status were overlain on the basemap to facilitate a preliminary analysis of habitat use in this species. Results supported the conclusions (1 that woodland caribou in northwestern Ontario select habitats containing high to moderate conifer cover and avoided disturbed areas and shrub-rich habitats, (2 that seasonal changes in habitat utilization occurs in females of this species, and (3 that satellite telemetry technology can be employed in the boreal forest ecosystem to assess habitat utilization by large ungulate species.

  9. Crystallization and preliminary X-ray analysis of Leishmania major glyoxalase I

    Energy Technology Data Exchange (ETDEWEB)

    Ariza, Antonio; Vickers, Tim J.; Greig, Neil; Fairlamb, Alan H.; Bond, Charles S., E-mail: c.s.bond@dundee.ac.uk [Division of Biological Chemistry and Molecular Microbiology, Wellcome Trust Biocentre, School of Life Sciences, University of Dundee, Dundee DD1 5EH,Scotland (United Kingdom)

    2005-08-01

    The detoxification enzyme glyoxalase I from L. major has been crystallized. Preliminary molecular-replacement calculations indicate the presence of three glyoxalase I dimers in the asymmetric unit. Glyoxalase I (GLO1) is a putative drug target for trypanosomatids, which are pathogenic protozoa that include the causative agents of leishmaniasis. Significant sequence and functional differences between Leishmania major and human GLO1 suggest that it may make a suitable template for rational inhibitor design. L. major GLO1 was crystallized in two forms: the first is extremely disordered and does not diffract, while the second, an orthorhombic form, produces diffraction to 2.0 Å. Molecular-replacement calculations indicate that there are three GLO1 dimers in the asymmetric unit, which take up a helical arrangement with their molecular dyads arranged approximately perpendicular to the c axis. Further analysis of these data are under way.

  10. Cognitive Work Analysis: Preliminary Data for a Model of Problem Solving Strategies

    Science.gov (United States)

    Rothmayer, Mark; Blue, Jennifer

    2007-10-01

    Investigations into problem solving strategies are part of the field of physics education research where investigators seek to improve physics instruction by conducting basic research of problem solving abilities among students, differences in knowledge representations between experts and novices, and how to transfer knowledge structures more effectively onto novices. We developed a new conceptual research tool in our laboratory, where we could potentially map the step by step flow of problem solving strategies among experts and novices. This model is derived from the theory of Cognitive Work Analysis, which is grounded in ecological psychology, and as far as we know it has never been applied to a knowledge domain like physics. We collected survey data from 140 undergraduates enrolled in an algebra based introductory physics course at Miami University as part of a larger study aimed to test the validity of the model. Preliminary data will be presented and discussed.

  11. Preliminary risk analysis applied to the transmission of Creutzfeldt-Jakob disease.

    Science.gov (United States)

    Bertrand, E; Schlatter, J

    2011-01-01

    Transmissible spongiform encephalopathy (TSE) is a degenerative disease of the central nervous system. As yet, there is no human screening test and no effective treatment. This disease is invariably fatal. General preventive measures are therefore essential. The objective of this study is to analyze and address on a prioritized basis the risks relating to the transmission of Creutzfeldt-Jakob disease during surgical operations by means of a preliminary risk analysis (PRA). The PRA produces 63 scenarios with maximum risk relating to operational and legal dangers. The study recommends a number of courses of action, such as training and internal controls, in order to reduce the risks identified. A procedure has been drawn up and assessed for each action. This PRA makes it possible to target and significantly reduce the potential dangers for transmission of Creutzfeldt-Jakob disease through the use of medical instruments.

  12. Purification, crystallization and preliminary X-ray analysis of uridine phosphorylase from Salmonella typhimurium.

    Science.gov (United States)

    Dontsova, Mariya V; Savochkina, Yulia A; Gabdoulkhakov, Azat G; Baidakov, Sergey N; Lyashenko, Andrey V; Zolotukhina, Maria; Errais Lopes, Liubov; Garber, Mariya B; Morgunova, Ekaterina Yu; Nikonov, Stanislav V; Mironov, Alexandr S; Ealick, Steven E; Mikhailov, Al 'Bert M

    2004-04-01

    The structural udp gene encoding uridine phosphorylase (UPh) was cloned from the Salmonella typhimurium chromosome and overexpressed in Escherichia coli cells. S. typhimurium UPh (StUPh) was purified to apparent homogeneity and crystallized. The primary structure of StUPh has high homology to the UPh from E. coli, but the enzymes differ substantially in substrate specificity and sensitivity to the polarity of the medium. Single crystals of StUPh were grown using hanging-drop vapor diffusion with PEG 8000 as the precipitant. X-ray diffraction data were collected to 2.9 A resolution. Preliminary analysis of the diffraction data indicated that the crystal belonged to space group P6(1(5)), with unit-cell parameters a = 92.3, c = 267.5 A. The solvent content is 37.7% assuming the presence of one StUPh hexamer per asymmetric unit.

  13. Crystallization and preliminary X-ray diffraction analysis of West Nile virus

    Energy Technology Data Exchange (ETDEWEB)

    Kaufmann, Barbel; Plevka, Pavel; Kuhn, Richard J.; Rossmann, Michael G. (Purdue)

    2010-05-25

    West Nile virus, a human pathogen, is closely related to other medically important flaviviruses of global impact such as dengue virus. The infectious virus was purified from cell culture using polyethylene glycol (PEG) precipitation and density-gradient centrifugation. Thin amorphously shaped crystals of the lipid-enveloped virus were grown in quartz capillaries equilibrated by vapor diffusion. Crystal diffraction extended at best to a resolution of about 25 {angstrom} using synchrotron radiation. A preliminary analysis of the diffraction images indicated that the crystals had unit-cell parameters a {approx_equal} b {approx_equal} 480 {angstrom}, {gamma} = 120{sup o}, suggesting a tight hexagonal packing of one virus particle per unit cell.

  14. Euler Technology Assessment for Preliminary Aircraft Design-Unstructured/Structured Grid NASTD Application for Aerodynamic Analysis of an Advanced Fighter/Tailless Configuration

    Science.gov (United States)

    Michal, Todd R.

    1998-01-01

    This study supports the NASA Langley sponsored project aimed at determining the viability of using Euler technology for preliminary design use. The primary objective of this study was to assess the accuracy and efficiency of the Boeing, St. Louis unstructured grid flow field analysis system, consisting of the MACGS grid generation and NASTD flow solver codes. Euler solutions about the Aero Configuration/Weapons Fighter Technology (ACWFT) 1204 aircraft configuration were generated. Several variations of the geometry were investigated including a standard wing, cambered wing, deflected elevon, and deflected body flap. A wide range of flow conditions, most of which were in the non-linear regimes of the flight envelope, including variations in speed (subsonic, transonic, supersonic), angles of attack, and sideslip were investigated. Several flowfield non-linearities were present in these solutions including shock waves, vortical flows and the resulting interactions. The accuracy of this method was evaluated by comparing solutions with test data and Navier-Stokes solutions. The ability to accurately predict lateral-directional characteristics and control effectiveness was investigated by computing solutions with sideslip, and with deflected control surfaces. Problem set up times and computational resource requirements were documented and used to evaluate the efficiency of this approach for use in the fast paced preliminary design environment.

  15. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    Science.gov (United States)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  16. A Preliminary Analysis of Dose Rates Associated with ITER CVCS Equipment/Area Location

    Energy Technology Data Exchange (ETDEWEB)

    Blakeman, Edward D [ORNL; Ilas, Dan [ORNL; Petrov, Andrei Y [ORNL

    2012-03-01

    A preliminary analysis of the ITER Chemical and Volume Control System (CVCS) Area was performed to assess dose rates outside the walls and ceiling of the facility after 1.5 years of operation at shutdown, 2 days, and 10 days after shutdown. For this purpose a simplified Monte Carlo computer model was developed using the MCNP (MCNP5 Ver. 1.51) code. Two components are included: the smaller filter tank and the larger ion exchanger. These pieces of equipment are associated with the Integrated Blanket ELM Divertor Primary Heat Transfer System, which will have the largest dose rates associated with activated corrosion products during operation in comparison with other systems. The ion exchanger contained two source regions, a 1.2-m-thick resin bed above a 0.55 m-thick skirt, and a 0.8-m-thick water region. The filter constituted an additional source. Thus the model consisted of three sources (filter, resin, water), homogeneously distributed within the appropriate source regions. However, much of the results (that address individual isotopes) are presented with the two sources in the ion exchanger combined. In these cases the sources are referred to as the 'ion exchanger source' and the 'filter source.' Dimensions for the facility and components, as well as source isotopes and strengths, and material densities, were supplied by US ITER. Because of its simplification, the model does not contain pipes. Consequently, radiation streaming through pipe penetrations, radiation emanating from the pipes, and shielding from the pipes were not considered in this analysis. Dose rates on the outside of two walls and the ceiling were calculated. The two walls are labeled as the 'long' wall (aligned with the X-axis) and the 'short' wall (aligned with the Y-axis). These walls and ceiling were nominally set to 30-cm-thick concrete. In the original analysis, standard concrete (2.3 g/cc density) was used. In addition to the shielding walls/ceiling, a

  17. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    Science.gov (United States)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  18. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  19. EST analysis pipeline: use of distributed computing resources.

    Science.gov (United States)

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  20. Computer Vision-Based Image Analysis of Bacteria.

    Science.gov (United States)

    Danielsen, Jonas; Nordenfelt, Pontus

    2017-01-01

    Microscopy is an essential tool for studying bacteria, but is today mostly used in a qualitative or possibly semi-quantitative manner often involving time-consuming manual analysis. It also makes it difficult to assess the importance of individual bacterial phenotypes, especially when there are only subtle differences in features such as shape, size, or signal intensity, which is typically very difficult for the human eye to discern. With computer vision-based image analysis - where computer algorithms interpret image data - it is possible to achieve an objective and reproducible quantification of images in an automated fashion. Besides being a much more efficient and consistent way to analyze images, this can also reveal important information that was previously hard to extract with traditional methods. Here, we present basic concepts of automated image processing, segmentation and analysis that can be relatively easy implemented for use with bacterial research.

  1. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  2. A preliminary uncertainty analysis of phenomenological inputs employed in MAAP code using the SAUNA system

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. H.; Park, S. Y.; Kim, K. R.; Ahn, K. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    Uncertainty analysis is an essential element of safety analysis of nuclear power plants, and especially on the increase as an essential methodology of safety assessment by computer codes. Recently, these efforts have been stepped up to apply the uncertainty methodology in severe accident analysis and PSA Level 2. From this point of view, a statistical sampling-based MAAP-specific platform for a severe accident uncertainty analysis, SAUNA, is being developed in KAERI. Its main purpose is to execute many simulations that are employed for uncertainty analysis. For its efficient implementation, the SAUNA system is composed of three related modules: Firstly, a module for preparing a statistical sampling matrix, secondly, a module for the dynamic linking between code and samples for code simulation, and thirdly, a postprocessing module for further analysis of the code simulation results. The main objective of this paper is to introduce the main functions of the SAUNA system and its example of implementation.

  3. CAR: A MATLAB Package to Compute Correspondence Analysis with Rotations

    Directory of Open Access Journals (Sweden)

    Urbano Lorenzo-Seva Rovira

    2009-09-01

    Full Text Available Correspondence analysis (CA is a popular method that can be used to analyse relationships between categorical variables. Like principal component analysis, CA solutions can be rotated both orthogonally and obliquely to simple structure without affecting the total amount of explained inertia. We describe a MATLAB package for computing CA. The package includes orthogonal and oblique rotation of axes. It is designed not only for advanced users of MATLAB but also for beginners. Analysis can be done using a user-friendly interface, or by using command lines. We illustrate the use of CAR with one example.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  7. Fluctuation in measurements of pulmonary nodule under tidal volume ventilation on four-dimensional computed tomography: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Tateishi, Ukihide [National Cancer Center Hospital, Division of Diagnostic Radiology, Chuo-ku, Tokyo (Japan); Tsukagoshi, Shinsuke; Inokawa, Hiroyasu; Okumura, Miwa [Toshiba Medical Systems Corporation, CT Systems Development, Otawara (Japan); Moriyama, Noriyuki [National Cancer Center, Division of Cancer Screening, Research Center for Cancer Prevention and Screening, Tokyo (Japan)

    2008-10-15

    The present study aimed to assess the feasibility of four-dimensional (4D) chest computed tomography (CT) under tidal volume ventilation and the impact of respiratory motion on quantitative analysis of CT measurements. Forty-four pulmonary nodules in patients with metastatic disease were evaluated. CT examinations were performed using a 256 multidetector-row CT (MDCT) unit. Volume data were obtained from the lower lung fields (128 mm) above the diaphragm during dynamic CT acquisition. The CT parameters used were 120 kV, 100 or 150 mA, 0.5 s{sup -1}, and 0.5 mm collimation. Image data were reconstructed every 0.1 s during one respiratory cycle by a 180 reconstruction algorithm for four independent fractions of the respiratory cycle. Pulmonary nodules were measured along their longest and shortest axes using electronic calipers. Automated volumetry was assessed using commercially available software. The diameters of long and short axes in each frame were 9.0-9.6 mm and 7.1-7.5 mm, respectively. There was fluctuation of the long axis diameters in the third fraction. The mean volume in each fraction ranged from 365 to 394 mm{sup 3}. Statistically significant fluctuation was also found in the third fraction. 4D-CT under tidal volume ventilation is feasible to determine diameter or volume of the pulmonary nodule. (orig.)

  8. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per

    2016-01-01

    -cost embedded computer with very limited computational resources as compared to an ordinary PC. The system succeeds in counting honeybees, identifying their position and measuring their in-and-out activity. Our algorithm uses background subtraction method to segment the images. After the segmentation stage......, the methods are primarily based on statistical analysis and inference. The regression statistics (i.e. R2) of the comparisons of system predictions and manual counts are 0.987 for counting honeybees, and 0.953 and 0.888 for measuring in-activity and out-activity, respectively. The experimental results...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  9. CFD Analysis and Design Optimization Using Parallel Computers

    Science.gov (United States)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  10. Preliminary Safety Analysis Report for the Transuranic Storage Area Retrieval Enclosure at the Idaho National Engineering Laboratory. Revision 8

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This Transuranic Storage Area Retrieval Enclosure Preliminary Safety Analysis Report was completed as required by DOE Order 5480.23. The purpose of this document is to construct a safety basis that supports the design and permits construction of the facility. The facility has been designed to the requirements of a Radioactive Solid Waste Facility presented in DOE Order 6430.1A.

  11. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  12. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  13. Waste Feed Delivery System Phase 1 Preliminary Reliability and Availability and Maintainability Analysis [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    CARLSON, A.B.

    1999-11-11

    The document presents updated results of the preliminary reliability, availability, maintainability analysis performed for delivery of waste feed from tanks 241-AZ-101 and 241-AN-105 to British Nuclear Fuels Limited, inc. under the Tank Waste Remediation System Privatization Contract. The operational schedule delay risk is estimated and contributing factors are discussed.

  14. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    Science.gov (United States)

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  15. Emerging Trends and Statistical Analysis in Computational Modeling in Agriculture

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2015-03-01

    Full Text Available In this paper the authors have tried to describe emerging trend in computational modelling used in the sphere of agriculture. Agricultural computational modelling with the use of intelligence techniques for computing the agricultural output by providing minimum input data to lessen the time through cutting down the multi locational field trials and also the labours and other inputs is getting momentum. Development of locally suitable integrated farming systems (IFS is the utmost need of the day, particularly in India where about 95% farms are under small and marginal holding size. Optimization of the size and number of the various enterprises to the desired IFS model for a particular set of agro-climate is essential components of the research to sustain the agricultural productivity for not only filling the stomach of the bourgeoning population of the country, but also to enhance the nutritional security and farms return for quality life. Review of literature pertaining to emerging trends in computational modelling applied in field of agriculture is done and described below for the purpose of understanding its trends mechanism behavior and its applications. Computational modelling is increasingly effective for designing and analysis of the system. Computa-tional modelling is an important tool to analyses the effect of different scenarios of climate and management options on the farming systems and its interaction among themselves. Further, authors have also highlighted the applications of computational modeling in integrated farming system, crops, weather, soil, climate, horticulture and statistical used in agriculture which can show the path to the agriculture researcher and rural farming community to replace some of the traditional techniques.

  16. A preliminary assessment of using a white light confocal imaging profiler for cut mark analysis.

    Science.gov (United States)

    Schmidt, Christopher W; Moore, Christopher R; Leifheit, Randell

    2012-01-01

    White light confocal microscopy creates detailed 3D representations of microsurfaces that can be qualitatively and quantitatively analyzed. The study describes its application to the analysis of cut marks on bone, particularly when discerning cuts made by steel tools from those made by stone. The process described comes from a study where cuts were manually made on a cow rib with seven cutting tools, four stone (an unmodified chert flake, a chert biface, a bifacially ground slate fragment, and an unsharpened piece of slate), and three steel (a Swiss Army Knife, a serrate steak knife, and a serrate saw). Kerfs were magnified ×20 and 3D data clouds were generated using a Sensofar(®) White Light Confocal Profiler (WLCP). Kerf profiles and surface areas, volumes, mean depths, and maximum depths were calculated with proprietary software (SensoScan(®) and SolarMap(®)). For the most part, the stone tools make shallower and wider cuts. Kerf floors can be studied at higher magnifications; they were viewed at ×100. When comparing the kerf floors of the unsharpened slate and the serrate steak knife it was found that the slate floor was more uneven, but the serrate steak knife generated more overall relief. Although preliminary, the approach described here successfully distinguishes stone and steel tools; the authors conclude that the WLCP is a promising technology for cut mark analysis because of the very detailed 3D representations it creates and the numerous avenues of analysis it provides.

  17. Preliminary Drop Time Analysis of a Control Rod Using CFD Code

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Myoung Hwan; Park, Jin Seok; Lee, Won Jae [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Jun Hong [SEST Co., Seoul (Korea, Republic of)

    2010-05-15

    A control rod drive mechanism (CRDM) is a reactor regulating system, which can insert and withdraw a control rod containing a neutron absorbing material to control the reactivity of the reactor core. The latch type CRDM for the SMART (System-integrated Modular Advanced ReacTor) is going to be used. The drop time of the control rod in the design stage is one of important parameters for a safety analysis of the reactor. When the control rod is falling down into the core, it is retarded by various forces acting on it such as fluid resistance buoyancy and mechanical friction caused by contacting the inner surface of the guide thimble, etc.. However, complicated coupling of the various forces makes it difficult to predict the drop behavior. This paper describes the development of the 3D CFD analysis model using a FLUENT code. The single control rod of the Westinghouse 17x17 type optimized fuel assembly (W-OFA) was considered for the verification of the CFD model. A preliminary drop time analysis for the SMART with the simulated control rod was performed

  18. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  19. Assessing computer waste generation in Chile using material flow analysis.

    Science.gov (United States)

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  2. A Preliminary Analysis of Precipitation Properties and Processes during NASA GPM IFloodS

    Science.gov (United States)

    Carey, Lawrence; Gatlin, Patrick; Petersen, Walt; Wingo, Matt; Lang, Timothy; Wolff, Dave

    2014-01-01

    The Iowa Flood Studies (IFloodS) is a NASA Global Precipitation Measurement (GPM) ground measurement campaign, which took place in eastern Iowa from May 1 to June 15, 2013. The goals of the field campaign were to collect detailed measurements of surface precipitation using ground instruments and advanced weather radars while simultaneously collecting data from satellites passing overhead. Data collected by the radars and other ground instruments, such as disdrometers and rain gauges, will be used to characterize precipitation properties throughout the vertical column, including the precipitation type (e.g., rain, graupel, hail, aggregates, ice crystals), precipitation amounts (e.g., rain rate), and the size and shape of raindrops. The impact of physical processes, such as aggregation, melting, breakup and coalescence on the measured liquid and ice precipitation properties will be investigated. These ground observations will ultimately be used to improve rainfall estimates from satellites and in particular the algorithms that interpret raw data for the upcoming GPM mission's Core Observatory satellite, which launches in 2014. The various precipitation data collected will eventually be used as input to flood forecasting models in an effort to improve capabilities and test the utility and limitations of satellite precipitation data for flood forecasting. In this preliminary study, the focus will be on analysis of NASA NPOL (S-band, polarimetric) radar (e.g., radar reflectivity, differential reflectivity, differential phase, correlation coefficient) and NASA 2D Video Disdrometers (2DVDs) measurements. Quality control and processing of the radar and disdrometer data sets will be outlined. In analyzing preliminary cases, particular emphasis will be placed on 1) documenting the evolution of the rain drop size distribution (DSD) as a function of column melting processes and 2) assessing the impact of range on ground-based polarimetric radar estimates of DSD properties.

  3. Computer-Assisted Learning in Anatomy at the International Medical School in Debrecen, Hungary: A Preliminary Report

    Science.gov (United States)

    Kish, Gary; Cook, Samuel A.; Kis, Greta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an…

  4. Towards Advanced Data Analysis by Combining Soft Computing and Statistics

    CERN Document Server

    Gil, María; Sousa, João; Verleysen, Michel

    2013-01-01

    Soft computing, as an engineering science, and statistics, as a classical branch of mathematics, emphasize different aspects of data analysis. Soft computing focuses on obtaining working solutions quickly, accepting approximations and unconventional approaches. Its strength lies in its flexibility to create models that suit the needs arising in applications. In addition, it emphasizes the need for intuitive and interpretable models, which are tolerant to imprecision and uncertainty. Statistics is more rigorous and focuses on establishing objective conclusions based on experimental data by analyzing the possible situations and their (relative) likelihood. It emphasizes the need for mathematical methods and tools to assess solutions and guarantee performance. Combining the two fields enhances the robustness and generalizability of data analysis methods, while preserving the flexibility to solve real-world problems efficiently and intuitively.

  5. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  6. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  7. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  8. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; Ramesh, N.; K. Vijayalakshmi; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  9. A Sensitivity Analysis on Component Reliability from Fatigue Life Computations

    Science.gov (United States)

    1992-02-01

    AD-A247 430 MTL TR 92-5 AD A SENSITIVITY ANALYSIS ON COMPONENT RELIABILITY FROM FATIGUE LIFE COMPUTATIONS DONALD M. NEAL, WILLIAM T. MATTHEWS, MARK G...HAGI OR GHANI NUMBI:H(s) Donald M. Neal, William T. Matthews, Mark G. Vangel, and Trevor Rudalevige 9. PERFORMING ORGANIZATION NAME AND ADDRESS lU...Technical Information Center, Cameron Station, Building 5, 5010 Duke Street, Alexandria, VA 22304-6145 2 ATTN: DTIC-FDAC I MIAC/ CINDAS , Purdue

  10. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  11. Analysis of coupled heat and moisture transport on parallel computers

    Science.gov (United States)

    Koudelka, Tomáš; Krejčí, Tomáš

    2017-07-01

    Coupled analysis of heat and moisture transport in complicated structural elements or in whole structures deserves a special attention because after space discretization, large number of degrees of freedom are needed. This paper describes possible solution of such problems based on domain decomposition methods executed on parallel computers. The Schur complement method is used with respect to nonsymmetric systems of algebraic equations. The method described is an alternative to other methods, e.g. two or more scale homogenization.

  12. Preliminary assessment of facial soft tissue thickness utilizing three-dimensional computed tomography models of living individuals.

    Science.gov (United States)

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2014-04-01

    Facial approximation is the technique of developing a representation of the face from the skull of an unknown individual. Facial approximation relies heavily on average craniofacial soft tissue depths. For more than a century, researchers have employed a broad array of tissue depth collection methodologies, a practice which has resulted in a lack of standardization in craniofacial soft tissue depth research. To combat such methodological inconsistencies, Stephan and Simpson 2008 [15] examined and synthesized a large number of previously published soft tissue depth studies. Their comprehensive meta-analysis produced a pooled dataset of averaged tissue depths and a simplified methodology, which the researchers suggest be utilized as a minimum standard protocol for future craniofacial soft tissue depth research. The authors of the present paper collected craniofacial soft tissue depths using three-dimensional models generated from computed tomography scans of living males and females of four self-identified ancestry groups from the United States ranging in age from 18 to 62 years. This paper assesses the differences between: (i) the pooled mean tissue depth values from the sample utilized in this paper and those published by Stephan 2012 [21] and (ii) the mean tissue depth values of two demographically similar subsets of the sample utilized in this paper and those published by Rhine and Moore 1984 [16]. Statistical test results indicate that the tissue depths collected from the sample evaluated in this paper are significantly and consistently larger than those published by Stephan 2012 [21]. Although a lack of published variance data by Rhine and Moore 1984 [16] precluded a direct statistical assessment, a substantive difference was also concluded. Further, the dataset presented in this study is representative of modern American adults and is, therefore, appropriate for use in constructing contemporary facial approximations.

  13. Approach of fuzzy logic in the preliminary risk analysis of the upstream and downstream lines of an offshore petroleum production unit

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Maia Neto, Luiz

    2009-07-01

    This work consists of the application of a model of qualitative risk assessment based in fuzzy logic for the judgment of criticality of the scenarios of accident identified through the technique of preliminary hazard analysis in the upstream and downstream of an offshore oil production unit already in operation. The model based on fuzzy logic acts as substitute to the traditional Risks Matrix that uses subjective concepts for the categories of expected severity and frequency of the accidents. The structure of the employed model consists of 7 input variables, an internal variable and an output variable, all linked in accordance with the modules of analysis for each type of accident. The developed base of knowledge, that complete the expert system consists of membership functions developed for each one of the variables and a set of 219 distributed inference rules in the 7 different modules. The developed knowledge base, which incorporates the mechanisms of logical reasoning of specialists, assists and guides, with efficiency, the teams that carry through the preliminary hazard analyses with the use of a computer program having previously inserted routines. The employed model incorporates in the knowledge base of the program the existing concepts in the categories of frequency and severity, under the form of membership functions of the linguistic variable and the set of rules. With this, scales subdivided in ranges, defined on the basis of the existing direction present in the risks matrices are used to define the actions to be taken for the analyzed accident scenarios. (author)

  14. Computer Use, Confidence, Attitudes, and Knowledge: A Causal Analysis.

    Science.gov (United States)

    Levine, Tamar; Donitsa-Schmidt, Smadar

    1998-01-01

    Introduces a causal model which links measures of computer experience, computer-related attitudes, computer-related confidence, and perceived computer-based knowledge. The causal model suggests that computer use has a positive effect on perceived computer self-confidence, as well as on computer-related attitudes. Questionnaires were administered…

  15. Preliminary analysis of patent trends for sodium/sulfur battery technology

    Energy Technology Data Exchange (ETDEWEB)

    Triplett, M.B.; Winter, C.; Ashton, W.B.

    1985-07-01

    This document summarizes development trends in sodium/sulfur battery technology based on data from US patents. Purpose of the study was to use the activity, timing and ownership of 285 US patents to identify and describe broad patterns of change in sodium/sulfur battery technology. The analysis was conducted using newly developed statistical and computer graphic techniques for describing technology development trends from patent data. This analysis suggests that for some technologies trends in patent data provide useful information for public and private R and D planning.

  16. Computer-assisted learning in anatomy at the international medical school in Debrecen, Hungary: a preliminary report.

    Science.gov (United States)

    Kish, Gary; Cook, Samuel A; Kis, Gréta

    2013-01-01

    The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an introduction to anatomical digital images along with clinical cases. This low-budget course has a large visual component using images from magnetic resonance imaging and computer axial tomogram scans, ultrasound clinical studies, and readily available anatomy software that presents topics which run in parallel to the university's core anatomy curriculum. From the combined computer images and CHA lecture information, students are asked to solve computer-based clinical anatomy problems in the CHA computer laboratory. A statistical comparison was undertaken of core anatomy oral examination performances of English program first-year medical students who took the elective CHA course and those who did not in the three academic years 2007-2008, 2008-2009, and 2009-2010. The results of this study indicate that the CHA-enrolled students improved their performance on required anatomy core curriculum oral examinations (P students benefit from computer-assisted learning in a multilingual and diverse cultural environment.

  17. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  18. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  19. Analysis and computation of microstructure in finite plasticity

    CERN Document Server

    Hackl, Klaus

    2015-01-01

    This book addresses the need for a fundamental understanding of the physical origin, the mathematical behavior, and the numerical treatment of models which include microstructure. Leading scientists present their efforts involving mathematical analysis, numerical analysis, computational mechanics, material modelling and experiment. The mathematical analyses are based on methods from the calculus of variations, while in the numerical implementation global optimization algorithms play a central role. The modeling covers all length scales, from the atomic structure up to macroscopic samples. The development of the models ware guided by experiments on single and polycrystals, and results will be checked against experimental data.

  20. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  1. Analysis of computational modeling techniques for complete rotorcraft configurations

    Science.gov (United States)

    O'Brien, David M., Jr.

    Computational fluid dynamics (CFD) provides the helicopter designer with a powerful tool for identifying problematic aerodynamics. Through the use of CFD, design concepts can be analyzed in a virtual wind tunnel long before a physical model is ever created. Traditional CFD analysis tends to be a time consuming process, where much of the effort is spent generating a high quality computational grid. Recent increases in computing power and memory have created renewed interest in alternative grid schemes such as unstructured grids, which facilitate rapid grid generation by relaxing restrictions on grid structure. Three rotor models have been incorporated into a popular fixed-wing unstructured CFD solver to increase its capability and facilitate availability to the rotorcraft community. The benefit of unstructured grid methods is demonstrated through rapid generation of high fidelity configuration models. The simplest rotor model is the steady state actuator disk approximation. By transforming the unsteady rotor problem into a steady state one, the actuator disk can provide rapid predictions of performance parameters such as lift and drag. The actuator blade and overset blade models provide a depiction of the unsteady rotor wake, but incur a larger computational cost than the actuator disk. The actuator blade model is convenient when the unsteady aerodynamic behavior needs to be investigated, but the computational cost of the overset approach is too large. The overset or chimera method allows the blades loads to be computed from first-principles and therefore provides the most accurate prediction of the rotor wake for the models investigated. The physics of the flow fields generated by these models for rotor/fuselage interactions are explored, along with efficiencies and limitations of each method.

  2. Preliminary analysis of Stearoyl Co-A Desaturase gene transcripts in River buffalo

    Directory of Open Access Journals (Sweden)

    L. Ramunno

    2010-02-01

    Full Text Available Stearoyl-CoA desaturase (SCD is a key enzyme in the biosynthesis of monounsaturated fatty acids (MUFAs. In cattle, SCD gene extends over a DNA segment of ~17.0 Kb, and it is organized in 6 exons and 5 introns. The SCD gene has been indicated as the candidate gene to change the saturated/unsaturated FAs ratio and hence it has been suggested as the gene influencing the fat quality. In cattle, eight SNPs have been identified and one of them, (T→C at 231st nt of 5th exon, is responsible for the Val→Ala amino acid change. The C allele has been associated with higher content of MUFAs in carcasses, and it is positively related to a higher index of desaturation (C18:0/C18:1 and C16:0/C16:1 in the milk. In this study, we report on preliminary results of analysis of transcripts of the SCD encoding gene in river buffalo. The electrophoretic analysis of the RT-PCR products and the subsequent sequencing showed at least five different populations of mRNA. The most represented population is correctly assembled (~1300 bp, followed by the one which is deleted of ~750bp, corresponding to the 3rd, 4th and 5th exon and partially to the 2nd and 6th exon.

  3. Biomarkers of Eating Disorders Using Support Vector Machine Analysis of Structural Neuroimaging Data: Preliminary Results.

    Science.gov (United States)

    Cerasa, Antonio; Castiglioni, Isabella; Salvatore, Christian; Funaro, Angela; Martino, Iolanda; Alfano, Stefania; Donzuso, Giulia; Perrotta, Paolo; Gioia, Maria Cecilia; Gilardi, Maria Carla; Quattrone, Aldo

    2015-01-01

    Presently, there are no valid biomarkers to identify individuals with eating disorders (ED). The aim of this work was to assess the feasibility of a machine learning method for extracting reliable neuroimaging features allowing individual categorization of patients with ED. Support Vector Machine (SVM) technique, combined with a pattern recognition method, was employed utilizing structural magnetic resonance images. Seventeen females with ED (six with diagnosis of anorexia nervosa and 11 with bulimia nervosa) were compared against 17 body mass index-matched healthy controls (HC). Machine learning allowed individual diagnosis of ED versus HC with an Accuracy ≥ 0.80. Voxel-based pattern recognition analysis demonstrated that voxels influencing the classification Accuracy involved the occipital cortex, the posterior cerebellar lobule, precuneus, sensorimotor/premotor cortices, and the medial prefrontal cortex, all critical regions known to be strongly involved in the pathophysiological mechanisms of ED. Although these findings should be considered preliminary given the small size investigated, SVM analysis highlights the role of well-known brain regions as possible biomarkers to distinguish ED from HC at an individual level, thus encouraging the translational implementation of this new multivariate approach in the clinical practice.

  4. Biomarkers of Eating Disorders Using Support Vector Machine Analysis of Structural Neuroimaging Data: Preliminary Results

    Directory of Open Access Journals (Sweden)

    Antonio Cerasa

    2015-01-01

    Full Text Available Presently, there are no valid biomarkers to identify individuals with eating disorders (ED. The aim of this work was to assess the feasibility of a machine learning method for extracting reliable neuroimaging features allowing individual categorization of patients with ED. Support Vector Machine (SVM technique, combined with a pattern recognition method, was employed utilizing structural magnetic resonance images. Seventeen females with ED (six with diagnosis of anorexia nervosa and 11 with bulimia nervosa were compared against 17 body mass index-matched healthy controls (HC. Machine learning allowed individual diagnosis of ED versus HC with an Accuracy ≥ 0.80. Voxel-based pattern recognition analysis demonstrated that voxels influencing the classification Accuracy involved the occipital cortex, the posterior cerebellar lobule, precuneus, sensorimotor/premotor cortices, and the medial prefrontal cortex, all critical regions known to be strongly involved in the pathophysiological mechanisms of ED. Although these findings should be considered preliminary given the small size investigated, SVM analysis highlights the role of well-known brain regions as possible biomarkers to distinguish ED from HC at an individual level, thus encouraging the translational implementation of this new multivariate approach in the clinical practice.

  5. The 6 April 2009 earthquake at L'Aquila: a preliminary analysis of magnetic field measurements

    Directory of Open Access Journals (Sweden)

    U. Villante

    2010-02-01

    Full Text Available Several investigations reported the possible identification of anomalous geomagnetic field signals prior to earthquake occurrence. In the ULF frequency range, candidates for precursory signatures have been proposed in the increase in the noise background and polarization parameter (i.e. the ratio between the amplitude/power of the vertical component and that one of the horizontal component, in the changing characteristics of the slope of the power spectrum and fractal dimension, in the possible occurrence of short duration pulses. We conducted, with conventional techniques of data processing, a preliminary analysis of the magnetic field observations performed at L'Aquila during three months preceding the 6 April 2009 earthquake, focusing attention on the possible occurrence of features similar to those identified in previous events. Within the limits of this analysis, we do not find compelling evidence for any of the features which have been proposed as earthquake precursors: indeed, most of aspects of our observations (which, in some cases, appear consistent with previous findings might be interpreted in terms of the general magnetospheric conditions and/or of different sources.

  6. Crystallization and preliminary X-ray diffraction analysis of human adenovirus

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, V.S.; Natchiar, S.K.; Gritton, L.; Mullen, T.-M.; Stewart, P.L.; Nemerow, G.R. (Scripps); (Vanderbilt)

    2010-07-22

    Replication-defective and conditionally replicating adenovirus (AdV) vectors are currently being utilized in {approx}25% of human gene transfer clinical trials. Unfortunately, progress in vector development has been hindered by a lack of accurate structural information. Here we describe the crystallization and preliminary X-ray diffraction analysis of a HAdV5 vector that displays a short flexible fiber derived from HAdV35. Crystals of Ad35F were grown in 100 mM HEPES pH 7.0, 200 mM Ca(OAc){sub 2}, 14% PEG 550 MME, 15% glycerol in 100 mM Tris-HCl 8.5. Freshly grown crystals diffracted well to 4.5 {angstrom} resolution and weakly to 3.5 {angstrom} at synchrotron sources. HAdV crystals belong to space group P1 with unit cell parameters a = 854.03 {angstrom}, b = 855.17 {angstrom}, c = 865.24 {angstrom}, {alpha} = 119.57{sup o}, {beta} = 91.71{sup o}, {gamma} = 118.08{sup o} with a single particle in the unit cell. Self-rotation and locked-rotation function analysis allowed the determination of the particle orientation. Molecular replacement, density modification and phase-extension procedures are being employed for structure determination.

  7. Ocean thermal energy conversion cold water pipe preliminary design project. Task 2. Analysis for concept selection

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-04-01

    The successful performance of the CWP is of crucial importance to the overall OTEC system; the pipe itself is considered the most critical part of the entire operation. Because of the importance the CWP, a project for the analysis and design of CWP's was begun in the fall of 1978. The goals of this project were to study a variety of concepts for delivering cold water to an OTEC plant, to analyze and rank these concepts based on their relative cost and risk, and to develop preliminary design for those concepts which seemed most promising. Two representative platforms and sites were chosen: a spar buoy of a Gibbs and Cox design to be moored at a site off Punta Tuna, Puerto Rico, and a barge designed by APL/Johns Hopkins University, grazing about a site approximately 200 miles east of the coast of Brazil. The approach was to concentrate on the most promising concepts and on those which were either of general interest or espoused by others (e.g., steel and concrete concepts). Much of the overall attention, therefore, focused on analyzing rigid and compliant wall design, while stockade (except for the special case of the FRP stockade) and bottom-mounted concepts received less attention. A total of 67 CWP concepts were initially generated and subjected to a screening process. Of these, 16 were carried through design analysis, costing, and ranking. Study results are presented in detail. (WHK)

  8. Computer-aided fiber analysis for crime scene forensics

    Science.gov (United States)

    Hildebrandt, Mario; Arndt, Christian; Makrushin, Andrey; Dittmann, Jana

    2012-03-01

    The forensic analysis of fibers is currently completely manual and therefore time consuming. The automation of analysis steps can significantly support forensic experts and reduce the time, required for the investigation. Moreover, a subjective expert belief is extended by objective machine estimation. This work proposes the pattern recognition pipeline containing the digital acquisition of a fiber media, the pre-processing for fiber segmentation, and the extraction of the distinctive characteristics of fibers. Currently, basic geometrical features like width, height, area of optically dominant fibers are investigated. In order to support the automatic classification of fibers, supervised machine learning algorithms are evaluated. The experimental setup includes a car seat and two pieces clothing of a different fabric. As preliminary work, acrylic as synthetic and sheep wool as natural fiber are chosen to be classified. While sitting on the seat, a test person leaves textile fibers. The test aims at automatic distinguishing of clothes through the fiber traces gained from the seat with the help of adhesive tape. The digitalization of fiber samples is provided by a contactless chromatic white light sensor. First test results showed, that two optically very different fibers can be properly assigned to their corresponding fiber type. The best classifier achieves an accuracy of 75 percent correctly classified samples for our suggested features.

  9. Engineering computer graphics in gas turbine engine design, analysis and manufacture

    Science.gov (United States)

    Lopatka, R. S.

    1975-01-01

    A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.

  10. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  13. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    Science.gov (United States)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  15. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  16. Reactive Aggression and Suicide-Related Behaviors in Children and Adolescents: A Review and Preliminary Meta-Analysis.

    Science.gov (United States)

    Hartley, Chelsey M; Pettit, Jeremy W; Castellanos, Daniel

    2017-01-03

    The empirical literature on the association between reactive aggression and suicide-related behaviors in children and adolescents was reviewed. A narrative review of seven studies that met inclusion/exclusion criteria is followed by a preliminary meta-analysis to provide insight into the strength of the association between reactive aggression and suicide-related behaviors. Each of the seven studies reported a statistically significant association between reactive aggression and suicide-related behaviors, including suicide, nonfatal suicide attempt, and suicide ideation. Results from the meta-analysis indicated a consistent, medium-sized association (k = 7; N = 4,693; rbar = .25). The narrative review and results of the preliminary meta-analysis support the promise of pursuing future research on reactive aggression and suicide-related behaviors in children and adolescents. A theoretical model is proposed to guide the development of future research.

  17. Report on preliminary analysis of state of nuclear criticality accident at JCO at Tokaimura, Ibaraki, Japan (I)

    Energy Technology Data Exchange (ETDEWEB)

    Ha, J.J.; Park, J.H.; Chang, J.H. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-10-01

    This preliminary report was prepared by the Special Task Force Team of KAERI in order to analysis status of nuclear criticality accident broken out at 10:35 September 30, 1999 at JCO nuclear conversion test facility located at Tokaimura, Ibaraki, Japan. The report was consisted of accident summary of cause of accident summary of cause of accident and response by relevant organizations, and preliminary technical analysis of radiation exposure of JCO workers, analysis of cause of accident, and accident assessment and preventive actions against criticality accident. It is expected that JCO accident, Japan's first nuclear criticality accident, would make significant effects to Japan nuclear policy and would be also a good example to Korea future actions to be taken in use and development of nuclear energy. 63 refs., 3 figs., 1 tab. (Author)

  18. Interim Progress Report on the Application of an Independent Components Analysis-based Spectral Unmixing Algorithm to Beowulf Computers

    Science.gov (United States)

    Lemeshewsky, George

    2003-01-01

    This report describes work done to implement an independent-components-analysis (ICA) -based blind unmixing algorithm on the Eastern Region Geography (ERG) Beowulf computer cluster. It gives a brief description of blind spectral unmixing using ICA-based techniques and a preliminary example of unmixing results for Landsat-7 Thematic Mapper multispectral imagery using a recently reported1,2,3 unmixing algorithm. Also included are computer performance data. The final phase of this work, the actual implementation of the unmixing algorithm on the Beowulf cluster, was not completed this fiscal year and is addressed elsewhere. It is noted that study of this algorithm and its application to land-cover mapping will continue under another research project in the Land Remote Sensing theme into fiscal year 2004.

  19. Computational Stability Analysis of Lotka-Volterra Systems

    Directory of Open Access Journals (Sweden)

    Polcz Péter

    2016-12-01

    Full Text Available This paper concerns the computational stability analysis of locally stable Lotka-Volterra (LV systems by searching for appropriate Lyapunov functions in a general quadratic form composed of higher order monomial terms. The Lyapunov conditions are ensured through the solution of linear matrix inequalities. The stability region is estimated by determining the level set of the Lyapunov function within a suitable convex domain. The paper includes interesting computational results and discussion on the stability regions of higher (3,4 dimensional LV models as well as on the monomial selection for constructing the Lyapunov functions. Finally, the stability region is estimated of an uncertain 2D LV system with an uncertain interior locally stable equilibrium point.

  20. Computational analysis of RNA structures with chemical probing data.

    Science.gov (United States)

    Ge, Ping; Zhang, Shaojie

    2015-06-01

    RNAs play various roles, not only as the genetic codes to synthesize proteins, but also as the direct participants of biological functions determined by their underlying high-order structures. Although many computational methods have been proposed for analyzing RNA structures, their accuracy and efficiency are limited, especially when applied to the large RNAs and the genome-wide data sets. Recently, advances in parallel sequencing and high-throughput chemical probing technologies have prompted the development of numerous new algorithms, which can incorporate the auxiliary structural information obtained from those experiments. Their potential has been revealed by the secondary structure prediction of ribosomal RNAs and the genome-wide ncRNA function annotation. In this review, the existing probing-directed computational methods for RNA secondary and tertiary structure analysis are discussed.

  1. Recent applications of the transonic wing analysis computer code, TWING

    Science.gov (United States)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  2. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  3. Preliminary results on the EUCLID NISP stray-light and ghost analysis

    Science.gov (United States)

    Geis, Norbert; Grupp, Frank; Prieto, Eric; Bender, Ralf

    2015-09-01

    The EUCLID mission within the European Space Agencies 2015 - 2025 Cosmic Vision framework addresses cosmological questions related to dark matter and dark energy. EUCLID is equipped with two instruments that are simultaneously observing patches of > 0:5 square degree on the sky. The VIS visual light high spacial resolution imager and the NISP near infrared spectrometer and photometer are separated by a di-chroic beam splitter. With its large FoV (larger than the full moon disk), together with high demands on the optical performance and strong requirements on in flight stability lead to very challenging demands on alignment and post launch { post cool-down optical element position. In addition the demanding requirements from spectroscopy and galaxy photometry lead to a highly demanding stray light and ghost control need. With this paper we present a preliminary - PDR level - analysis of ghosting and stray light levels in the EUCLID NISP near infrared spectrometer and photometer. The analysis presented focuses on the photometric channel, as this, together with the wide field of the instrument, shows most of the challenges and features of the instrument. As one requirement is to have a non vignetting design, extensive baffling is not possible, and only secondary and higher order light can be actively baffled. A comprehensive ZEMAX based analysis is being presented, showing in summary that baffles are only necessary due to the EUCLID fine guiding sensors auxiliary fields of view. The total level of contaminating light is thereafter dominated by stray light from dust on the lenses. Ghosts play a minor role.

  4. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  5. Crystallization and preliminary X-ray analysis of Chandipura virus glycoprotein G.

    Science.gov (United States)

    Baquero, Eduard; Buonocore, Linda; Rose, John K; Bressanelli, Stéphane; Gaudin, Yves; Albertini, Aurélie A

    2012-09-01

    Fusion in members of the Rhabdoviridae virus family is mediated by the G glycoprotein. At low pH, the G glycoprotein catalyzes fusion between viral and endosomal membranes by undergoing a major conformational change from a pre-fusion trimer to a post-fusion trimer. The structure of the G glycoprotein from vesicular stomatitis virus (VSV G), the prototype of Vesiculovirus, has recently been solved in its trimeric pre-fusion and post-fusion conformations; however, little is known about the structural details of the transition. In this work, a soluble form of the ectodomain of Chandipura virus G glycoprotein (CHAV G(th)) was purified using limited proteolysis of purified virus; this soluble ectodomain was also crystallized. This protein shares 41% amino-acid identity with VSV G and thus its structure could provide further clues about the structural transition of rhabdoviral glycoproteins induced by low pH. Crystals of CHAV G(th) obtained at pH 7.5 diffracted X-rays to 3.1 Å resolution. These crystals belonged to the orthorhombic space group P2(1)2(1)2, with unit-cell parameters a = 150.3, b = 228.2, c = 78.8 Å. Preliminary analysis of the data based on the space group and the self-rotation function indicated that there was no trimeric association of the protomers. This unusual oligomeric status could result from the presence of fusion intermediates in the crystal.

  6. Preliminary uncertainty analysis of OECD/UAM benchmark for the TMI-1 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Fabiano S.; Faria, Rochkhudson B.; Silva, Lucas M.C.; Pereira, Claubia; Fortini, Angela, E-mail: claubia@nuclear.ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Nowadays the demand from nuclear research centers for safety, regulation and better-estimated predictions provided with confidence bounds has been increasing. On that way, studies have pointed out that present uncertainties in the nuclear data should be significantly reduced, to get the full benefit from the advanced modeling and simulation initiatives. The major outcome of NEA/OECD (UAM) workshop took place Italy on 2006, was the preparation of a benchmark work program with steps (exercises) that would be needed to define the uncertainty and modeling tasks. On that direction, this work was performed within the framework of UAM Exercise 1 (I-1) 'Cell Physics' to validate the study, and to be able estimated the accuracies of the model. The objectives of this study were to make a preliminary analysis of criticality values of TMI-1 PWR and the biases of the results from two different nuclear codes multiplication factor. The range of the bias was obtained using the deterministic codes: NEWT (New ESC-based Weighting Transport code), the two-dimensional transport module that uses AMPX-formatted cross-sections processed by other SCALE; and WIMSD5 (Winfrith Improved Multi-Group Scheme) code. The WIMSD5 system consists of a simplified geometric representation of heterogeneous space zones that are coupled with each other and with the boundaries, while the properties of each spacing element are obtained from Carlson DSN method or Collision Probability method. (author)

  7. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt

    1996-06-01

    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  8. In situ fluidization for peat bed rupture, and preliminary economic analysis.

    Science.gov (United States)

    Niven, R K; Khalili, N

    2002-11-01

    This study concerns in situ fluidization (ISF), a new remediation method with potential application to the remediation of NAPL and heavy metal contaminants, by their release from the fluidized zone generated by a water jet. The present study examines the effect of ISF on layers of peat, of significance owing to its role as an important NAPL and metal contaminant trap. Once trapped, such contaminants are not readily accessible by most remedial methods, due to the low permeability and diffusivity of the peat. A simple tank experiment is used to demonstrate rupture of a peat layer by ISF, with removal of the peat as elutriated fines and segregated peat chunks. The application of ISF in the field is then examined by three field trials in uncontaminated sands, in both saturated and unsaturated conditions. Fluidized depths of up to 1.9 m in the saturated zone (with refusal on a peat layer) and 2.5 m in the unsaturated zone (no refusal) were attained, using a 1.9-m-long, 50 mm diameter jet operated at 5-13 1 s(-1). Pulses of dark turbidity and shell fragments in the effluent indicated the rupture of peat and shelly layers. The experiments demonstrate the hydraulic viability of ISF in the field, and its ability to remove peat-based contaminants. The issues of appropriate jet design and water generation during ISF are discussed, followed by a preliminary economic analysis of ISF relative to existing remediation methods.

  9. Cloning, expression, crystallization and preliminary X-ray data analysis of norcoclaurine synthase from Thalictrum flavum

    Energy Technology Data Exchange (ETDEWEB)

    Pasquo, Alessandra [ENEA Casaccia Research Centre, Dipartimento BIOTEC, Sezione Genetica e Genomica Vegetale, PO Box 2400, I-00100 Rome (Italy); Bonamore, Alessandra; Franceschini, Stefano; Macone, Alberto; Boffi, Alberto; Ilari, Andrea, E-mail: andrea.ilari@uniroma1.it [Istituto di Biologia e Patologia Molecolari, CNR (IBPM) and Department Of Biochemical Sciences, University of Roma ‘La Sapienza’, Piazza Aldo Moro 5, 00179 Roma (Italy); ENEA Casaccia Research Centre, Dipartimento BIOTEC, Sezione Genetica e Genomica Vegetale, PO Box 2400, I-00100 Rome (Italy)

    2008-04-01

    The cloning, expression, crystallization and preliminary X-ray data analysis of norcoclaurine synthase from T. flavum, a protein which catalyzes the first committed step in the biosynthesis of benzylisoquinoline alkaloids, are reported. Norcoclaurine synthase (NCS) catalyzes the condensation of 3,4-dihydroxyphenylethylamine (dopamine) and 4-hydroxyphenylacetaldehyde (4-HPAA) as the first committed step in the biosynthesis of benzylisoquinoline alkaloids in plants. The protein was cloned, expressed and purified. Crystals were obtained at 294 K by the hanging-drop vapour-diffusion method using ammonium sulfate and sodium chloride as precipitant agents and diffract to better than 3.0 Å resolution using a synchrotron-radiation source. The crystals belong to the trigonal space group P3{sub 1}21, with unit-cell parameters a = b = 86.31, c = 118.36 Å. A selenomethionine derivative was overexpressed, purified and crystallized in the same space group. A complete MAD data set was collected at 2.7 Å resolution. The model is under construction.

  10. Acanthamoeba polyphaga mimivirus NDK: preliminary crystallographic analysis of the first viral nucleoside diphosphate kinase

    Energy Technology Data Exchange (ETDEWEB)

    Jeudy, Sandra [Information Génomique et Structurale, CNRS UPR 2589, 31 Chemin Joseph Aiguier, 13402 Marseille CEDEX 20 (France); Coutard, Bruno [Architecture et Fonction des Macromolecules Biologiques, CNRS UMR 6098, 31 Chemin Joseph Aiguier, 13402 Marseille CEDEX 20 (France); Lebrun, Régine [IBSM, 31 Chemin Joseph Aiguier, 13402 Marseille CEDEX 20 (France); Abergel, Chantal, E-mail: chantal.abergel@igs.cnrs-mrs.fr [Information Génomique et Structurale, CNRS UPR 2589, 31 Chemin Joseph Aiguier, 13402 Marseille CEDEX 20 (France)

    2005-06-01

    A. polyphaga mimivirus, the largest known double-stranded DNA virus, is the first virus to exhibit a nucleoside diphosphate kinase gene. The expression and crystallization of the viral NDK are reported. The complete sequence of the largest known double-stranded DNA virus, Acanthamoeba polyphaga mimivirus, has recently been determined [Raoult et al. (2004 ▶), Science, 306, 1344–1350] and revealed numerous genes not expected to be found in a virus. A comprehensive structural and functional study of these gene products was initiated [Abergel et al. (2005 ▶), Acta Cryst. F61, 212–215] both to better understand their role in the virus physiology and to obtain some clues to the origin of DNA viruses. Here, the preliminary crystallographic analysis of the viral nucleoside diphosphate kinase protein is reported. The crystal belongs to the cubic space group P2{sub 1}3, with unit-cell parameter 99.425 Å. The self-rotation function confirms that there are two monomers per asymmetric unit related by a twofold non-crystallographic axis and that the unit cell thus contains four biological entities.

  11. N-Acetyltransferase 2 status and gastric cancer risk: a preliminary meta-analysis

    Directory of Open Access Journals (Sweden)

    Stefania Boccia

    2005-03-01

    Full Text Available

    In recent studies N- Acetyltransferase 2 (NAT2 genotype has been considered as a risk factor for developing gastric cancer, however with conflicting results among Asian and Caucasian populations. In order to clarify the influence of NAT2 slow acetylation status on gastric cancer risk, a preliminary meta-analysis of published case-control studies was undertaken.

    The primary outcome measure was the odds ratio (OR for the risk of gastric cancer associated with the NAT2 slow genotype using a random effects model. Pooling the results from the 5 studies identified (771 cases, 1083 controls, an overall OR for gastric cancer risk associated with the NAT2 slow genotype of 0.91 emerged (95% CI: 0.54-1.55.

    The result suggests that the NAT2 slow genotype has probably no effect on the risk of gastric cancer. Additional epidemiological studies, based on sample sizes that are commensurate with the detection of small genotypic risks, are required to confirm these findings. Future studies may also help to clarify whether geographic differences exist.

  12. Preliminary Analysis of Two Years of the Massive Collision Monitoring Activity

    Science.gov (United States)

    McKnight, Darren; Matney, Mark; Walbert, Kris; Behrend, Sophie; Casey, Patrick; Speaks, Seth

    2017-01-01

    It is hypothesized that the interactions between many of the most massive derelicts in low Earth orbit are more frequent than modeled by the traditional combination of kinetic theory of gases and Poisson probability distribution function. This is suggested by the fact that there are clusters of derelicts where members' inclinations are nearly identical and their apogees/perigees overlap significantly resulting in periodic synchronization of the objects' orbits. In order to address this proposition, an experiment was designed and conducted over the last two years. Results from this monitoring and characterization experiment are presented with implications for proposed debris remediation strategies. Four separate clusters of massive derelicts were examined that are centered around 775km, 850km, 975km, and 1500km, respectively. In aggregate, the constituents of these clusters contain around 500 objects and about 800,000kg of mass; this equates to a third of all derelict mass in LEO. Preliminary analysis indicates that encounter rates over this time period for these objects are greater than is estimated by traditional techniques. Hypothesized dependencies between latitude of encounter, relative velocity, frequency of encounters, inclination, and differential semi-major axis were established and verified. This experiment also identified specific repeatable cluster dynamics that may reduce the cost/risk and enhance the effectiveness of debris remediation activities and also enable new operational debris remediation options.

  13. GC-MS Analysis and Preliminary Antimicrobial Activity of Albizia adianthifolia (Schumach and Pterocarpus angolensis (DC

    Directory of Open Access Journals (Sweden)

    Mustapha N. Abubakar

    2016-01-01

    Full Text Available The non-polar components of two leguminoceae species Albizia adianthifolia (Schumach, and Pterocarpus angolensis (DC were investigated. GC-MS analysis of the crude n-hexane and chloroform extracts together with several chromatographic separation techniques led to the identification and characterization (using NMR of sixteen known compounds from the heartwood and stem bark of Albizia adianthifolia and Pterocarpus angolensis respectively. These constituents include, n-hexadecanoic acid (palmitic acid 1, oleic acid 2, chondrillasterol 3, stigmasterol 4, 24S 5α-stigmast-7-en-3β-ol 5, 9,12-octadecadienoic acid (Z,Z-, methyl ester 6, trans-13-octadecanoic acid, methyl ester 7, tetradecanoic acid 8, hexadecanoic acid, methyl ester 9, octadecanoic acid 10, tetratriacontane 11, 7-dehydrodiosgenin 12, lupeol 13, stigmasta-3,5-diene-7-one 14, friedelan-3-one (friedelin 15, and 1-octacosanol 16. Using agar over lay method, the preliminary antimicrobial assay for the extracts was carried out against bacterial (E. coli, P. aeruginosa, B. subtilis, S. aueus and a fungus/yeast (C. albicans strains. The n-hexane and chloroform extracts of A. adianthifolia showed the best activity against E. coli with minimum inhibition quantity (MIQ of 1 µg each while the remaining exhibited moderate-to-weak activity against the test microorganisms.

  14. Satellite geological and geophysical remote sensing of Iceland: Preliminary results from analysis of MSS imagery

    Science.gov (United States)

    Williams, R. S., Jr.; Boedvarsson, A.; Fridriksson, S.; Palmason, G.; Rist, S.; Sigtryggsson, H.; Thorarinsson, S.; Thorsteinsson, I.

    1973-01-01

    A binational, multidisciplinary research effort in Iceland is directed at an analysis of MSS imagery from ERTS-1 to study a variety of geologic, hydrologic, oceanographic, and agricultural phenomena. A preliminary evaluation of available MSS imagery of Iceland has yielded several significant results - some of which may have direct importance to the Icelandic economy. Initial findings can be summarized as follows: (1) recent lava flows can be delineated from older flows at Askja and Hekla; (2) MSS imagery from ERTS-1 and VHRR visible and infrared imagery from NOAA-2 recorded the vocanic eruption on Heimaey, Vestmann Islands; (3) coastline changes, particularly changes in the position of bars and beaches along the south coast are mappable; and (4) areas covered with new and residual snow can be mapped, and the appearance of newly fallen snow on ERTS-1, MSS band 7 appears dark where it is melting. ERTS-1 imagery provides a means of updating various types of maps of Iceland and will permit the compilation of special maps specifically aimed at those dynamic environmental phenomena which impact on the Icelandic economy.

  15. Analysis of the preliminary optical links between ARTEMIS and the Optical Ground Station

    Science.gov (United States)

    Reyes Garcia-Talavera, Marcos; Chueca, Sergio; Alonso, Angel; Viera, Teodora; Sodnik, Zoran

    2002-12-01

    In the frame of the SILEX project, the European Space Agency (ESA) has put into orbit two Laser Communication Terminals, to establish an experimental free space optical communication link between a GEO satellite (ARTEMIS) and a LEO satellite (SPOT IV), to relay earth observation data. In order to perform In Orbit Testing (IOT) of these, and other, optical communications systems, ESA and the Instituto de Astrofisica de Canarias (IAC) reached an agreement for building the Optical Ground Station (OGS), in the Teide Observatory of the IAC. With ARTEMIS placed in a circular parking orbit at about 31000 kilometres, its optical payload has been preliminary tested with the OGS. First results and analysis are presented on the space-to-ground bi-directional link, including pointing acquisition and tracking performance, Bit-Error Rate (BER) and transmitted beam divergence effects related with atmospheric models and predictions. Future plans include deeper optical bi-directional communication tests of OGS, not only with ARTEMIS but also with OSCAR-40 (downlink) and SMART-1 (up-link) satellites, in order to do a full characterisation of the performances of laser beam propagation through atmospheric turbulence and a comparison with theoretical predictions.

  16. Marine Traffic Density Over Port Klang, Malaysia Using Statistical Analysis of AIS Data: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Masnawi MUSTAFFA

    2016-12-01

    Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.

  17. GC-MS Analysis and Preliminary Antimicrobial Activity of Albizia adianthifolia (Schumach) and Pterocarpus angolensis (DC).

    Science.gov (United States)

    Abubakar, Mustapha N; Majinda, Runner R T

    2016-01-28

    The non-polar components of two leguminoceae species Albizia adianthifolia (Schumach), and Pterocarpus angolensis (DC) were investigated. GC-MS analysis of the crude n-hexane and chloroform extracts together with several chromatographic separation techniques led to the identification and characterization (using NMR) of sixteen known compounds from the heartwood and stem bark of Albizia adianthifolia and Pterocarpus angolensis respectively. These constituents include, n-hexadecanoic acid (palmitic acid) 1, oleic acid 2, chondrillasterol 3, stigmasterol 4, 24S 5α-stigmast-7-en-3β-ol 5, 9,12-octadecadienoic acid (Z,Z)-, methyl ester 6, trans-13-octadecanoic acid, methyl ester 7, tetradecanoic acid 8, hexadecanoic acid, methyl ester 9, octadecanoic acid 10, tetratriacontane 11, 7-dehydrodiosgenin 12, lupeol 13, stigmasta-3,5-diene-7-one 14, friedelan-3-one (friedelin) 15, and 1-octacosanol 16. Using agar over lay method, the preliminary antimicrobial assay for the extracts was carried out against bacterial (E. coli, P. aeruginosa, B. subtilis, S. aueus) and a fungus/yeast (C. albicans) strains. The n-hexane and chloroform extracts of A. adianthifolia showed the best activity against E. coli with minimum inhibition quantity (MIQ) of 1 µg each while the remaining exhibited moderate-to-weak activity against the test microorganisms.

  18. Bioinformatics analysis of the gene expression profile of hepatocellular carcinoma: preliminary results

    Science.gov (United States)

    Li, Jia

    2016-01-01

    Aim of the study To analyse the expression profile of hepatocellular carcinoma compared with normal liver by using bioinformatics methods. Material and methods In this study, we analysed the microarray expression data of HCC and adjacent normal liver samples from the Gene Expression Omnibus (GEO) database to screen for differentially expressed genes. Then, functional analyses were performed using GenCLiP analysis, Gene Ontology categories, and aberrant pathway identification. In addition, we used the CMap database to identify small molecules that can induce HCC. Results Overall, 2721 differentially expressed genes (DEGs) were identified. We found 180 metastasis-related genes and constructed co-occurrence networks. Several significant pathways, including the transforming growth factor β (TGF-β) signalling pathway, were identified as closely related to these DEGs. Some candidate small molecules (such as betahistine) were identified that might provide a basis for developing HCC treatments in the future. Conclusions Although we functionally analysed the differences in the gene expression profiles of HCC and normal liver tissues, our study is essentially preliminary, and it may be premature to apply our results to clinical trials. Further research and experimental testing are required in future studies. PMID:27095935

  19. Occupant evaluation of commercial office lighting: Volume 2, Preliminary data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marans, R.W.; Brown, M.A. (ed.)

    1987-11-01

    This report presents the preliminary results of a post-occupancy evaluation of office lighting environments. It explores the relationship between quantitative measures of lighting in occupied environments and qualitative measures related to occupant satisfaction. The report analyzes several types of data from more than 1000 occupied work stations: subjective data on attitudes and ratings of selected lighting and other characteristics, photometric and other direct environmental data, including illuminances, luminances, and contrast conditions, indirect environmental measures obtained from the architectural drawings and the work station photographs, and descriptive characteristics of the occupants. The work stations were sampled from thirteen office buildings located in various cities in the United States. Many tentative findings emerged from the analysis, including the following: (1) within the range of values examined here, there is a tendency for lighting satisfaction to decrease as lighting power density increases; (2) occupants who described their work station spaces as bright also tended to be satisfied with their work station lighting; (3) occupants who were most bothered by bright lights and glare were most likely to express dissatisfaction with the lighting at their work stations; (4) there is no relationship between work-related activities of employees and indicators of lighting quality. More research is needed before firm conclusions can be drawn and before guidance regarding lighting standards and other policy issues can be derived. 3 refs., 9 figs.

  20. Hydrothermal Liquefaction and Upgrading of Municipal Wastewater Treatment Plant Sludge: A Preliminary Techno-Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Snowden-Swan, Lesley J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhu, Yunhua [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jones, Susanne B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schmidt, Andrew J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hallen, Richard T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Billing, Justin M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hart, Todd R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fox, Samuel P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Maupin, Gary D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-08

    A preliminary process model and techno-economic analysis (TEA) was completed for fuel produced from hydrothermal liquefaction (HTL) of sludge waste from a municipal wastewater treatment plant (WWTP) and subsequent biocrude upgrading. The model is adapted from previous work by Jones et al. (2014) for algae HTL, using experimental data generated in fiscal year 2015 (FY15) bench-scale HTL testing of sludge waste streams. Testing was performed on sludge samples received from MetroVancouver’s Annacis Island WWTP (Vancouver, B.C.) as part of a collaborative project with the Water Environment and Reuse Foundation (WERF). The full set of sludge HTL testing data from this effort will be documented in a separate report to be issued by WERF. This analysis is based on limited testing data and therefore should be considered preliminary. Future refinements are necessary to improve the robustness of the model, including a cross-check of modeled biocrude components with the experimental GCMS data and investigation of equipment costs most appropriate at the smaller scales used here. Environmental sustainability metrics analysis is also needed to understand the broader impact of this technology pathway. The base case scenario for the analysis consists of 10 HTL plants, each processing 100 dry U.S. ton/day (92.4 ton/day on a dry, ash-free basis) of sludge waste and producing 234 barrel per stream day (BPSD) biocrude, feeding into a centralized biocrude upgrading facility that produces 2,020 barrel per standard day of final fuel. This scale was chosen based upon initial wastewater treatment plant data collected by the resource assessment team from the EPA’s Clean Watersheds Needs Survey database (EPA 2015a) and a rough estimate of what the potential sludge availability might be within a 100-mile radius. In addition, we received valuable feedback from the wastewater treatment industry as part of the WERF collaboration that helped form the basis for the selected HTL and upgrading

  1. Brook trout (Salvelinus fontinalis extinction in small boreal lakes revealed by ephippia pigmentation: a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Alexandre Bérubé Tellier

    2016-12-01

    Full Text Available Ephippium pigmentation is a plastic trait which can be related to a trade-off between visual predation pressure and better protection of cladoceran eggs against different types of stress. Experimental studies showed that planktivorous fish exert a greater predation pressure on individuals carrying darker ephippia, but little is known about the variation of ephippium pigmentation along gradients of fish predation pressure in natural conditions. For this study, our experimental design included four small boreal lakes with known fish assemblages. Two of the lakes have viable brook trout (Salvelinus fontinalis populations, whereas the other two lakes experienced brook trout extinctions during the 20th century. Cladoceran ephippia were extracted from sediment cores at layers corresponding to the documented post- extinction phase (1990's and from an older layer (1950's for which the brook trout population status is not known precisely. Our first objective was to determine whether brook trout extinction has a direct effect on both ephippium pigmentation and size. Our second objective was to give a preliminary assessment of the status of brook trout populations in the 1950's by comparing the variation in ephippia traits measured from this layer to those measured in the 1990's, for which the extinction patterns are well known. Cost-effective image analysis was used to assess variation in pigmentation levels in ephippia. This approach provided a proxy for the amount of melanin invested in each ephippium analysed. Our study clearly shows that ephippium pigmentation may represent a better indicator of the presence of fish predators than ephippium size, a trait that showed a less clear pattern of variation between lakes with and without fish. For the 1990's period, ephippia from fishless lakes were darker and showed a slight tendency to be larger than ephippia from lakes with brook trout. However, no clear differences in either ephippium size or pigmentation

  2. Analysis and computational dissection of molecular signature multiplicity.

    Directory of Open Access Journals (Sweden)

    Alexander Statnikov

    2010-05-01

    Full Text Available Molecular signatures are computational or mathematical models created to diagnose disease and other phenotypes and to predict clinical outcomes and response to treatment. It is widely recognized that molecular signatures constitute one of the most important translational and basic science developments enabled by recent high-throughput molecular assays. A perplexing phenomenon that characterizes high-throughput data analysis is the ubiquitous multiplicity of molecular signatures. Multiplicity is a special form of data analysis instability in which different analysis methods used on the same data, or different samples from the same population lead to different but apparently maximally predictive signatures. This phenomenon has far-reaching implications for biological discovery and development of next generation patient diagnostics and personalized treatments. Currently the causes and interpretation of signature multiplicity are unknown, and several, often contradictory, conjectures have been made to explain it. We present a formal characterization of signature multiplicity and a new efficient algorithm that offers theoretical guarantees for extracting the set of maximally predictive and non-redundant signatures independent of distribution. The new algorithm identifies exactly the set of optimal signatures in controlled experiments and yields signatures with significantly better predictivity and reproducibility than previous algorithms in human microarray gene expression datasets. Our results shed light on the causes of signature multiplicity, provide computational tools for studying it empirically and introduce a framework for in silico bioequivalence of this important new class of diagnostic and personalized medicine modalities.

  3. Computing the surveillance error grid analysis: procedure and examples.

    Science.gov (United States)

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings.

  4. Evaluation of CO2 migration and formation storage capacity in the Dalders formations, Baltic Sea - Preliminary analysis by means of models of increasing complexity

    Science.gov (United States)

    Niemi, Auli; Yang, Zhibing; Tian, Liang; Jung, Byeongju; Fagerlund, Fritjof; Joodaki, Saba; Pasquali, Riccardo; O'Neill, Nick; Vernon, Richard

    2014-05-01

    We present preliminary data analysis and modeling of CO2 injection into selected parts of the Dalders Monocline and Dalders Structure, formations situated under the Baltic Sea and of potential interest for CO2 geological storage. The approach taken is to use models of increasing complexity successively, thereby increasing the confidence and reliability of the predictions. The objective is to get order-of-magnitude estimates of the behavior of the formations during potential industrial scale CO2 injection and subsequent storage periods. The focus has been in regions with best cap-rock characteristics, according to the present knowledge. Data has been compiled from various sources available, such as boreholes within the region. As the first approximation we use analytical solutions, in order to get an initial estimate the CO2 injection rates that can be used without causing unacceptable pressure increases. These preliminary values are then used as basis for more detailed numerical analyses with TOUGH2/TOUGH2-MP (e.g. Zhang et al, 2008) simulator and vertical equilibrium based (e.g. Gasda et al, 2009) models. With the numerical models the variations in material properties, formation thickness etc., as well as more processes such as CO2 dissolution can also be taken into account. The presentation discusses results from these preliminary analyses in terms of estimated storage capacity, CO2 and pressure plume extent caused by various injection scenarios, as well as CO2 travel time after the end of the injection. The effect of factors such as number of injection wells and the positioning of these, the effect of formation properties and the boundary conditions are discussed as are the benefits and disadvantages of the various modeling approaches used. References: Gasda S.E. et al, 2009. Computational Geosciences 13, 469-481. Zhang et al, 2008. Report LBNL-315E, Lawrence Berkeley National Laboratory.

  5. Preliminary Analysis of Effects of Reduced Discharge onThermal Habitat of Pedersen Warm Springs Channel

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — A preliminary report to study the potential impacts of possible flow reductions in thermal spring systems located in the Warm Springs area of Moapa Valley NWR on the...

  6. Analysis of Network Performance for Computer Communication Systems with Benchmark

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper introduced a performance evaluating approach of computer communication system based on the simulation and measurement technology, and discussed its evaluating models. The result of our experiment showed that the outcome of practical measurement on Ether-LAN fitted in well with the theoreticai analysis. The approach we presented can be used to define various kinds of artificially simulated load models conveniently, build all kinds of network application environments in a flexible way, and exert sufficientiy the widely-used and high-precision features of the traditional simulation technology and the reality,reliability, adaptability features of measurement technology.

  7. Dynamical Analysis of a Computer Virus Model with Delays

    Directory of Open Access Journals (Sweden)

    Juan Liu

    2016-01-01

    Full Text Available An SIQR computer virus model with two delays is investigated in the present paper. The linear stability conditions are obtained by using characteristic root method and the developed asymptotic analysis shows the onset of a Hopf bifurcation occurs when the delay parameter reaches a critical value. Moreover the direction of the Hopf bifurcation and stability of the bifurcating period solutions are investigated by using the normal form theory and the center manifold theorem. Finally, numerical investigations are carried out to show the feasibility of the theoretical results.

  8. Parameter estimation and error analysis in environmental modeling and computation

    Science.gov (United States)

    Kalmaz, E. E.

    1986-01-01

    A method for the estimation of parameters and error analysis in the development of nonlinear modeling for environmental impact assessment studies is presented. The modular computer program can interactively fit different nonlinear models to the same set of data, dynamically changing the error structure associated with observed values. Parameter estimation techniques and sequential estimation algorithms employed in parameter identification and model selection are first discussed. Then, least-square parameter estimation procedures are formulated, utilizing differential or integrated equations, and are used to define a model for association of error with experimentally observed data.

  9. Introduction to Numerical Computation - analysis and Matlab illustrations

    DEFF Research Database (Denmark)

    Elden, Lars; Wittmeyer-Koch, Linde; Nielsen, Hans Bruun

    their properties. The book describes and analyses numerical methods for error analysis, differentiation, integration, interpolation and approximation, and the solution of nonlinear equations, linear systems of algebraic equations and systems of ordinary differential equations. Principles and algorithms......In a modern programming environment like eg MATLAB it is possible by simple commands to perform advanced calculations on a personal computer. In order to use such a powerful tool efiiciently it is necessary to have an overview of available numerical methods and algorithms and to know about...

  10. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  11. Vortex dominated flows. Analysis and computation for multiple scale phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Ting, L. [New York Univ., NY (United States). Courant Inst. of Mathematical Sciences; Klein, R. [Freie Univ. Berlin (Germany). Fachbereich Mathematik und Informatik; Knio, O.M. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Mechanical Engineering

    2007-07-01

    This monograph provides in-depth analyses of vortex dominated flows via matched and multiscale asymptotics, and demonstrates how insight gained through these analyses can be exploited in the construction of robust, efficient, and accurate numerical techniques. The book explores the dynamics of slender vortex filaments in detail, including fundamental derivations, compressible core structure, weakly non-linear limit regimes, and associated numerical methods. Similarly, the volume covers asymptotic analysis and computational techniques for weakly compressible flows involving vortex-generated sound and thermoacoustics. The book is addressed to both graduate students and researchers. (orig.)

  12. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  13. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  14. Computational issue in the analysis of adaptive control systems

    Science.gov (United States)

    Kosut, Robert L.

    1989-01-01

    Adaptive systems under slow parameter adaption can be analyzed by the method of averaging. This provides a means to assess stability (and instability) properties of most adaptive systems, either continuous-time or (more importantly for practice) discrete-time, as well as providing an estimate of the region of attraction. Although the method of averaging is conceptually straightforward, even simple examples are well beyond hand calculations. Specific software tools are proposed which can provide the basis for user-friendly environment to perform the necessary computations involved in the averaging analysis.

  15. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    Science.gov (United States)

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Computational geometry assessment for morphometric analysis of the mandible.

    Science.gov (United States)

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  17. Pharmacognostic and Preliminary Phytochemical Analysis of Sauropus androgynus (L) Merr. Leaf

    OpenAIRE

    Ankad Gireesh; Hegde Harsha; Kholkute S.D; Hurkadale Pramod

    2013-01-01

    The leaves of Sauropus androgynus (L.) Merr. are used in traditional medicine to treat various disorders and also used as vegetable for its nutritive value. Such plant having medicinal and nutritive values lacks pharmacognostical and preliminary phytochemical information. Hence the present work is intended to study pharmacognostical and preliminary phytochemical studies, which will serve as quality control parameters. The pharmacognostical parameters like transverse section of midrib, epiderm...

  18. Early detection of breast cancer using total biochemical analysis of peripheral blood components: a preliminary study.

    Science.gov (United States)

    Zelig, Udi; Barlev, Eyal; Bar, Omri; Gross, Itai; Flomen, Felix; Mordechai, Shaul; Kapelushnik, Joseph; Nathan, Ilana; Kashtan, Hanoch; Wasserberg, Nir; Madhala-Givon, Osnat

    2015-05-15

    Most of the blood tests aiming for breast cancer screening rely on quantification of a single or few biomarkers. The aim of this study was to evaluate the feasibility of detecting breast cancer by analyzing the total biochemical composition of plasma as well as peripheral blood mononuclear cells (PBMCs) using infrared spectroscopy. Blood was collected from 29 patients with confirmed breast cancer and 30 controls with benign or no breast tumors, undergoing screening for breast cancer. PBMCs and plasma were isolated and dried on a zinc selenide slide and measured under a Fourier transform infrared (FTIR) microscope to obtain their infrared absorption spectra. Differences in the spectra of PBMCs and plasma between the groups were analyzed as well as the specific influence of the relevant pathological characteristics of the cancer patients. Several bands in the FTIR spectra of both blood components significantly distinguished patients with and without cancer. Employing feature extraction with quadratic discriminant analysis, a sensitivity of ~90 % and a specificity of ~80 % for breast cancer detection was achieved. These results were confirmed by Monte Carlo cross-validation. Further analysis of the cancer group revealed an influence of several clinical parameters, such as the involvement of lymph nodes, on the infrared spectra, with each blood component affected by different parameters. The present preliminary study suggests that FTIR spectroscopy of PBMCs and plasma is a potentially feasible and efficient tool for the early detection of breast neoplasms. An important application of our study is the distinction between benign lesions (considered as part of the non-cancer group) and malignant tumors thus reducing false positive results at screening. Furthermore, the correlation of specific spectral changes with clinical parameters of cancer patients indicates for possible contribution to diagnosis and prognosis.

  19. Preliminary design of a small air loop for system analysis and validation of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Marchand, M.; Saez, M.; Tauveron, N.; Tenchine, D.; Germain, T.; Geffraye, G.; Ruby, G.P. [CEA Grenoble (DEN/DER/SSTH), 38 (France)

    2007-07-01

    The French Atomic Energy Commission (Cea) is carrying on the design of a Small Air Loop for System Analysis (SALSA), devoted to the study of gas cooled nuclear reactors behaviour in normal and incidental/accidental operating conditions. The reduced size of the SALSA components compared to a full-scale reactor and air as gaseous coolant instead of Helium will allow an easy management of the loop. The main purpose of SALSA will be the validation of the associated thermal hydraulic safety simulation codes, like CATHARE. The main goal of this paper is to present the methodology used to define the characteristics of the loop. In a first step, the study has been focused on a direct-cycle system for the SALSA loop with few global constraints using a similarity analysis to support the definition and design of the loop. Similarity requirements have been evaluated to determine the scale factors which have to be applied to the SALSA loop components. The preliminary conceptual design of the SALSA plant with a definition of each component has then be carried out. The whole plant has been modelled using the CATHARE code. Calculations of the SALSA steady-state in nominal conditions and of different plant transients in direct-cycle have been made. The first system results obtained on the global behaviour of the loop confirm that SALSA can be representative of a Gas-Cooled nuclear reactor with some minor design modifications. In a second step, the current prospects focus on the SALSA loop capability to reproduce correctly the heat transfer occurring in specific incidental situations. Heat decay removal by natural convection is a crucial point of interest. The first results show that the behaviour and the efficiency of the loop are strongly influenced by the definition of the main parameters for each component. A complete definition of SALSA is under progress. (authors)

  20. An evaluation of the Positive Emotional Experiences Scale: A preliminary analysis

    Directory of Open Access Journals (Sweden)

    Rene van Wyk

    2016-04-01

    Full Text Available Orientation: The positive organisational behaviour movement emphasises the advantages of psychological strengths in business. The psychological virtues of positive emotional experiences can potentially promote human strengths to the advantages of business functioning and the management of work conditions. This is supported by Fredrickson’s broaden-and-build theory that emphasises the broadening of reactive thought patterns through experiences of positive emotions.Research purpose: A preliminary psychometric evaluation of a positive measurement of dimensions of emotional experiences in the workplace, by rephrasing the Kiefer and Barclay Toxic Emotional Experiences Scale.Motivation for the study: This quantitative Exploratory Factor Analysis investigates the factorial structure and reliability of the Positive Emotional Experiences Scale, a positive rephrased version of the Toxic Emotional Experiences Scale.Research approach, design and method: This Exploratory Factor Analysis indicates an acceptable three-factor model for the Positive Emotional Experiences Scale. These three factors are: (1 psychological recurrent positive state, (2 social connectedness and (3 physical refreshed energy, with strong Cronbach’s alphas of 0.91, 0.91 and 0.94, respectively.Main findings: The three-factor model of the Positive Emotional Experiences Scale provides a valid measure in support of Fredrickson’s theory of social, physical and psychological endured personal resources that build positive emotions.Practical/Managerial implications: Knowledge gained on positive versus negative emotional experiences could be applied by management to promote endured personal resources that strengthen positive emotional experiences.Contribution/value-add: The contribution of this rephrased Positive Emotional Experiences Scale provides a reliable measure of assessment of the social, physical and endured psychological and personal resources identified in Fredrickson’s broaden

  1. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    Science.gov (United States)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  2. Plans for a sensitivity analysis of bridge-scour computations

    Science.gov (United States)

    Dunn, David D.; Smith, Peter N.

    1993-01-01

    Plans for an analysis of the sensitivity of Level 2 bridge-scour computations are described. Cross-section data from 15 bridge sites in Texas are modified to reflect four levels of field effort ranging from no field surveys to complete surveys. Data from United States Geological Survey (USGS) topographic maps will be used to supplement incomplete field surveys. The cross sections are used to compute the water-surface profile through each bridge for several T-year recurrence-interval design discharges. The effect of determining the downstream energy grade-line slope from topographic maps is investigated by systematically varying the starting slope of each profile. The water-surface profile analyses are then used to compute potential scour resulting from each of the design discharges. The planned results will be presented in the form of exceedance-probability versus scour-depth plots with the maximum and minimum scour depths at each T-year discharge presented as error bars.

  3. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.

  4. Computational analysis of light scattering from collagen fiber networks

    Science.gov (United States)

    Arifler, Dizem; Pavlova, Ina; Gillenwater, Ann; Richards-Kortum, Rebecca

    2007-07-01

    Neoplastic progression in epithelial tissues is accompanied by structural and morphological changes in the stromal collagen matrix. We used the Finite-Difference Time-Domain (FDTD) method, a popular computational technique for full-vector solution of complex problems in electromagnetics, to establish a relationship between structural properties of collagen fiber networks and light scattering, and to analyze how neoplastic changes alter stromal scattering properties. To create realistic collagen network models, we acquired optical sections from the stroma of fresh normal and neoplastic oral cavity biopsies using fluorescence confocal microscopy. These optical sections were then processed to construct three-dimensional collagen networks of different sizes as FDTD model input. Image analysis revealed that volume fraction of collagen fibers in the stroma decreases with neoplastic progression, and statistical texture features computed suggest that fibers tend to be more disconnected in neoplastic stroma. The FDTD modeling results showed that neoplastic fiber networks have smaller scattering cross-sections compared to normal networks of the same size, whereas high-angle scattering probabilities tend to be higher for neoplastic networks. Characterization of stromal scattering is expected to provide a basis to better interpret spectroscopic optical signals and to develop more reliable computational models to describe photon propagation in epithelial tissues.

  5. Crystallization and preliminary X-ray diffraction analysis of L-threonine dehydrogenase (TDH) from the hyperthermophilic archaeon Thermococcus kodakaraensis.

    Science.gov (United States)

    Bowyer, A; Mikolajek, H; Wright, J N; Coker, A; Erskine, P T; Cooper, J B; Bashir, Q; Rashid, N; Jamil, F; Akhtar, M

    2008-09-01

    The enzyme L-threonine dehydrogenase catalyses the NAD(+)-dependent conversion of L-threonine to 2-amino-3-ketobutyrate, which is the first reaction of a two-step biochemical pathway involved in the metabolism of threonine to glycine. Here, the crystallization and preliminary crystallographic analysis of L-threonine dehydrogenase (Tk-TDH) from the hyperthermophilic organism Thermococcus kodakaraensis KOD1 is reported. This threonine dehydrogenase consists of 350 amino acids, with a molecular weight of 38 kDa, and was prepared using an Escherichia coli expression system. The purified native protein was crystallized using the hanging-drop vapour-diffusion method and crystals grew in the tetragonal space group P4(3)2(1)2, with unit-cell parameters a = b = 124.5, c = 271.1 A. Diffraction data were collected to 2.6 A resolution and preliminary analysis indicates that there are four molecules in the asymmetric unit of the crystal.

  6. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time; (3) software should enable grouping of individual sperm based on one or more attributes so outputs reflect subpopulations or clusters of similar sperm with unique properties; means or medians for the total population are insufficient; and (4) a field-use, portable CASA system for measuring one motion and two or three morphology attributes of individual sperm is needed for field

  7. - and Syn-Eruptive Surface Movements of Azerbaijan Mud Volcanoes Detected Through Insar Analysis: Preliminary Results

    Science.gov (United States)

    Antonielli, Benedetta; Monserrat, Oriol; Bonini, Marco; Righini, Gaia; Sani, Federico; Luzi, Guido; Feyzullayev, Akper; Aliyev, Chingiz

    2014-05-01

    Mud volcanism is a process that consists in the extrusion of mud, fragments or blocks of country rocks, saline waters and gases, mostly methane. This mechanism is typically linked to in-depth hydrocarbon traps, and it builds up a variety of conical edifices with dimension and morphology similar to those of magmatic volcanoes. Interferometry by Satellite Aperture Radar (InSAR) techniques have been commonly used to monitor and investigate the ground deformation connected to the eruptive phases of magmatic volcanoes. InSAR techniques have also been employed to explore the ground deformation associated with the LUSI mud volcano in Java (Indonesia). We aim to carry out a study on the paroxysmal activities of the Azerbaijan mud volcanoes, among the largest on Earth, using similar techniques. In particular the deformations of the mud volcanic systems were analyzed through the technique of satellite differential interferometry (DInSAR), thanks to the acquisition of 16 descending and 4 ascending Envisat images, spanning about 4 years (October 2003-November 2007); these data were provided by the European Space Agency. The preliminary analysis of a set of 77 interferograms and the unwrapping process elaboration of some of them selected according to the best coherence values, allowed the detection of significant deformations in correspondence of Ayaz-Akhtarma and Khara Zira Island mud volcanoes. This analysis has allowed to identify relevant ground deformations of the volcanic systems in connection with the main eruptive events in 2005 and in 2006 respectively, that are recorded by the catalogue of Azerbaijan mud volcano eruptions until 2007. The preliminary analysis of the interferograms of the Ayaz-Akhtarma and the Khara Zira mud volcanoes shows that the whole volcano edifice or part of it is subject to a ground displacement before or in coincidence with the eruption. Assuming that the movement is mainly vertical, we suppose that deformation is due to bulging of the volcanic

  8. Preliminary analysis of the forest health state based on multispectral images acquired by Unmanned Aerial Vehicle

    Directory of Open Access Journals (Sweden)

    Czapski Paweł

    2015-09-01

    Full Text Available The main purpose of this publication is to present the current progress of the work associated with the use of a lightweight unmanned platforms for various environmental studies. Current development in information technology, electronics and sensors miniaturisation allows mounting multispectral cameras and scanners on unmanned aerial vehicle (UAV that could only be used on board aircraft and satellites. Remote Sensing Division in the Institute of Aviation carries out innovative researches using multisensory platform and lightweight unmanned vehicle to evaluate the health state of forests in Wielkopolska province. In this paper, applicability of multispectral images analysis acquired several times during the growing season from low altitude (up to 800m is presented. We present remote sensing indicators computed by our software and common methods for assessing state of trees health. The correctness of applied methods is verified using analysis of satellite scenes acquired by Landsat 8 OLI instrument (Operational Land Imager.

  9. Flow cytometric analysis of oil palm: a preliminary analysis for cultivars and genomic DNA alteration

    Directory of Open Access Journals (Sweden)

    Warawut Chuthammathat

    2005-12-01

    Full Text Available DNA contents of oil palm (Elaeis guineensis Jacq. cultivars were analyzed by flow cytometry using different external reference plant species. Analysis using corn (Zea mays line CE-777 as a reference plant gave the highest DNA content of oil palm (4.72±0.23 pg 2C-1 whereas the DNA content was found to be lower when using soybean (Glycine max cv. Polanka (3.77±0.09 pg 2C-1 or tomato (Lycopersicon esculentum cv. Stupicke (4.25±0.09 pg 2C-1 as a reference. The nuclear DNA contents of Dura (D109, Pisifera (P168 and Tenera (T38 cultivars were 3.46±0.04, 3.24±0.03 and 3.76±0.04 pg 2C-1 nuclei, respectively, using soybean as a reference. One haploid genome of oil palm therefore ranged from 1.56 to 1.81±109 base pairs. DNA contents from one-year-old calli and cell suspension of oil palm were found to be significantly different from those of seedlings. It thus should be noted that genomic DNA alteration occurred in these cultured tissues. We therefore confirm that flow cytometric analysis could verify cultivars, DNA content and genomic DNA alteration of oil palm using soybean as an external reference standard.

  10. Applications of Photogrammetry for Analysis of Forest Plantations. Preliminary study: Analysis of individual trees

    Science.gov (United States)

    Mora, R.; Barahona, A.; Aguilar, H.

    2015-04-01

    This paper presents a method for using high detail volumetric information, captured with a land based photogrammetric survey, to obtain information from individual trees. Applying LIDAR analysis techniques it is possible to measure diameter at breast height, height at first branch (commercial height), basal area and volume of an individual tree. Given this information it is possible to calculate how much of that tree can be exploited as wood. The main objective is to develop a methodology for successfully surveying one individual tree, capturing every side of the stem a using high resolution digital camera and reference marks with GPS coordinates. The process is executed for several individuals of two species present in the metropolitan area in San Jose, Costa Rica, Delonix regia (Bojer) Raf. and Tabebuia rosea (Bertol.) DC., each one with different height, stem shape and crown area. Using a photogrammetry suite all the pictures are aligned, geo-referenced and a dense point cloud is generated with enough detail to perform the required measurements, as well as a solid tridimensional model for volume measurement. This research will open the way to develop a capture methodology with an airborne camera using close range UAVs. An airborne platform will make possible to capture every individual in a forest plantation, furthermore if the analysis techniques applied in this research are automated it will be possible to calculate with high precision the exploit potential of a forest plantation and improve its management.

  11. Computer-monitored radionuclide tracking of three-dimensional mandibular movements. Part II: experimental setup and preliminary results - Posselt diagram

    Energy Technology Data Exchange (ETDEWEB)

    Salomon, J.A.; Waysenson, B.D.; Warshaw, B.D.

    1979-04-01

    This article described a new method to track mandibular movements using a computer-assisted radionuclide kinematics technique. The usefulness of various image-enhancement techniques is discussed, and the reproduction of physiologic displacements is shown. Vertical, lateral, and protrusive envelopes of motion of a point on a tooth of a complete denture mounted on a semiadjustable articulator were measured. A demonstrative example of the validity of this approach is reproducing the motion of the dental point, which clearly evidences the Posselt diagram.

  12. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis....... Factor I mediates the overall severity of the disturbance, factor II points to expressive versus comprehensive character of the language disorder, factor III represents the granularity of the phonetic mistakes, factor IV accentuates the patients' awareness of his disease, and factor V exposes...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language...

  13. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  14. Critical Data Analysis Precedes Soft Computing Of Medical Data

    DEFF Research Database (Denmark)

    Keyserlingk, Diedrich Graf von; Jantzen, Jan; Berks, G.

    2000-01-01

    extracted. The factors had different relationships (loadings) to the symptoms. Although the factors were gained only by computations, they seemed to express some modular features of the language disturbances. This phenomenon, that factors represent superior aspects of data, is well known in factor analysis...... the deficits in communication. Sets of symptoms corresponding to the traditional symptoms in Broca and Wernicke aphasia may be represented in the factors, but the factor itself does not represent a syndrome. It is assumed that this kind of data analysis shows a new approach to the understanding of language......Medical databases appear in general as collections of scarcely defined, uncomfortable feelings, disturbances and disabilities of patients encoded in medical terms and symptoms, often scarcely enriched with some ordinal and metric data. But, astonishing enough, in many cases this is sufficient...

  15. Dynamic analysis of spur gears using computer program DANST

    Science.gov (United States)

    Oswald, Fred B.; Lin, Hsiang H.; Liou, Chuen-Huei; Valco, Mark J.

    1993-06-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  16. Analysis of CERN computing infrastructure and monitoring data

    Science.gov (United States)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  17. Predicting outcomes in glioblastoma patients using computerized analysis of tumor shape: preliminary data

    Science.gov (United States)

    Mazurowski, Maciej A.; Czarnek, Nicholas M.; Collins, Leslie M.; Peters, Katherine B.; Clark, Kal

    2016-03-01

    Glioblastoma (GBM) is the most common primary brain tumor characterized by very poor survival. However, while some patients survive only a few months, some might live for multiple years. Accurate prognosis of survival and stratification of patients allows for making more personalized treatment decisions and moves treatment of GBM one step closer toward the paradigm of precision medicine. While some molecular biomarkers are being investigated, medical imaging remains significantly underutilized for prognostication in GBM. In this study, we investigated whether computer analysis of tumor shape can contribute toward accurate prognosis of outcomes. Specifically, we implemented applied computer algorithms to extract 5 shape features from magnetic resonance imaging (MRI) for 22 GBM patients. Then, we determined whether each one of the features can accurately distinguish between patients with good and poor outcomes. We found that that one of the 5 analyzed features showed prognostic value of survival. The prognostic feature describes how well the 3D tumor shape fills its minimum bounding ellipsoid. Specifically, for low values (less or equal than the median) the proportion of patients that survived more than a year was 27% while for high values (higher than median) the proportion of patients with survival of more than 1 year was 82%. The difference was statistically significant (p < 0.05) even though the number of patients analyzed in this pilot study was low. We concluded that computerized, 3D analysis of tumor shape in MRI may strongly contribute to accurate prognostication and stratification of patients for therapy in GBM.

  18. Investigating in vitro angiogenesis by computer-assisted image analysis and computational simulation.

    Science.gov (United States)

    Guidolin, Diego; Fede, Caterina; Albertin, Giovanna; De Caro, Raffaele

    2015-01-01

    In vitro assays that stimulate the formation of capillary-like structures by EC have become increasingly popular, because they allow the study of the EC's intrinsic ability to self-organize to form vascular-like patterns. Here we describe a widely applied protocol involving the use of basement membrane matrix (Matrigel) as a suitable environment to induce an angiogenic phenotype in cultured EC. EC differentiation on basement membrane matrix is a highly specific process, which recapitulates many steps in blood vessel formation and for this reason it is presently considered as a reliable in vitro tool to identify factors with potential antiangiogenic or pro-angiogenic properties. The morphological features of the obtained cell patterns can also be accurately quantified by computer-assisted image analysis and the main steps of such a procedure will be here outlined and discussed. The dynamics of in vitro EC self-organization is a complex biological process, involving a network of interactions between a high number of cells. For this reason, the combined use of in vitro experiments and computational modeling can represent a key approach to unravel how mechanical and chemical signaling by EC coordinates their organization into capillary-like tubes. Thus, a particularly helpful approach to modeling is also briefly described together with examples of its application.

  19. Dynamic computed tomography based on spatio-temporal analysis in acute stroke: Preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ha Young; Pyeon, Do Yeong; Kim, Da Hye; Jung, Young Jin [Dongseo University, Busan (Korea, Republic of)

    2016-12-15

    Acute stroke is a one of common disease that require fast diagnosis and treatment to save patients life. however, the acute stroke may cause lifelong disability due to brain damage with no prompt surgical procedure. In order to diagnose the Stroke, brain perfusion CT examination and possible rapid implementation of 3D angiography has been widely used. However, a low-dose technique should be applied for the examination since a lot of radiation exposure to the patient may cause secondary damage for the patients. Therefore, the degradation of the measured CT images may interferes with a clinical check in that blood vessel shapes o n the CT image are significantly affected by gaussian noise. In this study, we employed the spatio-temporal technique to analyze dynamic (brain perfusion) CT data to improve an image quality for successful clinical diagnosis. As a results, proposed technique could remove gaussian noise successfully, demonstrated a possibility of new image segmentation technique for CT angiography. Qualitative evaluation was conducted by skilled radiological technologists, indicated significant quality improvement of dynamic CT images. the proposed technique will be useful tools as a clinical application for brain perfusion CT examination.

  20. An Interactive Computer Program for the Preliminary Design and Analysis of Marine Reduction Gears.

    Science.gov (United States)

    1982-03-01

    10 =r b0 Ln ow ca 64 040* 0 -0.4 u % ca’O~= 4u 0 14 - - 0KK . En En W~N MA 02 04co E CA2 = 4 -06 N -C1 - 4-4 .. 04 C14 pa4H pa 3" %N E-4 (4I Hb4... -N...co 09z MW 0 *E4 4 H W4 = 4 =(.0LJH U!U C 04 14 4 Ad0. im a * H ad~4 = 00’ O 04 E4 H = - CD E4 0 E tD I CAV) L’m E-4 UN I z l (M I W~ fl(314E4 .4...4 0 00 V_ C-4 CD tD 5 n 00 . CNIE_ In0 c 0 LA0 %A Hc Z4 r- 545 - 04 Ad 0 - C gI 0 M CDj In IQ U ca 00 CD 0 -N 0 -S - 54 LAC CD . CD 0 * 0 0 % - 3(1