WorldWideScience

Sample records for analysis tool extensions

  1. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  2. An extensive (co-expression analysis tool for the cytochrome P450 superfamily in Arabidopsis thaliana

    Directory of Open Access Journals (Sweden)

    Provart Nicholas J

    2008-04-01

    Full Text Available Abstract Background Sequencing of the first plant genomes has revealed that cytochromes P450 have evolved to become the largest family of enzymes in secondary metabolism. The proportion of P450 enzymes with characterized biochemical function(s is however very small. If P450 diversification mirrors evolution of chemical diversity, this points to an unexpectedly poor understanding of plant metabolism. We assumed that extensive analysis of gene expression might guide towards the function of P450 enzymes, and highlight overlooked aspects of plant metabolism. Results We have created a comprehensive database, 'CYPedia', describing P450 gene expression in four data sets: organs and tissues, stress response, hormone response, and mutants of Arabidopsis thaliana, based on public Affymetrix ATH1 microarray expression data. P450 expression was then combined with the expression of 4,130 re-annotated genes, predicted to act in plant metabolism, for co-expression analyses. Based on the annotation of co-expressed genes from diverse pathway annotation databases, co-expressed pathways were identified. Predictions were validated for most P450s with known functions. As examples, co-expression results for P450s related to plastidial functions/photosynthesis, and to phenylpropanoid, triterpenoid and jasmonate metabolism are highlighted here. Conclusion The large scale hypothesis generation tools presented here provide leads to new pathways, unexpected functions, and regulatory networks for many P450s in plant metabolism. These can now be exploited by the community to validate the proposed functions experimentally using reverse genetics, biochemistry, and metabolic profiling.

  3. The integrated microbial genomes (IMG) system in 2007: datacontent and analysis tool extensions

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Szeto, Ernest; Palaniappan, Krishna; Grechkin, Yuri; Chu, Ken; Chen, I-Min A.; Dubchak, Inna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2007-08-01

    The Integrated Microbial Genomes (IMG) system is a data management, analysis and annotation platform for all publicly available genomes. IMG contains both draft and complete JGI microbial genomes integrated with all other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and annotating genomes, genes and functions, individually or in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through quarterly releases. IMG is provided by the DOE-Joint Genome Institute (JGI) and is available from http://img.jgi.doe.gov.

  4. BEAP: The BLAST Extension and Alignment Program- a tool for contig construction and analysis of preliminary genome sequence

    Directory of Open Access Journals (Sweden)

    Fritz Eric

    2009-01-01

    Full Text Available Abstract Background Fine-mapping projects require a high density of SNP markers and positional candidate gene sequences. In species with incomplete genomic sequence, the DNA sequences needed to generate markers for fine-mapping within a linkage analysis confidence interval may be available but may not have been assembled. To manually piece these sequences together is laborious and costly. Moreover, annotation and assembly of short, incomplete DNA sequences is time consuming and not always straightforward. Findings We have created a tool called BEAP that combines BLAST and CAP3 to retrieve sequences and construct contigs for localized genomic regions in species with unfinished sequence drafts. The rational is that a completed genome can be used as a template to query target genomic sequence for closing the gaps or extending contig sequence length in species whose genome is incomplete on the basis that good homology exists. Each user must define what template sequence is appropriate based on comparative mapping data such as radiation hybrid (RH maps or other evidence linking the gene sequence of the template species to the target species. Conclusion The BEAP software creates contigs suitable for discovery of orthologous genes for positional cloning. The resulting sequence alignments can be viewed graphically with a Java graphical user interface (GUI, allowing users to evaluate contig sequence quality and predict SNPs. We demonstrate the successful use of BEAP to generate genomic template sequence for positional cloning of the Angus dwarfism mutation. The software is available for free online for use on UNIX systems at http://www.animalgenome.org/bioinfo/tools/beap/.

  5. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  6. Development of the ITER upper port stub extension assembly tool and its mock-up

    Energy Technology Data Exchange (ETDEWEB)

    Nam, Kyoungo, E-mail: namko@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Park, Hyunki; Kim, Dongjin; Bae, Jinho; Ahn, Heejae [National Fusion Research Institute, Daejeon (Korea, Republic of); Kim, Kyoungkyu [Mecha T& S, Jinju (Korea, Republic of); Yoo, Yongsoo [SFA Co. Ltd., Asan (Korea, Republic of); Watson, Emma [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St. Paul Lez Durance Cedex (France)

    2015-10-15

    Highlights: • We present on the development of the ITER upper port stub extension (UPSE) assembly tool and its mock-up. • The design of the upper port stub extension assembly tool has been developed to meet design requirements and assembly procedure from ITER organization. • The structural analysis of UPSE assembly tool has been carried out for verifying structural strength of this tool. In the results of the analysis for assessing structural stability of the UPSE assembly tool considering its dead weight, 15 tons of UPSE and safety factor 2.0, all stress are less than the allowable stress. • For verification of tool design and feasibility, mock-up of the UPSE assembly tool and its alignment system was fabricated and tested. - Abstract: The ITER upper port stub extension (UPSE) of the vacuum vessel (VV) is classified into 2 categories according to its location and time to install; one is the central (even) UPSE which is welded at upper central position of the VV sector and delivered after attached to VV at factory. The other is the lateral (odd) UPSE which is welded at upper lateral position of the VV sector and the significant difference with the central UPSE is that the lateral UPSE is welded after whole VVs welded because this should be installed between adjacent VVs at Tokamak in-pit. The design of the dedicated assembly tool for the UPSE assembly has been developed by the Korean domestic agency (KODA). Adjustment system of this assembly tool has been also designed to meet the functional requirements requested by IO. For design verification of the UPSE assembly tool mentioned above, a mock-up has been fabricated in full size and tested according to the UPSE functional requirements for the assembly procedure. And the structural analysis results of the current design will be presented also.

  7. Extensive analysis of hydrogen costs

    Energy Technology Data Exchange (ETDEWEB)

    Guinea, D.M.; Martin, D.; Garcia-Alegre, M.C.; Guinea, D. [Consejo Superior de Investigaciones Cientificas, Arganda, Madrid (Spain). Inst. de Automatica Industrial; Agila, W.E. [Acciona Infraestructuras, Alcobendas, Madrid (Spain). Dept. I+D+i

    2010-07-01

    Cost is a key issue in the spreading of any technology. In this work, the cost of hydrogen is analyzed and determined, for hydrogen obtained by electrolysis. Different contributing partial costs are taken into account to calculate the hydrogen final cost, such as energy and electrolyzers taxes. Energy cost data is taken from official URLs, while electrolyzer costs are obtained from commercial companies. The analysis is accomplished under different hypothesis, and for different countries: Germany, France, Austria, Switzerland, Spain and the Canadian region of Ontario. Finally, the obtained costs are compared to those of the most used fossil fuels, both in the automotive industry (gasoline and diesel) and in the residential sector (butane, coal, town gas and wood), and the possibilities of hydrogen competing against fuels are discussed. According to this work, in the automotive industry, even neglecting subsidies, hydrogen can compete with fossil fuels. Hydrogen can also compete with gaseous domestic fuels. Electrolyzer prices were found to have the highest influence on hydrogen prices. (orig.)

  8. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  9. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  10. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  11. Extension Procedures for Confirmatory Factor Analysis

    Science.gov (United States)

    Nagy, Gabriel; Brunner, Martin; Lüdtke, Oliver; Greiff, Samuel

    2017-01-01

    We present factor extension procedures for confirmatory factor analysis that provide estimates of the relations of common and unique factors with external variables that do not undergo factor analysis. We present identification strategies that build upon restrictions of the pattern of correlations between unique factors and external variables. The…

  12. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  13. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  14. Versatile and Extensible, Continuous-Thrust Trajectory Optimization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop an innovative, versatile and extensible, continuous-thrust trajectory optimization tool for planetary mission design and optimization of...

  15. Producing Organic Cotton: A Toolkit - Crop Guide, Projekt guide, Extension tools

    OpenAIRE

    Eyhorn, Frank

    2005-01-01

    The CD compiles the following extension tools on organic cotton: Organic Cotton Crop Guide, Organic Cotton Training Manual, Soil Fertility Training Manual, Organic Cotton Project Guide, Record keeping tools, Video "Organic agriculture in the Nimar region", Photos for illustration.

  16. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  17. An evaluation of the assessment tool used for extensive mini ...

    African Journals Online (AJOL)

    2013-03-06

    dissertations in the Master's Degree. 125 ... adapted assessment tool addressed these areas, but identified lack of training and experience in the assessment of ... identifying a problem, planning an intervention through data.

  18. A standardised knowledge test to measure the extent of knowledge of agricultural extension personnel on m-tools

    Directory of Open Access Journals (Sweden)

    Kusuma Kumari Nagam

    2016-11-01

    Full Text Available A standardised teacher made test for assessing the extent of knowledge of agricultural extension on m-tools was developed using the item analysis procedure. For the purpose, a teacher made test consisting of 30 items were prepared and administered to 30 agricultural extension personnel. Based on the results of the study,14 items having difficulty index value ranging from 20 to 80 and discrimination index value above 0.10 were selected to construct the knowledge test. This standardised test can be used to measure knowledge level of the extension personnel on m-tools.

  19. Environmental extension as effective tool for sustainable natural ...

    African Journals Online (AJOL)

    Abstract. Environmental extension which is the propagation of sustained natural resource use involves dissemination of products of interaction between an entity and its surrounding in a manner of mutual relationship among its ... This should reduce the threat to man's existence by the depletion of environmental resources.

  20. Mobile Phone as an extension tool among female agricultural ...

    African Journals Online (AJOL)

    The majority (97.69%) of the respondents owned and used mobile phone for accessing market information among others. Also, 90.8% respondents agreed that it's an efficient and effective facility for extension communication. High cost of subscription, mobile phone and accessories were the major constraints reported.

  1. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  2. eqtools: Modular, extensible, open-source, cross-machine Python tools for working with magnetic equilibria

    Science.gov (United States)

    Chilenski, M. A.; Faust, I. C.; Walk, J. R.

    2017-01-01

    As plasma physics research for fusion energy transitions to an increasing emphasis on cross-machine collaboration and numerical simulation, it becomes increasingly important that portable tools be developed to enable data from diverse sources to be analyzed in a consistent manner. This paper presents eqtools, a modular, extensible, open-source toolkit implemented in the Python programming language for handling magnetic equilibria and associated data from tokamaks. eqtools provides a single interface for working with magnetic equilibrium data, both for handling derived quantities and mapping between coordinate systems, extensible to function with data from different experiments, data formats, and magnetic reconstruction codes, replacing the diverse, non-portable solutions currently in use. Moreover, while the open-source Python programming language offers a number of advantages as a scripting language for research purposes, the lack of basic tokamak-specific functionality has impeded the adoption of the language for regular use. Implementing equilibrium-mapping tools in Python removes a substantial barrier to new development in and porting legacy code into Python. In this paper, we introduce the design of the eqtools package and detail the workflow for usage and expansion to additional devices. The implementation of a novel three-dimensional spline solution (in two spatial dimensions and in time) is also detailed. Finally, verification and benchmarking for accuracy and speed against existing tools are detailed. Wider deployment of these tools will enable efficient sharing of data and software between institutions and machines as well as self-consistent analysis of the shared data.

  3. An evaluation of the assessment tool used for extensive mini ...

    African Journals Online (AJOL)

    Firstly, marks given by 15 assessors for four mini-dissertations using the current assessment tool were analysed quantitatively. In Phase 2, the regulation of the assessment bodies and the quantitative results of Phase 1 were discussed by assessors during a focus group interview, and data were analysed qualitatively.

  4. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  5. Integrating Reliability Analysis with a Performance Tool

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  6. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  7. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  8. AvoPlot: An extensible scientific plotting tool based on matplotlib

    Directory of Open Access Journals (Sweden)

    Nial Peters

    2014-02-01

    Full Text Available AvoPlot is a simple-to-use graphical plotting program written in Python and making extensive use of the matplotlib plotting library. It can be found at http://code.google.com/p/avoplot/. In addition to providing a user-friendly interface to the powerful capabilities of the matplotlib library, it also offers users the possibility of extending its functionality by creating plug-ins. These can import specific types of data into the interface and also provide new tools for manipulating them. In this respect, AvoPlot is a convenient platform for researchers to build their own data analysis tools on top of, as well as being a useful standalone program.

  9. The Brazilian Experience with Agroecological Extension: A Critical Analysis of Reform in a Pluralistic Extension System

    Science.gov (United States)

    Diesel, Vivien; Miná Dias, Marcelo

    2016-01-01

    Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…

  10. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel

    2014-01-01

    , analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...... that structure the inter-connections between these sites. Including contributions from a range of academic disciplines including Political Science, Media and Communication Studies, Economics, and Computer Science, this study showcases a new methodological approach that has been expressly designed to capture......As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract...

  11. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  12. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    Science.gov (United States)

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  13. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  14. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    Science.gov (United States)

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  15. VCAT: Visual Crosswalk Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  16. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  17. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  18. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  19. Organizing to Use Facebook Advertisements: A Planning Tool for Extension Professionals, Businesses, and Communities

    Science.gov (United States)

    Barnes, James

    2016-01-01

    The purpose of this article is to explain how Extension professionals, businesses, and communities can use Facebook advertisements effectively. The article is a planning tool that introduces Facebook's Advertiser Help Center, explains some applicable key concepts, and suggests best practices to apply before launching a Facebook advertising…

  20. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  1. A Meta-Analysis of Extensive Reading Research

    Science.gov (United States)

    Nakanishi, Takayuki

    2015-01-01

    The purposes of this study were to investigate the overall effectiveness of extensive reading, whether learners' age impacts learning, and whether the length of time second language learners engage in extensive reading influences test scores. The author conducted a meta-analysis to answer research questions and to identify future research…

  2. GenApp: Extensible Tool for Rapid Generation of Web and Native GUI Applications

    OpenAIRE

    Savelyev, Alexey; Brookes, Emre

    2017-01-01

    GenApp (Generalized Application Framework) is a universal and extensible tool for rapid deployment of scientific codes onto web platforms and generation of standalone GUI applications. Among the main unique features of GenApp are the minimal technical expertise requirement for the end user and an open-end design ensuring sustainability of generated applications. To produce fully functional applications GenApp weaves libraries of fragments and user defined modules as di...

  3. Income distribution: Boltzmann analysis and its extension

    Science.gov (United States)

    Yuqing, He

    2007-04-01

    The paper aims at describing income distribution in moderate income regions. Starting with dividing income behaviors into the two parts: random and deterministic, and by introducing “instantaneous model” for theoretical derivations and “cumulative model” for positive tests, this paper applies the equilibrium approach of statistical mechanics in the study of nonconserved individual income course. The random income follows a stationary distribution similar to the Maxwell-Boltzmann distribution in the instantaneous model. Combining this result with marginal analysis, the probability distribution of individual income process that is composed of the random and deterministic income courses approximately obeys a distribution law mixing exponential function with a logarithmic prefactor. Using the census or income survey data of USA, UK, Japan, and New Zealand, the distribution law has been tested. The results show that it agrees very well with most of the empirical data. The discussion suggests that there might be essentially different income processes to happen in moderate and high income regions.

  4. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of temporal maximum autocorrelation factor analysis to global monthly mean values of 1996-1997 sea surface temperature (SST) and sea surface height (SSH) data. This type of analysis can be considered as an extension of traditional empirical orthogonal function...... (EOF) analysis, which provides a non-temporal analysis of one variable over time. The temporal extension proves its strength in separating the signals at different periods in an analysis of relevant oceanographic properties related to one of the largest El Niño events ever recorded....

  5. ASAP: An Extensible Platform for State Space Analysis

    DEFF Research Database (Denmark)

    Westergaard, Michael; Evangelista, Sami; Kristensen, Lars Michael

    2009-01-01

    The ASCoVeCo State space Analysis Platform (ASAP) is a tool for performing explicit state space analysis of coloured Petri nets (CPNs) and other formalisms. ASAP supports a wide range of state space reduction techniques and is intended to be easy to extend and to use, making it a suitable tool...

  6. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  7. Climate Data Analysis Tools - (CDAT)

    Science.gov (United States)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  8. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  9. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  10. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  11. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  12. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  13. The ePNK: An Extensible Petri Net Tool for PNML

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    , the developer, basically, needs to give a class diagram defining the concepts of the new Petri net type, along with a mapping of these concepts to XML syntax. This type can then be plugged into the ePNK, and its graphical editor will be able to edit nets of this new type with all its features. This paper...... all kinds of Petri nets. To this end, PNML introduced the concept of Petri Net Type Definitions. There are many tools supporting one form of PNML or another. In particular, there is the PNML Framework, which helps tool developers implementing an interface to PNML by providing a framework and an API...... tool needs to be regenerated. Moreover, the PNML Framework does not come with a graphical editor. The ePNK overcomes these limitations: It provides an extension-point so that new Petri net types can be plugged into the ePNK without touching the code of the ePNK. For defining a new Petri net type...

  14. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  15. STUDENT PERFORMANCE PARAMETER ANALYSIS USING SPSS TOOL

    OpenAIRE

    Bhavesh Patel; Jyotindra Dharwa

    2017-01-01

    SPSS tool is statistical analysis tool. This tool is used for analyzing the large volume of available data, extracting useful information and knowledge to support the major decision-making processes. SPSS tool can be applied in educational sector for improving the performance of students by finding the highly affected parameter on student performance. This research study is carried out by collecting the student performance parameters and its related dataset. In this research study we have col...

  16. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    Directory of Open Access Journals (Sweden)

    Brinkley James F

    2007-10-01

    Full Text Available Abstract Background Three-dimensional (3-D visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  17. Development Roadmap of an Evolvable and Extensible Multi-Mission Telecom Planning and Analysis Framework

    Science.gov (United States)

    Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.

    2003-01-01

    In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.

  18. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    /steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program

  20. VALIDERING AV VERKTYGET "ENTERPRISE ARCHITECTURE ANALYSIS TOOL"

    OpenAIRE

    Österlind, Magnus

    2011-01-01

    The Enterprise Architecture Analysis Tool, EAAT, is a software tool developed by the department of Industrial Information- and Control systems, ICS, at the Royal Institute of Technology, Stockholm, Sweden. EAAT is a modeling tool that combines Enterprise Architecture (EA) modeling with probabilistic relational modeling. Therefore EAAT makes it possible to design, describe and analyze the organizational structure, business processes, information systems and infrastructure within an enterprise....

  1. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  2. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  3. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  4. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  5. Extension joints: a tool to infer the active stress field orientation (case study from southern Italy)

    Science.gov (United States)

    De Guidi, Giorgio; Caputo, Riccardo; Scudero, Salvatore; Perdicaro, Vincenzo

    2013-04-01

    An intense tectonic activity in eastern Sicily and southern Calabria is well documented by the differential uplift of Late Quaternary coastlines and by the record of the strong historical earthquakes. The extensional belt that crosses this area is dominated by a well established WNW-ESE-oriented extensional direction. However, this area is largely lacking of any structural analysis able to define the tectonics at a more local scale. In the attempt to fill this gap of knowledge, we carried out a systematic analysis of extension joint sets. In fact, the systematic field collection of these extensional features, coupled with an appropriate inversion technique, allows to determine the characteristic of the causative tectonic stress field. Joints are defined as outcrop-scale mechanical discontinuities showing no evidence of shear motion and being originated as purely extensional fractures. Such tectonic features are one of the most common deformational structures in every tectonic environment and particularly abundant in the study area. A particular arrangement of joints, called "fracture grid-lock system", and defined as an orthogonal joint system where mutual abutting and crosscutting relationships characterize two geologically coeval joint sets, allow to infer the direction and the magnitude of the tectonic stress field. We performed the analyses of joints only on Pleistocene deposits of Eastern Sicily and Southern Calabria. Moreover we investigated only calcarenite sediments and cemented deposits, avoiding claysh and loose matrix-supported clastic sediments where the deformation is generally accomodated in a distributed way through the relative motion between the single particles. In the selection of the sites, we also took into account the possibility to clearly observe the geometric relationships among the joints. For this reason we chose curvilinear road cuts or cliffs, wide coastal erosional surfaces and quarries. The numerical inversions show a similar stress

  6. Cost Effectiveness Ratio: Evaluation Tool for Comparing the Effectiveness of Similar Extension Programs

    Science.gov (United States)

    Jayaratne, K. S. U.

    2015-01-01

    Extension educators have been challenged to be cost effective in their educational programming. The cost effectiveness ratio is a versatile evaluation indicator for Extension educators to compare the cost of achieving a unit of outcomes or educating a client in similar educational programs. This article describes the cost effectiveness ratio and…

  7. Performance analysis of GYRO: a tool evaluation

    International Nuclear Information System (INIS)

    Worley, P; Candy, J; Carrington, L; Huck, K; Kaiser, T; Mahinthakumar, G; Malony, A; Moore, S; Reed, D; Roth, P; Shan, H; Shende, S; Snavely, A; Sreepathi, S; Wolf, F; Zhang, Y

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses

  8. Cauchy-Kovalevskaya extension theorem in fractional Clifford analysis

    OpenAIRE

    Vieira, Nelson

    2015-01-01

    In this paper, we establish the fractional Cauchy-Kovalevskaya extension (FCK-extension) theorem for fractional monogenic functions defined on R^d. Based on this extension principle, fractional Fueter polynomials, forming a basis of the space of fractional spherical monogenics, i.e. fractional homogeneous polynomials, are introduced. We studied the connection between the FCK-extension of functions of the form x^\\alpha P_l and the classical Gegenbauer polynomials. Finally we present two examp...

  9. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  10. Tools for T-RFLP data analysis using Excel.

    Science.gov (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  11. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  12. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-01-01

    Work is in progress on interactive tools for linear and non-linear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  13. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  14. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  15. KAFE - A Flexible Image Analysis Tool

    Science.gov (United States)

    Burkutean, Sandra

    2017-11-01

    We present KAFE, the Keywords of Astronomical FITS-Images Explorer - a web-based FITS images post-processing analysis tool designed to be applicable in the radio to sub-mm wavelength domain. KAFE was developed to enrich selected FITS files with metadata based on a uniform image analysis approach as well as to provide advanced image diagnostic plots. It is ideally suited for data mining purposes and multi-wavelength/multi-instrument data samples that require uniform data diagnostic criteria.

  16. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  17. Applying New Diabetes Teaching Tools in Health-Related Extension Programming

    Science.gov (United States)

    Grenci, Alexandra

    2010-01-01

    In response to the emerging global diabetes epidemic, health educators are searching for new and better education tools to help people make positive behavior changes to successfully prevent or manage diabetes. Conversation Maps[R] are new learner-driven education tools that have been developed to empower individuals to improve their health…

  18. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  19. The proteios software environment: an extensible multiuser platform for management and analysis of proteomics data.

    Science.gov (United States)

    Häkkinen, Jari; Vincic, Gregory; Månsson, Olle; Wårell, Kristofer; Levander, Fredrik

    2009-06-01

    Proteome analysis involves many steps that generate large quantities of data in different formats. This creates a need for automatic data merging and extraction of important features from data. Furthermore, metadata need to be collected and reported to enable critical evaluation of results. Many data analysis tools are developed locally in research laboratories and are nontrivial to adapt for other laboratories, preventing optimal exploitation of generated data. The proteomics field would benefit from user-friendly analysis and data management platforms in which method developers can make their analysis tools available for the community. Here, we describe the Proteios Software Environment (ProSE) that is built around a Web-based local data repository for proteomics experiments. The application features sample tracking, project sharing between multiple users, and automated data merging and analysis. ProSE has built-in support for several quantitative proteomics workflows, and integrates searching in several search engines, automated combination of the search results with predetermined false discovery rates, annotation of proteins and submission of results to public repositories. ProSE also provides a programming interface to enable local extensions, as well as database access using Web services. ProSE provides an analysis platform for proteomics research and is targeted for multiuser projects with needs to share data, sample tracking, and analysis result. ProSE is open source software available at http://www.proteios.org .

  20. Analysis of issues related to organizational commitment of extension ...

    African Journals Online (AJOL)

    Organizational commitment of extension personnel in Oyo and Ogun States Agricultural Development Programmes was studied. Organizational commitment is the degree to which the organization members identify with the values and goals of their organization. A census of the extension personnel in both OYSADEP (312) ...

  1. analysis of selected issues in swaziland's agricultural extension

    African Journals Online (AJOL)

    This paper describes the development of agricultural extension in Swaziland with regards to history; organizational philosophy, mission, goals and objectives, implementation delivery system and evaluation; policy framework; funding; linkages between agricultural extension (AE) and research; the planning of AE activities; ...

  2. Analysis of the work environment of extension personnel in Benue ...

    African Journals Online (AJOL)

    This investigation was undertaken to ascertain the environment in which extension workers transact their legitimate professional duties in Benue and Plateau States. The analyses of the results obtained indicate that extension workers operate in a less satisfying environment to the extent that about 67.3%, 36.9%, 88.2%, ...

  3. Analytic calculation of radio emission from parametrized extensive air showers : A tool to extract shower parameters

    NARCIS (Netherlands)

    Scholten, O.; Trinh, T. N. G.; de Vries, K. D.; Hare, B. M.

    2018-01-01

    The radio intensity and polarization footprint of a cosmic-ray induced extensive air shower is determined by the time-dependent structure of the current distribution residing in the plasma cloud at the shower front. In turn, the time dependence of the integrated charge-current distribution in the

  4. Use of Interactive Electronic Audience Response Tools (Clickers) to Evaluate Knowledge Gained in Extension Programming

    Science.gov (United States)

    Gunn, Patrick; Loy, Dan

    2015-01-01

    Effectively measuring short-term impact, particularly a change in knowledge resulting from Extension programming, can prove to be challenging. Clicker-based technology, when used properly, is one alternative that may allow educators to better evaluate this aspect of the logic model. While the potential interface between clicker technology and…

  5. Animal Agriculture in a Changing Climate Online Course: An Effective Tool for Creating Extension Competency

    Science.gov (United States)

    Whitefield, Elizabeth; Schmidt, David; Witt-Swanson, Lindsay; Smith, David; Pronto, Jennifer; Knox, Pam; Powers, Crystal

    2016-01-01

    There is a need to create competency among Extension professionals on the topic of climate change adaptation and mitigation in animal agriculture. The Animal Agriculture in a Changing Climate online course provides an easily accessible, user-friendly, free, and interactive experience for learning science-based information on a national and…

  6. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  7. Extension of an Object Oriented Multidisciplinary Analysis Optimization (MDAO) Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Multidisciplinary design, analysis, and optimization (MDAO) tools today possess limited disciplines with little fidelity modeling capability. These tools are...

  8. Extension of perceived arm length following tool-use: clues to plasticity of body metrics.

    Science.gov (United States)

    Sposito, Ambra; Bolognini, Nadia; Vallar, Giuseppe; Maravita, Angelo

    2012-07-01

    Humans hold a very accurate representation of the metrics of their body parts. Recent evidence shows that the spatial estimation of body parts length, as assessed through a bisection task, is even more accurate than that of non-corporeal extrapersonal objects (Sposito, Bolognini, Vallar, Posteraro, & Maravita (2009)). In the present paper we show that human participants estimate the mid-point of their forearm, which was kept in a radial posture, to be more distal following a 15-min training with a 60 cm-long tool as compared to pre tool-use. This outcome is compatible with an increased representation of the participants' forearm length. Control experiments show that this result was not due to a mere distal proprioceptive shift induced by tool-use, and was not replicated following the use of a 20 cm-long, functionally irrelevant tool. These results strongly support the view that, although the inner knowledge of one's own body metrics appears to be one of the more stable features of body representation, body-space interactions requiring the use of tools that extend the natural range of action, entail measurable dynamic changes in the representation of body metrics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. HOTRES: renewable energies in the hotels. An extensive technical tool for the hotel industry

    Energy Technology Data Exchange (ETDEWEB)

    Karagiorgas, Michaelis; Drosou, Vassiliki [Centre for Renewable Energy Sources (CRES), 19th km Marathonos Avenue, Pikermi GR-19009 (Greece); Tsoutsos, Theocharis [Environmental Engineering Department, Technical University of Crete (TUC), University Campus, Kounoupidiana (Greece); Pouffary, Stephane [Renewable Energies Department (ADEME), 500 route des Lucioles, 06560 Valbonne-Sophia-Antipolis (France); Pagano, Tulio [Azienda Municipale del Gas (AMG), via Ammiraglio Gravina 2/e-90139 Palermo (Italy); Lara, German Lopez [Sociedad para el Desarrollo Energetico de Andalucia (SODEAN), C/Isaac Newton, s/n 41092 Sevilla (Spain); Melim Mendes, Jose Manuel [Agencia Regional da Energia e Ambiente da Regiao Autonoma da Madeira (AREAM), Madeira Tecnopolo, 9000-390 Funchal NIPC 511058012 (Portugal)

    2006-06-15

    The project HOTRES aimed at the systematic implementation of conditions for future massive applications of the renewable energies in the tourism industry. Under the umbrella of this project five renewable energy technologies were promoted (solar thermal, solar passive, solar PV, biomass and geothermal energy) in parallel in five EU regions (East Attica, Sicily, Alpes-Maritimes, Andalusia and Madeira) by the corresponding agencies and promotion centers following an extensive and intensive work program be composed of six elaboration phases. The purpose of this article is to esteem the results achieved in the technical-economic field of the relevant extensive technical support project in 200 hotels as well as to validate the strategic methodology applied for the promotion of the renewable energy technologies (RETs) through the technical assistance of the hotel SMEs. Finally, by proving the liability and economic viability of RET applications in hotels, the largest European hotel installation with solar thermal is presented within technical and economic details.

  10. Information Technologies as a Tool for Agricultural Extension and Farmer-to-Farmer Exchange: Mobile-Phone Video Use in Mali and Burkina Faso

    Science.gov (United States)

    Sousa, Fernando; Nicolay, Gian; Home, Robert

    2016-01-01

    Mobile phones are widespread in the rural areas of Mali and Burkina Faso, but their potential as a tool for knowledge transfer by extension services in the region remains largely unexplored. The aim of this contribution is to evaluate the potential of video on mobile phones as a tool for farmer-to-farmer exchange and agricultural extension in…

  11. Old people's extensive traumatic cerebral infarction (analysis of 48 cases)

    International Nuclear Information System (INIS)

    Xu Wenhui

    2000-01-01

    Objective: To analyse clinically the genetic mechanism, clinical characteristics and the prognosis of old people's extensive traumatic cerebral infarction. Method: Forty eight such cases have been observed and analysed. Results: Old people's extensive traumatic cerebral infarction had its characteristics, which occurred mostly in the blood supply area of big branch blood vessels, and had observed nerve function defect. Conclusion: It has more clinical complication and bad prognosis. The death rate is high

  12. Miniaturized multiwavelength digital holography sensor for extensive in-machine tool measurement

    Science.gov (United States)

    Seyler, Tobias; Fratz, Markus; Beckmann, Tobias; Bertz, Alexander; Carl, Daniel

    2017-06-01

    In this paper we present a miniaturized digital holographic sensor (HoloCut) for operation inside a machine tool. With state-of-the-art 3D measurement systems, short-range structures such as tool marks cannot be resolved inside a machine tool chamber. Up to now, measurements had to be conducted outside the machine tool and thus processing data are generated offline. The sensor presented here uses digital multiwavelength holography to get 3D-shape-information of the machined sample. By using three wavelengths, we get a large artificial wavelength with a large unambiguous measurement range of 0.5mm and achieve micron repeatability even in the presence of laser speckles on rough surfaces. In addition, a digital refocusing algorithm based on phase noise is implemented to extend the measurement range beyond the limits of the artificial wavelength and geometrical depth-of-focus. With complex wave field propagation, the focus plane can be shifted after the camera images have been taken and a sharp image with extended depth of focus is constructed consequently. With 20mm x 20mm field of view the sensor enables measurement of both macro- and micro-structure (such as tool marks) with an axial resolution of 1 µm, lateral resolution of 7 µm and consequently allows processing data to be generated online which in turn qualifies it as a machine tool control. To make HoloCut compact enough for operation inside a machining center, the beams are arranged in two planes: The beams are split into reference beam and object beam in the bottom plane and combined onto the camera in the top plane later on. Using a mechanical standard interface according to DIN 69893 and having a very compact size of 235mm x 140mm x 215mm (WxHxD) and a weight of 7.5 kg, HoloCut can be easily integrated into different machine tools and extends no more in height than a typical processing tool.

  13. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  14. Analysis of integrated plant upgrading/life extension programs

    International Nuclear Information System (INIS)

    McCutchan, D.A.; Massie, H.W. Jr.; McFetridge, R.H.

    1988-01-01

    A present-worth generating cost model has been developed and used to evaluate the economic value of integrated plant upgrading life extension project in nuclear power plants. This paper shows that integrated plant upgrading programs can be developed in which a mix of near-term availability, power rating, and heat rate improvements can be obtained in combination with life extension. All significant benefits and costs are evaluated from the viewpoint of the utility, as measured in discounted revenue requirement differentials between alternative plans which are equivalent in system generating capacity. The near-term upgrading benefits are shown to enhance the benefit picture substantially. In some cases the net benefit is positive, even if the actual life extension proves to be less than expected

  15. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  16. Analytic calculation of radio emission from parametrized extensive air showers: A tool to extract shower parameters

    Science.gov (United States)

    Scholten, O.; Trinh, T. N. G.; de Vries, K. D.; Hare, B. M.

    2018-01-01

    The radio intensity and polarization footprint of a cosmic-ray induced extensive air shower is determined by the time-dependent structure of the current distribution residing in the plasma cloud at the shower front. In turn, the time dependence of the integrated charge-current distribution in the plasma cloud, the longitudinal shower structure, is determined by interesting physics which one would like to extract, such as the location and multiplicity of the primary cosmic-ray collision or the values of electric fields in the atmosphere during thunderstorms. To extract the structure of a shower from its footprint requires solving a complicated inverse problem. For this purpose we have developed a code that semianalytically calculates the radio footprint of an extensive air shower given an arbitrary longitudinal structure. This code can be used in an optimization procedure to extract the optimal longitudinal shower structure given a radio footprint. On the basis of air-shower universality we propose a simple parametrization of the structure of the plasma cloud. This parametrization is based on the results of Monte Carlo shower simulations. Deriving the parametrization also teaches which aspects of the plasma cloud are important for understanding the features seen in the radio-emission footprint. The calculated radio footprints are compared with microscopic CoREAS simulations.

  17. Analysis of risk management practices among extension workers in ...

    African Journals Online (AJOL)

    The extension workers were exposed to hazards / risks such as accident on the farm, car/motor/bicycle accident, injury to the leg/hand, encounter with ... practices among workers in government establishment include: lack of safety training, absence of risk management policy and non-provision of safety gargets. Keywords: ...

  18. Analysis of Training Needs of Extension Agents on Climate Change ...

    African Journals Online (AJOL)

    extension agents on climate change related issues were educating farmers on pest control (90.9%), rendering of technical advice to farmers (84.8%), establishment of SPAT to monitor climate change impacts (81.8%) and indigenous technology development to mitigate climate change impacts (81.8%). The training needs in ...

  19. Accessibility Analyst: an integrated GIS tool for accessibility analysis in urban transportation planning

    OpenAIRE

    Suxia Liu; Xuan Zhu

    2004-01-01

    The authors present an integrated GIS tool, Accessibility Analyst, for accessibility analysis in urban transportation planning, built as an extension to the desktop GIS software package, ArcView. Accessibility Analyst incorporates a number of accessibility measures, ranging from catchment profile analysis to cumulative-opportunity measures, gravity-type measures, and utility-based measures, contains several travel-impedance measurement tools for estimating the travel distance, time, or cost b...

  20. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  1. Bibliological analysis: security tool in collections of rare books

    Directory of Open Access Journals (Sweden)

    Raphael Diego Greenhalgh

    2015-04-01

    Full Text Available The historical-cultural and economic value of rare book makes the same is often the object of robbery and theft of rare books. Therefore, the guardians of this type of library institutions have to strengthen their safety in order to inhibit such criminal practice. This paper, through literature review has the objective analyzing the possibility of using bibliological analysis as a security tool. Because with the detailed description of the copies is possible to increase the knowledge about the collection of an institution and attribute unequivocal property to rare specimens, that not always receive stamps or other marks, making it difficult to return the books in case of robbery and theft and possible recovery thereof. Therefore, the bibliological analysis individualizes the exemplary, besides allowing the handling and extensive knowledge of this, essential factors to security of the books.

  2. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... and the value of associated rewards in states of interest for a real-world example from a case company in the Danish baked goods industry. The developments are presented in a generalised fashion to make them relevant to the general problem of implementing quantitative probabilistic model checking of graph...

  3. First evidence of an extensive Acheulean large cutting tool accumulation in Europe from Porto Maior (Galicia, Spain).

    Science.gov (United States)

    Méndez-Quintas, E; Santonja, M; Pérez-González, A; Duval, M; Demuro, M; Arnold, L J

    2018-02-15

    We describe a European Acheulean site characterised by an extensive accumulation of large cutting tools (LCT). This type of Lower Paleolithic assemblage, with dense LCT accumulations, has only been found on the African continent and in the Near East until now. The identification of a site with large accumulations of LCTs favours the hypothesis of an African origin for the Acheulean of Southwest Europe. The lithic tool-bearing deposits date back to 293-205 thousand years ago. Our chronological findings confirm temporal overlap between sites with clear "African" Acheulean affinities and Early Middle Paleolithic sites found elsewhere in the region. These complex technological patterns could be consistent with the potential coexistence of different human species in south-western Europe during the Middle Pleistocene.

  4. Barcode extension for analysis and reconstruction of structures

    Science.gov (United States)

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L.; Gootenberg, Jonathan S.; Yin, Peng

    2017-03-01

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.

  5. SEAT: A strategic engagement analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.; Michelsen, C.; Morgeson, D.

    1988-01-01

    The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

  6. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  7. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  8. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  9. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  10. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  11. Airborne LIDAR Data Processing and Analysis Tools

    Science.gov (United States)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  12. Extensions of positive definite functions applications and their harmonic analysis

    CERN Document Server

    Jorgensen, Palle E T; Tian, Feng

    2016-01-01

    This monograph deals with the mathematics of extending given partial data-sets obtained from experiments; Experimentalists frequently gather spectral data when the observed data is limited, e.g., by the precision of instruments; or by other limiting external factors. Here the limited information is a restriction, and the extensions take the form of full positive definite function on some prescribed group. It is therefore both an art and a science to produce solid conclusions from restricted or limited data. While the theory of is important in many areas of pure and applied mathematics, it is difficult for students and for the novice to the field, to find accessible presentations which cover all relevant points of view, as well as stressing common ideas and interconnections. We have aimed at filling this gap, and we have stressed hands-on-examples.

  13. Extensive pathological analysis of selected melanoma sentinel lymph nodes

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Sjoegren, Pia; Hamilton-Dutoit, Stephen Jacques

    2008-01-01

    BACKGROUND: Extensive pathological workup of sentinel lymph nodes (SLNs) in melanoma detects more patients with metastasis-positive SLNs than do routine protocols, but at the cost of high laboratory workloads. We aimed to design a protocol that reduced this workload without compromising metastasis...... detection. METHODS: We analyzed 920 SLNs from 321 consecutive patients with melanoma by complete step sectioning and immunohistochemistry. We designed different models to theoretically reduce the number of histological sections examined and compared the results from these simulations with results obtained...... with our extended protocol, with the restricted national Danish protocol, and with the protocol recommended by the European Organization for Research and Treatment of Cancer (EORTC). RESULTS: The extended protocol increased the metastasis detection rate by 22% (95% confidence interval, 11-34; 30.8% vs. 25...

  14. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  15. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  16. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  17. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  18. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  19. Genetic analysis of presbycusis by arrayed primer extension.

    Science.gov (United States)

    Rodriguez-Paris, Juan; Ballay, Charles; Inserra, Michelle; Stidham, Katrina; Colen, Tahl; Roberson, Joseph; Gardner, Phyllis; Schrijver, Iris

    2008-01-01

    Using the Hereditary Hearing Loss arrayed primer extension (APEX) array, which contains 198 mutations across 8 hearing loss-associated genes (GJB2, GJB6, GJB3, GJA1, SLC26A4, SLC26A5, 12S-rRNA, and tRNA Ser), we compared the frequency of sequence variants in 94 individuals with early presbycusis to 50 unaffected controls and aimed to identify possible genetic contributors. This cross-sectional study was performed at Stanford University with presbycusis samples from the California Ear Institute. The patients were between ages 20 and 65 yr, with adult-onset sensorineural hearing loss of unknown etiology, and carried a clinical diagnosis of early presbycusis. Exclusion criteria comprised known causes of hearing loss such as significant noise exposure, trauma, ototoxic medication, neoplasm, and congenital infection or syndrome, as well as congenital or pediatric onset. Sequence changes were identified in 11.7% and 10% of presbycusis and control alleles, respectively. Among the presbycusis group, these solely occurred within the GJB2 and SLC26A4 genes. Homozygous and compound heterozygous pathogenic mutations were exclusively seen in affected individuals. We were unable to detect a statistically significant difference between our control and affected populations regarding the frequency of sequence variants detected with the APEX array. Individuals who carry two mild mutations in the GJB2 gene possibly have an increased risk of developing early presbycusis.

  20. Extensible Data Set Architecture for Systems Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The process of aircraft design requires the integration of data from individual analysis of aerodynamic, structural, thermal, and behavioral properties of a flight...

  1. EcoSys{trademark}: Supporting Green Design through an extensible life cycle analysis approach

    Energy Technology Data Exchange (ETDEWEB)

    Gockel, B.C.; Watkins, R.D.; Kleban, S.D.

    1995-07-01

    EcoSys is an environmental decision support tool to assist in the design of green products and process. EcoSys consist of an information and expert system that contains input from experts in products, processes and the environment as well as a flexible, goal driven, rule based decision model that can accommodate many environmental management perspectives. This includes allowing specific users to specify weighting factors for the impact decision model. This tool is extensible in that it can be utilized within the boundaries of a company and migrated to include suppliers and customers until full life cycles are assessed. We discussed the details and use of the environmental models available for the experts. We also showed how interviews with manufacturing experts led to the design of a goal-driven rule based reasoning system to support the problem solving. Finally, we offered a number of examples that detailed the types of analysis possible with EcoSys. Our ongoing work is to increase the precision of the environmental attributes database and to extend the product-process database to support a wider set of product analyses. Based on user feedback, we are also continuing to improve the X-Window user interface.

  2. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  3. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...

  4. Extensions of the space trajectories error analysis programs

    Science.gov (United States)

    Adams, G. L.; Bradt, A. J.; Peterson, F. M.

    1971-01-01

    A generalized covariance analysis technique which permits the study of the sensitivity of linear estimation algorithms to errors in a priori statistics has been developed and programed. Several sample cases are presented to illustrate the use of this technique. Modifications to the Simulated Trajectories Error Analysis Program (STEAP) to enable targeting a multiprobe mission of the Planetary Explorer type are discussed. The logic for the mini-probe targeting is presented. Finally, the initial phases of the conversion of the Viking mission Lander Trajectory Reconstruction (LTR) program for use on Venus missions is discussed. An integrator instability problem is discussed and a solution proposed.

  5. Transport network extensions for accessibility analysis in geographic information systems

    NARCIS (Netherlands)

    Jong, Tom de; Tillema, T.

    2005-01-01

    In many developed countries high quality digital transport networks are available for GIS based analysis. Partly this is due to the requirements of route planning software for internet and car navigation systems. Properties of these networks consist among others of road quality attributes,

  6. Practical extensions to the level of repair analysis

    NARCIS (Netherlands)

    Basten, Robertus Johannes Ida; van der Heijden, Matthijs C.; Schutten, Johannes M.J.

    2010-01-01

    The level of repair analysis (lora) gives answers to three questions that are posed when deciding on how to maintain capital goods: 1) which components to repair upon failure and which to discard, 2) at which locations in the repair network to perform each type of repairs, and 3) at which locations

  7. National University Extension Policy: analysis of the experience of the Institute of Health Sciences of UFPA

    Directory of Open Access Journals (Sweden)

    Durbens Martins Nascimento

    2017-11-01

    Full Text Available The current study purpose refers to the university extension, addressing the concepts of university, organization, knowledge andextension. We sought to answer the following question: Does the outcome that has been generated through projects on extension practices developed by ICS/UFPA actually fulfill the guidelines of the National University Extension Policy? The pursued objective consisted in a general analysisof the extension practices of the Institute of Health Sciences (ICS at the Federal University of Pará (UFPA in the light of the National University Extension Policy (NUEP, comprehending dialogical interaction, interdisciplinary and interprofessionalism, teaching-research-extension inseparability, impact on student training, and impact and social transformation envisaged within the Policy Extension of UFPA. The research methodology comprehended a quantitative and qualitative approach supported by bibliographic and documentary supply. It was consulted the collection of various documents, given more evidence to those focused on the university extension in 2012, contained into several instances of UFPA. A number of 80 projects and 60 reports of extension of ICS were selected for analysis in the year 2012. The results revealed that the guidelines of PNEU fell far short of being reached by the ICS products, when it came to interdisciplinarity and interprofessionalism, teaching-research-extension inseparability, and impact and social transformation. Furthermore, there was little participation by teachers, students and administrative technicians of ICS in such activity. It was concluded that the extension model of ICS consists is a welfare model, developed through service provision.

  8. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  9. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  10. Postan - A Package for Postoptimal Analysis (An Extension of Minos)

    OpenAIRE

    Dobrowolski, G.; Hajduk, K.; Korytowski, A.; Rys, T.

    1984-01-01

    This paper presents a new software package which has been developed in collaboration with IIASA The new package, POSTAN, is designed for postoptimal analysis of linear programming problems, and is embedded in the well-known linear and nonlinear programming code MINOS. POSTAN is composed of a number of FORTRAN subroutines which may be called by adding some new keywords to the original list of MINOS specifications. The main function of POSTAN is to determine the ranges in which certain paramete...

  11. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    Science.gov (United States)

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  12. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  13. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  14. 16PF research into addiction: meta-analysis and extension.

    Science.gov (United States)

    Tuite, D R; Luiten, J W

    1986-03-01

    Meta-analysis of 34 studies on Cattell's 16PF test reveals ragged egos (C-), guilt (O), distrust (L), frustration (Q4), alienation (G-), vague identity (Q3-), alarm (H-), resentment (Q1), quasi-autism (M), scattered intellect (B-), grandiosity (E), autonomy (Q2), infantilism (I), avoidance (A-), and deviousness (N). The aberrant scores on E, G, I, Q1, and Q2 discriminate addicts from suicidals and the chronically ill or unemployed. We found nine types of addicts in our developmental study of 83 members of Alcoholics Anonymous. On the more stable second-order 16PF factors, 43% were highest on Autonomous, 37% on Desperate, 16% on Tough Poise, and 4% on Extravert. Profiles differed more by sexual preference than by gender. Recidivism was highest among homosexual men (38%) and the desperate (25%). Only the Fourth and Fifth Steps of the AA program seem crucial to recovery. Treatment programs based on these and tailored to sexual preference and the second-order personality types seem highly advisable.

  15. Statistical analysis of CSP plants by simulating extensive meteorological series

    Science.gov (United States)

    Pavón, Manuel; Fernández, Carlos M.; Silva, Manuel; Moreno, Sara; Guisado, María V.; Bernardos, Ana

    2017-06-01

    The feasibility analysis of any power plant project needs the estimation of the amount of energy it will be able to deliver to the grid during its lifetime. To achieve this, its feasibility study requires a precise knowledge of the solar resource over a long term period. In Concentrating Solar Power projects (CSP), financing institutions typically requires several statistical probability of exceedance scenarios of the expected electric energy output. Currently, the industry assumes a correlation between probabilities of exceedance of annual Direct Normal Irradiance (DNI) and energy yield. In this work, this assumption is tested by the simulation of the energy yield of CSP plants using as input a 34-year series of measured meteorological parameters and solar irradiance. The results of this work show that, even if some correspondence between the probabilities of exceedance of annual DNI values and energy yields is found, the intra-annual distribution of DNI may significantly affect this correlation. This result highlights the need of standardized procedures for the elaboration of representative DNI time series representative of a given probability of exceedance of annual DNI.

  16. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  17. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  18. Extension of HCDA safety analysis to large PCRV containment structures

    International Nuclear Information System (INIS)

    Marchertas, A.H.; Fistedis, S.H.; Bazant, A.P.

    1977-01-01

    The prestressed concrete reactor vessel (PCRV) has thus far been applied for nuclear reactor containments involving steady-state loading. This paper, on the other hand, considers the PCRV for the LMFBR containment which must sustain dynamic effects stemming from the Hypothetical Core Disruptive Accident (HCDA). To perform a feasibility study of such a PCRV, a specialized formulation for transient analysis of such structures was developed. The analytical model of a PCRV is represented in a finite element computer code involving an explicit time-integration procedure. The formulation is particularized for the axisymmetric geometry and includes simulations of tensile cracking of concrete, reinforcement and prestressing capability. The tensile cracking of concrete and concrete reinforcement are both modeled in a single subroutine describing the element constitutive relations. The stresses in the reinforcement and concrete stresses are computed separately and combined to give an overall stress state of the composite material. The reinforcement is assumed to be elastic-plastic; the concrete is taken to be elastic in tension with a tensile stress limit. Cracking of concrete is based on the principal stress-when the tensile criterion is exceeded, a crack is assumed to form normal to the direction of the maximum principal stress. The stress component in the direction normal to the crack is then set to zero. This procedure of cracking within an element is continued until three cracks have formed, at which time the material is assumed to have lost all its tensile carrying capacity. An existing crack is also permitted to close. This option is introduced so that a compressive load can still be sustained by the material subsequent to crack formation if compressive strains develop

  19. Satellite Images and Gaussian Parameterization for an Extensive Analysis of Urban Heat Islands in Thailand

    Directory of Open Access Journals (Sweden)

    Chaiyapon Keeratikasikorn

    2018-04-01

    Full Text Available For the first time, an extensive study of the surface urban heat island (SUHI in Thailand’s six major cities is reported, using 728 MODIS (MODerate Resolution Imaging Spectroradiometer images for each city. The SUHI analysis was performed at three timescales—diurnal, seasonal, and multiyear. The diurnal variation is represented by the four MODIS passages (10:00, 14:00, 22:00, and 02:00 local time and the seasonal variation by summer and winter maps, with images covering a 14-year interval (2003–2016. Also, 126 Landsat scenes were processed to classify and map land cover changes for each city. To analyze and compare the SUHI patterns, a least-square Gaussian fitting method has been applied and the corresponding empirical metrics quantified. Such an approach represents, when applicable, an efficient quantitative tool to perform comparisons that a visual inspection of a great number of maps would not allow. Results point out that SUHI does not show significant seasonality differences, while SUHI in the daytime is a more evident phenomenon with respect to nighttime, mainly due to solar forcing and intense human activities and traffic. Across the 14 years, the biggest city, Bangkok, shows the highest SUHI maximum intensities during daytime, with values ranging between 4 °C and 6 °C; during nighttime, the intensities are rather similar for all the six cities, between 1 °C and 2 °C. However, these maximum intensities are not correlated with the urban growth over the years. For each city, the SUHI spatial extension represented by the Gaussian footprint is generally not affected by the urban area sprawl across the years, except for Bangkok and Chiang Mai, whose daytime SUHI footprints show a slight increase over the years. Orientation angle and central location of the fitted surface also provide information on the SUHI layout in relation to the land use of the urban texture.

  20. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  1. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    . The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur.......This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...

  2. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system

    International Nuclear Information System (INIS)

    Lee, J.Y.; Moon, H.J.; Kim, T.I.; Kim, H.W.; Han, M.Y.

    2013-01-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. -- Highlights: •Urban extensive green roof systems have a synergetic effect in mitigating urban runoff. •These systems are improve runoff mitigation and decentralized urban water management. •These systems have a high water-retaining capacity response to rainfall of less than 20 mm/h. •The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52. -- Extensive green-roofs are an effective storm water best-management practice and the proposed parameters can be applied to mitigate urban runoff

  3. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  4. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  5. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  6. Buffer$--An Economic Analysis Tool

    Science.gov (United States)

    Gary Bentrup

    2007-01-01

    Buffer$ is an economic spreadsheet tool for analyzing the cost-benefits of conservation buffers by resource professionals. Conservation buffers are linear strips of vegetation managed for multiple landowner and societal objectives. The Microsoft Excel based spreadsheet can calculate potential income derived from a buffer, including income from cost-share/incentive...

  7. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  8. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  9. Multivariate and 2D Extensions of Singular Spectrum Analysis with the Rssa Package

    Directory of Open Access Journals (Sweden)

    Nina Golyandina

    2015-10-01

    Full Text Available Implementation of multivariate and 2D extensions of singular spectrum analysis (SSA by means of the R package Rssa is considered. The extensions include MSSA for simultaneous analysis and forecasting of several time series and 2D-SSA for analysis of digital images. A new extension of 2D-SSA analysis called shaped 2D-SSA is introduced for analysis of images of arbitrary shape, not necessary rectangular. It is shown that implementation of shaped 2D-SSA can serve as a basis for implementation of MSSA and other generalizations. Efficient implementation of operations with Hankel and Hankel-block-Hankel matrices through the fast Fourier transform is suggested. Examples with code fragments in R, which explain the methodology and demonstrate the proper use of Rssa, are presented.

  10. Radiographic analysis of dental implant extensions using bone grafts on dogs.

    Science.gov (United States)

    Cardoso, Álida Lúcia; Lima, Cirilo Antônio de Paula; Montebello Filho, Agenor; Pereira, Adriano Alves

    2018-01-10

    Despite the wide use of dental implants they can bring inconveniences, as the moment one reaches osseointegration, these can no longer be extended. Therefore, if a problem occurs regarding its positioning, the options open are substitution or burial of the implant. With implant substitution, there exists the risk of local bone loss and/or future loss of the new implant. This study proposes a new device (implant extender) for extending the dental implant. The feasibility of this technique is verified through installing dental implant extensions onto the humerus bone of dogs with autogenous bone grafts. Implants of 3.3 mm in diameter by 6 mm in length and implant extensions with a 3.3 mm diameter and 2.2 mm length were installed onto humerus of 4 healthy dogs, using an autogenous bone graft in a block made from ilium. The biomechanical percussion tests were performed on the implant extensions and then the implant-extension sets were removed for radiographic analysis. In the biomechanical percussion, none of the extensions present clinical mobility. As for the x-rays, these were analyzed by 20 professionals, who concluded that there was a 100% success rate with bone formation around the implants, 74.1% for bone neoformation of the implant extensions, and 80.1% referring to the adaptation of the implant extension. Bone formation occurred in every installed dental implant. In most cases, there occurred bone neoformation of the extensions and adaptation of the extension/implant set, according to the x-ray analysis performed by the evaluators. An absence of clinical mobility in the extensions was also observed. Although the results were promising, these techniques still need to be researched in humans, as an alternative for reducing elongated prosthetic crowns or poorly installed implants, as well as the modification of the type of implants among other applications. © 2018 Wiley Periodicals, Inc.

  11. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  12. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    Science.gov (United States)

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  13. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  14. An Integrated Tool for System Analysis of Sample Return Vehicles

    Science.gov (United States)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  15. KIT multi-physics tools for the analysis of design and beyond design basis accidents of light water reactors

    International Nuclear Information System (INIS)

    Sanchez, Victor Hugo; Miassoedov, Alexei; Steinbrueck, M.; Tromm, W.

    2016-01-01

    This paper describes the KIT numerical simulation tools under extension and validation for the analysis of design and beyond design basis accidents (DBA) of Light Water Reactors (LWR). The description of the complex thermal hydraulic, neutron kinetics and chemo-physical phenomena going on during off-normal conditions requires the development of multi-physics and multi-scale simulations tools which are fostered by the rapid increase in computer power nowadays. The KIT numerical tools for DBA and beyond DBA are validated using experimental data of KIT or from abroad. The developments, extensions, coupling approaches and validation work performed at KIT are shortly outlined and discussed in this paper.

  16. KIT multi-physics tools for the analysis of design and beyond design basis accidents of light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Victor Hugo; Miassoedov, Alexei; Steinbrueck, M.; Tromm, W. [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany)

    2016-05-15

    This paper describes the KIT numerical simulation tools under extension and validation for the analysis of design and beyond design basis accidents (DBA) of Light Water Reactors (LWR). The description of the complex thermal hydraulic, neutron kinetics and chemo-physical phenomena going on during off-normal conditions requires the development of multi-physics and multi-scale simulations tools which are fostered by the rapid increase in computer power nowadays. The KIT numerical tools for DBA and beyond DBA are validated using experimental data of KIT or from abroad. The developments, extensions, coupling approaches and validation work performed at KIT are shortly outlined and discussed in this paper.

  17. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  20. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  1. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  2. Creating an anthropomorphic digital MR phantom—an extensible tool for comparing and evaluating quantitative imaging algorithms

    Science.gov (United States)

    Bosca, Ryan J.; Jackson, Edward F.

    2016-01-01

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  3. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  4. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  5. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  6. Does tool use extend peripersonal space? A review and re-analysis.

    Science.gov (United States)

    Holmes, Nicholas P

    2012-04-01

    The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last 15 years, this idea has been reframed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously reported finding that tool use does not literally extend peripersonal space, the overall effect sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.

  7. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  8. A Tool for the Concise Analysis of Patient Safety Incidents.

    Science.gov (United States)

    Pham, Julius Cuong; Hoffman, Carolyn; Popescu, Ioana; Ijagbemi, O Mayowa; Carson, Kathryn A

    2016-01-01

    Patient safety incidents, sometimes referred to as adverse events, incidents, or patient safety events, are too common an occurrence in health care. Most methods for incident analysis are time and labor intensive. Given the significant resource requirements of a root cause analysis, for example, there is a need for a more targeted and efficient method of analyzing a larger number of incidents. Although several concise incident analysis tools are in existence, there are no published studies regarding their usability or effectiveness. Building on previous efforts, a Concise Incident Analysis (CIA) methodology and tool were developed to facilitate analysis of no- or low-harm incidents. Staff from 11 hospitals in five countries-Australia, Canada, Hong Kong, India, and the United States-pilot tested the tool in two phases. The tool was evaluated and refined after each phase on the basis of user perceptions of usability and effectiveness. From September 2013 through January 2014, 52 patient safety incidents were analyzed. A broad variety of incident types were investigated, the most frequent being patient falls (25%). Incidents came from a variety of hospital work areas, the most frequent being from the medical ward (37%). Most incidents investigated resulted in temporary harm or no harm (94%). All or most sites found the tool "understandable" (100%), "easy to use" (89%), and "effective" (89%). Some 95% of participants planned to continue to use all or some parts of the tool after the pilot. Qualitative feedback suggested that the tool allowed analysis of incidents that were not currently being analyzed because of insufficient resources. The tool was described as simple to use, easy to document, and aligned with the flow of the incident analysis. A concise tool for the investigation of patient safety incidents with low or no harm was well accepted across a select group of hospitals from five countries.

  9. RNAmute: RNA secondary structure mutation analysis tool

    Directory of Open Access Journals (Sweden)

    Barash Danny

    2006-04-01

    Full Text Available Abstract Background RNAMute is an interactive Java application that calculates the secondary structure of all single point mutations, given an RNA sequence, and organizes them into categories according to their similarity with respect to the wild type predicted structure. The secondary structure predictions are performed using the Vienna RNA package. Several alternatives are used for the categorization of single point mutations: Vienna's RNAdistance based on dot-bracket representation, as well as tree edit distance and second eigenvalue of the Laplacian matrix based on Shapiro's coarse grain tree graph representation. Results Selecting a category in each one of the processed tables lists all single point mutations belonging to that category. Selecting a mutation displays a graphical drawing of the single point mutation and the wild type, and includes basic information such as associated energies, representations and distances. RNAMute can be used successfully with very little previous experience and without choosing any parameter value alongside the initial RNA sequence. The package runs under LINUX operating system. Conclusion RNAMute is a user friendly tool that can be used to predict single point mutations leading to conformational rearrangements in the secondary structure of RNAs. In several cases of substantial interest, notably in virology, a point mutation may lead to a loss of important functionality such as the RNA virus replication and translation initiation because of a conformational rearrangement in the secondary structure.

  10. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  11. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  12. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  13. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  14. Failure Modes and Effects Analysis (FMEA) Assistant Tool

    Data.gov (United States)

    National Aeronautics and Space Administration — The FMEA Assistant tool offers a new and unique approach to assist hardware developers and safety analysts perform failure analysis by using model based systems...

  15. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  16. Constructing Social Networks From Secondary Storage With Bulk Analysis Tools

    Science.gov (United States)

    2016-06-01

    displays more than one social network that needs to be separated. The drive owner of Component d9c1 was an administrator for an email server, and would...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS by Janina L. Green...AND SUBTITLE CONSTRUCTING SOCIAL NETWORKS FROM SECONDARY STORAGE WITH BULK ANALYSIS TOOLS 5. FUNDING NUMBERS 6. AUTHOR(S) Janina L. Green 7. PERFORMING

  17. Smartphone-based accelerometry is a valid tool for measuring dynamic changes in knee extension range of motion

    DEFF Research Database (Denmark)

    Støve, Morten Pallisgaard; Palsson, Thorvaldur Skuli; Hirata, Rogerio Pessoto

    2018-01-01

    the validity of smartphone-based accelerometry measuring the dynamic range of motion of the knee joint during a passively executed extension movement.Materials and methodsDynamic knee extension range of motion was examined three consecutive times in twenty-one healthy male subjects utilising an isokinetic...... dynamometer to generate passively the extension motion. Measurements of joint angles in dynamic knee extension were performed using two methods: (i) isokinetic dynamometer (gold-standard method, Biodex System 4 Pro) and (ii) smartphone (iPhone 6, attached to the tibia) accelerometry data.......ResultsTests of validity showed excellent correlation (rs = 0.899) between methods, with a low standard error of measurement of 0.62 deg. and limits of agreement ranging from − 9.1 to 8.8 deg. Interclass correlation coefficients showed excellent between-measures reliability (ICC > 0.862) for both methods.ConclusionsSmartphone...

  18. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    Science.gov (United States)

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. © 2013 Elsevier Ltd. All rights reserved.

  19. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of

  20. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  1. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  2. Risk analysis tools for force protection and infrastructure/asset protection

    International Nuclear Information System (INIS)

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.

    1998-01-01

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework

  3. Analysis of Multiple Genomic Sequence Alignments: A Web Resource, Online Tools, and Lessons Learned From Analysis of Mammalian SCL Loci

    Science.gov (United States)

    Chapman, Michael A.; Donaldson, Ian J.; Gilbert, James; Grafham, Darren; Rogers, Jane; Green, Anthony R.; Göttgens, Berthold

    2004-01-01

    Comparative analysis of genomic sequences is becoming a standard technique for studying gene regulation. However, only a limited number of tools are currently available for the analysis of multiple genomic sequences. An extensive data set for the testing and training of such tools is provided by the SCL gene locus. Here we have expanded the data set to eight vertebrate species by sequencing the dog SCL locus and by annotating the dog and rat SCL loci. To provide a resource for the bioinformatics community, all SCL sequences and functional annotations, comprising a collation of the extensive experimental evidence pertaining to SCL regulation, have been made available via a Web server. A Web interface to new tools specifically designed for the display and analysis of multiple sequence alignments was also implemented. The unique SCL data set and new sequence comparison tools allowed us to perform a rigorous examination of the true benefits of multiple sequence comparisons. We demonstrate that multiple sequence alignments are, overall, superior to pairwise alignments for identification of mammalian regulatory regions. In the search for individual transcription factor binding sites, multiple alignments markedly increase the signal-to-noise ratio compared to pairwise alignments. PMID:14718377

  4. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  5. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  6. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  7. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  8. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  9. RiboCAT: a new capillary electrophoresis data analysis tool for nucleic acid probing.

    Science.gov (United States)

    Cantara, William A; Hatterschide, Joshua; Wu, Weixin; Musier-Forsyth, Karin

    2017-02-01

    Chemical and enzymatic probing of RNA secondary structure and RNA/protein interactions provides the basis for understanding the functions of structured RNAs. However, the ability to rapidly perform such experiments using capillary electrophoresis has been hampered by relatively labor-intensive data analysis software. While these computationally robust programs have been shown to calculate residue-specific reactivities to a high degree of accuracy, they often require time-consuming manual intervention and lack the ability to be easily modified by users. To alleviate these issues, RiboCAT (Ribonucleic acid capillary-electrophoresis analysis tool) was developed as a user-friendly, Microsoft Excel-based tool that reduces the need for manual intervention, thereby significantly shortening the time required for data analysis. Features of this tool include (i) the use of an Excel platform, (ii) a method of intercapillary signal alignment using internal size standards, (iii) a peak-sharpening algorithm to more accurately identify peaks, and (iv) an open architecture allowing for simple user intervention. Furthermore, a complementary tool, RiboDOG (RiboCAT data output generator) was designed to facilitate the comparison of multiple data sets, highlighting potential inconsistencies and inaccuracies that may have occurred during analysis. Using these new tools, the secondary structure of the HIV-1 5' untranslated region (5'UTR) was determined using selective 2'-hydroxyl acylation analyzed by primer extension (SHAPE), matching the results of previous work. © 2017 Cantara et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  10. Extensive risk analysis of mechanical failure for an epiphyseal hip prothesis: a combined numerical-experimental approach.

    Science.gov (United States)

    Martelli, S; Taddei, F; Cristofolini, L; Gill, H S; Viceconti, M

    2011-02-01

    There has been recent renewed interest in proximal femur epiphyseal replacement as an alternative to conventional total hip replacement. In many branches of engineering, risk analysis has proved to be an efficient tool for avoiding premature failures of innovative devices. An extensive risk analysis procedure has been developed for epiphyseal hip prostheses and the predictions of this method have been compared to the known clinical outcomes of a well-established contemporary design, namely hip resurfacing devices. Clinical scenarios leading to revision (i.e. loosening, neck fracture and failure of the prosthetic component) were associated with potential failure modes (i.e. overload, fatigue, wear, fibrotic tissue differentiation and bone remodelling). Driving parameters of the corresponding failure mode were identified together with their safe thresholds. For each failure mode, a failure criterion was identified and studied under the most relevant physiological loading conditions. All failure modes were investigated with the most suitable investigation tool, either numerical or experimental. Results showed a low risk for each failure scenario either in the immediate postoperative period or in the long term. These findings are in agreement with those reported by the majority of clinical studies for correctly implanted devices. Although further work is needed to confirm the predictions of this method, it was concluded that the proposed risk analysis procedure has the potential to increase the efficacy of preclinical validation protocols for new epiphyseal replacement devices.

  11. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  12. A computer aided tolerancing tool, II: tolerance analysis

    NARCIS (Netherlands)

    Salomons, O.W.; Haalboom, F.J.; Jonge poerink, H.J.; van Slooten, F.; van Slooten, F.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1996-01-01

    A computer aided tolerance analysis tool is presented that assists the designer in evaluating worst case quality of assembly after tolerances have been specified. In tolerance analysis calculations, sets of equations are generated. The number of equations can be restricted by using a minimum number

  13. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  14. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  15. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  16. Using the MoodleReader as an Extensive Reading Tool and Its Effect on Iranian EFL Students' Incidental Vocabulary Learning

    Science.gov (United States)

    Alavi, Sepideh; Keyvanshekouh, Afsaneh

    2012-01-01

    The present study focused on using the MoodleReader to promote extensive reading (ER) in an Iranian EFL context, emphasizing its effect on students' incidental vocabulary acquisition. Thirty eight Shiraz University sophomores were assigned to experimental and control groups. The experimental group used the MoodleReader for their ER program, while…

  17. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  18. Simplified Analysis Tool for Ship-Ship Collision

    DEFF Research Database (Denmark)

    Yamada, Yasuhira; Pedersen, Preben Terndrup

    2007-01-01

    The purpose of this paper is to develop a simplified ship collision analysis tool in order to rapidly estimate the structural damage and energy absorption of both striking and struck ships as well as prediction of rupture of cargo oil tanks of struck tankers. The present tool calculates external...... to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability...

  19. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  20. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  1. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  2. Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool

    Science.gov (United States)

    2017-06-01

    REQUIREMENTS GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL by Jonathan M. Swan June 2017 Thesis Advisor...GENERATION PROCESS FOR THE LOGISTICS ANALYSIS AND WARGAME SUPPORT TOOL 5. FUNDING NUMBERS 6. AUTHOR(S) Jonathan M. Swan 7. PERFORMING ORGANIZATION...maximum 200 words) This thesis conducts an analysis of the system requirements for the Logistics Analysis and Wargame Support Tool (LAWST). It studies

  3. Hidden Markov models for sequence analysis: extension and analysis of the basic method

    DEFF Research Database (Denmark)

    Hughey, Richard; Krogh, Anders Stærmose

    1996-01-01

    -maximization training procedure is relatively straight-forward. In this paper,we review the mathematical extensions and heuristics that move the method from the theoreticalto the practical. Then, we experimentally analyze the effectiveness of model regularization,dynamic model modification, and optimization strategies...

  4. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate the quanti......SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...

  5. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  6. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  7. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    Science.gov (United States)

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  8. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design.......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  9. GEPAS, a web-based tool for microarray data analysis and interpretation

    Science.gov (United States)

    Tárraga, Joaquín; Medina, Ignacio; Carbonell, José; Huerta-Cepas, Jaime; Minguez, Pablo; Alloza, Eva; Al-Shahrour, Fátima; Vegas-Azcárate, Susana; Goetz, Stefan; Escobar, Pablo; Garcia-Garcia, Francisco; Conesa, Ana; Montaner, David; Dopazo, Joaquín

    2008-01-01

    Gene Expression Profile Analysis Suite (GEPAS) is one of the most complete and extensively used web-based packages for microarray data analysis. During its more than 5 years of activity it has continuously been updated to keep pace with the state-of-the-art in the changing microarray data analysis arena. GEPAS offers diverse analysis options that include well established as well as novel algorithms for normalization, gene selection, class prediction, clustering and functional profiling of the experiment. New options for time-course (or dose-response) experiments, microarray-based class prediction, new clustering methods and new tests for differential expression have been included. The new pipeliner module allows automating the execution of sequential analysis steps by means of a simple but powerful graphic interface. An extensive re-engineering of GEPAS has been carried out which includes the use of web services and Web 2.0 technology features, a new user interface with persistent sessions and a new extended database of gene identifiers. GEPAS is nowadays the most quoted web tool in its field and it is extensively used by researchers of many countries and its records indicate an average usage rate of 500 experiments per day. GEPAS, is available at http://www.gepas.org. PMID:18508806

  10. Medical image analysis of 3D CT images based on extensions of Haralick texture features

    Czech Academy of Sciences Publication Activity Database

    Tesař, Ludvík; Shimizu, A.; Smutek, D.; Kobatake, H.; Nawano, S.

    2008-01-01

    Roč. 32, č. 6 (2008), s. 513-520 ISSN 0895-6111 R&D Projects: GA AV ČR 1ET101050403; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : image segmentation * Gaussian mixture model * 3D image analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 1.192, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/tesar-medical image analysis of 3d ct image s based on extensions of haralick texture features.pdf

  11. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    Science.gov (United States)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  12. Analysis and Geometry in Metric Spaces: Sobolev Mappings, the Heisenberg Group, and the Whitney Extension Theorem

    Science.gov (United States)

    Zimmerman, Scott

    This thesis focuses on analysis in and the geometry of the Heisenberg group as well as geometric properties of Sobolev mappings. It begins with a detailed introduction to the Heisenberg group. After, we see a new and elementary proof for the structure of geodesics in the sub-Riemannian Heisenberg group. We also prove that the Carnot-Caratheodory metric is real analytic away from the center of the group. Next, we prove a version of the classical Whitney Extension Theorem for curves in the Heisenberg group. Given a real valued function defined on a compact set in Euclidean space, the classical Whitney Extension Theorem from 1934 gives necessary and sufficient conditions for the existence of a Ck extension defined on the entire space. We prove a version of the Whitney Extension Theorem for C1 , horizontal curves in the Heisenberg group. We then turn our attention to Sobolev mappings. In particular, given a Lipschitz map from a compact subset Z of Euclidean space into a Lipschitz connected metric space, we construct a Sobolev extension defined on any bounded domain containing Z. Finally, we generalize a classical result of Dubovitskiǐ for smooth maps to the case of Sobolev mappings. In 1957, Dubovitskiǐ generalized Sard's classical theorem by establishing a bound on the Hausdorff dimension of the intersection of the critical set of a smooth map and almost every one of its level sets. We show that Dubovitskiǐ's theorem can be generalized to Wk,p loc (R n,Rm) mappings for all positive integers k and p > n.

  13. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    Science.gov (United States)

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  14. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  15. Conflict Resolution for Product Performance Requirements Based on Propagation Analysis in the Extension Theory

    Directory of Open Access Journals (Sweden)

    Yanwei Zhao

    2014-01-01

    Full Text Available Traditional product data mining methods are mainly focused on the static data. Performance requirements are generally met as possible by finding some cases and changing their structures. However, when one is satisfied with the structures changed, the other effects are not taken into account by analyzing the correlations; that is, design conflicts are not identified and resolved. An approach to resolving the conflict problems is proposed based on propagation analysis in Extension Theory. Firstly, the extension distance is improved to better fit evaluating the similarity among cases, then, a case retrieval method is developed. Secondly, the transformations that can be made on selected cases are formulated by understanding the conflict natures in the different performance requirements, which leads to the extension transformation strategy development for coordinating conflicts using propagation analysis. Thirdly, the effects and levels of propagation are determined by analyzing the performance values before and after the transformations, thus the co-existing conflict coordination strategy of multiple performances is developed. The method has been implemented in a working prototype system for supporting decision-making. And it has been demonstrated the feasible and effective through resolving the conflicts of noise, exhaust, weight and intake pressure for the screw air compressor performance design.

  16. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1978-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet neet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  17. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1977-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  18. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  19. An Online Image Analysis Tool for Science Education

    Science.gov (United States)

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  20. Tools of Audience Analysis in Contemporary Political Campaigns.

    Science.gov (United States)

    Friedenberg, Robert V.

    This paper examines two basic tools of audience analysis as they are used in contemporary political campaingning: public opinion polls and interpretations of voter statistics. The raw data used in the statistical analyses reported in this investigation come from national polls and voter statistics provided to Republican candidates running in local…

  1. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    ABSTRACT: Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical. Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a ...

  2. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  3. Rapid Benefit Indicators (RBI) Spatial Analysis Tools - Manual

    Science.gov (United States)

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  4. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  5. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  6. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  7. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  8. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    Science.gov (United States)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  9. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  10. Analysis for Non-Traditional Security Challenges: Methods and Tools

    Science.gov (United States)

    2006-11-20

    Course of Action COCOM Combatant Commander COI Community of Interest CPB Cultural Preparation of the Battlefield CPM Critical Path Method DARPA Defense...Secretary of Defense (Program Analysis and Evaluation) PACOM United States Pacific Command PERT Program Evaluation Review Technique PMESII Political...Availability .U .aid5.ss of each pos f1 00" tool.Ow, pomty al wi c ro.sd. J Metod F pl Tools I I LL J L L L Dvnmuoc jI.1n~stl osb M00io ~g~osgdowr01Vfl) X x

  11. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  12. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  13. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  14. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  15. An Analysis of a Hard Real-Time Execution Environment Extension for FreeRTOS

    Directory of Open Access Journals (Sweden)

    STANGACIU, C.

    2015-08-01

    Full Text Available FreeRTOS is a popular real-time operating system, which has been under a significant attention in the last years due to its main advantages: it is open source, portable, well documented and implemented on more than 30 architectures. FreeRTOS execution environment is dynamic, preemptive and priority based, but it is not suitable for hard real-time tasks, because it provides task execution determinism only to a certain degree and cannot guarantee the absence of task execution jitter. As a solution to this problem, we propose a hard real time execution extension to FreeRTOS in order to support a particular model of HRT tasks, called ModXs, which are executed with no jitter. This article presents a detailed analysis, in terms of scheduling, task execution and memory usage of this hard real time execution environment extension. The article is concluding with the advantages this extension brings to the system compared to the small memory and timing overhead introduced.

  16. Gene Knockout Identification Using an Extension of Bees Hill Flux Balance Analysis

    Directory of Open Access Journals (Sweden)

    Yee Wen Choon

    2015-01-01

    Full Text Available Microbial strain optimisation for the overproduction of a desired phenotype has been a popular topic in recent years. Gene knockout is a genetic engineering technique that can modify the metabolism of microbial cells to obtain desirable phenotypes. Optimisation algorithms have been developed to identify the effects of gene knockout. However, the complexities of metabolic networks have made the process of identifying the effects of genetic modification on desirable phenotypes challenging. Furthermore, a vast number of reactions in cellular metabolism often lead to a combinatorial problem in obtaining optimal gene knockout. The computational time increases exponentially as the size of the problem increases. This work reports an extension of Bees Hill Flux Balance Analysis (BHFBA to identify optimal gene knockouts to maximise the production yield of desired phenotypes while sustaining the growth rate. This proposed method functions by integrating OptKnock into BHFBA for validating the results automatically. The results show that the extension of BHFBA is suitable, reliable, and applicable in predicting gene knockout. Through several experiments conducted on Escherichia coli, Bacillus subtilis, and Clostridium thermocellum as model organisms, extension of BHFBA has shown better performance in terms of computational time, stability, growth rate, and production yield of desired phenotypes.

  17. Non-extensive statistical analysis on solar activity dependence of magnetospheric dynamics

    Science.gov (United States)

    Gopinath, Sumesh; Santhosh Kumar, G.; Prince, P. R.

    2018-01-01

    Tsallis q-Gaussian distributions and associated q-statistics have been used for the last couple of decades to describe non-equilibrium dynamical systems with varying degrees of complexity. In the present study, we use Tsallis non-extensive statistical analysis for a better understanding of magnetospheric dynamics and its relationship with solar activity. The Tsallis' q-triplet, a set of indices (such as qsens , qstat and qrel) used as empirical quantifiers of non-extensivity, has been estimated for magnetospheric proxies such as auroral electrojet (AE) and disturbance storm time (Dst) indices, for a period of 1985-2007. Our results indicate that the degree of non-extensivity of AE index is quite different from that of Dst index in relation with solar activity dependence. We have seen that the values of the q-triplets calculated from Dst index are more solar activity dependent than those computed from AE index. This shows that, other than solar wind exertions, certain complex phenomena of internal origin also modulate the dynamics of geomagnetic fluctuations in the auroral region.

  18. SMART: Statistical Metabolomics Analysis-An R Tool.

    Science.gov (United States)

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm .

  19. Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.

    Science.gov (United States)

    Chochlaki, Kalliopi; Vallianatos, Filippos

    2017-04-01

    Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G

  20. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    Stauch David

    2008-01-01

    Full Text Available Abstract We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  1. Formal Analysis Tools for the Synchronous Aspect Language Larissa

    Directory of Open Access Journals (Sweden)

    David Stauch

    2008-11-01

    Full Text Available We present two tools for the formal analysis of the aspect language Larissa, which extends the simple synchronous language Argos. The first tool concerns the combination of design-by-contract with Larissa aspects, and shows how we can apply an aspect not only to a program, but to a specification of programs in form of a contract, and obtain a new contract. The second concerns aspect interferences, that is, aspects that influence each other in an unintended way if they are applied to the same program. We present a way to weave aspects in a less conflict-prone manner, and a means to detect remaining conflicts statically. These tools are quite powerful, compared to those available for other aspect languages.

  2. On the error analysis of the meshless FDM and its multipoint extension

    Science.gov (United States)

    Jaworska, Irena

    2018-01-01

    The error analysis for the meshless methods, especially for the Meshless Finite Difference Method (MFDM), is discussed in the paper. Both a priori and a posteriori error estimations are considered. Experimental order of convergence confirms the theoretically developed a priori error bound. The higher order extension of the MFDM - the multipoint approach may be used as a source of the improved reference solution, instead of the true analytical one, for the global and local error estimation of the solution and residual errors. Several types of a posteriori error estimators are described. A variety of performed tests confirm high quality of a posteriori error estimation based on the multipoint MFDM.

  3. Extension of the GeN-Foam neutronic solver to SP3 analysis and application to the CROCUS experimental reactor

    International Nuclear Information System (INIS)

    Fiorina, Carlo; Hursin, Mathieu; Pautz, Andreas

    2017-01-01

    Highlights: • Development and verification of an SP 3 solver based on OpenFOAM. • Integration into the GeN-Foam multi-physics platform. • Application of the new GeN-Foam SP 3 solver to the CROCUS reactor. - Abstract: The Laboratory for Reactor Physics and Systems Behaviour at the PSI and at the EPFL has been developing since 2013 a multi-physics platform for coupled reactor analysis named GeN-Foam. The developed tool includes a solver for the eigenvalue and transient solution of multi-group neutron diffusion equations. Although frequently used in reactor analysis, the diffusion theory shows some limitations for core configurations involving strong anisotropies, which is the case for the CROCUS research reactor at the EPFL. The use of an SP 3 approximation to neutron transport can often lead to visible improvements in a code predictive capabilities, especially for one-directional anisotropies, with acceptable added computational cost vs diffusion. Following some modelling issues for the CROCUS reactor, and in order to improve the GeN-Foam modelling capabilities, the GeN-Foam diffusion solver has been extended to allow for SP 3 analyses. The present paper describes such extension and a preliminary verification using a mini-core PWR benchmark. The newly developed solver is then applied to the analysis of the CROCUS experimental reactor and results are compared to Monte Carlo calculations, as well as to the results of the diffusion solver.

  4. Extensible Stylesheet Language Formatting Objects (XSL-FO): a tool to transform patient data into attractive clinical reports.

    Science.gov (United States)

    Simonaitis, Linas; Belsito, Anne; Warvel, Jeff; Hui, Siu; McDonald, Clement J

    2006-01-01

    Clinicians at Wishard Hospital in Indianapolis print and carry clinical reports called "Pocket Rounds". This paper describes a new process we developed to improve these clinical reports. The heart of our new process is a World Wide Web Consortium standard: Extensible Stylesheet Language Formatting Objects (XSL-FO). Using XSL-FO stylesheets we generated Portable Document Format (PDF) and PostScript reports with complex formatting: columns, tables, borders, shading, indents, dividing lines. We observed patterns of clinical report printing during a eight month study period on three Medicine wards. Usage statistics indicated that clinicians accepted the new system enthusiastically: 78% of 26,418 reports were printed using the new system. We surveyed 67 clinical users. Respondents gave the new reports a rating of 4.2 (on a 5 point scale); they gave the old reports a rating of 3.4. The primary complaint was that it took longer to print the new reports. We believe that XSL-FO is a promising way to transform text data into functional and attractive clinical reports: relatively easy to implement and modify.

  5. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  6. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  7. Is motion analysis a valid tool for assessing laparoscopic skill?

    Science.gov (United States)

    Mason, John D; Ansell, James; Warren, Neil; Torkington, Jared

    2013-05-01

    The use of simulation for laparoscopic training has led to the development of objective tools for skills assessment. Motion analysis represents one area of focus. This study was designed to assess the evidence for the use of motion analysis as a valid tool for laparoscopic skills assessment. Embase, MEDLINE and PubMed were searched using the following domains: (1) motion analysis, (2) validation and (3) laparoscopy. Studies investigating motion analysis as a tool for assessment of laparoscopic skill in general surgery were included. Common endpoints in motion analysis metrics were compared between studies according to a modified form of the Oxford Centre for Evidence-Based Medicine levels of evidence and recommendation. Thirteen studies were included from 2,039 initial papers. Twelve (92.3 %) reported the construct validity of motion analysis across a range of laparoscopic tasks. Of these 12, 5 (41.7 %) evaluated the ProMIS Augmented Reality Simulator, 3 (25 %) the Imperial College Surgical Assessment Device (ICSAD), 2 (16.7 %) the Hiroshima University Endoscopic Surgical Assessment Device (HUESAD), 1 (8.33 %) the Advanced Dundee Endoscopic Psychomotor Tester (ADEPT) and 1 (8.33 %) the Robotic and Video Motion Analysis Software (ROVIMAS). Face validity was reported by 1 (7.7 %) study each for ADEPT and ICSAD. Concurrent validity was reported by 1 (7.7 %) study each for ADEPT, ICSAD and ProMIS. There was no evidence for predictive validity. Evidence exists to validate motion analysis for use in laparoscopic skills assessment. Valid parameters are time taken, path length and number of hand movements. Future work should concentrate on the conversion of motion data into competency-based scores for trainee feedback.

  8. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  9. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    Science.gov (United States)

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-02

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  11. Multi-Spacecraft Analysis with Generic Visualization Tools

    Science.gov (United States)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  12. Criterion-Related Validity of Sit-and-Reach Tests for Estimating Hamstring and Lumbar Extensibility: a Meta-Analysis.

    Science.gov (United States)

    Mayorga-Vega, Daniel; Merino-Marban, Rafael; Viciana, Jesús

    2014-01-01

    The main purpose of the present meta-analysis was to examine the scientific literature on the criterion-related validity of sit-and-reach tests for estimating hamstring and lumbar extensibility. For this purpose relevant studies were searched from seven electronic databases dated up through December 2012. Primary outcomes of criterion-related validity were Pearson´s zero-order correlation coefficients (r) between sit-and-reach tests and hamstrings and/or lumbar extensibility criterion measures. Then, from the included studies, the Hunter- Schmidt´s psychometric meta-analysis approach was conducted to estimate population criterion- related validity of sit-and-reach tests. Firstly, the corrected correlation mean (rp), unaffected by statistical artefacts (i.e., sampling error and measurement error), was calculated separately for each sit-and-reach test. Subsequently, the three potential moderator variables (sex of participants, age of participants, and level of hamstring extensibility) were examined by a partially hierarchical analysis. Of the 34 studies included in the present meta-analysis, 99 correlations values across eight sit-and-reach tests and 51 across seven sit-and-reach tests were retrieved for hamstring and lumbar extensibility, respectively. The overall results showed that all sit-and-reach tests had a moderate mean criterion-related validity for estimating hamstring extensibility (rp = 0.46-0.67), but they had a low mean for estimating lumbar extensibility (rp = 0. 16-0.35). Generally, females, adults and participants with high levels of hamstring extensibility tended to have greater mean values of criterion-related validity for estimating hamstring extensibility. When the use of angular tests is limited such as in a school setting or in large scale studies, scientists and practitioners could use the sit-and-reach tests as a useful alternative for hamstring extensibility estimation, but not for estimating lumbar extensibility. Key PointsOverall sit

  13. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved...... to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods...

  14. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. An analysis of farm services centre (fsc) approach launched for agricultural extension in NWFP, pakistan

    International Nuclear Information System (INIS)

    Haq, I.; Ali, T.; Zafar, M.I.

    2009-01-01

    Agricultural extension services have a pivotal role in agricultural and rural development. It is the major source of technology dissemination and helps the farmers to rationalize the use of natural resources for a sustainable agricultural development. Globally, public-private partnership approach in Agricultural Extension is considered more effective, efficient, and responsive to different categories of farmers. In Pakistan, government of North West Frontier Province (NWFP) has initiated a public-private partnership Extension Programme in the province. This is locally called as Farm Services Centre (FSC). This approach has the inbuilt mechanism of inputs delivery, market facilitation, exchange of experiences and diffusion of knowledge and technology. However, the extent to which this public-private partnership is instrumental in achieving aforementioned objectives is yet to be established. The present study was an attempt to analyze this public-private partnership approach by measuring its strengths and weaknesses. For this purpose, out of 24 districts of NWFP, two districts namely Swabi and Lakimarwat were selected randomly. From these two districts, 491 FSC's member farmers were selected as respondents for interview on random basis. The analysis showed that the most prominent strength of FSC was farmers empowerment with mean 4.05 and SD 1.29, while that of Agriculture Extension Department (AED) was effective message delivery. As per respondents, the major weakness of both (FSC and AED) systems was no marketing facility with mean 4.12 and 4.13 and SD 1.22 and 1.01 respectively. It is essential that the government should ensure the mandated activities at FSC forum particularly the facilitation by line agencies and NWFP Agricultural University, Peshawar. It should be a forum of technology dissemination, agricultural surplus produce marketing and cooperative farming. Agricultural Extension Department should provide more facilities to the staff indulged in FSC

  16. Analysis of the extension and reduction method of qualified life used in equipment with ambiental qualification

    International Nuclear Information System (INIS)

    Serrano R, M.L. .e -mail: mlserrano@cnsns.gob.mx

    2005-01-01

    With the purpose of reducing costs of acquisition, maintenance, design, hour-man, dose, production, etc. or by changes in the temperature of service, diverse Nuclear Plants (NP) in the world have been carried out extensions and reductions of life of equipment and/or components related with the safety. The used methods are mainly tests type on equipment or component aged quickly, tests type on equipment or component aged naturally and use of the Arrhenius model with monitoring of temperatures in place. The present article carries out an analysis of the Arrhenius model with monitoring of temperatures in place, which is presented in two variants, equation (1) and (2), since it is the but used by the NP with extension purposes and reduction of qualified life. As back information, it was made one search of investigations, applied to diverse fields of the science that try about the reliability of the results provided by Arrhenius for the aging case. The results of the analysis indicate: I) that this method make uncertainties in some intervals of temperature and 2) which of the two variants it provides data more reliable. (Author)

  17. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  18. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  19. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  20. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  1. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  2. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  3. Aerospace Power Systems Design and Analysis (APSDA) Tool

    Science.gov (United States)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  4. SOFTWARE TOOLS FOR COMPUTING EXPERIMENT AIMED AT MULTIVARIATE ANALYSIS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    A. V. Tyurin

    2015-09-01

    Full Text Available A concept for organization and planning of computational experiment aimed at implementation of multivariate analysis of complex multifactor models is proposed. It is based on the generation of calculations tree. The logical and structural schemes of the tree are given and software tools, as well, for the automation of work with it: calculation generation, carrying out calculations and analysis of the obtained results. Computer modeling systems and such special-purpose systems as RACS and PRADIS do not solve the problems connected with effective carrying out of computational experiment, consisting of its organization, planning, execution and analysis of the results. Calculation data storage for computational experiment organization is proposed in the form of input and output data tree. Each tree node has a reference to the calculation of model step performed earlier. The storage of calculations tree is realized in a specially organized directory structure. A software tool is proposed for creating and modifying design scheme that stores the structure of one branch of the calculation tree with the view of effective planning of multivariate calculations. A set of special-purpose software tools gives the possibility for the quick generation and modification of the tree, addition of calculations with step-by-step change in the model factors. To perform calculations, software environment in the form of a graphical user interface for creating and modifying calculation script has been developed. This environment makes it possible to traverse calculation tree in a certain order and to perform serial and parallel initiation of computational modules. To analyze the results, software tool has been developed, operating on the base of the tag tree. It is a special tree that stores input and output data of the calculations in the set of changes form of appropriate model factors. The tool enables to select the factors and responses of the model at various steps

  5. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  6. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  7. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  8. Introducing GIS to TransNav and its Extensive Maritime Application: An Innovative Tool for Intelligent Decision Making?

    Directory of Open Access Journals (Sweden)

    Angelica M. Baylon

    2013-12-01

    Full Text Available This paper aims to introduce GIS, its definition, principle, application in any discipline particularly maritime, its process, data sets and features and its benefits to maritime and universities. Specifically, the paper intends to provide an overview of its wide applications in maritime including but not limited to marine transportation, marine environment, port management and operation, maritime education and training (MET and maritime research. GIS simplest task is in mapping and visualization. But its most important function is in spatial analysis. Spatial analysis takes into account the location, geometry, topology, and relationships of geographic data, which lend itself to intelligent decision making. GIS is not just for researchers and students. GIS is especially useful for decision makers such as: managers, administrators, and directors of large and small projects. Scenarios are “seen” and analyzed even before events happen. To planners and decision makers, this is very important because they can assess the impact of events or scenario and may save a lot of time, effort, and money before implementing the actual project. An additional skill on GIS when learned or thought would certainly result to a technically competent maritime global workforce. The paper would provide ideas on possible areas for collaborations among TransNav member institutions for data sharing which may be processed and analyzed by a GIS specialist.

  9. Design and Application of the Exploration Maintainability Analysis Tool

    Science.gov (United States)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  12. Criterion-Related Validity of Sit-and-Reach Tests for Estimating Hamstring and Lumbar Extensibility: a Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Daniel Mayorga-Vega

    2014-03-01

    Full Text Available The main purpose of the present meta-analysis was to examine the scientific literature on the criterion-related validity of sit-and-reach tests for estimating hamstring and lumbar extensibility. For this purpose relevant studies were searched from seven electronic databases dated up through December 2012. Primary outcomes of criterion-related validity were Pearson´s zero-order correlation coefficients (r between sit-and-reach tests and hamstrings and/or lumbar extensibility criterion measures. Then, from the included studies, the Hunter- Schmidt´s psychometric meta-analysis approach was conducted to estimate population criterion- related validity of sit-and-reach tests. Firstly, the corrected correlation mean (rp, unaffected by statistical artefacts (i.e., sampling error and measurement error, was calculated separately for each sit-and-reach test. Subsequently, the three potential moderator variables (sex of participants, age of participants, and level of hamstring extensibility were examined by a partially hierarchical analysis. Of the 34 studies included in the present meta-analysis, 99 correlations values across eight sit-and-reach tests and 51 across seven sit-and-reach tests were retrieved for hamstring and lumbar extensibility, respectively. The overall results showed that all sit-and-reach tests had a moderate mean criterion-related validity for estimating hamstring extensibility (rp = 0.46-0.67, but they had a low mean for estimating lumbar extensibility (rp = 0. 16-0.35. Generally, females, adults and participants with high levels of hamstring extensibility tended to have greater mean values of criterion-related validity for estimating hamstring extensibility. When the use of angular tests is limited such as in a school setting or in large scale studies, scientists and practitioners could use the sit-and-reach tests as a useful alternative for hamstring extensibility estimation, but not for estimating lumbar extensibility.

  13. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  14. FC-NIRS: A Functional Connectivity Analysis Tool for Near-Infrared Spectroscopy Data.

    Science.gov (United States)

    Xu, Jingping; Liu, Xiangyu; Zhang, Jinrui; Li, Zhen; Wang, Xindi; Fang, Fang; Niu, Haijing

    2015-01-01

    Functional near-infrared spectroscopy (fNIRS), a promising noninvasive imaging technique, has recently become an increasingly popular tool in resting-state brain functional connectivity (FC) studies. However, the corresponding software packages for FC analysis are still lacking. To facilitate fNIRS-based human functional connectome studies, we developed a MATLAB software package called "functional connectivity analysis tool for near-infrared spectroscopy data" (FC-NIRS). This package includes the main functions of fNIRS data preprocessing, quality control, FC calculation, and network analysis. Because this software has a friendly graphical user interface (GUI), FC-NIRS allows researchers to perform data analysis in an easy, flexible, and quick way. Furthermore, FC-NIRS can accomplish batch processing during data processing and analysis, thereby greatly reducing the time cost of addressing a large number of datasets. Extensive experimental results using real human brain imaging confirm the viability of the toolbox. This novel toolbox is expected to substantially facilitate fNIRS-data-based human functional connectome studies.

  15. Results of fractal analysis of the Kiel extensive air shower data

    International Nuclear Information System (INIS)

    Kempa, J.; Samorski, M.

    1998-01-01

    For years there has been a problem in cosmic ray studies of how to distinguish individual extensive air showers (EAS) originating from primary protons, heavy nuclei or primary photons. In this paper results of experimental data obtained from the fractal analysis of particle density distributions in individual EAS detected in the range of shower sizes N e between 1.4x10 5 -5x10 6 by the old Kiel experiment are presented. The Lipschitz-Hoelder exponent distributions of EAS detected by the Kiel experiment are discussed. The examples of EAS most probably originating from primary protons, heavy nuclei and high-energy gamma-rays are presented. The lateral distributions of charged particle densities at small distances, angular and size spectra and the mass composition of primary cosmic ray particles around the 'knee' of the energy spectrum are discussed. The Monte Carlo simulation data illustrating the problem of interest are also shown. (author)

  16. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Jeremy [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  17. An Analysis of the Priority Needs of Cooperative Extension at the County Level

    Science.gov (United States)

    Harder, Amy; Lamm, Alexa; Strong, Robert

    2009-01-01

    Cooperative Extension's role as a relevant provider of nonformal education is dependent upon its ability to improve and adjust in response to internal and external pressures. Periodically conducting needs assessments focused on the Extension organization can aid in Extension's efforts to deliver quality educational programs by pinpointing priority…

  18. ANALYSIS OF "IN-DEPTH" SCHOOLS CONDUCTED BY AREA EXTENSION AGENTS.

    Science.gov (United States)

    MCCORMICK, ROBERT W.

    FIVE EDUCATIONAL PROGRAMS WERE CONDUCTED DURING THE FALL AND WINTER OF 1965-66 AT AREA EXTENSION CENTERS ESTABLISHED BY THE OHIO COOPERATIVE EXTENSION SERVICE IN JANUARY 1965. AIMING MAINLY AT THE COMMERCIAL AGRICULTURAL INDUSTRY, SPECIALIZED EXTENSION AGENTS FOCUSED ON EDUCATIONAL PROBLEMS OF AGRICULTURAL PRODUCTION AND OF SUCH AGRIBUSINESS…

  19. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    Science.gov (United States)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  20. IQM: an extensible and portable open source application for image and signal analysis in Java.

    Science.gov (United States)

    Kainz, Philipp; Mayrhofer-Reinhartshuber, Michael; Ahammer, Helmut

    2015-01-01

    Image and signal analysis applications are substantial in scientific research. Both open source and commercial packages provide a wide range of functions for image and signal analysis, which are sometimes supported very well by the communities in the corresponding fields. Commercial software packages have the major drawback of being expensive and having undisclosed source code, which hampers extending the functionality if there is no plugin interface or similar option available. However, both variants cannot cover all possible use cases and sometimes custom developments are unavoidable, requiring open source applications. In this paper we describe IQM, a completely free, portable and open source (GNU GPLv3) image and signal analysis application written in pure Java. IQM does not depend on any natively installed libraries and is therefore runnable out-of-the-box. Currently, a continuously growing repertoire of 50 image and 16 signal analysis algorithms is provided. The modular functional architecture based on the three-tier model is described along the most important functionality. Extensibility is achieved using operator plugins, and the development of more complex workflows is provided by a Groovy script interface to the JVM. We demonstrate IQM's image and signal processing capabilities in a proof-of-principle analysis and provide example implementations to illustrate the plugin framework and the scripting interface. IQM integrates with the popular ImageJ image processing software and is aiming at complementing functionality rather than competing with existing open source software. Machine learning can be integrated into more complex algorithms via the WEKA software package as well, enabling the development of transparent and robust methods for image and signal analysis.

  1. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  2. Extension of explicit formulas in Poissonian white noise analysis using harmonic analysis on configuration spaces

    Directory of Open Access Journals (Sweden)

    Yu.G.Kondratiev

    2008-02-01

    Full Text Available Harmonic analysis on configuration spaces is used in order to extend explicit expressions for the images of creation, annihilation, and second quantization operators in L2-spaces with respect to Poisson point processes to a set of functions larger than the space obtained by directly using chaos expansion. This permits, in particular, to derive an explicit expression for the generator of the second quantization of a sub-Markovian contraction semigroup on a set of functions which forms a core of the generator.

  3. Network enrichment analysis: extension of gene-set enrichment analysis to gene networks

    Directory of Open Access Journals (Sweden)

    Alexeyenko Andrey

    2012-09-01

    Full Text Available Abstract Background Gene-set enrichment analyses (GEA or GSEA are commonly used for biological characterization of an experimental gene-set. This is done by finding known functional categories, such as pathways or Gene Ontology terms, that are over-represented in the experimental set; the assessment is based on an overlap statistic. Rich biological information in terms of gene interaction network is now widely available, but this topological information is not used by GEA, so there is a need for methods that exploit this type of information in high-throughput data analysis. Results We developed a method of network enrichment analysis (NEA that extends the overlap statistic in GEA to network links between genes in the experimental set and those in the functional categories. For the crucial step in statistical inference, we developed a fast network randomization algorithm in order to obtain the distribution of any network statistic under the null hypothesis of no association between an experimental gene-set and a functional category. We illustrate the NEA method using gene and protein expression data from a lung cancer study. Conclusions The results indicate that the NEA method is more powerful than the traditional GEA, primarily because the relationships between gene sets were more strongly captured by network connectivity rather than by simple overlaps.

  4. multiplierz: an extensible API based desktop environment for proteomics data analysis.

    Science.gov (United States)

    Parikh, Jignesh R; Askenazi, Manor; Ficarro, Scott B; Cashorali, Tanya; Webber, James T; Blank, Nathaniel C; Zhang, Yi; Marto, Jarrod A

    2009-10-29

    Efficient analysis of results from mass spectrometry-based proteomics experiments requires access to disparate data types, including native mass spectrometry files, output from algorithms that assign peptide sequence to MS/MS spectra, and annotation for proteins and pathways from various database sources. Moreover, proteomics technologies and experimental methods are not yet standardized; hence a high degree of flexibility is necessary for efficient support of high- and low-throughput data analytic tasks. Development of a desktop environment that is sufficiently robust for deployment in data analytic pipelines, and simultaneously supports customization for programmers and non-programmers alike, has proven to be a significant challenge. We describe multiplierz, a flexible and open-source desktop environment for comprehensive proteomics data analysis. We use this framework to expose a prototype version of our recently proposed common API (mzAPI) designed for direct access to proprietary mass spectrometry files. In addition to routine data analytic tasks, multiplierz supports generation of information rich, portable spreadsheet-based reports. Moreover, multiplierz is designed around a "zero infrastructure" philosophy, meaning that it can be deployed by end users with little or no system administration support. Finally, access to multiplierz functionality is provided via high-level Python scripts, resulting in a fully extensible data analytic environment for rapid development of custom algorithms and deployment of high-throughput data pipelines. Collectively, mzAPI and multiplierz facilitate a wide range of data analysis tasks, spanning technology development to biological annotation, for mass spectrometry-based proteomics research.

  5. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  6. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  7. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  8. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  9. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  10. High resolution analysis of the human transcriptome: detection of extensive alternative splicing independent of transcriptional activity

    Directory of Open Access Journals (Sweden)

    Rouet Fabien

    2009-10-01

    Full Text Available Abstract Background Commercially available microarrays have been used in many settings to generate expression profiles for a variety of applications, including target selection for disease detection, classification, profiling for pharmacogenomic response to therapeutics, and potential disease staging. However, many commercially available microarray platforms fail to capture transcript diversity produced by alternative splicing, a major mechanism for driving proteomic diversity through transcript heterogeneity. Results The human Genome-Wide SpliceArray™ (GWSA, a novel microarray platform, utilizes an existing probe design concept to monitor such transcript diversity on a genome scale. The human GWSA allows the detection of alternatively spliced events within the human genome through the use of exon body and exon junction probes to provide a direct measure of each transcript, through simple calculations derived from expression data. This report focuses on the performance and validation of the array when measured against standards recently published by the Microarray Quality Control (MAQC Project. The array was shown to be highly quantitative, and displayed greater than 85% correlation with the HG-U133 Plus 2.0 array at the gene level while providing more extensive coverage of each gene. Almost 60% of splice events among genes demonstrating differential expression of greater than 3 fold also contained extensive splicing alterations. Importantly, almost 10% of splice events within the gene set displaying constant overall expression values had evidence of transcript diversity. Two examples illustrate the types of events identified: LIM domain 7 showed no differential expression at the gene level, but demonstrated deregulation of an exon skip event, while erythrocyte membrane protein band 4.1 -like 3 was differentially expressed and also displayed deregulation of a skipped exon isoform. Conclusion Significant changes were detected independent of

  11. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  12. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  13. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  14. Software Tools for the Analysis of Functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-09-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized.

  15. Software Tools for the Analysis of functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mehdi Behroozi

    2012-12-01

    Full Text Available Functional magnetic resonance imaging (fMRI has become the most popular method for imaging of brain functions. Currently, there is a large variety of software packages for the analysis of fMRI data, each providing many features for users. Since there is no single package that can provide all the necessary analyses for the fMRI data, it is helpful to know the features of each software package. In this paper, several software tools have been introduced and they have been evaluated for comparison of their functionality and their features. The description of each program has been discussed and summarized

  16. Smart roadside initiative macro benefit analysis : user's guide for the benefit-cost analysis tool.

    Science.gov (United States)

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of : various new transportation technologies at a State level and to provide results that could support technology adoption : by a State ...

  17. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    Science.gov (United States)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed

  18. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  19. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  20. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  1. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  2. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  3. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  4. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  5. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  6. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...... and parameterised reward annotations. SBOAT allows the optimisation of these processes by specifying optimisation goals by means of probabilistic control tree logic (PCTL). Optimisation is performed by means of an evolutionary algorithm where stochastic model checking, in the form of the PRISM model checker......, is used to compute the fitness, the performance of a candidate in terms of the specified goals, of variants of a process. Our evolutionary algorithm approach uses a matrix representation of process models to efficiently allow mutation and crossover of a process model to be performed, allowing broad...

  7. metaSNV: A tool for metagenomic strain level analysis.

    Directory of Open Access Journals (Sweden)

    Paul Igor Costea

    Full Text Available We present metaSNV, a tool for single nucleotide variant (SNV analysis in metagenomic samples, capable of comparing populations of thousands of bacterial and archaeal species. The tool uses as input nucleotide sequence alignments to reference genomes in standard SAM/BAM format, performs SNV calling for individual samples and across the whole data set, and generates various statistics for individual species including allele frequencies and nucleotide diversity per sample as well as distances and fixation indices across samples. Using published data from 676 metagenomic samples of different sites in the oral cavity, we show that the results of metaSNV are comparable to those of MIDAS, an alternative implementation for metagenomic SNV analysis, while data processing is faster and has a smaller storage footprint. Moreover, we implement a set of distance measures that allow the comparison of genomic variation across metagenomic samples and delineate sample-specific variants to enable the tracking of specific strain populations over time. The implementation of metaSNV is available at: http://metasnv.embl.de/.

  8. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  9. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  10. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  11. A digital repository with an extensible data model for biobanking and genomic analysis management

    Science.gov (United States)

    2014-01-01

    Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information

  12. Genetic analysis of extensively drug-resistant Mycobacterium tuberculosis strains in Lisbon, Portugal.

    Science.gov (United States)

    Perdigão, João; Macedo, Rita; Malaquias, Ana; Ferreira, Ana; Brum, Laura; Portugal, Isabel

    2010-02-01

    Extensively drug-resistant (XDR) tuberculosis (TB) threatens the global control of TB worldwide. Lisbon has a high XDR-TB rate [50% of the multidrug-resistant tuberculosis (MDR-TB)], which is mainly associated with Lisboa family strains. Few studies have addressed the identification of mutations associated with resistance to second-line injectable drugs, and the relative frequency of such mutations varies geographically. The aim of this study was to characterize the genetic changes associated with the high number of XDR-TB cases in Lisbon. In the present study we analysed 26 XDR-TB clinical isolates. The gyrA, tlyA and rrs genes were screened for mutations that could be responsible for resistance to fluoroquinolones and second-line injectable drugs. Moreover, the strains under analysis were also genotyped by MIRU-VNTR ('mycobacterial interspersed repetitive unit-variable number of tandem repeats'). The mutational analysis identified the most frequent mutations in the resistance-associated genes: S91P in gyrA (42.3%); A1401G in rrs (30.8%); and Ins755GT in tlyA (42.3%). The occurrence of mutations in rrs was associated with the non-occurrence of mutations in tlyA. The genotypic analysis revealed that the strains were highly clonal, belonging to one of two MIRU-VNTR clusters, with the largest belonging to the Lisboa family. Association between mutations in gyrA and rrs or tlyA was verified. The association of specific mutations highlighted the strains' high clonality and indicates recent XDR-TB transmission. In addition, the identification of the most frequent resistance-associated mutations will be invaluable in applying XDR-TB molecular detection tests in the region in the near future.

  13. multiplierz: an extensible API based desktop environment for proteomics data analysis

    Directory of Open Access Journals (Sweden)

    Webber James T

    2009-10-01

    Full Text Available Abstract Background Efficient analysis of results from mass spectrometry-based proteomics experiments requires access to disparate data types, including native mass spectrometry files, output from algorithms that assign peptide sequence to MS/MS spectra, and annotation for proteins and pathways from various database sources. Moreover, proteomics technologies and experimental methods are not yet standardized; hence a high degree of flexibility is necessary for efficient support of high- and low-throughput data analytic tasks. Development of a desktop environment that is sufficiently robust for deployment in data analytic pipelines, and simultaneously supports customization for programmers and non-programmers alike, has proven to be a significant challenge. Results We describe multiplierz, a flexible and open-source desktop environment for comprehensive proteomics data analysis. We use this framework to expose a prototype version of our recently proposed common API (mzAPI designed for direct access to proprietary mass spectrometry files. In addition to routine data analytic tasks, multiplierz supports generation of information rich, portable spreadsheet-based reports. Moreover, multiplierz is designed around a "zero infrastructure" philosophy, meaning that it can be deployed by end users with little or no system administration support. Finally, access to multiplierz functionality is provided via high-level Python scripts, resulting in a fully extensible data analytic environment for rapid development of custom algorithms and deployment of high-throughput data pipelines. Conclusion Collectively, mzAPI and multiplierz facilitate a wide range of data analysis tasks, spanning technology development to biological annotation, for mass spectrometry-based proteomics research.

  14. Reusable, extensible, and modifiable R scripts and Kepler workflows for comprehensive single set ChIP-seq analysis.

    Science.gov (United States)

    Cormier, Nathan; Kolisnik, Tyler; Bieda, Mark

    2016-07-05

    There has been an enormous expansion of use of chromatin immunoprecipitation followed by sequencing (ChIP-seq) technologies. Analysis of large-scale ChIP-seq datasets involves a complex series of steps and production of several specialized graphical outputs. A number of systems have emphasized custom development of ChIP-seq pipelines. These systems are primarily based on custom programming of a single, complex pipeline or supply libraries of modules and do not produce the full range of outputs commonly produced for ChIP-seq datasets. It is desirable to have more comprehensive pipelines, in particular ones addressing common metadata tasks, such as pathway analysis, and pipelines producing standard complex graphical outputs. It is advantageous if these are highly modular systems, available as both turnkey pipelines and individual modules, that are easily comprehensible, modifiable and extensible to allow rapid alteration in response to new analysis developments in this growing area. Furthermore, it is advantageous if these pipelines allow data provenance tracking. We present a set of 20 ChIP-seq analysis software modules implemented in the Kepler workflow system; most (18/20) were also implemented as standalone, fully functional R scripts. The set consists of four full turnkey pipelines and 16 component modules. The turnkey pipelines in Kepler allow data provenance tracking. Implementation emphasized use of common R packages and widely-used external tools (e.g., MACS for peak finding), along with custom programming. This software presents comprehensive solutions and easily repurposed code blocks for ChIP-seq analysis and pipeline creation. Tasks include mapping raw reads, peakfinding via MACS, summary statistics, peak location statistics, summary plots centered on the transcription start site (TSS), gene ontology, pathway analysis, and de novo motif finding, among others. These pipelines range from those performing a single task to those performing full analyses of

  15. An Analysis of the North Carolina Cooperative Extension Service's Role in Bridging the Digital Divide

    Science.gov (United States)

    Alston, Antoine J.; Hilton, Lashawn; English, Chastity Warren; Elbert, Chanda; Wakefield, Dexter

    2011-01-01

    The study reported here sought to determine the perception of North Carolina County Cooperative Extension directors in regard to the North Carolina Cooperative Extension Service's role in bridging the digital divide. It was perceived by respondents that variables such as income, education, gender, disability status, race/ethnicity, age, and…

  16. Analysis of effects of extension teaching methods on farmers' level of ...

    African Journals Online (AJOL)

    This study analyzed the effects of extension teaching methods used by Ogun State (Nigeria) Agricultural Development Programme's extension agents on farmers' level of production in maize and cassava. The sample included 210 randomly selected farmers, comprising adopters and non-adopters of introduced agricultural ...

  17. Virtual Focus Groups in Extension: A Useful Approach to Audience Analysis

    Science.gov (United States)

    Warner, Laura A.

    2014-01-01

    As change agents, Extension educators may begin their program planning by identifying the audience's perceived barriers and benefits to adopting some behavior that will benefit the community. Extension professionals and researchers have used in-person focus groups to understand an audience, and they can also administer them as…

  18. Networked Learning for Agricultural Extension: A Framework for Analysis and Two Cases

    Science.gov (United States)

    Kelly, Nick; Bennett, John McLean; Starasts, Ann

    2017-01-01

    Purpose: This paper presents economic and pedagogical motivations for adopting information and communications technology (ICT)- mediated learning networks in agricultural education and extension. It proposes a framework for networked learning in agricultural extension and contributes a theoretical and case-based rationale for adopting the…

  19. Analysis of the role and level of job performance among extension ...

    African Journals Online (AJOL)

    The study analysed the role performance and job satisfaction of extension agents in technology delivery in Imo State. The multistage random sampling technique was adopted in the selection of farmers and simple random sampling for the selection of extension agents. The instruments for data collection were four sets of ...

  20. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  1. New Power Quality Analysis Method Based on Chaos Synchronization and Extension Neural Network

    Directory of Open Access Journals (Sweden)

    Meng-Hui Wang

    2014-10-01

    Full Text Available A hybrid method comprising a chaos synchronization (CS-based detection scheme and an Extension Neural Network (ENN classification algorithm is proposed for power quality monitoring and analysis. The new method can detect minor changes in signals of the power systems. Likewise, prominent characteristics of system signal disturbance can be extracted by this technique. In the proposed approach, the CS-based detection method is used to extract three fundamental characteristics of the power system signal and an ENN-based clustering scheme is then applied to detect the state of the signal, i.e., normal, voltage sag, voltage swell, interruption or harmonics. The validity of the proposed method is demonstrated by means of simulations given the use of three different chaotic systems, namely Lorenz, New Lorenz and Sprott. The simulation results show that the proposed method achieves a high detection accuracy irrespective of the chaotic system used or the presence of noise. The proposed method not only achieves higher detection accuracy than existing methods, but also has low computational cost, an improved robustness toward noise, and improved scalability. As a result, it provides an ideal solution for the future development of hand-held power quality analyzers and real-time detection devices.

  2. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Bekavac, Ivan; Garbin Praničević, Daniela

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  3. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  4. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  5. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  6. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  7. Sensitivity analysis of an information fusion tool: OWA operator

    Science.gov (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  8. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  9. ASOURCE: Source Term Analysis Tool for Advanced Fuel Cycle

    International Nuclear Information System (INIS)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Jong Won; Choi, Heui Joo; Jeong, Jong Tae

    2012-01-01

    In 2007, the 3 rd Comprehensive Nuclear Energy Promotion Plan, passed at the 254 th meeting of the Atomic Energy Commission, was announced as an R and D action plan for the development of an advanced fuel cycle adopting a sodium-cooled fast reactor (SFR) in connection with a pyroprocess for a sustainable stable energy supply and a reduction in the amount of spent fuel (SF). It is expected that this fuel cycle can greatly reduce the SF inventory through a recycling process in which transuranics (TRU) and long-lived nuclides are burned in the SFR and cesium and strontium are disposed of after sufficient interim storage. For the success of the R and D plan, there are several issues related to the source term analysis. These are related with the following: (a) generation of inflow and outflow source terms of mixed SF in each process for the design of the pyroprocess facility, (b) source terms of mixed radwaste in a canister for the design of storage and disposal systems, (c) overall inventory estimation for TRU and long-lived nuclides for the design of the SFR, and (d) best estimate source terms for the practical design of the interim storage facility of SFs. A source term evaluation for a SF or radwaste with a single irradiation profile can be easily accomplished with the conventional computation tool. However, source term assessment for a batch of SFs or a mixture of radwastes generated from SFs with different irradiation profiles. A task that is essential to support the aforementioned activities is not possible with the conventional tool. Therefore, hybrid computing program for source term analysis to support the advanced fuel cycle was developed

  10. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  11. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  12. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  13. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  14. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  15. The Comparative Analysis of a Novel Acetabular Component against Hemispherical Component in Case of Extensive Acetabular Bone Defects — A Study of Finite Element Analysis

    Directory of Open Access Journals (Sweden)

    Wenhui Ma

    2013-02-01

    Full Text Available The purpose of this study was to evaluate the design of a cup using finite element method and to analyze possible effects of joint loading postoperatively, and its initial mechanical stability, so as to direct its further optimization. Finite-element (FE models of the cup with three wings and the hemispherical cup were created to calculate the stress patterns during a normal gait cycle. The stress in the acetabular components were analyzed and compared. The FE analysis demonstrated that all kinds of acetabular components had the same trend for stress and strain. The stress of the wings increased gradually from rim to root. Its peak stress was significantly lower than the yield force of the Co-Cr-Mo alloy at the joint between the wing and the shell. The graft portion near the acetabular component was subjected to higher stress conditions. The contact stresses were found to be decreased with a reduced abduction angle of wings. The cup with wings of abduction angle of 15° had lower stresses compared with other cups. The cup with wings is a reliable option for the reconstruction of the acetabulum with extensive bone socket defects. The reduced abduction angle of wings helps to decrease the stress of the cup with wings. The FE analysis is a useful tool with which to address these issues.

  16. Interaction tools for underwater shock analysis in naval platform design

    NARCIS (Netherlands)

    Aanhold, J.E.; Tuitman, J.T.; Trouwborst, W.; Vaders, J.A.A.

    2016-01-01

    In order to satisfy the need for good quality UNDerwater EXplosion (UNDEX) response estimates of naval platforms, TNO developed two 3D simulation tools: the Simplified Interaction Tool (SIT) and the hydro/structural code 3DCAV. Both tools are an add-on to LS-DYNA. SIT is a module of user routines

  17. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... is reported. (C) 1999 Elsevier Science S.A. All rights reserved....

  18. Clinicopathologic Analysis of Microscopic Extension in Lung Adenocarcinoma: Defining Clinical Target Volume for Radiotherapy

    International Nuclear Information System (INIS)

    Grills, Inga S.; Fitch, Dwight L.; Goldstein, Neal S.; Yan Di; Chmielewski, Gary W.; Welsh, Robert J.; Kestin, Larry L.

    2007-01-01

    Purpose: To determine the gross tumor volume (GTV) to clinical target volume margin for non-small-cell lung cancer treatment planning. Methods: A total of 35 patients with Stage T1N0 adenocarcinoma underwent wedge resection plus immediate lobectomy. The gross tumor size and microscopic extension distance beyond the gross tumor were measured. The nuclear grade and percentage of bronchoalveolar features were analyzed for association with microscopic extension. The gross tumor dimensions were measured on a computed tomography (CT) scan (lung and mediastinal windows) and compared with the pathologic dimensions. The potential coverage of microscopic extension for two different lung stereotactic radiotherapy regimens was evaluated. Results: The mean microscopic extension distance beyond the gross tumor was 7.2 mm and varied according to grade (10.1, 7.0, and 3.5 mm for Grade 1 to 3, respectively, p < 0.01). The 90th percentile for microscopic extension was 12.0 mm (13.0, 9.7, and 4.4 mm for Grade 1 to 3, respectively). The CT lung windows correlated better with the pathologic size than did the mediastinal windows (gross pathologic size overestimated by a mean of 5.8 mm; composite size [gross plus microscopic extension] underestimated by a mean of 1.2 mm). For a GTV contoured on the CT lung windows, the margin required to cover microscopic extension for 90% of the cases would be 9 mm (9, 7, and 4 mm for Grade 1 to 3, respectively). The potential microscopic extension dosimetric coverage (55 Gy) varied substantially between the stereotactic radiotherapy schedules. Conclusion: For lung adenocarcinomas, the GTV should be contoured using CT lung windows. Although a GTV based on the CT lung windows would underestimate the gross tumor size plus microscopic extension by only 1.2 mm for the average case, the clinical target volume expansion required to cover the microscopic extension in 90% of cases could be as large as 9 mm, although considerably smaller for high-grade tumors

  19. Discourse Analysis: A Tool for Helping Educators to Teach Science

    Directory of Open Access Journals (Sweden)

    Katerina Plakitsi

    2016-11-01

    Full Text Available This article refers to a part of a collaborative action research project in three elementary science classrooms. The project aims at the transformation of the nature and type of teachers' discursive practices into more collaborative inquiries. The basic strategy is to give the teachers the opportunity to analyze their discourse using a three-dimensional context of analysis. The teachers analyzed their discursive repertoires when teaching science. They studied the companion meaning, i.e., the different layers of explicit and tacit messages they communicate about Nature of Science (NoS, Nature of Teaching (NoT, and Nature of Language (NoL. The question investigated is the following: Could an action research program, which involves teachers in the analysis of their own discursive practices, lead to the transformation of discourse modes that take place in the science classrooms to better communicate aspects of NoS, NoT and NoL in a collaborative, inquiry-based context? Results indicate that the teachers' involvement in their discourse analysis led to a transformation in the discursive repertoires in their science classrooms. Gradually, the teachers' companion meanings that were created, implicitly/explicitly, from the dialogues taking place during science lessons were more appropriate for the establishment of a productive collaborative inquiry learning context. We argue that discourse analysis could be used for research purposes, as a training medium or as a reflective tool on how teachers communicate science. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs170168

  20. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  1. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods.

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E; Re, Matteo

    2014-06-01

    In the context of "network medicine", gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different "informativeness" embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further enhance disease gene ranking results, by adopting both

  2. Kinematic Analysis of a 3-dof Parallel Machine Tool with Large Workspace

    Directory of Open Access Journals (Sweden)

    Shi Yan

    2016-01-01

    Full Text Available Kinematics of a 3-dof (degree of freedom parallel machine tool with large workspace was analyzed. The workspace volume and surface and boundary posture angles of the 3-dof parallel machine tool are relatively large. Firstly, three dimensional simulation manipulator of the 3-dof parallel machine tool was constructed, and its joint distribution was described. Secondly, kinematic models of the 3-dof parallel machine tool were fixed on, including displacement analysis, velocity analysis, and acceleration analysis. Finally, the kinematic models of the machine tool were verified by a numerical example. The study result has an important significance to the application of the parallel machine tool.

  3. Mass balance re-analysis of Findelengletscher, Switzerland; benefits of extensive snow accumulation measurements

    Directory of Open Access Journals (Sweden)

    Leo eSold

    2016-02-01

    Full Text Available A re-analysis is presented here of a 10-year mass balance series at Findelengletscher, a temperate mountain glacier in Switzerland. Calculating glacier-wide mass balance from the set of glaciological point balance observations using conventional approaches, such as the profile or contour method, resulted in significant deviations from the reference value given by the geodetic mass change over a five-year period. This is attributed to the sparsity of observations at high elevations and to the inability of the evaluation schemes to adequately estimate accumulation in unmeasured areas. However, measurements of winter mass balance were available for large parts of the study period from snow probings and density pits. Complementary surveys by helicopter-borne ground-penetrating radar (GPR were conducted in three consecutive years. The complete set of seasonal observations was assimilated using a distributed mass balance model. This model-based extrapolation revealed a substantial mass loss at Findelengletscher of -0.43m w.e. a^-1 between 2004 and 2014, while the loss was less pronounced for its former tributary, Adlergletscher (-0.30m w.e. a^-1. For both glaciers, the resulting time series were within the uncertainty bounds of the geodetic mass change. We show that the model benefited strongly from the ability to integrate seasonal observations. If no winter mass balance measurements were available and snow cover was represented by a linear precipitation gradient, the geodetic mass balance was not matched. If winter balance measurements by snow probings and snow density pits were taken into account, the model performance was substantially improved but still showed a significant bias relative to the geodetic mass change. Thus the excellent agreement of the model-based extrapolation with the geodetic mass change was owed to an adequate representation of winter accumulation distribution by means of extensive GPR measurements.

  4. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  5. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Pakarinen Jyri

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  6. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Science.gov (United States)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  7. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of

  8. Actigraphy and motion analysis: new tools for psychiatry.

    Science.gov (United States)

    Teicher, M H

    1995-01-01

    Altered locomotor activity is a cardinal sign of several psychiatric disorders. With advances in technology, activity can now be measured precisely. Contemporary studies quantifying activity in psychiatric patients are reviewed. Studies were located by a Medline search (1965 to present; English language only) cross-referencing motor activity and major psychiatric disorders. The review focused on mood disorders and attention-deficit hyperactivity disorder (ADHD). Activity levels are elevated in mania, agitated depression, and ADHD and attenuated in bipolar depression and seasonal depression. The percentage of low-level daytime activity is directly related to severity of depression, and change in this parameter accurately mirrors recovery. Demanding cognitive tasks elicit fidgeting in children with ADHD, and precise measures of activity and attention may provide a sensitive and specific marker for this disorder. Circadian rhythm analysis enhances the sophistication of activity measures. Affective disorders in children and adolescents are characterized by an attenuated circadian rhythm and an enhanced 12-hour harmonic rhythm (diurnal variation). Circadian analysis may help to distinguish between the activity patterns of mania (dysregulated) and ADHD (intact or enhanced). Persistence of hyperactivity or circadian dysregulation in bipolar patients treated with lithium appears to predict rapid relapse once medication is discontinued. Activity monitoring is a valuable research tool, with the potential to aid clinicians in diagnosis and in prediction of treatment response.

  9. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  10. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  11. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis......, on the data of the funnel, which jointly estimates the FAT and the PET. Ideal funnels are lean and symmetric. Empirical funnels are wide, and most have asymmetries biasing the plain average. Many asymmetries are due to censoring made during the research-publication process. The PET is tooled to correct...

  12. Extending the XNAT archive tool for image and analysis management in ophthalmology research

    Science.gov (United States)

    Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.

    2013-03-01

    In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.

  13. Road safety risk evaluation and target setting using data envelopment analysis and its extensions.

    Science.gov (United States)

    Shen, Yongjun; Hermans, Elke; Brijs, Tom; Wets, Geert; Vanhoof, Koen

    2012-09-01

    Currently, comparison between countries in terms of their road safety performance is widely conducted in order to better understand one's own safety situation and to learn from those best-performing countries by indicating practical targets and formulating action programmes. In this respect, crash data such as the number of road fatalities and casualties are mostly investigated. However, the absolute numbers are not directly comparable between countries. Therefore, the concept of risk, which is defined as the ratio of road safety outcomes and some measure of exposure (e.g., the population size, the number of registered vehicles, or distance travelled), is often used in the context of benchmarking. Nevertheless, these risk indicators are not consistent in most cases. In other words, countries may have different evaluation results or ranking positions using different exposure information. In this study, data envelopment analysis (DEA) as a performance measurement technique is investigated to provide an overall perspective on a country's road safety situation, and further assess whether the road safety outcomes registered in a country correspond to the numbers that can be expected based on the level of exposure. In doing so, three model extensions are considered, which are the DEA based road safety model (DEA-RS), the cross-efficiency method, and the categorical DEA model. Using the measures of exposure to risk as the model's input and the number of road fatalities as output, an overall road safety efficiency score is computed for the 27 European Union (EU) countries based on the DEA-RS model, and the ranking of countries in accordance with their cross-efficiency scores is evaluated. Furthermore, after applying clustering analysis to group countries with inherent similarity in their practices, the categorical DEA-RS model is adopted to identify best-performing and underperforming countries in each cluster, as well as the reference sets or benchmarks for those

  14. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    OpenAIRE

    Shang-Liang Chen; Yin-Ting Cheng; Chin-Fa Su

    2015-01-01

    Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as ...

  15. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    OpenAIRE

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnost...

  16. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  17. Clinicopathologic analysis of extracapsular extension in prostate cancer: Should the clinical target volume be expanded posterolaterally to account for microscopic extension?

    International Nuclear Information System (INIS)

    Chao, K. Kenneth; Goldstein, Neal S.; Yan Di; Vargas, Carlos E.; Ghilezan, Michel I.; Korman, Howard J.; Kernen, Kenneth M.; Hollander, Jay B.; Gonzalez, Jose A.; Martinez, Alvaro A.; Vicini, Frank A.; Kestin, Larry L.

    2006-01-01

    Purpose: We performed a complete pathologic analysis examining extracapsular extension (ECE) and microscopic spread of malignant cells beyond the prostate capsule to determine whether and when clinical target volume (CTV) expansion should be performed. Methods and Materials: A detailed pathologic analysis was performed for 371 prostatectomy specimens. All slides from each case were reviewed by a single pathologist (N.S.G.). The ECE status and ECE distance, defined as the maximal linear radial distance of malignant cells beyond the capsule, were recorded. Results: A total of 121 patients (33%) were found to have ECE (68 unilateral, 53 bilateral). Median ECE distance = 2.4 mm [range: 0.05-7.0 mm]. The 90th-percentile distance = 5.0 mm. Of the 121 cases with ECE, 55% had ECE distance ≥2 mm, 19% ≥4 mm, and 6% ≥6 mm. ECE occurred primarily posterolaterally along the neurovascular bundle in all cases. Pretreatment prostrate-specific antigen (PSA), biopsy Gleason, pathologic Gleason, clinical stage, bilateral involvement, positive margins, percentage of gland involved, and maximal tumor dimension were associated with presence of ECE. Both PSA and Gleason score were associated with ECE distance. In all 371 patients, for those with either pretreatment PSA ≥10 or biopsy Gleason score ≥7, 21% had ECE ≥2 mm and 5% ≥4 mm beyond the capsule. For patients with both of these risk factors, 49% had ECE ≥2 mm and 21% ≥4 mm. Conclusions: For prostate cancer with ECE, the median linear distance of ECE was 2.4 mm and occurred primarily posterolaterally. Although only 5% of patients demonstrate ECE >4 to 5 mm beyond the capsule, this risk may exceed 20% in patients with PSA ≥10 ng/ml and biopsy Gleason score ≥7. As imaging techniques improve for prostate capsule delineation and as radiotherapy delivery techniques increase in accuracy, a posterolateral CTV expansion should be considered for patients at high risk

  18. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  19. Multivariate analysis of ultrasound-recorded dorsal strain sequences: Investigation of dynamic neck extensions in women with chronic whiplash associated disorders

    Science.gov (United States)

    Peolsson, Anneli; Peterson, Gunnel; Trygg, Johan; Nilsson, David

    2016-08-01

    Whiplash Associated Disorders (WAD) refers to the multifaceted and chronic burden that is common after a whiplash injury. Tools to assist in the diagnosis of WAD and an increased understanding of neck muscle behaviour are needed. We examined the multilayer dorsal neck muscle behaviour in nine women with chronic WAD versus healthy controls during the entire sequence of a dynamic low-loaded neck extension exercise, which was recorded using real-time ultrasound movies with high frame rates. Principal component analysis and orthogonal partial least squares were used to analyse mechanical muscle strain (deformation in elongation and shortening). The WAD group showed more shortening during the neck extension phase in the trapezius muscle and during both the neck extension and the return to neutral phase in the multifidus muscle. For the first time, a novel non-invasive method is presented that is capable of detecting altered dorsal muscle strain in women with WAD during an entire exercise sequence. This method may be a breakthrough for the future diagnosis and treatment of WAD.

  20. Analysis of the influence of tool dynamics in diamond turning

    Energy Technology Data Exchange (ETDEWEB)

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  1. Suspended Cell Culture ANalysis (SCAN) Tool to Enhance ISS On-Orbit Capabilities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences and partner, Draper Laboratory, propose to develop an on-orbit immuno-based label-free Suspension Cell Culture ANalysis tool, SCAN tool, which...

  2. Transient Side Load Analysis of Out-of-Round Film-Cooled Nozzle Extensions

    Science.gov (United States)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2012-01-01

    There was interest in understanding the impact of out-of-round nozzle extension on the nozzle side load during transient startup operations. The out-of-round nozzle extension could be the result of asymmetric internal stresses, deformation induced by previous tests, and asymmetric loads induced by hardware attached to the nozzle. The objective of this study was therefore to computationally investigate the effect of out-of-round nozzle extension on the nozzle side loads during an engine startup transient. The rocket engine studied encompasses a regeneratively cooled chamber and nozzle, along with a film cooled nozzle extension. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and transient inlet boundary flow properties derived from an engine system simulation. Six three-dimensional cases were performed with the out-of-roundness achieved by three different degrees of ovalization, elongated on lateral y and z axes: one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation line jump was the primary source of the peak side loads. Comparing to the peak side load of the perfectly round nozzle, the peak side loads increased for the slightly and more ovalized nozzle extensions, and either increased or decreased for the two significantly ovalized nozzle extensions. A theory based on the counteraction of the flow destabilizing effect of an exacerbated asymmetrical flow caused by a lower degree of ovalization, and the flow stabilizing effect of a more symmetrical flow, created also by ovalization, is presented to explain the observations obtained in this effort.

  3. Bioelectrical impedance analysis: A new tool for assessing fish condition

    Science.gov (United States)

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  4. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  5. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  6. VisIt: Interactive Parallel Visualization and Graphical Analysis Tool

    Science.gov (United States)

    Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

    2011-03-01

    VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

  7. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  8. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis

    Directory of Open Access Journals (Sweden)

    Nicolas Lefevre

    2016-01-01

    Full Text Available The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears.

  9. A Current Review of the Meniscus Imaging: Proposition of a Useful Tool for Its Radiologic Analysis

    Science.gov (United States)

    Lefevre, Nicolas; Naouri, Jean Francois; Herman, Serge; Gerometta, Antoine; Klouche, Shahnaz; Bohu, Yoann

    2016-01-01

    The main objective of this review was to present a synthesis of the current literature in order to provide a useful tool to clinician in radiologic analysis of the meniscus. All anatomical descriptions were clearly illustrated by MRI, arthroscopy, and/or drawings. The value of standard radiography is extremely limited for the assessment of meniscal injuries but may be indicated to obtain a differential diagnosis such as osteoarthritis. Ultrasound is rarely used as a diagnostic tool for meniscal pathologies and its accuracy is operator-dependent. CT arthrography with multiplanar reconstructions can detect meniscus tears that are not visible on MRI. This technique is also useful in case of MRI contraindications, in postoperative assessment of meniscal sutures and the condition of cartilage covering the articular surfaces. MRI is the most accurate and less invasive method for diagnosing meniscal lesions. MRI allows confirming and characterizing the meniscal lesion, the type, the extension, its association with a cyst, the meniscal extrusion, and assessing cartilage and subchondral bone. New 3D-MRI in three dimensions with isotropic resolution allows the creation of multiplanar reformatted images to obtain from an acquisition in one sectional plane reconstructions in other spatial planes. 3D MRI should further improve the diagnosis of meniscal tears. PMID:27057352

  10. Tools4miRs - one place to gather all the tools for miRNA analysis.

    Science.gov (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-09-01

    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. Tools4miRs – one place to gather all the tools for miRNA analysis

    Science.gov (United States)

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  12. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  13. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    Science.gov (United States)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user

  14. The applications of neural networks in the core location analysis of extensive air showers

    International Nuclear Information System (INIS)

    Perrett, J.C.; Stekelenborg, J.T.P.M. van

    1991-01-01

    We describe the teaching and implementation of a neural network to estimate the core position and energy of extensive air showers recorded by the South Pole air shower experiment, SPASE. The neural network was found to be approximately 300 times faster than the traditional χ 2 minimization technique previously used and as accurate. (Author)

  15. Recommendations arising from an analysis of changes to the Australian agricultural research, development and extension system

    NARCIS (Netherlands)

    Hunt, Warren; Birch, Colin; Vanclay, Frank; Coutts, Jeff

    The business of agricultural research, development and extension (RD&E) has undergone considerable change in Australia since the late 1980s, moving from a domain largely dominated by government departments to a situation of multiple actors, and where rural industries now directly contribute funds

  16. Farmers' Participation in Extension Programs and Technology Adoption in Rural Nepal: A Logistic Regression Analysis

    Science.gov (United States)

    Suvedi, Murari; Ghimire, Raju; Kaplowitz, Michael

    2017-01-01

    Purpose: This paper examines the factors affecting farmers' participation in extension programs and adoption of improved seed varieties in the hills of rural Nepal. Methodology/approach: Cross-sectional farm-level data were collected during July and August 2014. A sample of 198 farm households was selected for interviewing by using a multistage,…

  17. Identifying gaps in the practices of rural health extension workers in Ethiopia : A task analysis study

    NARCIS (Netherlands)

    Desta, Firew Ayalew; Shifa, Girma Temam; Dagoye, Damtew WoldeMariam; Carr, Catherine; Van Roosmalen, Jos; Stekelenburg, Jelle; Nedi, Assefa Bulcha; Kols, Adrienne; Kim, Young Mi

    2017-01-01

    Background: Health extension workers (HEWs) are the frontline health workers for Ethiopia's primary health care system. The Federal Ministry of Health is seeking to upgrade and increase the number of HEWs, particularly in remote areas, and address concerns about HEWs' pre-service education and

  18. Using R-Project for Free Statistical Analysis in Extension Research

    Science.gov (United States)

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  19. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 11: PEP, cost, schedules, and work breakdown structure dictionary

    Science.gov (United States)

    1979-01-01

    Cost scheduling and funding data are presented for the reference design of the power extension package. Major schedule milestones are correlated with current Spacelab flight dates. Funding distributions provide for minimum expenditure during the first year of the project.

  20. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  1. An Analysis of Teacher Selection Tools in Pennsylvania

    Science.gov (United States)

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  2. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  3. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  4. Automated patterning and probing with multiple nanoscale tools for single-cell analysis.

    Science.gov (United States)

    Li, Jiayao; Kim, Yeonuk; Liu, Boyin; Qin, Ruwen; Li, Jian; Fu, Jing

    2017-10-01

    The nano-manipulation approach that combines Focused Ion Beam (FIB) milling and various imaging and probing techniques enables researchers to investigate the cellular structures in three dimensions. Such fusion approach, however, requires extensive effort on locating and examining randomly-distributed targets due to limited Field of View (FOV) when high magnification is desired. In the present study, we present the development that automates 'pattern and probe' particularly for single-cell analysis, achieved by computer aided tools including feature recognition and geometric planning algorithms. Scheduling of serial FOVs for imaging and probing of multiple cells was considered as a rectangle covering problem, and optimal or near-optimal solutions were obtained with the heuristics developed. FIB milling was then employed automatically followed by downstream analysis using Atomic Force Microscopy (AFM) to probe the cellular interior. Our strategy was applied to examine bacterial cells (Klebsiella pneumoniae) and achieved high efficiency with limited human interference. The developed algorithms can be easily adapted and integrated with different imaging platforms towards high-throughput imaging analysis of single cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  6. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  7. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  8. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  9. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint

    Directory of Open Access Journals (Sweden)

    Hong-gen Du

    2016-05-01

    Full Text Available This study investigates the effect of a new Chinese massage technique named “press-extension” on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1–S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique.

  10. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    Science.gov (United States)

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  11. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    International Nuclear Information System (INIS)

    Ourghanlian, Alain

    2015-01-01

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  12. An Empirical Analysis and Extension of the Bartlett and Ghoshal Typology of Multinational Companies

    OpenAIRE

    Anne-Wil Harzing

    2000-01-01

    This study offers an empirical test and extension of the Bartlett and Ghoshal typology of multinational companies (MNCs) A three-fold typology of multinational companies: Global, Multidomestic and Transnational is developed from the literature. This typology is tested using data from 166 subsidiaries of 37 MNCs, headquartered in 9 different countries. Subsidiaries in the three types of MNCs are shown to differ significantly in aspects of interdependence and local responsiveness.© 2000 JIBS. J...

  13. Assessing the Possibility of Implementing Tools of Technical Analysys for Real Estate Market Analysis

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2016-06-01

    Full Text Available Technical analysis (TA and its different aspects are widely used to study the capital market. In the traditional approach, this analysis is used to determine the probability of changes in current rates on the basis of their past changes, accounting for factors which had, have or may have an influence on shaping the supply and demand of a given asset. In the practical sense, TA is a set of techniques used for assessing the value of an asset based on the analysis of the asset's trajectories as well as statistical tools.

  14. Neutrons and magnetic structures: analysis methods and tools

    Science.gov (United States)

    Damay, Françoise

    2015-12-01

    After a short introduction on neutron diffraction and magnetic structures, this review focuses on the new computing tools available in magnetic crystallography nowadays. The appropriate neutron techniques and different steps required to determine a magnetic structure are also introduced.

  15. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  16. Scaffolding Assignments: Analysis of Assignmentor as a Tool to Support First Year Students' Academic Writing Skills

    Science.gov (United States)

    Silva, Pedro

    2017-01-01

    There are several technological tools which aim to support first year students' challenges, especially when it comes to academic writing. This paper analyses one of these tools, Wiley's AssignMentor. The Technological Pedagogical Content Knowledge framework was used to systematise this analysis. The paper showed an alignment between the tools'…

  17. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    Science.gov (United States)

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  18. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    Science.gov (United States)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center

  19. Modal Analysis and Experimental Determination of Optimum Tool Shank Overhang of a Lathe Machine

    Directory of Open Access Journals (Sweden)

    Nabin SARDAR

    2008-12-01

    Full Text Available Vibration of Tool Shank of a cutting tool has large influence on tolerances and surface finish of products. Frequency and amplitude of vibrations depend on the overhang of the shank of the cutting tool. In turning operations, when the tool overhang is about 2 times of the tool height, the amplitude of the vibration is almost zero and dimensional tolerances and surface finish of the product becomes high. In this paper, the above statement is verified firstly by using a finite element analysis of the cutting tool with ANSYS software package and secondly, with experimental verification with a piezoelectric sensor.

  20. INCREASE OF LEVEL INTENSIVE AND EXTENSIVE USE OF FIXED ASSETS AND ANALYSIS OF ITS INFLUENCE ON OUTPUTS

    Directory of Open Access Journals (Sweden)

    R. T. Ismailov

    2014-01-01

    Full Text Available It is considered complex organizing-technical action, which undertaking allows to raise the level intensive and extensive of the use in production process of the active part of the main fund and such way to enlarge the volumes of production. They are offered estimations to efficiency conducted action, which use allows to raise adequacy a result called on analysis and define the most effective actions, providing growing volume of production with minimum expenses.

  1. A pigment analysis tool for hyperspectral images of cultural heritage artifacts

    Science.gov (United States)

    Bai, Di; Messinger, David W.; Howell, David

    2017-05-01

    The Gough Map, in the collection at the Bodleian Library, Oxford University, is one of the earliest surviving maps of Britain. Previous research deemed that it was likely created over the 15th century and afterwards it was extensively revised more than once. In 2015, the Gough Map was imaged using a hyperspectral imaging system at the Bodleian Library. The collection of the hyperspectral image (HSI) data was aimed at faded text enhancement for reading and pigment analysis for the material diversity of its composition and potentially the timeline of its creation. In this research, we introduce several methods to analyze the green pigments in the Gough Map, especially the number and spatial distribution of distinct green pigments. One approach, called the Gram Matrix, has been used to estimate the material diversity in a scene (i.e., endmember selection and dimensionality estimation). Here, we use the Gram Matrix technique to study the within-material differences of pigments in the Gough map with common visual color. We develop a pigment analysis tool that extracts visually common pixels, green pigments in this case, from the Gough Map and estimates its material diversity. It reveals that the Gough Map consists of at least six kinds of dominant green pigments. Both historical geographers and cartographic historians will benefit from this work to analyze the pigment diversity using HSI of cultural heritage artifacts.

  2. Expressed sequence tags as a tool for phylogenetic analysis of placental mammal evolution.

    Directory of Open Access Journals (Sweden)

    Morgan Kullberg

    Full Text Available BACKGROUND: We investigate the usefulness of expressed sequence tags, ESTs, for establishing divergences within the tree of placental mammals. This is done on the example of the established relationships among primates (human, lagomorphs (rabbit, rodents (rat and mouse, artiodactyls (cow, carnivorans (dog and proboscideans (elephant. METHODOLOGY/PRINCIPAL FINDINGS: We have produced 2000 ESTs (1.2 mega bases from a marsupial mouse and characterized the data for their use in phylogenetic analysis. The sequences were used to identify putative orthologous sequences from whole genome projects. Although most ESTs stem from single sequence reads, the frequency of potential sequencing errors was found to be lower than allelic variation. Most of the sequences represented slowly evolving housekeeping-type genes, with an average amino acid distance of 6.6% between human and mouse. Positive Darwinian selection was identified at only a few single sites. Phylogenetic analyses of the EST data yielded trees that were consistent with those established from whole genome projects. CONCLUSIONS: The general quality of EST sequences and the general absence of positive selection in these sequences make ESTs an attractive tool for phylogenetic analysis. The EST approach allows, at reasonable costs, a fast extension of data sampling from species outside the genome projects.

  3. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes.

    Science.gov (United States)

    Halski, Tomasz; Żmijewski, Piotr; Cięszczyk, Paweł; Nowak, Barbara; Ptaszkowski, Kuba; Slupska, Lucyna; Dymarek, Robert; Taradaj, Jakub

    2015-11-22

    The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART) of the gluteus maximus (GM) and the hamstring muscle group (HMG) was assessed by the means of surface electromyography (sEMG). The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross - sectional, randomised design, where 76 visually impaired athletes between the age of 18-25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women) were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2). All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05). Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05).

  5. Electromyographic Analysis of the Hip Extension Pattern in Visually Impaired Athletes

    Directory of Open Access Journals (Sweden)

    Halski Tomasz

    2015-12-01

    Full Text Available The objective of the study was to determine the order of muscle recruitment during the active hip joint extension in particular positions in young visually impaired athletes. The average recruitment time (ART of the gluteus maximus (GM and the hamstring muscle group (HMG was assessed by the means of surface electromyography (sEMG. The sequence of muscle recruitment in the female and male group was also taken into consideration. This study followed a prospective, cross – sectional, randomised design, where 76 visually impaired athletes between the age of 18–25 years were enrolled into the research and selected on chosen inclusion and exclusion criteria. Finally, 64 young subjects (32 men and 32 women were included in the study (age: 21.1 ± 1.05 years; body mass: 68.4 ± 12.4 kg; body height: 1.74 ± 0.09 m; BMI: 22.20 ± 2.25 kg/m2. All subjects were analysed for the ART of the GM and HMG during the active hip extension performed in two different positions, as well as resting and functional sEMG activity of each muscle. Between gender differences were comprised and the correlations between the ART of the GM and HMG with their functional sEMG activity during hip extension in both positions were shown. No significant differences between the ART of the GM and HMG were found (p>0.05. Furthermore, there was no significant difference of ART among both tested positions, as well in male as female subjects (p>0.05.

  6. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  7. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    Science.gov (United States)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  8. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  9. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    Science.gov (United States)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  10. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  11. A Geographic and Functional Network Flow Analysis Tool

    Science.gov (United States)

    2014-06-01

    Additional functionality added to the Arc Maker plugin .................................15 Figure 7. Work flow of running the flow model on a QGIS layer...System PPD-21 Presidential Policy Directive 21 QGIS Quantum GIS XML Extensible Markup Language xiv THIS PAGE INTENTIONALLY LEFT...heavily on Quantum GIS ( QGIS 2012) to model our simulated networks. QGIS is a free, open source Geographic Information System software suite. We ran

  12. Lagrangian analysis. Modern tool of the dynamics of solids

    Science.gov (United States)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  13. Enabling Collaborative Analysis: State Evaluation Groups, the Electronic State File, and Collaborative Analysis Tools

    International Nuclear Information System (INIS)

    Eldridge, C.; Gagne, D.; Wilson, B.; Murray, J.; Gazze, C.; Feldman, Y.; Rorif, F.

    2015-01-01

    The timely collection and analysis of all safeguards relevant information is the key to drawing and maintaining soundly-based safeguards conclusions. In this regard, the IAEA has made multidisciplinary State Evaluation Groups (SEGs) central to this process. To date, SEGs have been established for all States and tasked with developing State-level approaches (including the identification of technical objectives), drafting annual implementation plans specifying the field and headquarters activities necessary to meet technical objectives, updating the State evaluation on an ongoing basis to incorporate new information, preparing an annual evaluation summary, and recommending a safeguards conclusion to IAEA senior management. To accomplish these tasks, SEGs need to be staffed with relevant expertise and empowered with tools that allow for collaborative access to, and analysis of, disparate information sets. To ensure SEGs have the requisite expertise, members are drawn from across the Department of Safeguards based on their knowledge of relevant data sets (e.g., nuclear material accountancy, material balance evaluation, environmental sampling, satellite imagery, open source information, etc.) or their relevant technical (e.g., fuel cycle) expertise. SEG members also require access to all available safeguards relevant data on the State. To facilitate this, the IAEA is also developing a common, secure platform where all safeguards information can be electronically stored and made available for analysis (an electronic State file). The structure of this SharePoint-based system supports IAEA information collection processes, enables collaborative analysis by SEGs, and provides for management insight and review. In addition to this common platform, the Agency is developing, deploying, and/or testing sophisticated data analysis tools that can synthesize information from diverse information sources, analyze diverse datasets from multiple viewpoints (e.g., temporal, geospatial

  14. CIPHER: a flexible and extensive workflow platform for integrative next-generation sequencing data analysis and genomic regulatory element prediction.

    Science.gov (United States)

    Guzman, Carlos; D'Orso, Iván

    2017-08-08

    Next-generation sequencing (NGS) approaches are commonly used to identify key regulatory networks that drive transcriptional programs. Although these technologies are frequently used in biological studies, NGS data analysis remains a challenging, time-consuming, and often irreproducible process. Therefore, there is a need for a comprehensive and flexible workflow platform that can accelerate data processing and analysis so more time can be spent on functional studies. We have developed an integrative, stand-alone workflow platform, named CIPHER, for the systematic analysis of several commonly used NGS datasets including ChIP-seq, RNA-seq, MNase-seq, DNase-seq, GRO-seq, and ATAC-seq data. CIPHER implements various open source software packages, in-house scripts, and Docker containers to analyze and process single-ended and pair-ended datasets. CIPHER's pipelines conduct extensive quality and contamination control checks, as well as comprehensive downstream analysis. A typical CIPHER workflow includes: (1) raw sequence evaluation, (2) read trimming and adapter removal, (3) read mapping and quality filtering, (4) visualization track generation, and (5) extensive quality control assessment. Furthermore, CIPHER conducts downstream analysis such as: narrow and broad peak calling, peak annotation, and motif identification for ChIP-seq, differential gene expression analysis for RNA-seq, nucleosome positioning for MNase-seq, DNase hypersensitive site mapping, site annotation and motif identification for DNase-seq, analysis of nascent transcription from Global-Run On (GRO-seq) data, and characterization of chromatin accessibility from ATAC-seq datasets. In addition, CIPHER contains an "analysis" mode that completes complex bioinformatics tasks such as enhancer discovery and provides functions to integrate various datasets together. Using public and simulated data, we demonstrate that CIPHER is an efficient and comprehensive workflow platform that can analyze several NGS

  15. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    Science.gov (United States)

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Stability analysis of multipoint tool equipped with metal cutting ceramics

    Science.gov (United States)

    Maksarov, V. V.; Khalimonenko, A. D.; Matrenichev, K. G.

    2017-10-01

    The article highlights the issues of determining the stability of the cutting process by a multipoint cutting tool equipped with cutting ceramics. There were some recommendations offered on the choice of parameters of replaceable cutting ceramic plates for milling based of the conducted researches. Ceramic plates for milling are proposed to be selected on the basis of value of their electrical volume resistivity.

  17. “DRYPACK” - a calculation and analysis tool

    DEFF Research Database (Denmark)

    Andreasen, M.B.; Toftegaard, R.; Schneider, P.

    2013-01-01

    energy consumption reductions by using “DryPack” are calculated. With the “DryPack” calculation tool, it is possible to calculate four different unit operations with moist air (dehumidification of air, humidification of air, mixing of two air streams, and heating of air). In addition, a Mollier diagram...

  18. comparative analysis of diagnostic applications of autoscan tools

    African Journals Online (AJOL)

    user

    changing the skills demanded of auto designers, engineers and production workers [1,5,6,]. In automobile education, the use of autotronic simulators and demonstrators as teaching aids with computer soft- wares, auto scan tools for diagnosis, servicing and maintenance, auto- analyzers, solid work design and can- bus hard ...

  19. Fractography analysis of tool samples used for cold forging

    DEFF Research Database (Denmark)

    Dahl, K.V.

    2002-01-01

    Three fractured tool dies used for industrial cold forging have been investigated using light optical microscopy and scanning electron microscopy. Two of the specimens were produced using the traditional Böhler P/M steel grade s790, while the lastspecimen was a third generation P/M steel produced...... resistance towards abrasive wear compared with the traditional P/M steel....

  20. Analysis of Online Quizzes as a Teaching and Assessment Tool

    Science.gov (United States)

    Salas-Morera, Lorenzo; Arauzo-Azofra, Antonio; García-Hernández, Laura

    2012-01-01

    This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an…

  1. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    Science.gov (United States)

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  2. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Science.gov (United States)

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  3. NREL Suite of Tools for PV and Storage Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Elgqvist, Emma M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Salasovich, James A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-03

    Many different factors such as the solar resource, technology costs and incentives, utility cost and consumption, space available, and financial parameters impact the technical and economic potential of a PV project. NREL has developed techno-economic modeling tools that can be used to evaluate PV projects at a site.

  4. Stakeholder Analysis of an Executable Achitecture Systems Engineering (EASE) Tool

    Science.gov (United States)

    2013-06-21

    The FCR tables and stakeholder feedback are then used as the foundation of a Strengths, Weaknesses, Opportunities, and Threats ( SWOT ) analysis . Finally...the SWOT analysis and stakeholder feedback arc translated into an EASE future development strategy; a series of recommendations regarding...and Threats ( SWOT ) analysis . Finally, the SWOT analysis and stakeholder feedback are translated into an EASE future development strategy; a series

  5. Extensive Analysis of Elastase-Induced Pulmonary Emphysema in Rats: ALP in the Lung, a New Biomarker for Disease Progression?

    Science.gov (United States)

    Inoue, Ken-ichiro; Koike, Eiko; Yanagisawa, Rie; Takano, Hirohisa

    2010-01-01

    It is accepted that pulmonary exposure of rodents to porcine pancreatic elastase (ELT) induces lesions that morphologically resemble human emphysema. Nonetheless, extensive analysis of this model has rarely been conducted. The present study was designed to extensively examine the effects of ELT on lung inflammation, cell damage, emphysematous change, and cholinergic reactivity in rats. Intratracheal administration of two doses of ELT induced 1) a proinflammatory response in the lung that was characterized by significant infiltration of macrophages and an increased level of interleukin-1β in lung homogenates, 2) lung cell damage as indicated by higher levels of total protein, lactate dehydrogenase, and alkaline phosphatase (ALP) in lung homogenates, 3) emphysema-related morphological changes including airspace enlargement and progressive destruction of alveolar wall structures, and 4) airway responsiveness to methacholine including an augmented Rn value. In addition, ELT at a high dose was more effective than that at a low dose. This is the novel study to extensively analyze ELT-induced lung emphysema, and the analysis might be applied to future investigations that evaluate new therapeutic agents or risk factors for pulmonary emphysema. In particular, ALP in lung homogenates might be a new biomarker for the disease progression/exacerbation. PMID:20216950

  6. Interactive tool that empowers structural understanding and enables FEM analysis in a parametric design environment

    DEFF Research Database (Denmark)

    Christensen, Jesper Thøger; Parigi, Dario; Kirkegaard, Poul Henning

    2014-01-01

    This paper introduces an interactive tool developed to integrate structural analysis in the architectural design environment from the early conceptual design stage. The tool improves exchange of data between the design environment of Rhino Grasshopper and the FEM analysis of Autodesk Robot...

  7. Single nucleotide polymorphism analysis of ubiquitin extension protein genes (ubq) of gossypium arboreum and gossypium herbaceum in comparison with arabidopsis thaliana

    International Nuclear Information System (INIS)

    Shaheen, T.; Zafar, Y.; Rahman, M.

    2014-01-01

    Single nucleotide polymorphism analysis is an expedient way to study polymorphisms at genomic level. In the present study we have explored Ubiquitin extension protein gene of G. arboreum (A2) and G. herbaceum (A1) of cotton which is a multiple copy gene. We have found SNPs at 16 positions in 200 bp region within A genome of cotton indicating frequency of SNPs 1/13 bp. Both sequences from cotton have shown maximum similarity with UBQ5 and UBQ6 of Arabidopsis thaliana. Sequence obtained from G. arboreum has shown SNPs at 28 positions in comparison with each UBQ5 and UBQ6 of Arabidopsis thaliana while sequence obtained from G. herbaceum has shown SNPs at 31 positions in comparison with each UBQ5 and UBQ6 of Arabidopsis thaliana. In conclusion although during pace of evolution ubiquitin extension protein genes of both A genome species have got some mutations from nature but still most of their sequence is similar. Single nucleotide polymorphism study can prove a vital tool to identify gene type in case of Multicopy genes. (author)

  8. Subreflector extension for improved efficiencies in Cassegrain antennas - GTD/PO analysis. [Geometrical Theory of Diffraction/Physical Optics

    Science.gov (United States)

    Rahmat-Samii, Yahya

    1986-01-01

    Both offset and symmetric Cassegrain reflector antennas are used in satellite and ground communication systems. It is known that the subreflector diffraction can degrade the performance of these reflectors. A geometrical theory of diffraction/physical optics analysis technique is used to investigate the effects of the extended subreflector, beyond its optical rim, on the reflector efficiency and far-field patterns. Representative numerical results are shown for an offset Cassegrain reflector antenna with different feed illumination tapers and subreflector extensions. It is observed that for subreflector extensions as small as one wavelength, noticeable improvements in the overall efficiencies can be expected. Useful design data are generated for the efficiency curves and far-field patterns.

  9. Multiply controlled verbal operants: An analysis and extension to the picture exchange communication system

    OpenAIRE

    Bondy, Andy; Tincani, Matt; Frost, Lori

    2004-01-01

    This paper presents Skinner's (1957) analysis of verbal behavior as a framework for understanding language acquisition in children with autism. We describe Skinner's analysis of pure and impure verbal operants and illustrate how this analysis may be applied to the design of communication training programs. The picture exchange communication system (PECS) is a training program influenced by Skinner's framework. We describe the training sequence associated with PECS and illustrate how this sequ...

  10. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  12. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  13. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  14. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    International Nuclear Information System (INIS)

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings

  15. A Rigorous, Compositional, and Extensible Framework for Dynamic Fault Tree Analysis

    NARCIS (Netherlands)

    Boudali, H.; Sandhu, R.; Crouzen, Pepijn; Stoelinga, Mariëlle Ida Antoinette

    Fault trees (FT) are among the most prominent formalisms for reliability analysis of technical systems. Dynamic FTs extend FTs with support for expressing dynamic dependencies among components. The standard analysis vehicle for DFTs is state-based, and treats the model as a CTMC, a continuous-time

  16. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  17. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where an a...... provide the possibility for the designer to work both with the aesthetics as well as the technical aspects of architectural design....

  18. ProteinHistorian: tools for the comparative analysis of eukaryote protein origin.

    Directory of Open Access Journals (Sweden)

    John A Capra

    Full Text Available The evolutionary history of a protein reflects the functional history of its ancestors. Recent phylogenetic studies identified distinct evolutionary signatures that characterize proteins involved in cancer, Mendelian disease, and different ontogenic stages. Despite the potential to yield insight into the cellular functions and interactions of proteins, such comparative phylogenetic analyses are rarely performed, because they require custom algorithms. We developed ProteinHistorian to make tools for performing analyses of protein origins widely available. Given a list of proteins of interest, ProteinHistorian estimates the phylogenetic age of each protein, quantifies enrichment for proteins of specific ages, and compares variation in protein age with other protein attributes. ProteinHistorian allows flexibility in the definition of protein age by including several algorithms for estimating ages from different databases of evolutionary relationships. We illustrate the use of ProteinHistorian with three example analyses. First, we demonstrate that proteins with high expression in human, compared to chimpanzee and rhesus macaque, are significantly younger than those with human-specific low expression. Next, we show that human proteins with annotated regulatory functions are significantly younger than proteins with catalytic functions. Finally, we compare protein length and age in many eukaryotic species and, as expected from previous studies, find a positive, though often weak, correlation between protein age and length. ProteinHistorian is available through a web server with an intuitive interface and as a set of command line tools; this allows biologists and bioinformaticians alike to integrate these approaches into their analysis pipelines. ProteinHistorian's modular, extensible design facilitates the integration of new datasets and algorithms. The ProteinHistorian web server, source code, and pre-computed ages for 32 eukaryotic genomes are

  19. Pyrosequencing data analysis software: a useful tool for EGFR, KRAS, and BRAF mutation analysis

    Directory of Open Access Journals (Sweden)

    Shen Shanxiang

    2012-05-01

    Full Text Available Abstract Background Pyrosequencing is a new technology and can be used for mutation tests. However, its data analysis is a manual process and involves sophisticated algorithms. During this process, human errors may occur. A better way of analyzing pyrosequencing data is needed in clinical diagnostic laboratory. Computer software is potentially useful for pyrosequencing data analysis. We have developed such software, which is able to perform pyrosequencing mutation data analysis for epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene homolog and v-raf murine sarcoma viral oncogene homolog B1. The input data for analysis includes the targeted nucleotide sequence, common mutations in the targeted sequence, pyrosequencing dispensing order, pyrogram peak order and peak heights. The output includes mutation type and percentage of mutant gene in the specimen. Results The data from 1375 pyrosequencing test results were analyzed using the software in parallel with manual analysis. The software was able to generate correct results for all 1375 cases. Conclusion The software developed is a useful molecular diagnostic tool for pyrosequencing mutation data analysis. This software can increase laboratory data analysis efficiency and reduce data analysis error rate. Virtual slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1348911657684292.

  20. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  1. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide

    Science.gov (United States)

    Smittle, Aaron M.; Shoberg, Thomas G.

    2017-06-16

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  2. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  3. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  4. Version VI of the ESTree db: an improved tool for peach transcriptome analysis

    Science.gov (United States)

    Lazzari, Barbara; Caprera, Andrea; Vecchietti, Alberto; Merelli, Ivan; Barale, Francesca; Milanesi, Luciano; Stella, Alessandra; Pozzi, Carlo

    2008-01-01

    Background The ESTree database (db) is a collection of Prunus persica and Prunus dulcis EST sequences that in its current version encompasses 75,404 sequences from 3 almond and 19 peach libraries. Nine peach genotypes and four peach tissues are represented, from four fruit developmental stages. The aim of this work was to implement the already existing ESTree db by adding new sequences and analysis programs. Particular care was given to the implementation of the web interface, that allows querying each of the database features. Results A Perl modular pipeline is the backbone of sequence analysis in the ESTree db project. Outputs obtained during the pipeline steps are automatically arrayed into the fields of a MySQL database. Apart from standard clustering and annotation analyses, version VI of the ESTree db encompasses new tools for tandem repeat identification, annotation against genomic Rosaceae sequences, and positioning on the database of oligomer sequences that were used in a peach microarray study. Furthermore, known protein patterns and motifs were identified by comparison to PROSITE. Based on data retrieved from sequence annotation against the UniProtKB database, a script was prepared to track positions of homologous hits on the GO tree and build statistics on the ontologies distribution in GO functional categories. EST mapping data were also integrated in the database. The PHP-based web interface was upgraded and extended. The aim of the authors was to enable querying the database according to all the biological aspects that can be investigated from the analysis of data available in the ESTree db. This is achieved by allowing multiple searches on logical subsets of sequences that represent different biological situations or features. Conclusions The version VI of ESTree db offers a broad overview on peach gene expression. Sequence analyses results contained in the database, extensively linked to external related resources, represent a large amount of

  5. An Extension of Fuzzy SWOT Analysis: An Application to Information Technology

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Taghavifard

    2018-02-01

    Full Text Available When considering today’s uncertain atmosphere, many people and organizations believe that strategy has lost its meaning and position. When future is predictable, common approaches for strategic planning are applicable; nonetheless, vague circumstances require different methods. Accordingly, a new approach that is compatible with uncertainty and unstable conditions is necessary. Fuzzy logic is a worldview compatible with today complicated requirements. Regarding today’s uncertain and vague atmosphere, there is an absolute requirement to fuzzify the tools and strategic planning models, especially for dynamic and unclear environment. In this research, an extended version of Strengths, Weaknesses, Opportunities and Threats (SWOT fuzzy approach has been presented for strategic planning based on fuzzy logic. It has solved the traditional strategic planning key problems like internal and external factors in imprecision and ambiguous environment. The model has been performed in an information technology corporation to demonstrate the capabilities in real world cases.

  6. An extensive analysis of various texture feature extractors to detect Diabetes Mellitus using facial specific regions.

    Science.gov (United States)

    Shu, Ting; Zhang, Bob; Yan Tang, Yuan

    2017-04-01

    Researchers have recently discovered that Diabetes Mellitus can be detected through non-invasive computerized method. However, the focus has been on facial block color features. In this paper, we extensively study the effects of texture features extracted from facial specific regions at detecting Diabetes Mellitus using eight texture extractors. The eight methods are from four texture feature families: (1) statistical texture feature family: Image Gray-scale Histogram, Gray-level Co-occurance Matrix, and Local Binary Pattern, (2) structural texture feature family: Voronoi Tessellation, (3) signal processing based texture feature family: Gaussian, Steerable, and Gabor filters, and (4) model based texture feature family: Markov Random Field. In order to determine the most appropriate extractor with optimal parameter(s), various parameter(s) of each extractor are experimented. For each extractor, the same dataset (284 Diabetes Mellitus and 231 Healthy samples), classifiers (k-Nearest Neighbors and Support Vector Machines), and validation method (10-fold cross validation) are used. According to the experiments, the first and third families achieved a better outcome at detecting Diabetes Mellitus than the other two. The best texture feature extractor for Diabetes Mellitus detection is the Image Gray-scale Histogram with bin number=256, obtaining an accuracy of 99.02%, a sensitivity of 99.64%, and a specificity of 98.26% by using SVM. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  8. Adapted waveform analysis, wavelet packets, and local cosine libraries as a tool for image processing

    Science.gov (United States)

    Coifman, Ronald R.; Woog, Lionel J.

    1995-09-01

    Adapted wave form analysis, refers to a collection of FFT like adapted transform algorithms. Given an image these methods provide special matched collections of templates (orthonormal bases) enabling an efficient coding of the image. Perhaps the closest well known example of such coding method is provided by musical notation, where each segment of music is represented by a musical score made up of notes (templates) characterised by their duration, pitch, location and amplitude, our method corresponds to transcribing the music in as few notes as possible. The extension to images and video is straightforward we describe the image by collections of oscillatory patterns (paint brush strokes)of various sizes locations and amplitudes using a variety of orthogonal bases. These selected basis functions are chosen inside predefined libraries of oscillatory localized functions (trigonometric and wavelet-packets waveforms) so as to optimize the number of parameters needed to describe our object. These algorithms are of complexity N log N opening the door for a large range of applications in signal and image processing, such as compression, feature extraction denoising and enhancement. In particular we describe a class of special purpose compressions for fingerprint irnages, as well as denoising tools for texture and noise extraction. We start by relating traditional Fourier methods to wavelet, wavelet-packet based algorithms using a recent refinement of the windowed sine and cosine transforms. We will then derive an adapted local sine transform show it's relation to wavelet and wavelet-packet analysis and describe an analysis toolkit illustrating the merits of different adaptive and nonadaptive schemes.

  9. The Hydrograph Analyst, an Arcview GIS Extension That Integrates Point, Spatial, and Temporal Data Provides A Graphical User Interface for Hydrograph Analysis

    International Nuclear Information System (INIS)

    Jones, M.L.; O'Brien, G.M.; Jones, M.L.

    2000-01-01

    The Hydrograph Analyst (HA) is an ArcView GIS 3.2 extension developed by the authors to analyze hydrographs from a network of ground-water wells and springs in a regional ground-water flow model. ArcView GIS integrates geographic, hydrologic, and descriptive information and provides the base functionality needed for hydrograph analysis. The HA extends ArcView's base functionality by automating data integration procedures and by adding capabilities to visualize and analyze hydrologic data. Data integration procedures were automated by adding functionality to the View document's Document Graphical User Interface (DocGUI). A menu allows the user to query a relational database and select sites which are displayed as a point theme in a View document. An ''Identify One to Many'' tool is provided within the View DocGUI to retrieve all hydrologic information for a selected site and display it in a simple and concise tabular format. For example, the display could contain various records from many tables storing data for one site. Another HA menu allows the user to generate a hydrograph for sites selected from the point theme. Hydrographs generated by the HA are added as hydrograph documents and accessed by the user with the Hydrograph DocGUI, which contains tools and buttons for hydrograph analysis. The Hydrograph DocGUI has a ''Select By Polygon'' tool used for isolating particular points on the hydrograph inside a user-drawn polygon or the user could isolate the same points by constructing a logical expression with the ArcView GIS ''Query Builder'' dialog that is also accessible in the Hydrograph DocGUI. Other buttons can be selected to alter the query applied to the active hydrograph. The selected points on the active hydrograph can be attributed (or flagged) individually or as a group using the ''Flag'' tool found on the Hydrograph DocGUI. The ''Flag'' tool activates a dialog box that prompts the user to select an attribute and ''methods'' or ''conditions'' that qualify

  10. Operating system efficiency evaluation on the base of measurements analysis with the use of non-extensive statistics elements

    Directory of Open Access Journals (Sweden)

    Dymora Paweł

    2014-09-01

    Full Text Available The major goal of this article was to evaluate the efficiency of Linux operating system using statistical self-similarity and multifractal analysis. In order to collect the necessary data, the tools available in Linux such as vmstat, top and iostat were used. The measurement data collected with those tools had to be converted into a format acceptable by applications which analyze statistical selfsimilarity and multifractal spectra. Measurements collected while using the MySQL database system in a host operating system were therefore analyzed with the use of statistical self-similarity and allowed to determine the occurrence of long-range dependencies. Those dependencies were analyzed with the use of adequately graduated diagrams. Multifractal analysis was conducted with the help of FracLab application. Two methods were applied to determine the multifractal spectra. The obtained spectra were analyzed in order to establish the multifractal dependencies.

  11. Comparative analysis on arthroscopic sutures of large and extensive rotator cuff injuries in relation to the degree of osteopenia

    Directory of Open Access Journals (Sweden)

    Alexandre Almeida

    2015-02-01

    Full Text Available OBJECTIVE: To analyze the results from arthroscopic suturing of large and extensive rotator cuff injuries, according to the patient's degree of osteopenia.METHOD: 138 patients who underwent arthroscopic suturing of large and extensive rotator cuff injuries between 2003 and 2011 were analyzed. Those operated from October 2008 onwards formed a prospective cohort, while the remainder formed a retrospective cohort. Also from October 2008 onwards, bone densitometry evaluation was requested at the time of the surgical treatment. For the patients operated before this date, densitometry examinations performed up to two years before or after the surgical treatment were investigated. The patients were divided into three groups. Those with osteoporosis formed group 1 (n = 16; those with osteopenia, group 2 (n = 33; and normal individuals, group 3 (n = 55.RESULTS: In analyzing the University of California at Los Angeles (UCLA scores of group 3 and comparing them with group 2, no statistically significant difference was seen (p = 0.070. Analysis on group 3 in comparison with group 1 showed a statistically significant difference (p = 0.027.CONCLUSION: The results from arthroscopic suturing of large and extensive rotator cuff injuries seem to be influenced by the patient's bone mineral density, as assessed using bone densitometry.

  12. An auditory display tool for DNA sequence analysis.

    Science.gov (United States)

    Temple, Mark D

    2017-04-24

    DNA Sonification refers to the use of an auditory display to convey the information content of DNA sequence data. Six sonification algorithms are presented that each produce an auditory display. These algorithms are logically designed from the simple through to the more complex. Three of these parse individual nucleotides, nucleotide pairs or codons into musical notes to give rise to 4, 16 or 64 notes, respectively. Codons may also be parsed degenerately into 20 notes with respect to the genetic code. Lastly nucleotide pairs can be parsed as two separate frames or codons can be parsed as three reading frames giving rise to multiple streams of audio. The most informative sonification algorithm reads the DNA sequence as codons in three reading frames to produce three concurrent streams of audio in an auditory display. This approach is advantageous since start and stop codons in either frame have a direct affect to start or stop the audio in that frame, leaving the other frames unaffected. Using these methods, DNA sequences such as open reading frames or repetitive DNA sequences can be distinguished from one another. These sonification tools are available through a webpage interface in which an input DNA sequence can be processed in real time to produce an auditory display playable directly within the browser. The potential of this approach as an analytical tool is discussed with reference to auditory displays derived from test sequences including simple nucleotide sequences, repetitive DNA sequences and coding or non-coding genes. This study presents a proof-of-concept that some properties of a DNA sequence can be identified through sonification alone and argues for their inclusion within the toolkit of DNA sequence browsers as an adjunct to existing visual and analytical tools.

  13. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    Directory of Open Access Journals (Sweden)

    Drechsel Marion

    2009-10-01

    Full Text Available Abstract Background Single nucleotide polymorphism (SNP genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis. Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  14. Economic modeling for life extension decision making

    International Nuclear Information System (INIS)

    Farber, M.A.; Harrison, D.L.; Carlson, D.D.

    1986-01-01

    This paper presents a methodology for the economic and financial analysis of nuclear plant life extension under uncertainty and demonstrates its use in a case analysis. While the economic and financial evaluation of life extension does not require new analytical tools, such studies should be based on the following three premises. First, the methodology should examine effects at the level of the company or utility system, because the most important economic implications of life extension relate to the altered generation system expansion plan. Second, it should focus on the implications of uncertainty in order to understand the factors that most affect life extension benefits and identify risk management efforts. Third, the methodology should address multiple objectives, at a minimum, both economic and financial objectives

  15. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  16. Practical Multi-Disciplinary Analysis Tools for Combustion Devices Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of multidisciplinary analysis (MDA) techniques for combustion device environment prediction, including complex fluid mixing phenomena, is now becoming...

  17. A Comparative Analysis of Cockpit Display Development Tools

    National Research Council Canada - National Science Library

    Gebhardt, Matthew

    2000-01-01

    ..., Virtual Application Prototyping System (VAPS) and Display Editor. The comparison exploits the analysis framework establishing the advantages and disadvantages of the three software development suites...

  18. Towards understanding the lifespan extension by reduced insulin signaling: bioinformatics analysis of DAF-16/FOXO direct targets in Caenorhabditis elegans.

    Science.gov (United States)

    Li, Yan-Hui; Zhang, Gai-Gai

    2016-04-12

    DAF-16, the C. elegans FOXO transcription factor, is an important determinant in aging and longevity. In this work, we manually curated FOXODB http://lyh.pkmu.cn/foxodb/, a database of FOXO direct targets. It now covers 208 genes. Bioinformatics analysis on 109 DAF-16 direct targets in C. elegans found interesting results. (i) DAF-16 and transcription factor PQM-1 co-regulate some targets. (ii) Seventeen targets directly regulate lifespan. (iii) Four targets are involved in lifespan extension induced by dietary restriction. And (iv) DAF-16 direct targets might play global roles in lifespan regulation.

  19. Data visualization and analysis tools for the MAVEN mission

    Science.gov (United States)

    Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.

    2016-12-01

    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.

  20. Cemented carbide cutting tool: Laser processing and thermal stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)]. E-mail: bsyilbas@kfupm.edu.sa; Arif, A.F.M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia); Karatas, C. [Engineering Faculty, Hacettepe University, Ankara (Turkey); Ahsan, M. [Mechanical Engineering Department, KFUPM, Box 1913, Dhahran 31261 (Saudi Arabia)

    2007-04-15

    Laser treatment of cemented carbide tool surface consisting of W, C, TiC, TaC is examined and thermal stress developed due to temperature gradients in the laser treated region is predicted numerically. Temperature rise in the substrate material is computed numerically using the Fourier heating model. Experiment is carried out to treat the tool surfaces using a CO{sub 2} laser while SEM, XRD and EDS are carried out for morphological and structural characterization of the treated surface. Laser parameters were selected include the laser output power, duty cycle, assisting gas pressure, scanning speed, and nominal focus setting of the focusing lens. It is found that temperature gradient attains significantly high values below the surface particularly for titanium and tantalum carbides, which in turn, results in high thermal stress generation in this region. SEM examination of laser treated surface and its cross section reveals that crack initiation below the surface occurs and crack extends over the depth of the laser treated region.

  1. Data Envelopment Analysis and Extensions for Decision Support and Management Planning.

    Science.gov (United States)

    1983-05-01

    nput and multioutput case. DEA dif- fers, however, from the other models reviewed in the our- poses for which it has been employed. Many other models...efficiency of multiinput, multioutput public service organizations, and provided a basis for further development into other areas of analysis and

  2. Comprehensive analysis of RNA-Seq data reveals extensive RNA editing in a human transcriptome

    DEFF Research Database (Denmark)

    Peng, Zhiyu; Cheng, Yanbing; Tan, Bertrand Chin-Ming

    2012-01-01

    RNA editing is a post-transcriptional event that recodes hereditary information. Here we describe a comprehensive profile of the RNA editome of a male Han Chinese individual based on analysis of ∼767 million sequencing reads from poly(A)(+), poly(A)(-) and small RNA samples. We developed...

  3. Practical extensions to a minimum cost flow model for level of repair analysis

    NARCIS (Netherlands)

    Basten, Robertus Johannes Ida; van der Heijden, Matthijs C.; Schutten, Johannes M.J.

    2011-01-01

    The level of repair analysis (LORA) gives answers to three questions that are posed when deciding on how to maintain capital goods: (1) which components to repair upon failure and which to discard, (2) at which locations in the repair network to perform each type of repairs, and (3) at which

  4. Extension problem for generalized multi-monogenic functions in Clifford analysis

    International Nuclear Information System (INIS)

    Tran Quyet Thang.

    1992-10-01

    The main purpose of this paper is to extend some properties of multi-monogenic functions, which is a generalization of monogenic functions in higher dimensions, for a class of functions satisfying Vekua-type generalized Cauchy-Riemann equations in Clifford Analysis. It is proved that the Hartogs theorem is valid for these functions. (author). 7 refs

  5. Mammographic findings predicting an extensive intraductal component in early stage invasive breast cancer : analysis on microcalcification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeong Ah; Kim, Mi Hye; Lee, Mi Kyung; Oh, Ki Keun [College of Medicine, Yonsei University, Seoul (Korea, Republic of); Kim, Eun Kyung [Pundang CHA General Hospital, College of Medicine, Pochon CHA University, Seoul (Korea, Republic of)

    2000-05-01

    To analyze the mammographic findings of extensive intraductal component (EIC)-positive early invasive breast carcinoma and to determine the mammographic features which predict an EIC positivity in an invasive carcinoma. The mammographic and pathologic findings in 71 patients aged 34-79 (mean 50) years in whom stage I or II invasive breast carcinoma had been diagnosed were retrospectively analysed. The mammographic findings were assigned to one of three groups: mass, mass with microcalcification, or microcalcification only. The shape and distribution of a calcification were classified according to the BI-RADS lexicon, and its extent was classified as either more or less than 3 cm. To detect the presence or absence of EIC and the type of ductal carcinoma in situ (DCIS), the findings were re-examined by means of slide mappings. Twenty-eight of 71 patients (39%) showed ECI positivity. The mammographic findings of EIC-positive invasive cancer (n=3D28) were mass with microcalcification (n=3D14), microcalcification only (n=3D7) and mass only (n=3D7). The mammographic finding which predicted EIC positivity was mass with microcalcification (PPV:0.67, NPV:0.33, p=3D0.02). A mammographic of mass only (n=3D39) showed a significantly high negative predictive value for EIC positivity. (PPV 0.18, NPV 0.82, P less than 0.01). A comparison of cases with or without calcification showed that those with microcalcifications (n=3D32) showed a significantly high PPV of 0.66 (NPV:0.34, p less than 0.01) while those without calcification (n=3D39) showed a significantly high NPV of 0.82 (PPV:0.18, p less than 0.01). There were no significant differences in positive predictive values for EIC between the shape, distribution and extent of calcifications. Whenever microcalcification with or without mass is seen on mammographs obtained during early breast cancer, we can predict EIC-positivity, regardless of shape or distribution according to the BI-RADS lexicon. (author)

  6. LAF-Fabric : a data analysis tool for Linguistic Annotation Framework with an application to the Hebrew Bible

    NARCIS (Netherlands)

    Roorda, Dirk; Kalkman, Gino; Naaijer, Martijn; Cranenburgh, Andreas van

    2014-01-01

    The Linguistic Annotation Framework (LAF) provides a general, extensible stand-off markup system for corpora. This paper discusses LAF-Fabric, a new tool to analyse LAF resources in general with an extension to process the Hebrew Bible in particular. We first walk through the history of the Hebrew

  7. LAF-Fabric: a data analysis tool for Linguistic Annotation Framework with an application to the Hebrew Bible

    NARCIS (Netherlands)

    Roorda, D.; Kalkman, G.; Naaijer, M.; van Cranenburgh, A.

    2014-01-01

    The Linguistic Annotation Framework (LAF) provides a general, extensible stand-off markup system for corpora. This paper discusses LAF-Fabric, a new tool to analyse LAF resources in general with an extension to process the Hebrew Bible in particular. We first walk through the history of the Hebrew

  8. Integration of management control tools. Analysis of a case study

    Directory of Open Access Journals (Sweden)

    Raúl Comas Rodríguez

    2015-09-01

    Full Text Available The objective of this article is to design and to implement a procedure that integrates management control tools focusing on process, to improve the efficiency and the efficacy. It was carried out an experimental study where is defined a procedure, based in the Balanced Scorecard, which integrates the process management into the strategic planning and their evaluation. As results of this work, we define the key factors of success associated with the four perspectives of the Balanced Scorecard that are linked through the cause-effect relations obtaining the strategic map that allows visualizing and communicating the enterprise strategy. The indicators evaluate the key factor of success, integrating the process with the assistance of a software. The implementation of the procedure in a commercialization enterprise contributed to integrate the process definition into the strategic planning. The alignment was evaluated and the efficiency and efficacy indicators improved the company´s performance.

  9. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    Science.gov (United States)

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the

  10. Risk factors associated with default from multi- and extensively drug-resistant tuberculosis treatment, Uzbekistan: a retrospective cohort analysis.

    Science.gov (United States)

    Lalor, Maeve K; Greig, Jane; Allamuratova, Sholpan; Althomsons, Sandy; Tigay, Zinaida; Khaemraev, Atadjan; Braker, Kai; Telnov, Oleksander; du Cros, Philipp

    2013-01-01

    The Médecins Sans Frontières project of Uzbekistan has provided multidrug-resistant tuberculosis treatment in the Karakalpakstan region since 2003. Rates of default from treatment have been high, despite psychosocial support, increasing particularly since programme scale-up in 2007. We aimed to determine factors associated with default in multi- and extensively drug-resistant tuberculosis patients who started treatment between 2003 and 2008 and thus had finished approximately 2 years of treatment by the end of 2010. A retrospective cohort analysis of multi- and extensively drug-resistant tuberculosis patients enrolled in treatment between 2003 and 2008 compared baseline demographic characteristics and possible risk factors for default. Default was defined as missing ≥60 consecutive days of treatment (all drugs). Data were routinely collected during treatment and entered in a database. Potential risk factors for default were assessed in univariate analysis using chi-square test and in multivariate analysis with logistic regression. 20% (142/710) of patients defaulted after a median of 6 months treatment (IQR 2.6-9.9). Factors associated with default included severity of resistance patterns (pre-extensively drug-resistant/extensively drug-resistant tuberculosis adjusted odds ratio 0.52, 95%CI: 0.31-0.86), previous default (2.38, 1.09-5.24) and age >45 years (1.77, 1.10-2.87). The default rate was 14% (42/294) for patients enrolled 2003-2006 and 24% (100/416) for 2007-2008 enrolments (p = 0.001). Default from treatment was high and increased with programme scale-up. It is essential to ensure scale-up of treatment is accompanied with scale-up of staff and patient support. A successful first course of tuberculosis treatment is important; patients who had previously defaulted were at increased risk of default and death. The protective effect of severe resistance profiles suggests that understanding disease severity or fear may motivate against default. Targeted

  11. Molecular polymorphism as a tool for differentiating ground beetles (Carabus species): application of ubiquitin PCR/SSCP analysis.

    Science.gov (United States)

    Boge, A; Gerstmeier, R; Einspanier, R

    1994-11-01

    Differentiation between Carabus species (ground beetle) and subspecies is difficult, although there have been extensive studies. To address this problem we have applied PCR in combination with SSCP analysis focussing on the evolutionally conservative ubiquitin gene to elaborate a new approach to molecular differentiation between species. We report that Carabidae possess an ubiquitin gene and that its gene has a multimeric structure. Differential SSCP analysis was performed with the monomeric form of the gene to generate a clear SSCP pattern. Such PCR/SSCP resulted in reproducible patterns throughout our experiments. Comparing different Carabus species (Carabus granulatus, C. irregularis, C. violaceus and C. auronitens) we could observe clear interspecies differences but no differences between genders. Some species showed some remarkable differences between the individuals. We suggest that the ubiquitin PCR-SSCP technique might be an additional tool for the differentiation of ground beetles.

  12. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...... expectations algorithm. New canonical variates are found that indicate that the highest correlation between ocean temperature and height is associated with the build-up of the El Niño during the last half of 1997....

  13. Phylogenomic Analysis Reveals Extensive Phylogenetic Mosaicism in the Human GPCR Superfamily

    Directory of Open Access Journals (Sweden)

    Mathew Woodwark

    2007-01-01

    Full Text Available A novel high throughput phylogenomic analysis (HTP was applied to the rhodopsin G-protein coupled receptor (GPCR family. Instances of phylogenetic mosaicism between receptors were found to be frequent, often as instances of correlated mosaicism and repeated mosaicism. A null data set was constructed with the same phylogenetic topology as the rhodopsin GPCRs. Comparison of the two data sets revealed that mosaicism was found in GPCRs in a higher frequency than would be expected by homoplasy or the effects of topology alone. Various evolutionary models of differential conservation, recombination and homoplasy are explored which could result in the patterns observed in this analysis. We find that the results are most consistent with frequent recombination events. A complex evolutionary history is illustrated in which it is likely frequent recombination has endowed GPCRs with new functions. The pattern of mosaicism is shown to be informative for functional prediction for orphan receptors. HTP analysis is complementary to conventional phylogenomic analyses revealing mosaicism that would not otherwise have been detectable through conventional phylogenetics.

  14. Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report

    Science.gov (United States)

    2013-01-01

    Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME

  15. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Science.gov (United States)

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  16. The Digital Shoreline Analysis System (DSAS) Version 4.0 - An ArcGIS extension for calculating shoreline change

    Science.gov (United States)

    Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan

    2009-01-01

    The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.

  17. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  18. ATENA – A tool for engineering analysis of fracture in concrete

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    ATENA – A tool for engineering analysis of fracture in concrete. VLADIMIR CERVENKA, JAN CERVENKA and RADOMIR PUKL. Cervenka Consulting, Prague, Czech Republic e-mail: cervenka@cervenka.cz. Abstract. Advanced constitutive models implemented in the finite element system ATENA serve as rational tools to ...

  19. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard

    2013-01-01

    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures...

  20. A review of ADM1 extensions, applications, and analysis 2002-2005

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Keller, J.; Steyer, J.-P.

    2006-01-01

    hydrogen production from carbohydrate-type waste. Critical analysis of the model has mainly focused on model structure reduction, hydrogen inhibition functions, and the default parameter set recommended in the STIR. This default parameter set has largely been verified as a reasonable compromise, especially...... for wastewater sludge digestion. One criticism of note is that the ADM1 stoichiometry focuses on catabolism rather than anabolism. This means that inorganic carbon can be used unrealistically as a carbon source during some anabolic reactions. Advances and novel applications have also been made in the present...

  1. DEBRISK, a Tool for Re-Entry Risk Analysis

    Science.gov (United States)

    Omaly, P.; Spel, M.

    2012-01-01

    An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the

  2. ThermoData Engine: Extension to Solvent Design and Multi-component Process Stream Property Calculations with Uncertainty Analysis

    DEFF Research Database (Denmark)

    Diky, Vladimir; Chirico, Robert D.; Muzny, Chris

    ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured...... property values and expert system for data analysis and generation of recommended property values at the specified conditions along with uncertainties on demand. The most recent extension of TDE covers solvent design and multi-component process stream property calculations with uncertainty analysis....... Selection is made by best efficiency (depending on the task, solubility, selectivity, or distribution coefficient, etc.) and matching other requirements requested by the user. At user’s request, efficiency criteria are evaluated based on experimental data for binary mixtures or predictive models (UNIFAC...

  3. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  4. dada - a web-based 2D detector analysis tool

    Science.gov (United States)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  5. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Bowman, Stephen M.; Hollenbach, Daniel F.; Dehart, Mark D.; Rearden, Bradley T.; Gauld, Ian C.; Goluoglu, Sedat

    2003-01-01

    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  6. Piping analysis for the life extension of Heavy Water Plant, Kota

    International Nuclear Information System (INIS)

    Mishra, Rajesh; Soni, R.S.; Kushwaha, H.S.; Venkat Raj, V.

    2001-02-01

    Heavy water production in India has achieved many milestones in the past. One of the most successfully running heavy water plant situated at Kota (Rajasthan) is on the verge of completion of its design life in near future. Heavy Water Plant, Kota is hydrogen sulfide based plant. Various exercises have been planned with an aim to assess the fatigue usage for the various components of these plants in order to extend their life. Considering the process parameters and past history of the plant performance, 25 process critical nozzle locations and connected piping systems are identified. Analyses have been carried out for these critical piping systems for mainly two kinds of loading, viz. sustained loads and the expansion loads. The static analysis has been carried out to find the induced stress levels due to sustained as well as thermal expansion loading as per the design code ANSI B31.3. Due consideration is given to the design corrosion allowance while evaluating the stresses due to sustained loads. At the locations where induced stresses (S 1 ) due to the sustained loads are exceeding the allowable limits (S h ), exercises have been carried out considering the reduced corrosion allowance value. This strategy is adopted due to the fact that the corrosion measurements carried out at site at various critical locations show a very low rate of corrosion. Where it is found that system is getting qualified with reduced corrosion allowance values, it is recommended to keep that location under periodic monitoring. The strategy adopted for carrying out the analysis for thermal expansion loading is to qualify the system as per the code allowable value (S a ). Where it is found that the stresses are more than the allowable value, credit of liberal allowable value as suggested in the code i.e., with the addition of the term (S h -S 1 ) to the allowable stress (S a ) value, has been taken. If at any location, it is found that the problem of high thermal stress still persists, the

  7. Analysis of Piping Systems for Life Extension of Heavy Water Plants in India

    International Nuclear Information System (INIS)

    Mishra, Rajesh K.; Soni, R.S.; Kushwaha, H.S.; Raj, V. Venkat

    2002-01-01

    Heavy water production in India has achieved many milestones in the past. Two of the successfully running heavy water plants are on the verge of completion of their design life in the near future. One of these two plants, situated at Kota, is a hydrogen sulfide based plant and the other one at Tuticorin is an ammonia-based plant. Various exercises have been planned with an aim to assess the fatigue usage for the various components of these plants in order to extend their life. Considering the process parameters and the past history of the plant performance, critical piping systems and equipment are identified. Analyses have been carried out for these critical piping systems for mainly two kinds of loading, viz. sustained loads and the expansion loads. Static analysis has been carried out to find the induced stress levels due to sustained as well as thermal expansion loading as per the design code ANSI B31.3. Due consideration has been given to the design corrosion allowance while evaluating the stresses due to sustained loads. At the locations where the induced stresses (S L ) due to the sustained loads are exceeding the allowable limits (S h ), exercises have been carried out considering the reduced corrosion allowance value. This strategy is adopted in view of the fact that the thickness measurements carried out at site at various critical locations show a very low rate of corrosion. It has been possible to qualify the system with reduced corrosion allowance values however, it is recommended to keep that location under periodic monitoring. The strategy adopted for carrying out analysis for thermal expansion loading is to qualify the system as per the code allowable value (S a ). If the stresses are more than the allowable value, credit of liberal allowable value as suggested in the code i.e., with the addition of the term (S h -S L ) to the term 0.25 S h , has been taken. However, if at any location, it is found that thermal stress is high, fatigue analysis has

  8. Analysis and Extension of the PCA Method, Estimating a Noise Curve from a Single Image

    Directory of Open Access Journals (Sweden)

    Miguel Colom

    2016-12-01

    Full Text Available In the article 'Image Noise Level Estimation by Principal Component Analysis', S. Pyatykh, J. Hesser, and L. Zheng propose a new method to estimate the variance of the noise in an image from the eigenvalues of the covariance matrix of the overlapping blocks of the noisy image. Instead of using all the patches of the noisy image, the authors propose an iterative strategy to adaptively choose the optimal set containing the patches with lowest variance. Although the method measures uniform Gaussian noise, it can be easily adapted to deal with signal-dependent noise, which is realistic with the Poisson noise model obtained by a CMOS or CCD device in a digital camera.

  9. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    Science.gov (United States)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  10. Population genomic analysis of elongated skulls reveals extensive female-biased immigration in Early Medieval Bavaria.

    Science.gov (United States)

    Veeramah, Krishna R; Rott, Andreas; Groß, Melanie; van Dorp, Lucy; López, Saioa; Kirsanow, Karola; Sell, Christian; Blöcher, Jens; Wegmann, Daniel; Link, Vivian; Hofmanová, Zuzana; Peters, Joris; Trautmann, Bernd; Gairhos, Anja; Haberstroh, Jochen; Päffgen, Bernd; Hellenthal, Garrett; Haas-Gebhard, Brigitte; Harbeck, Michaela; Burger, Joachim

    2018-03-12

    Modern European genetic structure demonstrates strong correlations with geography, while genetic analysis of prehistoric humans has indicated at least two major waves of immigration from outside the continent during periods of cultural change. However, population-level genome data that could shed light on the demographic processes occurring during the intervening periods have been absent. Therefore, we generated genomic data from 41 individuals dating mostly to the late 5th/early 6th century AD from present-day Bavaria in southern Germany, including 11 whole genomes (mean depth 5.56×). In addition we developed a capture array to sequence neutral regions spanning a total of 5 Mb and 486 functional polymorphic sites to high depth (mean 72×) in all individuals. Our data indicate that while men generally had ancestry that closely resembles modern northern and central Europeans, women exhibit a very high genetic heterogeneity; this includes signals of genetic ancestry ranging from western Europe to East Asia. Particularly striking are women with artificial skull deformations; the analysis of their collective genetic ancestry suggests an origin in southeastern Europe. In addition, functional variants indicate that they also differed in visible characteristics. This example of female-biased migration indicates that complex demographic processes during the Early Medieval period may have contributed in an unexpected way to shape the modern European genetic landscape. Examination of the panel of functional loci also revealed that many alleles associated with recent positive selection were already at modern-like frequencies in European populations ∼1,500 years ago. Copyright © 2018 the Author(s). Published by PNAS.

  11. Cluster analysis as a prediction tool for pregnancy outcomes.

    Science.gov (United States)

    Banjari, Ines; Kenjerić, Daniela; Šolić, Krešimir; Mandić, Milena L

    2015-03-01

    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes.

  12. EMGTools, an adaptive and versatile tool for detailed EMG analysis

    DEFF Research Database (Denmark)

    Nikolic, M; Krarup, C

    2010-01-01

    We have developed an EMG decomposition system called EMGTools that can extract the constituent MUAPs and firing patterns for quantitative analysis from the EMG signal recorded at slight effort for clinical evaluation. The aim was to implement a robust system able to handle the challenges...

  13. Gipsy 3D : Analysis, Visualization and Vo-Tools

    NARCIS (Netherlands)

    Ruiz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; Hulst, J. M. van der

    2009-01-01

    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing

  14. Television as an Instructional Tool for Concept Analysis

    Science.gov (United States)

    Benwari, Nnenna Ngozi

    2015-01-01

    This is a study of the perception of teachers on the use of television for concept analysis in the classroom. The population of the study is all the 9,784 Secondary School teachers in Bayelsa State of Nigeria out of which 110 teachers were randomly selected using the proportional sampling method. The instrument is a questionnaire designed by the…

  15. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    differentiate olive oils from non-olive vegetable oils. Moreover, manual analysis of such a large volume of data is laborious and time consuming, and may not provide any meaningful interpre-. Figure 4. Amount of vari- ance captured by different principal components (PCs). The plot indicates that first two PCs are sufficient to ...

  16. Principal Component Analysis: Most Favourite Tool in Chemometrics

    Indian Academy of Sciences (India)

    Abstract. Principal component analysis (PCA) is the most commonlyused chemometric technique. It is an unsupervised patternrecognition technique. PCA has found applications in chemistry,biology, medicine and economics. The present work attemptsto understand how PCA work and how can we interpretits results.

  17. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  18. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.

    1998-05-01

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  19. Evaluation of fatigue damage in nuclear power plants: evolution and new tools of analysis

    International Nuclear Information System (INIS)

    Cicero, R.; Corchon, F.

    2011-01-01

    This paper presents new fatigue mechanisms requiring analysis, tools developed for evaluation and the latest trends and studies that are currently working in the nuclear field, and allow proper management referring facilities the said degradation mechanism.

  20. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.