WorldWideScience

Sample records for identifying analysis tools

  1. Using the Contextual Hub Analysis Tool (CHAT) in Cytoscape to Identify Contextually Relevant Network Hubs.

    Science.gov (United States)

    Muetze, Tanja; Lynn, David J

    2017-09-13

    Highly connected nodes in biological networks are called network hubs. Hubs are topologically important to the structure of the network and have been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we provide a step-by-step protocol for using the Contextual Hub Analysis Tool (CHAT), an application within Cytoscape 3, which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene or protein expression data, and identify hub nodes that are more highly connected to contextual nodes than expected by chance. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  2. Contextual Hub Analysis Tool (CHAT): A Cytoscape app for identifying contextually relevant hubs in biological networks.

    Science.gov (United States)

    Muetze, Tanja; Goenawan, Ivan H; Wiencko, Heather L; Bernal-Llinares, Manuel; Bryan, Kenneth; Lynn, David J

    2016-01-01

    Highly connected nodes (hubs) in biological networks are topologically important to the structure of the network and have also been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we report a Cytoscape app, the Contextual Hub Analysis Tool (CHAT), which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene expression or mass spectrometry data, and identify hub nodes that are more highly connected to contextual nodes (e.g. genes or proteins that are differentially expressed) than expected by chance. In a case study, we use CHAT to construct a network of genes that are differentially expressed in Dengue fever, a viral infection. CHAT was used to identify and compare contextual and degree-based hubs in this network. The top 20 degree-based hubs were enriched in pathways related to the cell cycle and cancer, which is likely due to the fact that proteins involved in these processes tend to be highly connected in general. In comparison, the top 20 contextual hubs were enriched in pathways commonly observed in a viral infection including pathways related to the immune response to viral infection. This analysis shows that such contextual hubs are considerably more biologically relevant than degree-based hubs and that analyses which rely on the identification of hubs solely based on their connectivity may be biased towards nodes that are highly connected in general rather than in the specific context of interest. CHAT is available for Cytoscape 3.0+ and can be installed via the Cytoscape App Store ( http://apps.cytoscape.org/apps/chat).

  3. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  4. Social Network Analysis: A Simple but Powerful Tool for Identifying Teacher Leaders

    Science.gov (United States)

    Smith, P. Sean; Trygstad, Peggy J.; Hayes, Meredith L.

    2018-01-01

    Instructional teacher leadership is central to a vision of distributed leadership. However, identifying instructional teacher leaders can be a daunting task, particularly for administrators who find themselves either newly appointed or faced with high staff turnover. This article describes the use of social network analysis (SNA), a simple but…

  5. Messina: a novel analysis tool to identify biologically relevant molecules in disease.

    Directory of Open Access Journals (Sweden)

    Mark Pinese

    Full Text Available BACKGROUND: Morphologically similar cancers display heterogeneous patterns of molecular aberrations and follow substantially different clinical courses. This diversity has become the basis for the definition of molecular phenotypes, with significant implications for therapy. Microarray or proteomic expression profiling is conventionally employed to identify disease-associated genes, however, traditional approaches for the analysis of profiling experiments may miss molecular aberrations which define biologically relevant subtypes. METHODOLOGY/PRINCIPAL FINDINGS: Here we present Messina, a method that can identify those genes that only sometimes show aberrant expression in cancer. We demonstrate with simulated data that Messina is highly sensitive and specific when used to identify genes which are aberrantly expressed in only a proportion of cancers, and compare Messina to contemporary analysis techniques. We illustrate Messina by using it to detect the aberrant expression of a gene that may play an important role in pancreatic cancer. CONCLUSIONS/SIGNIFICANCE: Messina allows the detection of genes with profiles typical of markers of molecular subtype, and complements existing methods to assist the identification of such markers. Messina is applicable to any global expression profiling data, and to allow its easy application has been packaged into a freely-available stand-alone software package.

  6. Power analysis as a tool to identify statistically informative indicators for monitoring coral reef disturbances.

    Science.gov (United States)

    Van Wynsberge, Simon; Gilbert, Antoine; Guillemot, Nicolas; Heintz, Tom; Tremblay-Boyer, Laura

    2017-07-01

    Extensive biological field surveys are costly and time consuming. To optimize sampling and ensure regular monitoring on the long term, identifying informative indicators of anthropogenic disturbances is a priority. In this study, we used 1800 candidate indicators by combining metrics measured from coral, fish, and macro-invertebrate assemblages surveyed from 2006 to 2012 in the vicinity of an ongoing mining project in the Voh-Koné-Pouembout lagoon, New Caledonia. We performed a power analysis to identify a subset of indicators which would best discriminate temporal changes due to a simulated chronic anthropogenic impact. Only 4% of tested indicators were likely to detect a 10% annual decrease of values with sufficient power (>0.80). Corals generally exerted higher statistical power than macro-invertebrates and fishes because of lower natural variability and higher occurrence. For the same reasons, higher taxonomic ranks provided higher power than lower taxonomic ranks. Nevertheless, a number of families of common sedentary or sessile macro-invertebrates and fishes also performed well in detecting changes: Echinometridae, Isognomidae, Muricidae, Tridacninae, Arcidae, and Turbinidae for macro-invertebrates and Pomacentridae, Labridae, and Chaetodontidae for fishes. Interestingly, these families did not provide high power in all geomorphological strata, suggesting that the ability of indicators in detecting anthropogenic impacts was closely linked to reef geomorphology. This study provides a first operational step toward identifying statistically relevant indicators of anthropogenic disturbances in New Caledonia's coral reefs, which can be useful in similar tropical reef ecosystems where little information is available regarding the responses of ecological indicators to anthropogenic disturbances.

  7. Identifying biological landmarks using a novel cell measuring image analysis tool: Cell-o-Tape

    Directory of Open Access Journals (Sweden)

    French Andrew P

    2012-03-01

    Full Text Available Abstract Background The ability to quantify the geometry of plant organs at the cellular scale can provide novel insights into their structural organization. Hitherto manual methods of measurement provide only very low throughput and subjective solutions, and often quantitative measurements are neglected in favour of a simple cell count. Results We present a tool to count and measure individual neighbouring cells along a defined file in confocal laser scanning microscope images. The tool allows the user to extract this generic information in a flexible and intuitive manner, and builds on the raw data to detect a significant change in cell length along the file. This facility can be used, for example, to provide an estimate of the position of transition into the elongation zone of an Arabidopsis root, traditionally a location sensitive to the subjectivity of the experimenter. Conclusions Cell-o-tape is shown to locate cell walls with a high degree of accuracy and estimate the location of the transition feature point in good agreement with human experts. The tool is an open source ImageJ/Fiji macro and is available online.

  8. SigTree: A Microbial Community Analysis Tool to Identify and Visualize Significantly Responsive Branches in a Phylogenetic Tree

    OpenAIRE

    Stevens, John R.; Jones, Todd R.; Lefevre, Michael; Ganesan, Balasubramanian; Weimer, Bart C.

    2017-01-01

    Microbial community analysis experiments to assess the effect of a treatment intervention (or environmental change) on the relative abundance levels of multiple related microbial species (or operational taxonomic units) simultaneously using high throughput genomics are becoming increasingly common. Within the framework of the evolutionary phylogeny of all species considered in the experiment, this translates to a statistical need to identify the phylogenetic branches that exhibit a significan...

  9. Identifying like-minded audiences for global warming public engagement campaigns: an audience segmentation analysis and tool development.

    Directory of Open Access Journals (Sweden)

    Edward W Maibach

    2011-03-01

    Full Text Available Achieving national reductions in greenhouse gas emissions will require public support for climate and energy policies and changes in population behaviors. Audience segmentation--a process of identifying coherent groups within a population--can be used to improve the effectiveness of public engagement campaigns.In Fall 2008, we conducted a nationally representative survey of American adults (n = 2,164 to identify audience segments for global warming public engagement campaigns. By subjecting multiple measures of global warming beliefs, behaviors, policy preferences, and issue engagement to latent class analysis, we identified six distinct segments ranging in size from 7 to 33% of the population. These six segments formed a continuum, from a segment of people who were highly worried, involved and supportive of policy responses (18%, to a segment of people who were completely unconcerned and strongly opposed to policy responses (7%. Three of the segments (totaling 70% were to varying degrees concerned about global warming and supportive of policy responses, two (totaling 18% were unsupportive, and one was largely disengaged (12%, having paid little attention to the issue. Certain behaviors and policy preferences varied greatly across these audiences, while others did not. Using discriminant analysis, we subsequently developed 36-item and 15-item instruments that can be used to categorize respondents with 91% and 84% accuracy, respectively.In late 2008, Americans supported a broad range of policies and personal actions to reduce global warming, although there was wide variation among the six identified audiences. To enhance the impact of campaigns, government agencies, non-profit organizations, and businesses seeking to engage the public can selectively target one or more of these audiences rather than address an undifferentiated general population. Our screening instruments are available to assist in that process.

  10. Identifying like-minded audiences for global warming public engagement campaigns: an audience segmentation analysis and tool development.

    Science.gov (United States)

    Maibach, Edward W; Leiserowitz, Anthony; Roser-Renouf, Connie; Mertz, C K

    2011-03-10

    Achieving national reductions in greenhouse gas emissions will require public support for climate and energy policies and changes in population behaviors. Audience segmentation--a process of identifying coherent groups within a population--can be used to improve the effectiveness of public engagement campaigns. In Fall 2008, we conducted a nationally representative survey of American adults (n = 2,164) to identify audience segments for global warming public engagement campaigns. By subjecting multiple measures of global warming beliefs, behaviors, policy preferences, and issue engagement to latent class analysis, we identified six distinct segments ranging in size from 7 to 33% of the population. These six segments formed a continuum, from a segment of people who were highly worried, involved and supportive of policy responses (18%), to a segment of people who were completely unconcerned and strongly opposed to policy responses (7%). Three of the segments (totaling 70%) were to varying degrees concerned about global warming and supportive of policy responses, two (totaling 18%) were unsupportive, and one was largely disengaged (12%), having paid little attention to the issue. Certain behaviors and policy preferences varied greatly across these audiences, while others did not. Using discriminant analysis, we subsequently developed 36-item and 15-item instruments that can be used to categorize respondents with 91% and 84% accuracy, respectively. In late 2008, Americans supported a broad range of policies and personal actions to reduce global warming, although there was wide variation among the six identified audiences. To enhance the impact of campaigns, government agencies, non-profit organizations, and businesses seeking to engage the public can selectively target one or more of these audiences rather than address an undifferentiated general population. Our screening instruments are available to assist in that process.

  11. Identifying Like-Minded Audiences for Global Warming Public Engagement Campaigns: An Audience Segmentation Analysis and Tool Development

    Science.gov (United States)

    Maibach, Edward W.; Leiserowitz, Anthony; Roser-Renouf, Connie; Mertz, C. K.

    2011-01-01

    Background Achieving national reductions in greenhouse gas emissions will require public support for climate and energy policies and changes in population behaviors. Audience segmentation – a process of identifying coherent groups within a population – can be used to improve the effectiveness of public engagement campaigns. Methodology/Principal Findings In Fall 2008, we conducted a nationally representative survey of American adults (n = 2,164) to identify audience segments for global warming public engagement campaigns. By subjecting multiple measures of global warming beliefs, behaviors, policy preferences, and issue engagement to latent class analysis, we identified six distinct segments ranging in size from 7 to 33% of the population. These six segments formed a continuum, from a segment of people who were highly worried, involved and supportive of policy responses (18%), to a segment of people who were completely unconcerned and strongly opposed to policy responses (7%). Three of the segments (totaling 70%) were to varying degrees concerned about global warming and supportive of policy responses, two (totaling 18%) were unsupportive, and one was largely disengaged (12%), having paid little attention to the issue. Certain behaviors and policy preferences varied greatly across these audiences, while others did not. Using discriminant analysis, we subsequently developed 36-item and 15-item instruments that can be used to categorize respondents with 91% and 84% accuracy, respectively. Conclusions/Significance In late 2008, Americans supported a broad range of policies and personal actions to reduce global warming, although there was wide variation among the six identified audiences. To enhance the impact of campaigns, government agencies, non-profit organizations, and businesses seeking to engage the public can selectively target one or more of these audiences rather than address an undifferentiated general population. Our screening instruments are

  12. MelanomaDB: a Web Tool for Integrative Analysis of Melanoma Genomic Information to Identify Disease-Associated Molecular Pathways

    Directory of Open Access Journals (Sweden)

    Alexander Joseph Trevarton

    2013-07-01

    Full Text Available Despite on-going research, metastatic melanoma survival rates remain low and treatment options are limited. Researchers can now access a rapidly growing amount of molecular and clinical information about melanoma. This information is becoming difficult to assemble and interpret due to its dispersed nature, yet as it grows it becomes increasingly valuable for understanding melanoma. Integration of this information into a comprehensive resource to aid rational experimental design and patient stratification is needed. As an initial step in this direction, we have assembled a web-accessible melanoma database, MelanomaDB, which incorporates clinical and molecular data from publically available sources, which will be regularly updated as new information becomes available. This database allows complex links to be drawn between many different aspects of melanoma biology: genetic changes (e.g. mutations in individual melanomas revealed by DNA sequencing, associations between gene expression and patient survival, data concerning drug targets, biomarkers, druggability and clinical trials, as well as our own statistical analysis of relationships between molecular pathways and clinical parameters that have been produced using these data sets. The database is freely available at http://genesetdb.auckland.ac.nz/melanomadb/about.html . A subset of the information in the database can also be accessed through a freely available web application in the Illumina genomic cloud computing platform BaseSpace at http://www.biomatters.com/apps/melanoma-profiler-for-research . This illustrates dysregulation of specific signalling pathways, both across 310 exome-sequenced melanomas and in individual tumours and identifies novel features about the distribution of somatic variants in melanoma. We suggest that this database can provide a context in which to interpret the tumour molecular profiles of individual melanoma patients relative to biological information and available

  13. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  14. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  15. Construct validity of the Heart Failure Screening Tool (Heart-FaST) to identify heart failure patients at risk of poor self-care: Rasch analysis.

    Science.gov (United States)

    Reynolds, Nicholas A; Ski, Chantal F; McEvedy, Samantha M; Thompson, David R; Cameron, Jan

    2018-02-14

    The aim of this study was to psychometrically evaluate the Heart Failure Screening Tool (Heart-FaST) via: (1) examination of internal construct validity; (2) testing of scale function in accordance with design; and (3) recommendation for change/s, if items are not well adjusted, to improve psychometric credential. Self-care is vital to the management of heart failure. The Heart-FaST may provide a prospective assessment of risk, regarding the likelihood that patients with heart failure will engage in self-care. Psychometric validation of the Heart-FaST using Rasch analysis. The Heart-FaST was administered to 135 patients (median age = 68, IQR = 59-78 years; 105 males) enrolled in a multidisciplinary heart failure management program. The Heart-FaST is a nurse-administered tool for screening patients with HF at risk of poor self-care. A Rasch analysis of responses was conducted which tested data against Rasch model expectations, including whether items serve as unbiased, non-redundant indicators of risk and measure a single construct and that rating scales operate as intended. The results showed that data met Rasch model expectations after rescoring or deleting items due to poor discrimination, disordered thresholds, differential item functioning, or response dependence. There was no evidence of multidimensionality which supports the use of total scores from Heart-FaST as indicators of risk. Aggregate scores from this modified screening tool rank heart failure patients according to their "risk of poor self-care" demonstrating that the Heart-FaST items constitute a meaningful scale to identify heart failure patients at risk of poor engagement in heart failure self-care. © 2018 John Wiley & Sons Ltd.

  16. Development and analysis of a low-cost screening tool to identify and classify hearing loss in children: a proposal for developing countries

    Directory of Open Access Journals (Sweden)

    Alessandra Giannella Samelli

    2011-01-01

    Full Text Available OBJECTIVE: A lack of attention has been given to hearing health in primary care in developing countries. A strategy involving low-cost screening tools may fill the current gap in hearing health care provided to children. Therefore, it is necessary to establish and adopt lower-cost procedures that are accessible to underserved areas that lack other physical or human resources that would enable the identification of groups at risk for hearing loss. The aim of this study was to develop and analyze the efficacy of a low-cost screening tool to identify and classify hearing loss in children. METHODS: A total of 214 2-to-10 year-old children participated in this study. The study was conducted by providing a questionnaire to the parents and comparing the answers with the results of a complete audiological assessment. Receiver operating characteristic (ROC curves were constructed, and discriminant analysis techniques were used to classify each child based on the total score. RESULTS: We found conductive hearing loss in 39.3% of children, sensorineural hearing loss in 7.4% and normal hearing in 53.3%. The discriminant analysis technique provided the following classification rule for the total score on the questionnaire: 0 to 4 points - normal hearing; 5 to 7 points - conductive hearing loss; over 7 points - sensorineural hearing loss. CONCLUSION: Our results suggest that the questionnaire could be used as a screening tool to classify children with normal hearing or hearing loss and according to the type of hearing loss based on the total questionnaire score

  17. Twitter as a Potential Disaster Risk Reduction Tool. Part II: Descriptive Analysis of Identified Twitter Activity during the 2013 Hattiesburg F4 Tornado.

    Science.gov (United States)

    Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo

    2015-06-29

    This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado.  Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.

  18. BENCHMARKING - PRACTICAL TOOLS IDENTIFY KEY SUCCESS FACTORS

    Directory of Open Access Journals (Sweden)

    Olga Ju. Malinina

    2016-01-01

    Full Text Available The article gives a practical example of the application of benchmarking techniques. The object of study selected fashion store Company «HLB & M Hennes & Mauritz», located in the shopping center «Gallery», Krasnodar. Hennes & Mauritz. The purpose of this article is to identify the best ways to develop a fashionable brand clothing store Hennes & Mauritz on the basis of benchmarking techniques. On the basis of conducted market research is a comparative analysis of the data from different perspectives. The result of the author’s study is a generalization of the ndings, the development of the key success factors that will allow to plan a successful trading activities in the future, based on the best experience of competitors.

  19. Clinical trial regulation in Argentina: overview and analysis of regulatory framework, use of existing tools, and researchers' perspectives to identify potential barriers.

    Science.gov (United States)

    White, Lauren; Ortiz, Zulma; Cuervo, Luis G; Reveiz, Ludovic

    2011-11-01

    To review and analyze the regulatory framework of clinical trial registration, use of existing tools (publicly accessible national/international registration databases), and users' perspectives to identify possible barriers to registration compliance by sponsors and researchers in Argentina. Internationally registered trials recruiting patients in Argentina were found through clincialtrials.gov and the International Clinical Trial Registration Platform (ICTRP) and compared with publically available clinical trials registered through the National Administration of Drugs, Foods, and Medical Devices (ANMAT). A questionnaire addressing hypothesized attitudinal, knowledge-related, idiomatic, technical, economic, and regulatory barriers that could discourage or impede registration of clinical trials was developed, and semi-structured, in-depth interviews were conducted with a purposively selected sample of researchers (investigators, sponsors, and monitors) in Argentina. A response rate of 74.3% (n = 29) was achieved, and 27 interviews were ultimately used for analysis. Results suggested that the high proportion of foreign-sponsored or multinational trials (64.8% of all protocols approved by ANMAT from 1994-2006) may contribute to a communication gap between locally based investigators and foreign-based administrative officials. A lack of knowledge about available international registration tools and limited awareness of the importance of registration were also identified as limiting factors for local investigators and sponsors. To increase compliance and promote clinical trial registration in Argentina, national health authorities, sponsors, and local investigators could take the following steps: implement a grassroots educational campaign to improve clinical trial regulation, support local investigator-sponsor-initiated clinical trials, and/or encourage local and regional scientific journal compliance with standards from the International Committee of Medical Journal

  20. Contextual Hub Analysis Tool (CHAT: A Cytoscape app for identifying contextually relevant hubs in biological networks [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Tanja Muetze

    2016-08-01

    Full Text Available Highly connected nodes (hubs in biological networks are topologically important to the structure of the network and have also been shown to be preferentially associated with a range of phenotypes of interest. The relative importance of a hub node, however, can change depending on the biological context. Here, we report a Cytoscape app, the Contextual Hub Analysis Tool (CHAT, which enables users to easily construct and visualize a network of interactions from a gene or protein list of interest, integrate contextual information, such as gene expression or mass spectrometry data, and identify hub nodes that are more highly connected to contextual nodes (e.g. genes or proteins that are differentially expressed than expected by chance. In a case study, we use CHAT to construct a network of genes that are differentially expressed in Dengue fever, a viral infection. CHAT was used to identify and compare contextual and degree-based hubs in this network. The top 20 degree-based hubs were enriched in pathways related to the cell cycle and cancer, which is likely due to the fact that proteins involved in these processes tend to be highly connected in general. In comparison, the top 20 contextual hubs were enriched in pathways commonly observed in a viral infection including pathways related to the immune response to viral infection. This analysis shows that such contextual hubs are considerably more biologically relevant than degree-based hubs and that analyses which rely on the identification of hubs solely based on their connectivity may be biased towards nodes that are highly connected in general rather than in the specific context of interest.   Availability: CHAT is available for Cytoscape 3.0+ and can be installed via the Cytoscape App Store (http://apps.cytoscape.org/apps/chat.

  1. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  2. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  3. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  4. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  5. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  6. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  7. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  8. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  9. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  10. In vivo functional analysis of L-rhamnose metabolic pathway in Aspergillus niger: a tool to identify the potential inducer of RhaR.

    Science.gov (United States)

    Khosravi, Claire; Kun, Roland Sándor; Visser, Jaap; Aguilar-Pontes, María Victoria; de Vries, Ronald P; Battaglia, Evy

    2017-11-06

    The genes of the non-phosphorylative L-rhamnose catabolic pathway have been identified for several yeast species. In Schefferomyces stipitis, all L-rhamnose pathway genes are organized in a cluster, which is conserved in Aspergillus niger, except for the lra-4 ortholog (lraD). The A. niger cluster also contains the gene encoding the L-rhamnose responsive transcription factor (RhaR) that has been shown to control the expression of genes involved in L-rhamnose release and catabolism. In this paper, we confirmed the function of the first three putative L-rhamnose utilisation genes from A. niger through gene deletion. We explored the identity of the inducer of the pathway regulator (RhaR) through expression analysis of the deletion mutants grown in transfer experiments to L-rhamnose and L-rhamnonate. Reduced expression of L-rhamnose-induced genes on L-rhamnose in lraA and lraB deletion strains, but not on L-rhamnonate (the product of LraB), demonstrate that the inducer of the pathway is of L-rhamnonate or a compound downstream of it. Reduced expression of these genes in the lraC deletion strain on L-rhamnonate show that it is in fact a downstream product of L-rhamnonate. This work showed that the inducer of RhaR is beyond L-rhamnonate dehydratase (LraC) and is likely to be the 2-keto-3-L-deoxyrhamnonate.

  11. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  12. Tools for monitoring aquatic environments to identify anthropic effects.

    Science.gov (United States)

    da Rocha, Monyque Palagano; Dourado, Priscila Leocadia Rosa; Cardoso, Claudia Andrea Lima; Cândido, Liliam Silva; Pereira, Joelson Gonçalves; de Oliveira, Kelly Mari Pires; Grisolia, Alexeia Barufatti

    2018-01-05

    Anthropic activities are directly related to the contamination of aquatic ecosystems owing to the release of numerous chemicals from agricultural and urban waste. These contaminants cause environmental degradation and a decrease in the availability of water quality. The objective of this search was to evaluate the efficiency of physicochemical, chemical, and microbiological tests; extraction of chlorophyll a; and genetic parameters to identify anthropic activities and weather condition effects on the stream water quality and the consequences of its use by the population. The physicochemical parameters were within the limits allowed by the Brazilian law. However, contamination by metals (Cd 0.510 mg L -1 , Co 0.405 mg L -1 , and Ni 0.316 mg L -1 ) has been found at various collection points to be more than the allowable values. The antibiotic oxytetracycline was detected in stream water in quantities of up to 89 μg L -1 . In relation to microbiological contamination, Escherichia coli and Pseudomonas spp. have been isolated. The averages of chlorophyll a were up to 0.15558 mg cm -2 . Genetic tools identified greater number of micronuclei and DNA damage in periods that showed lower rainfall rates and lower amounts of metals. The analysis used for monitoring was efficient to verify the interference that animal breeding and planting of different cultures have caused on that stream. Thus, the continued use of this water for drinking, irrigation of vegetables, and recreational activities makes the population susceptible to contamination by bacteria and creates conditions for the development of genetic alterations in the long run.

  13. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  14. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  15. Numerical simulation analysis as a tool to identify areas of weakness in a turbine wind-blade and solutions for their reinforcement

    OpenAIRE

    RAMAN, Venkadesh; DRISSI-HABTI, Monssef; GUILLAUMAT, Laurent; KHADHOUR, Aghihad

    2016-01-01

    Offshore wind energy is one of the main sources of renewable energy that can benefit from new generation materials that exhibit good oxidation resistance and mechanical reliability. Composite materials are the best consideration for harsh environment and deep sea wind turbine manufacturing. In this study, a numerical simulation was implemented to predict the stress distribution over a wind turbineblade and to determine areas with high stress concentration. Finite Element Analysis (FEA) was us...

  16. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  17. Identifying Pollutants in the Siret River Basin by Applying New Assessment Tools on Monitoring Data: the Correlation of Land Use and Physicochemical Parameter of Water Quality Analysis

    Directory of Open Access Journals (Sweden)

    Mănescu Andreea

    2014-10-01

    Full Text Available The Siret River are used as raw water source for different municipal water supply systems, yet the Siret River are used as receiving bodies by some inhabitants and industry. In the study the quality of the Siret River water was determinate using a Water Quality Index (WQI. Results are presented from a field study performed on the Bistrita, Moldova, Suceava, Siret, Şomuzu Mare, Trotuş and Tributary River in the study area Siret Basin Romania. The main objective of this study was to determine is to find correlations land use to indicators physical-chemical of water quality, to investigate pollution source is more responsible for river water quality. This is of interest not only research context, but also for supporting and facilitating the application analysis postullend in the Water Framework Directive (WFD (2000/60/CE for the establishment of programmers of measures. For this purpose a slightly impact pollution source municipal wastewater treatment, land uses, urban, forest, agriculture and mining was selected and intensively monitored during six years January 2006 - December 2011, sampling was determined to meet the WFD standards for confidence in twenty two different control section of the Siret Basin. The main measures to reduce emissions to the Siret River were calcium, ammonium, sulfate, residue fixed (RF, sodium, chloride, free detergent and municipal wastewater treatment, concentrated on point emission. The main contributor to diffuse this parameters increased when more percentage of land was dedicated to industry and urban and less to forest and mining.

  18. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  19. Clinical assessment tools identify functional deficits in fragility fracture patients

    Directory of Open Access Journals (Sweden)

    Ames TD

    2016-05-01

    Full Text Available Tyler D Ames,1 Corinne E Wee,1 Khoi M Le,1 Tiffany L Wang,1 Julie Y Bishop,2 Laura S Phieffer,2 Carmen E Quatman2 1The Ohio State University College of Medicine, 2Department of Orthopaedics, The Ohio State University Wexner Medical Center, Columbus, OH, USA Purpose: To identify inexpensive, noninvasive, portable, clinical assessment tools that can be used to assess functional performance measures that may put older patients at risk for falls such as balance, handgrip strength, and lumbopelvic control.Patients and methods: Twenty fragility fracture patients and 21 healthy control subjects were evaluated using clinical assessment tools (Nintendo Wii Balance Board [WBB], a handheld dynamometer, and an application for the Apple iPod Touch, the Level Belt that measure functional performance during activity of daily living tasks. The main outcome measurements were balance (WBB, handgrip strength (handheld dynamometer, and lumbopelvic control (iPod Touch Level Belt, which were compared between fragility fracture patients and healthy controls.Results: Fragility fracture patients had lower scores on the vertical component of the WBB Torso Twist task (P=0.042 and greater medial–lateral lumbopelvic sway during a 40 m walk (P=0.026 when compared to healthy controls. Unexpectedly, the fracture patients had significantly higher scores on the left leg (P=0.020 and total components (P=0.010 of the WBB Single Leg Stand task as well as less faults during the left Single Leg Stand task (P=0.003.Conclusion: The clinical assessment tools utilized in this study are relatively inexpensive and portable tools of performance measures capable of detecting differences in postural sway between fragility fracture patients and controls. Keywords: fall risk, geriatric fracture, Nintendo Wii Balance Board, Level Belt, fragility fracture

  20. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    Science.gov (United States)

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available

  1. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  2. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  3. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  4. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    Science.gov (United States)

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  5. New genetic tools to identify and protect typical italian products

    Directory of Open Access Journals (Sweden)

    Sergio Lanteri

    2011-02-01

    Full Text Available During last decades the use of local varieties was strongly reduced due to introduction of modern cultivars characterized by higher yield, and breed for different traits of agronomic value. However, these cultivars not always have the quality aspects that was found in old traditional and typical crops also depending from the know-how of traditional cultivation. Nowadays the practise of intensive agriculture select only a small number of species and varieties with a consequent reduction of the diversity in agro-ecosystems and risk of loss of important alleles characterizing genetic materials adapted to specific environments. The creation of quality marks of the European Union proved to be a successful system to protect typical products through the Denomination of Origins (PDO- Protected Denomination of Origin and PGI- Protected Geographical Indication. However, the protection of quality needs efficient instruments to discriminate DOP or IGP varieties in the field and to trace them along the agro-food chain. DNA fingerprinting represents an excellent system to discriminate herbaceous and tree species as well as to quantify the amount of genetic variability present in germplasm collections. The paper describes several examples in which AFLPs, SSRs and minisatellite markers were successfully used to identify tomato, artichoke, grape, apple and walnut varieties proving to be effective in discriminating also closely related genetic material. DNA fingerprinting based on SSR is also a powerful tool to trace and authenticate row plant materials in agro-food chains. The paper describes examples of varieties traceability in the food chains durum wheat, olive, apple and tomato pursued through the identification of SSR allelic profiles obtained from DNA isolated from complex highly processed food, such as bread, olive oil, apple pureè and nectar and peeled tomato.

  6. New genetic tools to identify and protect typical italian products

    Directory of Open Access Journals (Sweden)

    Sergio Lanteri

    2009-10-01

    Full Text Available During last decades the use of local varieties was strongly reduced due to introduction of modern cultivars characterized by higher yield, and breed for different traits of agronomic value. However, these cultivars not always have the quality aspects that was found in old traditional and typical crops also depending from the know-how of traditional cultivation. Nowadays the practise of intensive agriculture select only a small number of species and varieties with a consequent reduction of the diversity in agro-ecosystems and risk of loss of important alleles characterizing genetic materials adapted to specific environments. The creation of quality marks of the European Union proved to be a successful system to protect typical products through the Denomination of Origins (PDO- Protected Denomination of Origin and PGI- Protected Geographical Indication. However, the protection of quality needs efficient instruments to discriminate DOP or IGP varieties in the field and to trace them along the agro-food chain. DNA fingerprinting represents an excellent system to discriminate herbaceous and tree species as well as to quantify the amount of genetic variability present in germplasm collections. The paper describes several examples in which AFLPs, SSRs and minisatellite markers were successfully used to identify tomato, artichoke, grape, apple and walnut varieties proving to be effective in discriminating also closely related genetic material. DNA fingerprinting based on SSR is also a powerful tool to trace and authenticate row plant materials in agro-food chains. The paper describes examples of varieties traceability in the food chains durum wheat, olive, apple and tomato pursued through the identification of SSR allelic profiles obtained from DNA isolated from complex highly processed food, such as bread, olive oil, apple pureè and nectar and peeled tomato.

  7. Diagnostic tools for identifying sleepy drivers in the field.

    Science.gov (United States)

    2013-05-06

    The overarching goal of this project was to identify and evaluate cognitive and behavioral indices that are sensitive to sleep : deprivation and may help identify commercial motor vehicle drivers (CMV) who are at-risk for driving in a sleep deprived ...

  8. Channel CAT: A Tactical Link Analysis Tool

    Science.gov (United States)

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  9. PAPA: a flexible tool for identifying pleiotropic pathways using genome-wide association study summaries.

    Science.gov (United States)

    Wen, Yan; Wang, Wenyu; Guo, Xiong; Zhang, Feng

    2016-03-15

    : Pleiotropy is common in the genetic architectures of complex diseases. To the best of our knowledge, no analysis tool has been developed for identifying pleiotropic pathways using multiple genome-wide association study (GWAS) summaries by now. Here, we present PAPA, a flexible tool for pleiotropic pathway analysis utilizing GWAS summary results. The performance of PAPA was validated using publicly available GWAS summaries of body mass index and waist-hip ratio of the GIANT datasets. PAPA identified a set of pleiotropic pathways, which have been demonstrated to be involved in the development of obesity. PAPA program, document and illustrative example are available at http://sourceforge.net/projects/papav1/files/ : fzhxjtu@mail.xjtu.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. A tool for identifying potential Eucalyptus nitens seed orchard sites ...

    African Journals Online (AJOL)

    Shy seed production in orchards of Eucalyptus nitens is a major barrier to the deployment of genetic gain in South African plantations. A machine learning method was used to identify optimal sites for the establishment of E. nitens seed orchards within the plantation forestry landscape of the summer rainfall region of South ...

  11. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  12. Human genetics as a tool to identify progranulin regulators.

    Science.gov (United States)

    Nicholson, Alexandra M; Finch, NiCole A; Rademakers, Rosa

    2011-11-01

    Frontotemporal lobar degeneration (FTLD) is a common neurodegenerative disorder that predominantly affects individuals under the age of 65. It is known that the most common pathological subtype is FTLD with TAR DNA-binding protein 43 inclusions (FTLD-TDP). FTLD has a strong genetic component with about 50% of cases having a positive family history. Mutations identified in the progranulin gene (GRN) have been shown to cause FTLD-TDP as a result of progranulin haploinsufficiency. These findings suggest a progranulin-dependent mechanism in this pathological FTLD subtype. Thus, identifying regulators of progranulin levels is essential for new therapies and treatments for FTLD and related disorders. In this review, we discuss the role of genetic studies in identifying progranulin regulators, beginning with the discovery of pathogenic GRN mutations and additional GRN risk variants. We also cover more recent genetic advances, including the detection of variants in the transmembrane protein 106 B gene that increase FTLD-TDP risk presumably by modulating progranulin levels and the identification of a potential progranulin receptor, sortilin. This review highlights the importance of genetic studies in the context of FTLD and further emphasizes the need for future genetic and cell biology research to continue the effort in finding a cure for progranulin-related diseases.

  13. Chimeric opioid peptides: Tools for identifying opioid receptor types

    International Nuclear Information System (INIS)

    Xie, G.; Miyajima, A.; Yokota, T.; Arai, K.; Goldstein, A.

    1990-01-01

    The authors synthesized several chimeric [125J-labelled] peptides in which the N-terminal nine residues of dynorphin-32, a peptide selective for the κ opioid receptor, were replaced by opioid peptides selective for other opioid receptor types. Each chimeric peptide retained the high affinity and type selectivity characteristic of its N-terminal sequence. The common C-terminal two-thirds of the chimeric peptides served as an epitope recognized by the same monoclonal antibody. When bound to receptors on a cell surface or membrane preparation, these peptides could still bind specifically to the monoclonal antibody. These chimeric peptides should be useful for isolating μ, δ, and κ opioid receptors and for identifying opioid receptors on transfected cells in expression cloning procedures. The general approach using chimeric peptides should be applicable to other peptide receptors

  14. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature

    Directory of Open Access Journals (Sweden)

    Ning Ye

    2015-01-01

    Full Text Available The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  15. History as a tool in identifying "new" old drugs.

    Science.gov (United States)

    Riddle, John M

    2002-01-01

    To trace the history of a natural product and its use, it is necessary to identify to correct plant among around a half-million species. One must also know how and when harvest the plant and the morphology of location and extraction. Within the same species plant chemistry varies, depending upon climatic and soil conditions, stage of maturity and even diurnal factors. To all of these variations must be added the diagnostic ability of physicians and native healers (to distinguish between Hippocratically-trained Western physicians and whose knowledge is less formally taught). Seldom was a disease identified as we Know it today, but the constellations of symptoms described, when studied carefully within the framework historical setting of the culture, can be related to modern medicine. It is essential to study the historical contemporary usage data in the language in which those accounts were writTen. Translators are often philologists who are not sensitive to medical nuances. Modern readers of translated historical documents often are unaware of the precision the authors delivered in describing medical afflictions and their treatments. Natural product drugs are truly products of human knowledge. Because so many modern pharmaceuticals are manufactured synthetically we forget that once either the compound or its affinity had a home in a natural product. Over 2,500 years ago man first used a drug obtained from white willow bark, which was aspirin or acetylsalicylic acid. Today's scientists continue to be bewildered by just what aspirin's mechanisms of action are, discovering new modes of action, and how they relate to medical diagnostics. Whatever the science of aspirin, an intelligent person today takes it just as our ancestors did fo millennia. Throughout time, explanations continue to vary just as purpose of administration do as well. Nevertheless, aspirin is perceived as being beneficial. Historical in-use data can also be a factor in judging a drug's safety, since

  16. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  17. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  18. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  19. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  20. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  1. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  2. Evaluation of an online family history tool for identifying hereditary and familial colorectal cancer.

    Science.gov (United States)

    Kallenberg, F G J; Aalfs, C M; The, F O; Wientjes, C A; Depla, A C; Mundt, M W; Bossuyt, P M M; Dekker, E

    2017-09-21

    Identifying a hereditary colorectal cancer (CRC) syndrome or familial CRC (FCC) in a CRC patient may enable the patient and relatives to enroll in surveillance protocols. As these individuals are insufficiently recognized, we evaluated an online family history tool, consisting of a patient-administered family history questionnaire and an automated genetic referral recommendation, to facilitate the identification of patients with hereditary CRC or FCC. Between 2015 and 2016, all newly diagnosed CRC patients in five Dutch outpatient clinics, were included in a trial with a stepped-wedge design, when first visiting the clinic. Each hospital continued standard procedures for identifying patients at risk (control strategy) and then, after a predetermined period, switched to offering the family history tool to included patients (intervention strategy). After considering the tool-based recommendation, the health care provider could decide on and arrange the referral. Primary outcome was the relative number of CRC patients who received screening or surveillance recommendations for themselves or relatives because of hereditary CRC or FCC, provided by genetic counseling. The intervention effect was evaluated using a logit-linear model. With the tool, 46/489 (9.4%) patients received a screening or surveillance recommendation, compared to 35/292 (12.0%) in the control group. In the intention-to-treat-analysis, accounting for time trends and hospital effects, this difference was not statistically significant (p = 0.58). A family history tool does not necessarily assist in increasing the number of CRC patients and relatives enrolled in screening or surveillance recommendations for hereditary CRC or FCC. Other interventions should be considered.

  3. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  4. Developing a Clinician Friendly Tool to Identify Useful Clinical Practice Guidelines: G-TRUST.

    Science.gov (United States)

    Shaughnessy, Allen F; Vaswani, Akansha; Andrews, Bonnie K; Erlich, Deborah R; D'Amico, Frank; Lexchin, Joel; Cosgrove, Lisa

    2017-09-01

    Clinicians are faced with a plethora of guidelines. To rate guidelines, they can select from a number of evaluation tools, most of which are long and difficult to apply. The goal of this project was to develop a simple, easy-to-use checklist for clinicians to use to identify trustworthy, relevant, and useful practice guidelines, the Guideline Trustworthiness, Relevance, and Utility Scoring Tool (G-TRUST). A modified Delphi process was used to obtain consensus of experts and guideline developers regarding a checklist of items and their relative impact on guideline quality. We conducted 4 rounds of sampling to refine wording, add and subtract items, and develop a scoring system. Multiple attribute utility analysis was used to develop a weighted utility score for each item to determine scoring. Twenty-two experts in evidence-based medicine, 17 developers of high-quality guidelines, and 1 consumer representative participated. In rounds 1 and 2, items were rewritten or dropped, and 2 items were added. In round 3, weighted scores were calculated from rankings and relative weights assigned by the expert panel. In the last round, more than 75% of experts indicated 3 of the 8 checklist items to be major indicators of guideline usefulness and, using the AGREE tool as a reference standard, a scoring system was developed to identify guidelines as useful, may not be useful, and not useful. The 8-item G-TRUST is potentially helpful as a tool for clinicians to identify useful guidelines. Further research will focus on its reliability when used by clinicians. © 2017 Annals of Family Medicine, Inc.

  5. Cellular signaling identifiability analysis: a case study.

    Science.gov (United States)

    Roper, Ryan T; Pia Saccomani, Maria; Vicini, Paolo

    2010-05-21

    Two primary purposes for mathematical modeling in cell biology are (1) simulation for making predictions of experimental outcomes and (2) parameter estimation for drawing inferences from experimental data about unobserved aspects of biological systems. While the former purpose has become common in the biological sciences, the latter is less common, particularly when studying cellular and subcellular phenomena such as signaling-the focus of the current study. Data are difficult to obtain at this level. Therefore, even models of only modest complexity can contain parameters for which the available data are insufficient for estimation. In the present study, we use a set of published cellular signaling models to address issues related to global parameter identifiability. That is, we address the following question: assuming known time courses for some model variables, which parameters is it theoretically impossible to estimate, even with continuous, noise-free data? Following an introduction to this problem and its relevance, we perform a full identifiability analysis on a set of cellular signaling models using DAISY (Differential Algebra for the Identifiability of SYstems). We use our analysis to bring to light important issues related to parameter identifiability in ordinary differential equation (ODE) models. We contend that this is, as of yet, an under-appreciated issue in biological modeling and, more particularly, cell biology. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  6. Identifying the cutting tool type used in excavations using neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Jonak, J.; Gajewski, J. [Lublin University of Technology, Lublin (Poland). Faculty of Mechanical Engineering

    2006-03-15

    The paper presents results of preliminary research on utilising neural networks to identify excavating cutting tool's type used in multi-tool excavating heads of mechanical coal miners. Such research is necessary to identify rock excavating process with a given head, and construct adaptation systems for control of excavating process with such a head.

  7. Use of Photogrammetry and Biomechanical Gait analysis to Identify Individuals

    DEFF Research Database (Denmark)

    Larsen, Peter Kastmand; Simonsen, Erik Bruun; Lynnerup, Niels

    Photogrammetry and recognition of gait patterns are valuable tools to help identify perpetrators based on surveillance recordings. We have found that stature but only few other measures have a satisfying reproducibility for use in forensics. Several gait variables with high recognition rates were...... found. Especially the variables located in the frontal plane are interesting due to large inter-individual differences in time course patterns. The variables with high recognition rates seem preferable for use in forensic gait analysis and as input variables to waveform analysis techniques...

  8. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  9. Affordances of agricultural systems analysis tools

    NARCIS (Netherlands)

    Ditzler, Lenora; Klerkx, Laurens; Chan-Dentoni, Jacqueline; Posthumus, Helena; Krupnik, Timothy J.; Ridaura, Santiago López; Andersson, Jens A.; Baudron, Frédéric; Groot, Jeroen C.J.

    2018-01-01

    The increasingly complex challenges facing agricultural systems require problem-solving processes and systems analysis (SA) tools that engage multiple actors across disciplines. In this article, we employ the theory of affordances to unravel what tools may furnish users, and how those affordances

  10. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  11. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  12. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  13. Relating genes to function: identifying enriched transcription factors using the ENCODE ChIP-Seq significance tool.

    Science.gov (United States)

    Auerbach, Raymond K; Chen, Bin; Butte, Atul J

    2013-08-01

    Biological analysis has shifted from identifying genes and transcripts to mapping these genes and transcripts to biological functions. The ENCODE Project has generated hundreds of ChIP-Seq experiments spanning multiple transcription factors and cell lines for public use, but tools for a biomedical scientist to analyze these data are either non-existent or tailored to narrow biological questions. We present the ENCODE ChIP-Seq Significance Tool, a flexible web application leveraging public ENCODE data to identify enriched transcription factors in a gene or transcript list for comparative analyses. The ENCODE ChIP-Seq Significance Tool is written in JavaScript on the client side and has been tested on Google Chrome, Apple Safari and Mozilla Firefox browsers. Server-side scripts are written in PHP and leverage R and a MySQL database. The tool is available at http://encodeqt.stanford.edu. abutte@stanford.edu Supplementary material is available at Bioinformatics online.

  14. The RUBA Watchdog Video Analysis Tool

    DEFF Research Database (Denmark)

    Bahnsen, Chris Holmberg; Madsen, Tanja Kidholm Osmann; Jensen, Morten Bornø

    We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA.......We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA....

  15. Risk assessment tools to identify women with increased risk of osteoporotic fracture. Complexity or simplicity?

    DEFF Research Database (Denmark)

    Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille

    2013-01-01

    A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview...... of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance each tool was sufficient for practical use and lastly to examine whether the complexity of the tools influenced their discriminative power. We searched Pub......Med, Embase and Cochrane databases for papers and evaluated these with respect to methodological quality using the QUADAS checklist. A total of 48 tools were identified, 20 had been externally validated, however only 6 tools had been tested more than once in a population-based setting with acceptable...

  16. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  17. A spectroscopic tool for identifying sources of origin for materials of military interest

    Science.gov (United States)

    Miziolek, Andrzej W.; De Lucia, Frank C.

    2014-05-01

    There is a need to identify the source of origin for many items of military interest, including ammunition and weapons that may be circulated and traded in illicit markets. Both fieldable systems (man-portable or handheld) as well as benchtop systems in field and home base laboratories are desired for screening and attribution purposes. Laser Induced Breakdown Spectroscopy (LIBS) continues to show significant capability as a promising new tool for materials identification, matching, and provenance. With the use of the broadband, high resolution spectrometer systems, the LIBS devices can not only determine the elemental inventory of the sample, but they are also capable of elemental fingerprinting to signify sources of origin of various materials. We present the results of an initial study to differentiate and match spent cartridges from different manufacturers and countries. We have found that using Partial Least Squares Discriminant Analysis (PLS-DA) we are able to achieve on average 93.3% True Positives and 5.3% False Positives. These results add to the large body of publications that have demonstrated that LIBS is a particularly suitable tool for source of origin determinations.

  18. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  19. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  20. Work Ability Index as tool to identify workers at risk of premature work exit.

    Science.gov (United States)

    Roelen, Corné A M; Heymans, Martijn W; Twisk, Jos W R; van der Klink, Jac J L; Groothoff, Johan W; van Rhenen, Willem

    2014-12-01

    To investigate the Work Ability Index (WAI) as tool for identifying workers at risk of premature work exit in terms of disability pension, unemployment, or early retirement. Prospective cohort study of 11,537 male construction workers (mean age 45.5 years), who completed the WAI at baseline and reported their work status (employed, unemployed, disability pension, or retired) after mean 2.3 years of follow-up. Associations between WAI scores and work status were investigated by multinomial logistic regression analysis. The ability of the WAI to discriminate between workers at high and low risk of premature work exit was analyzed by the area (AUC) under the receiver operating characteristic curve. 9,530 (83 %) construction workers had complete data for analysis. At follow-up, 336 (4 %) workers reported disability pension, 125 (1 %) unemployment, and 255 (3 %) retirement. WAI scores were prospectively associated with the risk of disability pension at follow-up, but not with the risk of unemployment and early retirement. The WAI showed fair discrimination to identify workers at risk of disability pension [AUC = 0.74; 95 % confidence interval (CI) 0.70-0.77]. The discriminative ability decreased with age from AUC = 0.78 in workers aged 30-39 years to AUC = 0.69 in workers ≥50 years of age. Discrimination failed for unemployment (AUC = 0.51; 95 % CI 0.47-0.55) and early retirement (AUC = 0.58; 95 % CI 0.53-0.61). The WAI can be used to identify construction workers <50 years of age at increased risk of disability pension and invite them for preventive interventions.

  1. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    Science.gov (United States)

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely

  2. The use of current risk analysis tools evaluated towards preventing external domino accidents

    NARCIS (Netherlands)

    Reniers, Genserik L L; Dullaert, W.; Ale, B. J.M.; Soudan, K.

    Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that

  3. Passive sampling as a tool for identifying micro-organic compounds in groundwater.

    Science.gov (United States)

    Mali, N; Cerar, S; Koroša, A; Auersperger, P

    2017-09-01

    The paper presents the use of a simple and cost efficient passive sampling device with integrated active carbon with which to test the possibility of determining the presence of micro-organic compounds (MOs) in groundwater and identifying the potential source of pollution as well as the seasonal variability of contamination. Advantage of the passive sampler is to cover a long sampling period by integrating the pollutant concentration over time, and the consequently analytical costs over the monitoring period can be reduced substantially. Passive samplers were installed in 15 boreholes in the Maribor City area in Slovenia, with two sampling campaigns covered a period about one year. At all sampling sites in the first series a total of 103 compounds were detected, and 144 in the second series. Of all detected compounds the 53 most frequently detected were selected for further analysis. These were classified into eight groups based on the type of their source: Pesticides, Halogenated solvents, Non-halogenated solvents, Domestic and personal, Plasticizers and additives, Other industrial, Sterols and Natural compounds. The most frequently detected MO compounds in groundwater were tetrachloroethene and trichloroethene from the Halogenated solvents group. The most frequently detected among the compound's groups were pesticides. Analysis of frequency also showed significant differences between the two sampling series, with less frequent detections in the summer series. For the analysis to determine the origin of contamination three groups of compounds were determined according to type of use: agriculture, urban and industry. Frequency of detection indicates mixed land use in the recharge areas of sampling sites, which makes it difficult to specify the dominant origin of the compound. Passive sampling has proved to be useful tool with which to identify MOs in groundwater and for assessing groundwater quality. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Identifying Pornographic Materials with Judgment Analysis

    Science.gov (United States)

    Houston, Judith A.; Houston, Samuel R.

    1974-01-01

    The primary purpose of this study was to determine if a policy-capturing methodology (JAN) which has been successfully utilized in military and educational research could be adapted for use as a procedure in identifying pornographic material. (Author)

  5. Validation of assessment tools for identifying trauma symptomatology in young children exposed to trauma

    DEFF Research Database (Denmark)

    Schandorph Løkkegaard, Sille; Elmose, Mette; Elklit, Ask

    There is a lack of Danish validated, developmentally sensitive assessment tools for preschool and young school children exposed to psychological trauma. Consequently, young traumatised children are at risk of not being identified. The purpose of this project is to validate three assessment tools...... that identify trauma symptomatology in young children; a caregiver interview called the Diagnostic Infant and Preschool Assessment (DIPA), a structured play test called the Odense Child Trauma Screening (OCTS), and a child questionnaire called the Darryl Cartoon Test. Three validity studies were conducted...

  6. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  7. Developing tools to identify marginal lands and assess their potential for bioenergy production

    Science.gov (United States)

    Galatsidas, Spyridon; Gounaris, Nikolaos; Dimitriadis, Elias; Rettenmaier, Nils; Schmidt, Tobias; Vlachaki, Despoina

    2017-04-01

    The term "marginal land" is currently intertwined in discussions about bioenergy although its definition is neither specific nor firm. The uncertainty arising from marginal land classification and quantification is one of the major constraining factors for its potential use. The clarification of political aims, i.e. "what should be supported?" is also an important constraining factor. Many approaches have been developed to identify marginal lands, based on various definitions according to the management goals. Concerns have been frequently raised regarding the impacts of marginal land use on environment, ecosystem services and sustainability. Current tools of soil quality and land potentials assessment fail to meet the needs of marginal land identification and exploitation for biomass production, due to the lack of comprehensive analysis of interrelated land functions and their quantitative evaluation. Land marginality is determined by dynamic characteristics in many cases and may therefore constitute a transitional state, which requires reassessment in due time. Also, marginal land should not be considered simply a dormant natural resource waiting to be used, since it may already provide multiple benefits and services to society relating to wildlife, biodiversity, carbon sequestration, etc. The consequences of cultivating such lands need to be fully addressed to present a balanced view of their sustainable potential for bioenergy. This framework is the basis for the development of the SEEMLA tools, which aim at supporting the identification, assessment, management of marginal lands in Europe and the decision-making for sustainable biomass production of them using appropriate bioenergy crops. The tools comprise two applications, a web-based one (independent of spatial data) and a GIS-based application (land regionalization on the basis of spatial data), which both incorporate: - Land resource characteristics, restricting the cultivation of agricultural crops but

  8. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  9. Identifying persistent and characteristic features in firearm tool marks on cartridge cases

    Science.gov (United States)

    Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John

    2017-12-01

    Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.

  10. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  11. Tool for identifying critical control points in embedded purchasing activities in SMEs

    NARCIS (Netherlands)

    Hagelaar, Geoffrey; Staal, Anne; Holman, Richard; Walhof, Gert

    2015-01-01

    This paper discusses risk and uncertainty aspects and proposes an assessment tool leading to identification of critical control points (CCPs) within purchasing-oriented activities of small and medium enterprises (SMEs). Identifying such CCPs is the basis for developing SME purchasing instruments to

  12. Work Ability Index as Tool to Identify Workers at Risk of Premature Work Exit

    NARCIS (Netherlands)

    Roelen, Corne A. M.; Heymans, Martijn W.; Twisk, Jos W. R.; van der Klink, Jac J. L.; Groothoff, Johan W.; van Rhenen, Willem

    2014-01-01

    Purpose To investigate the Work Ability Index (WAI) as tool for identifying workers at risk of premature work exit in terms of disability pension, unemployment, or early retirement. Methods Prospective cohort study of 11,537 male construction workers (mean age 45.5 years), who completed the WAI at

  13. Work ability index as tool to identify workers at risk of premature work exit

    NARCIS (Netherlands)

    Roelen, C.A.M.; Heymans, M.W.; Twisk, J.W.R.; van der Klink, J.J.L.; Groothoff, J.W.; van Rhenen, W.

    2014-01-01

    Purpose To investigate the Work Ability Index (WAI) as tool for identifying workers at risk of premature work exit in terms of disability pension, unemployment, or early retirement. Methods Prospective cohort study of 11,537 male construction workers (mean age 45.5 years), who completed the WAI at

  14. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  15. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  16. The CANDU alarm analysis tool (CAAT)

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Feher, M P; Lupton, L R [Control Centre Technology Branch, ON (Canada)

    1997-09-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs.

  17. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  18. A Critical Analysis of Anesthesiology Podcasts: Identifying Determinants of Success.

    Science.gov (United States)

    Singh, Devin; Alam, Fahad; Matava, Clyde

    2016-08-17

    Audio and video podcasts have gained popularity in recent years. Increasingly, podcasts are being used in the field of medicine as a tool to disseminate information. This format has multiple advantages including highly accessible creation tools, low distribution costs, and portability for the user. However, despite its ongoing use in medical education, there are no data describing factors associated with the success or quality of podcasts. The goal of the study was to assess the landscape of anesthesia podcasts in Canada and develop a methodology for evaluating the quality of the podcast. To achieve our objective, we identified the scope of podcasts in anesthesia specifically, constructed an algorithmic model for measuring success, and identified factors linked to both successful podcasts and a peer-review process. Independent reviewers performed a systematic search of anesthesia-related podcasts on iTunes Canada. Data and metrics recorded for each podcast included podcast's authorship, number posted, podcast series duration, target audience, topics, and social media presence. Descriptive statistics summarized mined data, and univariate analysis was used to identify factors associated with podcast success and a peer-review process. Twenty-two podcasts related to anesthesia were included in the final analysis. Less than a third (6/22=27%) were still active. The median longevity of the podcasts' series was just 13 months (interquartile range: 1-39 months). Anesthesiologists were the target audience for 77% of podcast series with clinical topics being most commonly addressed. We defined a novel algorithm for measuring success: Podcast Success Index. Factors associated with a high Podcast Success Index included podcasts targeting fellows (Spearman R=0.434; P=.04), inclusion of professional topics (Spearman R=0.456-0.603; P=.01-.03), and the use of Twitter as a means of social media (Spearman R=0.453;P=.03). In addition, more than two-thirds (16/22=73%) of podcasts

  19. Identifying areas under potential risk of illegal construction and demolition waste dumping using GIS tools.

    Science.gov (United States)

    Seror, Nissim; Portnov, Boris A

    2018-05-01

    Construction and demolition (C&D) waste, dumped illegally in ravines and open areas, contaminates soil and can cause underground water pollution and forests fires. Yet, effective monitoring of illegal C&D waste dumping and enforcing legislation against the offenders are often a difficult task due to the large size of geographic areas that need to be monitored, and limited human and financial resources available to environmental law enforcement agencies. In this study, we use Geographic Information System (GIS) tools and geo-statistical modelling to identify the areas under potentially elevated risk of illegal C&D waste dumping in the Haifa district of Israel. As our analysis shows, locational factors, significantly associated with the accumulated amount of waste in the existing illegal C&D waste sites, include: distance to the nearest main road, depth of the ravine present at the site (pwaste dumping for future monitoring. As we suggest, the proposed approach may be useful for environmental law enforcement authorities, by helping them to focus on specific sites for inspection, save resources, and act against the offenders more efficiently. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  1. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  2. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  3. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  4. Gene Unprediction with Spurio: A tool to identify spurious protein sequences.

    Science.gov (United States)

    Höps, Wolfram; Jeffryes, Matt; Bateman, Alex

    2018-01-01

    We now have access to the sequences of tens of millions of proteins. These protein sequences are essential for modern molecular biology and computational biology. The vast majority of protein sequences are derived from gene prediction tools and have no experimental supporting evidence for their translation.  Despite the increasing accuracy of gene prediction tools there likely exists a large number of spurious protein predictions in the sequence databases.  We have developed the Spurio tool to help identify spurious protein predictions in prokaryotes.  Spurio searches the query protein sequence against a prokaryotic nucleotide database using tblastn and identifies homologous sequences. The tblastn matches are used to score the query sequence's likelihood of being a spurious protein prediction using a Gaussian process model. The most informative feature is the appearance of stop codons within the presumed translation of homologous DNA sequences. Benchmarking shows that the Spurio tool is able to distinguish spurious from true proteins. However, transposon proteins are prone to be predicted as spurious because of the frequency of degraded homologs found in the DNA sequence databases. Our initial experiments suggest that less than 1% of the proteins in the UniProtKB sequence database are likely to be spurious and that Spurio is able to identify over 60 times more spurious proteins than the AntiFam resource. The Spurio software and source code is available under an MIT license at the following URL: https://bitbucket.org/bateman-group/spurio.

  5. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  6. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  7. Identifying inaccuracy of MS Project using system analysis

    Science.gov (United States)

    Fachrurrazi; Husin, Saiful; Malahayati, Nurul; Irzaidi

    2018-05-01

    The problem encountered in project owner’s financial accounting report is the difference in total project costs of MS Project to the Indonesian Standard (Standard Indonesia Standard / Cost Estimating Standard Book of Indonesia). It is one of the MS Project problems concerning to its cost accuracy, so cost data cannot be used in an integrated way for all project components. This study focuses on finding the causes of inaccuracy of the MS Projects. The aim of this study, which is operationally, are: (i) identifying cost analysis procedures for both current methods (SNI) and MS Project; (ii) identifying cost bias in each element of the cost analysis procedure; and (iii) analysing the cost differences (cost bias) in each element to identify what the cause of inaccuracies in MS Project toward SNI is. The method in this study is comparing for both the system analysis of MS Project and SNI. The results are: (i) MS Project system in Work of Resources element has limitation for two decimal digits only, have led to its inaccuracy. Where the Work of Resources (referred to as effort) in MS Project represents multiplication between the Quantities of Activities and Requirements of resources in SNI; (ii) MS Project and SNI have differences in the costing methods (the cost estimation methods), in which the SNI uses the Quantity-Based Costing (QBC), meanwhile MS Project uses the Time-Based Costing (TBC). Based on this research, we recommend to the contractors who use SNI should make an adjustment for Work of Resources in MS Project (with correction index) so that it can be used in an integrated way to the project owner’s financial accounting system. Further research will conduct for improvement the MS Project as an integrated tool toward all part of the project participant.

  8. Time Analysis: Still an Important Accountability Tool.

    Science.gov (United States)

    Fairchild, Thomas N.; Seeley, Tracey J.

    1994-01-01

    Reviews benefits to school counselors of conducting a time analysis. Describes time analysis system that authors have used, including case illustration of how authors used data to effect counseling program changes. System described followed process outlined by Fairchild: identifying services, devising coding system, keeping records, synthesizing…

  9. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  10. Remote Sensing as a Landscape Epidemiologic Tool to Identify Villages at High Risk for Malaria Transmission

    Science.gov (United States)

    Beck, Louisa R.; Rodriquez, Mario H.; Dister, Sheri W.; Rodriquez, Americo D.; Rejmankova, Eliska; Ulloa, Armando; Meza, Rosa A.; Roberts, Donald R.; Paris, Jack F.; Spanner, Michael A.; hide

    1994-01-01

    A landscape approach using remote sensing and Geographic Information System (GIS) technologies was developed to discriminate between villages at high and low risk for malaria transmission, as defined by adult Anopheles albimanus abundance. Satellite data for an area in southern Chiapas, Mexico were digitally processed to generate a map of landscape elements. The GIS processes were used to determine the proportion of mapped landscape elements surrounding 40 villages where An. albimanus data had been collected. The relationships between vector abundance and landscape element proportions were investigated using stepwise discriminant analysis and stepwise linear regression. Both analyses indicated that the most important landscape elements in terms of explaining vector abundance were transitional swamp and unmanaged pasture. Discriminant functions generated for these two elements were able to correctly distinguish between villages with high ind low vector abundance, with an overall accuracy of 90%. Regression results found both transitional swamp and unmanaged pasture proportions to be predictive of vector abundance during the mid-to-late wet season. This approach, which integrates remotely sensed data and GIS capabilities to identify villages with high vector-human contact risk, provides a promising tool for malaria surveillance programs that depend on labor-intensive field techniques. This is particularly relevant in areas where the lack of accurate surveillance capabilities may result in no malaria control action when, in fact, directed action is necessary. In general, this landscape approach could be applied to other vector-borne diseases in areas where: 1. the landscape elements critical to vector survival are known and 2. these elements can be detected at remote sensing scales.

  11. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  12. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  13. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  14. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  15. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  16. Developing tools for identifying employer and employee satisfaction of nursing new graduates in China.

    Science.gov (United States)

    Fan, Yuying; Li, Qiujie; Yang, Shufen; Guo, Ying; Yang, Libin; Zhao, Shibin

    2014-01-01

    Researchers developed evaluation tools measuring employment relevant satisfaction for nursing new graduates. The evaluation tools were designed to be relevant to nursing managers who make employment decisions and nursing new graduates who were just employed. In-depth interviews and an expert panel were established to review the activities that evaluate the employee and employer satisfaction of nursing new graduates. Based on individual interviews and literature review, evaluation items were selected. A two-round Delphi study was then conducted from September 2008 to May 2009 with a panel of experts from a range of nursing colleges in China. The response rate was 100% and Kendall's W was 0.73 in the second round of Delphi study. After two rounds of Delphi surveys, a list of 5 employee satisfaction items and 4 employer satisfaction items was identified for nursing new graduates. The findings of this study identified a different but multidimensional set of factors for employment relevant satisfaction, which confirmed the importance of certain fundamental aspects of practice. We developed the evaluation tools to assess the employer and employee satisfaction of nursing new graduates, which provided a database for further study.

  17. Social network analysis in identifying influential webloggers: A preliminary study

    Science.gov (United States)

    Hasmuni, Noraini; Sulaiman, Nor Intan Saniah; Zaibidi, Nerda Zura

    2014-12-01

    In recent years, second generation of internet-based services such as weblog has become an effective communication tool to publish information on the Web. Weblogs have unique characteristics that deserve users' attention. Some of webloggers have seen weblogs as appropriate medium to initiate and expand business. These webloggers or also known as direct profit-oriented webloggers (DPOWs) communicate and share knowledge with each other through social interaction. However, survivability is the main issue among DPOW. Frequent communication with influential webloggers is one of the way to keep survive as DPOW. This paper aims to understand the network structure and identify influential webloggers within the network. Proper understanding of the network structure can assist us in knowing how the information is exchanged among members and enhance survivability among DPOW. 30 DPOW were involved in this study. Degree centrality and betweenness centrality measurement in Social Network Analysis (SNA) were used to examine the strength relation and identify influential webloggers within the network. Thus, webloggers with the highest value of these measurements are considered as the most influential webloggers in the network.

  18. Identifying and classifying quality-of-life tools for assessing pressure ulcers after spinal cord injury

    Science.gov (United States)

    Hitzig, Sander L.; Balioussis, Christina; Nussbaum, Ethne; McGillivray, Colleen F.; Catharine Craven, B.; Noreau, Luc

    2013-01-01

    Context Although pressure ulcers may negatively influence quality of life (QoL) post-spinal cord injury (SCI), our understanding of how to assess their impact is confounded by conceptual and measurement issues. To ensure that descriptions of pressure ulcer impact are appropriately characterized, measures should be selected according to the domains that they evaluate and the population and pathologies for which they are designed. Objective To conduct a systematic literature review to identify and classify outcome measures used to assess the impact of pressure ulcers on QoL after SCI. Methods Electronic databases (Medline/PubMed, CINAHL, and PsycInfo) were searched for studies published between 1975 and 2011. Identified outcome measures were classified as being either subjective or objective using a QoL model. Results Fourteen studies were identified. The majority of tools identified in these studies did not have psychometric evidence supporting their use in the SCI population with the exception of two objective measures, the Short-Form 36 and the Craig Handicap Assessment and Reporting Technique, and two subjective measures, the Life Situation Questionnaire-Revised and the Ferrans and Powers Quality of Life Index SCI-Version. Conclusion Many QoL outcome tools showed promise in being sensitive to the presence of pressure ulcers, but few of them have been validated for use with SCI. Prospective studies should employ more rigorous methods for collecting data on pressure ulcer severity and location to improve the quality of findings with regard to their impact on QoL. The Cardiff Wound Impact Schedule is a potential tool for assessing impact of pressure ulcers-post SCI. PMID:24090238

  19. Affinity resins as new tools for identifying target proteins of ascorbic acid.

    Science.gov (United States)

    Iwaoka, Yuji; Nishino, Kohei; Ishikawa, Takahiro; Ito, Hideyuki; Sawa, Yoshihiro; Tai, Akihiro

    2018-02-12

    l-Ascorbic acid (AA) has diverse physiological functions, but little is known about the functional mechanisms of AA. In this study, we synthesized two types of affinity resin on which AA is immobilized in a stable form to identify new AA-targeted proteins, which can provide important clues for elucidating unknown functional mechanisms of AA. To our knowledge, an affinity resin on which AA as a ligand is immobilized has not been prepared, because AA is very unstable and rapidly degraded in an aqueous solution. By using the affinity resins, cytochrome c (cyt c) was identified as an AA-targeted protein, and we showed that oxidized cyt c exhibits specific affinity for AA. These results suggest that two kinds of AA-affinity resin can be powerful tools to identify new target proteins of AA.

  20. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  1. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  2. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  3. Accuracy of Brief Screening Tools for Identifying Postpartum Depression Among Adolescent Mothers

    Science.gov (United States)

    Venkatesh, Kartik K.; Zlotnick, Caron; Triche, Elizabeth W.; Ware, Crystal

    2014-01-01

    OBJECTIVE: To evaluate the accuracy of the Edinburgh Postnatal Depression Scale (EPDS) and 3 subscales for identifying postpartum depression among primiparous adolescent mothers. METHODS: Mothers enrolled in a randomized controlled trial to prevent postpartum depression completed a psychiatric diagnostic interview and the 10-item EPDS at 6 weeks, 3 months, and 6 months postpartum. Three subscales of the EPDS were assessed as brief screening tools: 3-item anxiety subscale (EPDS-3), 7-item depressive symptoms subscale (EPDS-7), and 2-item subscale (EPDS-2) that resemble the Patient Health Questionnaire-2. Receiver operating characteristic curves and the areas under the curves for each tool were compared to assess accuracy. The sensitivities and specificities of each screening tool were calculated in comparison with diagnostic criteria for a major depressive disorder. Repeated-measures longitudinal analytical techniques were used. RESULTS: A total of 106 women contributed 289 postpartum visits; 18% of the women met criteria for incident postpartum depression by psychiatric diagnostic interview. When used as continuous measures, the full EPDS, EPDS-7, and EPDS-2 performed equally well (area under the curve >0.9). Optimal cutoff scores for a positive depression screen for the EPDS and EPDS-7 were lower (≥9 and ≥7, respectively) than currently recommended cutoff scores (≥10). At optimal cutoff scores, the EPDS and EPDS-7 both had sensitivities of 90% and specificities of >85%. CONCLUSIONS: The EPDS, EPDS-7, and EPDS-2 are highly accurate at identifying postpartum depression among adolescent mothers. In primary care pediatric settings, the EPDS and its shorter subscales have potential for use as effective depression screening tools. PMID:24344102

  4. A prognostic tool to identify adolescents at high risk of becoming daily smokers

    Directory of Open Access Journals (Sweden)

    Paradis Gilles

    2011-08-01

    Full Text Available Abstract Background The American Academy of Pediatrics advocates that pediatricians should be involved in tobacco counseling and has developed guidelines for counseling. We present a prognostic tool for use by health care practitioners in both clinical and non-clinical settings, to identify adolescents at risk of becoming daily smokers. Methods Data were drawn from the Nicotine Dependence in Teens (NDIT Study, a prospective investigation of 1293 adolescents, initially aged 12-13 years, recruited in 10 secondary schools in Montreal, Canada in 1999. Questionnaires were administered every three months for five years. The prognostic tool was developed using estimated coefficients from multivariable logistic models. Model overfitting was corrected using bootstrap cross-validation. Goodness-of-fit and predictive ability of the models were assessed by R2, the c-statistic, and the Hosmer-Lemeshow test. Results The 1-year and 2-year probability of initiating daily smoking was a joint function of seven individual characteristics: age; ever smoked; ever felt like you needed a cigarette; parent(s smoke; sibling(s smoke; friend(s smoke; and ever drank alcohol. The models were characterized by reasonably good fit and predictive ability. They were transformed into user-friendly tables such that the risk of daily smoking can be easily computed by summing points for responses to each item. The prognostic tool is also available on-line at http://episerve.chumontreal.qc.ca/calculation_risk/daily-risk/daily_smokingadd.php. Conclusions The prognostic tool to identify youth at high risk of daily smoking may eventually be an important component of a comprehensive tobacco control system.

  5. Identifying Social Trust in Cross-Country Analysis: Do We Really Measure the Same?

    Science.gov (United States)

    Torpe, Lars; Lolle, Henrik

    2011-01-01

    Many see trust as an important social resource for the welfare of individuals as well as nations. It is therefore important to be able to identify trust and explain its sources. Cross-country survey analysis has been an important tool in this respect, and often one single variable is used to identify social trust understood as trust in strangers,…

  6. Testing the woman abuse screening tool to identify intimate partner violence in Indonesia.

    Science.gov (United States)

    Iskandar, Livia; Braun, Kathryn L; Katz, Alan R

    2015-04-01

    Intimate Partner Violence (IPV) is a global public health problem. IPV prevalence in Indonesia has been estimated to be less than 1%, based on reported cases. It is likely that IPV prevalence is underreported in Indonesia, as it is in many other countries. Screening for IPV has been found to increase IPV identification, but no screening tools are in use in Indonesia. The aim of this study was to test the translated Woman Abuse Screening Tool (WAST) for detecting IPV in Indonesia. The WAST was tested against a diagnostic interview by a trained psychologist on 240 women attending two Primary Health Centers in Jakarta. IPV prevalence and the reliability, sensitivity, and specificity of the WAST were estimated. Prevalence of IPV by diagnostic interview was 36.3%, much higher than published estimates. The most common forms of IPV identified were psychological (85%) and physical abuse (24%). Internal reliability of the WAST was high (α = .801). A WAST score of 13 (out of 24) is the recommended cutoff for identifying IPV, but only 17% of the Indonesian sample scored 13 or higher. Test sensitivity of the WAST with a cutoff score of 13 was only 41.9%, with a specificity of 96.8%. With a cutoff score of 10, the sensitivity improved to 84.9%, while the specificity decreased to 61.0%. Use of the WAST with a cutoff score of 10 provides good sensitivity and reasonable specificity and would provide a much-needed screening tool for use in Indonesia. Although a lower cutoff would yield a greater proportion of false positives, most of the true cases would be identified, increasing the possibility that women experiencing abuse would receive needed assistance. © The Author(s) 2014.

  7. Tools to identify the men with prostate cancer most appropriate for active surveillance?

    Directory of Open Access Journals (Sweden)

    Robert H Getzenberg

    2014-02-01

    Full Text Available A great deal of effort is underway in order to identify those men with prostate cancer felicitous for active surveillance with greater precision than that afforded to us today. In the manuscript by Irshad et al. the authors evaluate a novel set of genes associated with senescence and aging as tools that can provide guidance regarding the indolent nature of an individual's prostate cancer with validation using both mRNA and protein analyses. While additional studies are required to understand the full impact of these findings, the innovative approach taken enhances our understanding of distinct phenotypes of prostate cancer.

  8. Identifying avian sources of faecal contamination using sterol analysis.

    Science.gov (United States)

    Devane, Megan L; Wood, David; Chappell, Andrew; Robson, Beth; Webster-Brown, Jenny; Gilpin, Brent J

    2015-10-01

    Discrimination of the source of faecal pollution in water bodies is an important step in the assessment and mitigation of public health risk. One tool for faecal source tracking is the analysis of faecal sterols which are present in faeces of animals in a range of distinctive ratios. Published ratios are able to discriminate between human and herbivore mammal faecal inputs but are of less value for identifying pollution from wildfowl, which can be a common cause of elevated bacterial indicators in rivers and streams. In this study, the sterol profiles of 50 avian-derived faecal specimens (seagulls, ducks and chickens) were examined alongside those of 57 ruminant faeces and previously published sterol profiles of human wastewater, chicken effluent and animal meatwork effluent. Two novel sterol ratios were identified as specific to avian faecal scats, which, when incorporated into a decision tree with human and herbivore mammal indicative ratios, were able to identify sterols from avian-polluted waterways. For samples where the sterol profile was not consistent with herbivore mammal or human pollution, avian pollution is indicated when the ratio of 24-ethylcholestanol/(24-ethylcholestanol + 24-ethylcoprostanol + 24-ethylepicoprostanol) is ≥0.4 (avian ratio 1) and the ratio of cholestanol/(cholestanol + coprostanol + epicoprostanol) is ≥0.5 (avian ratio 2). When avian pollution is indicated, further confirmation by targeted PCR specific markers can be employed if greater confidence in the pollution source is required. A 66% concordance between sterol ratios and current avian PCR markers was achieved when 56 water samples from polluted waterways were analysed.

  9. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  10. A decision support tool for identifying abuse of controlled substances by ForwardHealth Medicaid members.

    Science.gov (United States)

    Mailloux, Allan T; Cummings, Stephen W; Mugdh, Mrinal

    2010-01-01

    Our objective was to use Wisconsin's Medicaid Evaluation and Decision Support (MEDS) data warehouse to develop and validate a decision support tool (DST) that (1) identifies Wisconsin Medicaid fee-for-service recipients who are abusing controlled substances, (2) effectively replicates clinical pharmacist recommendations for interventions intended to curb abuse of physician and pharmacy services, and (3) automates data extraction, profile generation and tracking of recommendations and interventions. From pharmacist manual reviews of medication profiles, seven measures of overutilization of controlled substances were developed, including (1-2) 6-month and 2-month "shopping" scores, (3-4) 6-month and 2-month forgery scores, (5) duplicate/same day prescriptions, (6) count of controlled substance claims, and the (7) shopping 6-month score for the individual therapeutic class with the highest score. The pattern analysis logic for the measures was encoded into SQL and applied to the medication profiles of 190 recipients who had already undergone manual review. The scores for each measure and numbers of providers were analyzed by exhaustive chi-squared automatic interaction detection (CHAID) to determine significant thresholds and combinations of predictors of pharmacist recommendations, resulting in a decision tree to classify recipients by pharmacist recommendations. The overall correct classification rate of the decision tree was 95.3%, with a 2.4% false positive rate and 4.0% false negative rate for lock-in versus prescriber-alert letter recommendations. Measures used by the decision tree include the 2-month and 6-month shopping scores, and the number of pharmacies and prescribers. The number of pharmacies was the best predictor of abuse of controlled substances. When a Medicaid recipient receives prescriptions for controlled substances at 8 or more pharmacies, the likelihood of a lock-in recommendation is 90%. The availability of the Wisconsin MEDS data warehouse has

  11. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  12. BIBLIOGRAPHIC STUDY IN RISK MANAGEMENT AIMED TO IDENTIFY MORE REFERENCED TOOLS, METHODS AND RELATIONSHIPS

    Directory of Open Access Journals (Sweden)

    Alamir Costa Louro

    2015-06-01

    Full Text Available The objective of this paper is to identify and discuss trends in tools and methods used in project risk management and its relationship to other matters, using current scientific articles. The focus isn´t in understanding how they work in technical terms, but think about the possibilities of deepening in academic studies, including making several suggestions for future research. Adjacent to the article there is a discussion about an alleged "one best way" imperative normativity approach. It was answered the following research questions: what subjects and theories are related to project risk management tools and methods? The first contribution is related to the importance of the academic Chris Chapman as an author who has more published and also more referenced in the survey. There are several contributions on various subjects such as: the perception of the existence of many conceptual papers; papers about construction industry, problematization of contracts according to agency theory, IT and ERPs issues. Other contributions came from the bibliometric method that brings lot of consolidated information about terms, topics, authors, references, periods and, of course, methods and tools about Project Risk Management.

  13. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  14. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    Directory of Open Access Journals (Sweden)

    Yuttachon Promworn

    Full Text Available Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance.We present Transformation of Nucleotide Enrichment Ratios (ToNER, a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR.ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools

  15. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    Science.gov (United States)

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and Git

  16. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  17. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  18. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  19. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  20. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  1. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  2. Polyphenolic profile as a useful tool to identify the wood used in wine aging.

    Science.gov (United States)

    Sanz, Miriam; Fernández de Simón, Brígida; Cadahía, Estrella; Esteruelas, Enrique; Muñoz, Angel Ma; Hernández, Ma Teresa; Estrella, Isabel

    2012-06-30

    Although oak wood is the main material used in cooperage, other species are being considered as possible sources of wood for the production of wines and their derived products. In this work we have compared the phenolic composition of acacia (Robinia pseudoacacia), chestnut (Castanea sativa), cherry (Prunus avium) and ash (Fraxinus excelsior and F. americana) heartwoods, by using HPLC-DAD/ESI-MS/MS (some of these data have been showed in previous paper), as well as the changes that toasting intensity at cooperage produce in each polyphenolic profile. Before toasting, each wood shows a different and specific polyphenolic profile, with both qualitative and quantitative differences among them. Toasting notably changed these profiles, in general, proportionally to toasting intensity and led to a minor differentiation among species in toasted woods, although we also found phenolic markers in toasted woods. Thus, methyl syringate, benzoic acid, methyl vanillate, p-hydroxybenzoic acid, 3,4,5-trimethylphenol and p-coumaric acid, condensed tannins of the procyanidin type, and the flavonoids naringenin, aromadendrin, isosakuranetin and taxifolin will be a good tool to identify cherry wood. In acacia wood the chemical markers will be the aldehydes gallic and β-resorcylic and two not fully identified hydroxycinnamic compounds, condensed tannins of the prorobinetin type, and when using untoasted wood, dihydrorobinetin, and in toasted acacia wood, robinetin. In untoasted ash wood, the presence of secoiridoids, phenylethanoid glycosides, or di and oligolignols will be a good tool, especially oleuropein, ligstroside and olivil, together verbascoside and isoverbascoside in F. excelsior, and oleoside in F. americana. In toasted ash wood, tyrosol, syringaresinol, cyclolovil, verbascoside and olivil, could be used to identify the botanical origin. In addition, in ash wood, seasoned and toasted, neither hydrolysable nor condensed tannins were detected. Lastly, in chestnut wood, gallic

  3. Identifying target processes for microbial electrosynthesis by elementary mode analysis.

    Science.gov (United States)

    Kracke, Frauke; Krömer, Jens O

    2014-12-30

    Microbial electrosynthesis and electro fermentation are techniques that aim to optimize microbial production of chemicals and fuels by regulating the cellular redox balance via interaction with electrodes. While the concept is known for decades major knowledge gaps remain, which make it hard to evaluate its biotechnological potential. Here we present an in silico approach to identify beneficial production processes for electro fermentation by elementary mode analysis. Since the fundamentals of electron transport between electrodes and microbes have not been fully uncovered yet, we propose different options and discuss their impact on biomass and product yields. For the first time 20 different valuable products were screened for their potential to show increased yields during anaerobic electrically enhanced fermentation. Surprisingly we found that an increase in product formation by electrical enhancement is not necessarily dependent on the degree of reduction of the product but rather the metabolic pathway it is derived from. We present a variety of beneficial processes with product yield increases of maximal 36% in reductive and 84% in oxidative fermentations and final theoretical product yields up to 100%. This includes compounds that are already produced at industrial scale such as succinic acid, lysine and diaminopentane as well as potential novel bio-commodities such as isoprene, para-hydroxybenzoic acid and para-aminobenzoic acid. Furthermore, it is shown that the way of electron transport has major impact on achievable biomass and product yields. The coupling of electron transport to energy conservation could be identified as crucial for most processes. This study introduces a powerful tool to determine beneficial substrate and product combinations for electro-fermentation. It also highlights that the maximal yield achievable by bio electrochemical techniques depends strongly on the actual electron transport mechanisms. Therefore it is of great importance to

  4. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  5. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  6. Internal transcribed spacer 2 barcode: A good tool for identifying Acanthopanacis cortex

    Directory of Open Access Journals (Sweden)

    Sha eZhao

    2015-10-01

    Full Text Available Acanthopanacis cortex has been used in clinical applications for a long time. Considering some historical and geographical factors, Acanthopanacis cortex is easily confused with other herbs in medicine markets, thereby causing potential safety issues. In this study, we used the internal transcribed spacer 2 (ITS2 barcode to identify 69 samples belonging to six species, including Acanthopanacis cortex and its adulterants. The nearest distance, single-nucleotide polymorphisms (SNPs, and neighbor-joining (NJ tree methods were used to evaluate the identification ability of the ITS2 barcode. According to the kimura-2-parameter model, the intraspecific distance of Eleutherococcus nodiflorus ITS2 sequences ranged from 0 to 0.0132. The minimum interspecific distance between E. nodiflorus and E. giraldii was 0.0221, which was larger than the maximum intraspecific distance of E. nodiflorus. Three stable SNPs in ITS2 can be used to distinguish Acanthopanacis cortex and its closely related species. The NJ tree indicated that the Acanthopanacis cortex samples clustered into one clade, which can be distinguished clearly from the adulterants of this herb. A secondary structure of ITS2 provided another dimensionality to identify species. In conclusion, the ITS2 barcode effectively identifies Acanthopanacis cortex, and DNA barcoding is a convenient tool for medicine market supervision.

  7. Benchmarking to Identify Practice Variation in Test Ordering: A Potential Tool for Utilization Management.

    Science.gov (United States)

    Signorelli, Heather; Straseski, Joely A; Genzen, Jonathan R; Walker, Brandon S; Jackson, Brian R; Schmidt, Robert L

    2015-01-01

    Appropriate test utilization is usually evaluated by adherence to published guidelines. In many cases, medical guidelines are not available. Benchmarking has been proposed as a method to identify practice variations that may represent inappropriate testing. This study investigated the use of benchmarking to identify sites with inappropriate utilization of testing for a particular analyte. We used a Web-based survey to compare 2 measures of vitamin D utilization: overall testing intensity (ratio of total vitamin D orders to blood-count orders) and relative testing intensity (ratio of 1,25(OH)2D to 25(OH)D test orders). A total of 81 facilities contributed data. The average overall testing intensity index was 0.165, or approximately 1 vitamin D test for every 6 blood-count tests. The average relative testing intensity index was 0.055, or one 1,25(OH)2D test for every 18 of the 25(OH)D tests. Both indexes varied considerably. Benchmarking can be used as a screening tool to identify outliers that may be associated with inappropriate test utilization. Copyright© by the American Society for Clinical Pathology (ASCP).

  8. Electronic monitoring of adherence to inhaled corticosteroids: an essential tool in identifying severe asthma in children.

    Science.gov (United States)

    Jochmann, Anja; Artusio, Luca; Jamalzadeh, Angela; Nagakumar, Prasad; Delgado-Eckert, Edgar; Saglani, Sejal; Bush, Andrew; Frey, Urs; Fleming, Louise J

    2017-12-01

    International guidelines recommend that severe asthma can only be diagnosed after contributory factors, including adherence, have been addressed. Accurate assessment of adherence is difficult in clinical practice. We hypothesised that electronic monitoring in children would identify nonadherence, thus delineating the small number with true severe asthma.Asthmatic children already prescribed inhaled corticosteroids were prospectively recruited and persistence of adherence assessed using electronic monitoring devices. Spirometry, airway inflammation and asthma control were measured at the start and end of the monitoring period.93 children (62 male; median age 12.4 years) were monitored for a median of 92 days. Median (range) monitored adherence was 74% (21-99%). We identified four groups: 1) good adherence during monitoring with improved control, 24% (likely previous poor adherence); 2) good adherence with poor control, 18% (severe therapy-resistant asthma); 3) poor adherence with good control, 26% (likely overtreated); and 4) poor adherence with poor control, 32%. No clinical parameter prior to monitoring distinguished these groups.Electronic monitoring is a useful tool for identifying children in whom a step up in treatment is indicated. Different approaches are needed in those who are controlled when adherent or who are nonadherent. Electronic monitoring is essential in a paediatric severe asthma clinic. Copyright ©ERS 2017.

  9. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  10. Mumps Virus: Modification of the Identify-Isolate-Inform Tool for Frontline Healthcare Providers

    Directory of Open Access Journals (Sweden)

    Kristi L. Koenig

    2016-09-01

    Full Text Available Mumps is a highly contagious viral infection that became rare in most industrialized countries following the introduction of measles-mumps-rubella (MMR vaccine in 1967. The disease, however, has been re-emerging with several outbreaks over the past decade. Many clinicians have never seen a case of mumps. To assist frontline healthcare providers with detecting potential cases and initiating critical actions, investigators modified the “Identify-Isolate-Inform” tool for mumps infection. The tool is applicable to regions with rare incidences or local outbreaks, especially seen in college students, as well as globally in areas where vaccination is less common. Mumps begins with a prodrome of low-grade fever, myalgias and malaise/anorexia, followed by development of nonsuppurative parotitis, which is the pathognomonic finding associated with acute mumps infection. Orchitis and meningitis are the two most common serious complications, with hearing loss and infertility occurring rarely. Providers should consider mumps in patients with exposure to a known case or international travel to endemic regions who present with consistent signs and symptoms. If mumps is suspected, healthcare providers must immediately implement standard and droplet precautions and notify the local health department and hospital infection control personnel.

  11. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  12. Suppression subtractive hybridization as a tool to identify anthocyanin metabolism-related genes in apple skin.

    Science.gov (United States)

    Ban, Yusuke; Moriguchi, Takaya

    2010-01-01

    The pigmentation of anthocyanins is one of the important determinants for consumer preference and marketability in horticultural crops such as fruits and flowers. To elucidate the mechanisms underlying the physiological process leading to the pigmentation of anthocyanins, identification of the genes differentially expressed in response to anthocyanin accumulation is a useful strategy. Currently, microarrays have been widely used to isolate differentially expressed genes. However, the use of microarrays is limited by its high cost of special apparatus and materials. Therefore, availability of microarrays is limited and does not come into common use at present. Suppression subtractive hybridization (SSH) is an alternative tool that has been widely used to identify differentially expressed genes due to its easy handling and relatively low cost. This chapter describes the procedures for SSH, including RNA extraction from polysaccharides and polyphenol-rich samples, poly(A)+ RNA purification, evaluation of subtraction efficiency, and differential screening using reverse northern in apple skin.

  13. PathScore: a web tool for identifying altered pathways in cancer data.

    Science.gov (United States)

    Gaffney, Stephen G; Townsend, Jeffrey P

    2016-12-01

    PathScore quantifies the level of enrichment of somatic mutations within curated pathways, applying a novel approach that identifies pathways enriched across patients. The application provides several user-friendly, interactive graphic interfaces for data exploration, including tools for comparing pathway effect sizes, significance, gene-set overlap and enrichment differences between projects. Web application available at pathscore.publichealth.yale.edu. Site implemented in Python and MySQL, with all major browsers supported. Source code available at: github.com/sggaffney/pathscore with a GPLv3 license. stephen.gaffney@yale.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. DRIS Analysis Identifies a Common Potassium Imbalance in Sweetgum Plantations

    Science.gov (United States)

    Mark D. Coleman; S.X. Chang; D.J. Robison

    2003-01-01

    DRIS (Diagnosis and Recommendation Integrated System) analysis was applied to fast-growing sweetgum (Liquidambar styraciflua L.) plantations in the southeast United States as a tool for nutrient diagnosis and fertilizer recommendations. First, standard foliar nutrient ratios for nitrogen (N), phosphorus (P), potassium (K), calcium (Ca), and...

  15. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  16. Using a Systematic Approach to Identifying Organizational Factors in Root Cause Analysis

    International Nuclear Information System (INIS)

    Gallogly, Kay Wilde

    2011-01-01

    This presentation set the scene for the second discussion session. In her presentation, the author observed that: - Investigators do not see the connection between the analysis tools available and the identification of HOF. Most investigators use the tools in a cursory manner and so do not derive the full benefits of the tools. Some tools are used for presentation purposes as opposed to being used for analytical purposes e.g. event and causal factors charts. In some cases, the report will indicate that specific analytical tools were used in the investigation but the analysis is not in the body of the report. - Some investigators are documenting HOF causes but do not recognize them as such. This indicates a lack of understanding of HOF. - Others investigators focus on technical issues because of their own comfort level. - The culture of the Organisation will affect the depth of the investigation and therefore the use of the analytical tools to pursue HOF issues. - The author contends that if analysis tools are applied systematically to gather factually based data, then HOF issues can be identified. The use of factual information (without judgement and subjectivity) is important to maintain the credibility of the investigation especially when HOF issues are identified. - Systematic use of tools assists in better communication of the issues to foster greater understanding and acceptance by senior management. - Barrier Analysis, Change Analysis, and TWIN (Task Demands, Work Environment, Individual Capabilities, and Human Nature) all offer the opportunity to identify HOF issues if the analyst pursues this line of investigation. It was illustrated that many elements of the TWIN Error Precursors are themselves Organisational in nature. - The TWIN model applied to the Anatomy of an Event will help to distinguish those which are Organisational issues (Latent Organisational Weaknesses, Error Precursors and Flawed Defences) and those which are human factors (Active Errors

  17. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  18. SMM-system: A mining tool to identify specific markers in Salmonella enterica.

    Science.gov (United States)

    Yu, Shuijing; Liu, Weibing; Shi, Chunlei; Wang, Dapeng; Dan, Xianlong; Li, Xiao; Shi, Xianming

    2011-03-01

    This report presents SMM-system, a software package that implements various personalized pre- and post-BLASTN tasks for mining specific markers of microbial pathogens. The main functionalities of SMM-system are summarized as follows: (i) converting multi-FASTA file, (ii) cutting interesting genomic sequence, (iii) automatic high-throughput BLASTN searches, and (iv) screening target sequences. The utility of SMM-system was demonstrated by using it to identify 214 Salmonella enterica-specific protein-coding sequences (CDSs). Eighteen primer pairs were designed based on eighteen S. enterica-specific CDSs, respectively. Seven of these primer pairs were validated with PCR assay, which showed 100% inclusivity for the 101 S. enterica genomes and 100% exclusivity of 30 non-S. enterica genomes. Three specific primer pairs were chosen to develop a multiplex PCR assay, which generated specific amplicons with a size of 180bp (SC1286), 238bp (SC1598) and 405bp (SC4361), respectively. This study demonstrates that SMM-system is a high-throughput specific marker generation tool that can be used to identify genus-, species-, serogroup- and even serovar-specific DNA sequences of microbial pathogens, which has a potential to be applied in food industries, diagnostics and taxonomic studies. SMM-system is freely available and can be downloaded from http://foodsafety.sjtu.edu.cn/SMM-system.html. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Genomic suppression subtractive hybridization as a tool to identify differences in mycorrhizal fungal genomes.

    Science.gov (United States)

    Murat, Claude; Zampieri, Elisa; Vallino, Marta; Daghino, Stefania; Perotto, Silvia; Bonfante, Paola

    2011-05-01

    Characterization of genomic variation among different microbial species, or different strains of the same species, is a field of significant interest with a wide range of potential applications. We have investigated the genomic variation in mycorrhizal fungal genomes through genomic suppressive subtractive hybridization. The comparison was between phylogenetically distant and close truffle species (Tuber spp.), and between isolates of the ericoid mycorrhizal fungus Oidiodendron maius featuring different degrees of metal tolerance. In the interspecies experiment, almost all the sequences that were identified in the Tuber melanosporum genome and absent in Tuber borchii and Tuber indicum corresponded to transposable elements. In the intraspecies comparison, some specific sequences corresponded to regions coding for enzymes, among them a glutathione synthetase known to be involved in metal tolerance. This approach is a quick and rather inexpensive tool to develop molecular markers for mycorrhizal fungi tracking and barcoding, to identify functional genes and to investigate the genome plasticity, adaptation and evolution. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  20. Nominal group technique: a brainstorming tool for identifying areas to improve pain management in hospitalized patients.

    Science.gov (United States)

    Peña, Adolfo; Estrada, Carlos A; Soniat, Debbie; Taylor, Benjamin; Burton, Michael

    2012-01-01

    Pain management in hospitalized patients remains a priority area for improvement; effective strategies for consensus development are needed to prioritize interventions. To identify challenges, barriers, and perspectives of healthcare providers in managing pain among hospitalized patients. Qualitative and quantitative group consensus using a brainstorming technique for quality improvement-the nominal group technique (NGT). One medical, 1 medical-surgical, and 1 surgical hospital unit at a large academic medical center. Nurses, resident physicians, patient care technicians, and unit clerks. Responses and ranking to the NGT question: "What causes uncontrolled pain in your unit?" Twenty-seven health workers generated a total of 94 ideas. The ideas perceived contributing to a suboptimal pain control were grouped as system factors (timeliness, n = 18 ideas; communication, n = 11; pain assessment, n = 8), human factors (knowledge and experience, n = 16; provider bias, n = 8; patient factors, n = 19), and interface of system and human factors (standardization, n = 14). Knowledge, timeliness, provider bias, and patient factors were the top ranked themes. Knowledge and timeliness are considered main priorities to improve pain control. NGT is an efficient tool for identifying general and context-specific priority areas for quality improvement; teams of healthcare providers should consider using NGT to address their own challenges and barriers. Copyright © 2011 Society of Hospital Medicine.

  1. NREL Analysis Identifies Where Commercial Customers Might Benefit from

    Science.gov (United States)

    Battery Energy Storage | NREL | News | NREL NREL Analysis Identifies Where Commercial Customers Customers Might Benefit from Battery Energy Storage August 24, 2017 After upfront costs, batteries may reduce operating costs for customers paying demand charges Commercial electricity customers who are

  2. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  3. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  4. Diagnostic test accuracy of nutritional tools used to identify undernutrition in patients with colorectal cancer: a systematic review.

    Science.gov (United States)

    Håkonsen, Sasja Jul; Pedersen, Preben Ulrich; Bath-Hextall, Fiona; Kirkpatrick, Pamela

    2015-05-15

    Effective nutritional screening, nutritional care planning and nutritional support are essential in all settings, and there is no doubt that a health service seeking to increase safety and clinical effectiveness must take nutritional care seriously. Screening and early detection of malnutrition is crucial in identifying patients at nutritional risk. There is a high prevalence of malnutrition in hospitalized patients undergoing treatment for colorectal cancer. To synthesize the best available evidence regarding the diagnostic test accuracy of nutritional tools (sensitivity and specificity) used to identify malnutrition (specifically undernutrition) in patients with colorectal cancer (such as the Malnutrition Screening Tool and Nutritional Risk Index) compared to reference tests (such as the Subjective Global Assessment or Patient Generated Subjective Global Assessment). Patients with colorectal cancer requiring either (or all) surgery, chemotherapy and/or radiotherapy in secondary care. Focus of the review: The diagnostic test accuracy of validated assessment tools/instruments (such as the Malnutrition Screening Tool and Nutritional Risk Index) in the diagnosis of malnutrition (specifically under-nutrition) in patients with colorectal cancer, relative to reference tests (Subjective Global Assessment or Patient Generated Subjective Global Assessment). Types of studies: Diagnostic test accuracy studies regardless of study design. Studies published in English, German, Danish, Swedish and Norwegian were considered for inclusion in this review. Databases were searched from their inception to April 2014. Methodological quality was determined using the Quality Assessment of Diagnostic Accuracy Studies checklist. Data was collected using the data extraction form: the Standards for Reporting Studies of Diagnostic Accuracy checklist for the reporting of studies of diagnostic accuracy. The accuracy of diagnostic tests is presented in terms of sensitivity, specificity, positive

  5. Transposon activation mutagenesis as a screening tool for identifying resistance to cancer therapeutics

    International Nuclear Information System (INIS)

    Chen, Li; Schmidt, Emmett V; Stuart, Lynda; Ohsumi, Toshiro K; Burgess, Shawn; Varshney, Gaurav K; Dastur, Anahita; Borowsky, Mark; Benes, Cyril; Lacy-Hulbert, Adam

    2013-01-01

    The development of resistance to chemotherapies represents a significant barrier to successful cancer treatment. Resistance mechanisms are complex, can involve diverse and often unexpected cellular processes, and can vary with both the underlying genetic lesion and the origin or type of tumor. For these reasons developing experimental strategies that could be used to understand, identify and predict mechanisms of resistance in different malignant cells would be a major advance. Here we describe a gain-of-function forward genetic approach for identifying mechanisms of resistance. This approach uses a modified piggyBac transposon to generate libraries of mutagenized cells, each containing transposon insertions that randomly activate nearby gene expression. Genes of interest are identified using next-gen high-throughput sequencing and barcode multiplexing is used to reduce experimental cost. Using this approach we successfully identify genes involved in paclitaxel resistance in a variety of cancer cell lines, including the multidrug transporter ABCB1, a previously identified major paclitaxel resistance gene. Analysis of co-occurring transposons integration sites in single cell clone allows for the identification of genes that might act cooperatively to produce drug resistance a level of information not accessible using RNAi or ORF expression screening approaches. We have developed a powerful pipeline to systematically discover drug resistance in mammalian cells in vitro. This cost-effective approach can be readily applied to different cell lines, to identify canonical or context specific resistance mechanisms. Its ability to probe complex genetic context and non-coding genomic elements as well as cooperative resistance events makes it a good complement to RNAi or ORF expression based screens

  6. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  7. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  8. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  9. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    Energy Technology Data Exchange (ETDEWEB)

    Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  10. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  11. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  12. Swarm-Aurora: A web-based tool for quickly identifying multi-instrument auroral events

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Knudsen, D. J.; Frey, H. U.; Kauristie, K.; Partamies, N.; Jackel, B. J.; Gillies, M.; Holmdahl Olsen, P. E.

    2016-12-01

    In recent years there has been a dramatic increase in ground-based auroral imaging systems. These include the continent-wide THEMIS-ASI network, and imagers operated by other programs including GO-Canada, MIRACLE, AGO, OMTI, and more. In the near future, a new Canadian program called TREx will see the deployment of new narrow-band ASIs that will provide multi-wavelength imaging across Western Canada. At the same time, there is an unprecedented fleet of international spacecraft probing geospace at low and high altitudes. We are now in the position to simultaneously observe the magnetospheric drivers of aurora, observe in situ the waves, currents, and particles associated with MI coupling, and the conjugate aurora. Whereas a decade ago, a single magnetic conjunction between one ASI and a low altitude satellite was a relatively rare event, we now have a plethora of triple conjunctions between imagers, low-altitude spacecraft, and near-equatorial magnetospheric probes. But with these riches comes a new level of complexity. It is often difficult to identify the many useful conjunctions for a specific line of inquiry from the multitude of conjunctions where the geospace conditions are often not relevant and/or the imaging is compromised by clouds, moon, or other factors. Swarm-Aurora was designed to facilitate and drive the use of Swarm in situ measurements in auroral science. The project seeks to build a bridge between the Swarm science community, Swarm data, and the complimentary auroral data and community. Swarm-Aurora (http://swarm-aurora.phys.ucalgary.ca) incorporates a web-based tool which provides access to quick-look summary data for a large array of instruments, with Swarm in situ and ground-based ASI data as the primary focus. This web interface allows researchers to quickly and efficiently browse Swarm and ASI data to identify auroral events of interest to them. This allows researchers to be able to easily and quickly identify Swarm overflights of ASIs that

  13. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  14. Uniqueness plots: A simple graphical tool for identifying poor peak fits in X-ray photoelectron spectroscopy

    International Nuclear Information System (INIS)

    Singh, Bhupinder; Diwan, Anubhav; Jain, Varun; Herrera-Gomez, Alberto; Terry, Jeff; Linford, Matthew R.

    2016-01-01

    Highlights: • Uniqueness plots are introduced as a new tool for identifying poor XPS peak fits. • Uniqueness plots are demonstrated on real XPS data sets. • A horizontal line in a uniqueness plot indicates a poor fit, i.e., fit parameter correlation. • A parabolic shape in a uniqueness plot indicates that a fit may be appropriate. - Abstract: Peak fitting is an essential part of X-ray photoelectron spectroscopy (XPS) narrow scan analysis, and the Literature contains both good and bad examples of peak fitting. A common cause of poor peak fitting is the inclusion of too many fit parameters, often without a sound chemical and/or physical basis for them, and/or the failure to reasonably constrain them. Under these conditions, fit parameters are often correlated, and therefore lacking in statistical meaning. Here we introduce the uniqueness plot as a simple graphical tool for identifying bad peak fits in XPS, i.e., fit parameter correlation. These plots are widely used in spectroscopic ellipsometry. We illustrate uniqueness plots with two data sets: a C 1s narrow scan from ozone-treated carbon nanotube forests and an Si 2p narrow scan from an air-oxidized silicon wafer. For each fit, we consider different numbers of parameters and constraints on them. As expected, the uniqueness plots are parabolic when fewer fit parameters and/or more constraints are applied. However, they fan out and eventually become horizontal lines as more unconstrained parameters are included in the fits. Uniqueness plots are generated by plotting the chi squared (χ 2 ) value for a fit vs. a systematically varied value of a parameter in the fit. The Abbe criterion is also considered as a figure of merit for uniqueness plots in the Supporting Information. We recommend that uniqueness plots be used by XPS practitioners for identifying inappropriate peak fits.

  15. Uniqueness plots: A simple graphical tool for identifying poor peak fits in X-ray photoelectron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Bhupinder; Diwan, Anubhav; Jain, Varun [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT, 84606 (United States); Herrera-Gomez, Alberto [CINVESTAV-Unidad Queretaro, Queretaro, 76230 (Mexico); Terry, Jeff [Department of Physics, Illinois Institute of Technology, Chicago, IL, 60616 (United States); Linford, Matthew R., E-mail: mrlinford@chem.byu.edu [Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT, 84606 (United States)

    2016-11-30

    Highlights: • Uniqueness plots are introduced as a new tool for identifying poor XPS peak fits. • Uniqueness plots are demonstrated on real XPS data sets. • A horizontal line in a uniqueness plot indicates a poor fit, i.e., fit parameter correlation. • A parabolic shape in a uniqueness plot indicates that a fit may be appropriate. - Abstract: Peak fitting is an essential part of X-ray photoelectron spectroscopy (XPS) narrow scan analysis, and the Literature contains both good and bad examples of peak fitting. A common cause of poor peak fitting is the inclusion of too many fit parameters, often without a sound chemical and/or physical basis for them, and/or the failure to reasonably constrain them. Under these conditions, fit parameters are often correlated, and therefore lacking in statistical meaning. Here we introduce the uniqueness plot as a simple graphical tool for identifying bad peak fits in XPS, i.e., fit parameter correlation. These plots are widely used in spectroscopic ellipsometry. We illustrate uniqueness plots with two data sets: a C 1s narrow scan from ozone-treated carbon nanotube forests and an Si 2p narrow scan from an air-oxidized silicon wafer. For each fit, we consider different numbers of parameters and constraints on them. As expected, the uniqueness plots are parabolic when fewer fit parameters and/or more constraints are applied. However, they fan out and eventually become horizontal lines as more unconstrained parameters are included in the fits. Uniqueness plots are generated by plotting the chi squared (χ{sup 2}) value for a fit vs. a systematically varied value of a parameter in the fit. The Abbe criterion is also considered as a figure of merit for uniqueness plots in the Supporting Information. We recommend that uniqueness plots be used by XPS practitioners for identifying inappropriate peak fits.

  16. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  17. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  18. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  19. Identifying the Risk Areas and Urban Growth by ArcGIS-Tools

    Directory of Open Access Journals (Sweden)

    Omar Hamdy

    2016-10-01

    Full Text Available Abouelreesh is one of the most at risk areas in Aswan, Egypt, which suffers from storms, poor drainage, and flash flooding. These phenomena affect the urban areas and cause a lot of damage to buildings and infrastructure. Moreover, the potential for the further realization of dangerous situations increased when the urban areas of Abouelreesh extended towards the risk areas. In an effort to ameliorate the danger, two key issues for urban growth management were studied, namely: (i estimations regarding the pace of urban sprawl, and (ii the identification of urban areas located in regions that would be affected by flash floods. Analyzing these phenomena require a lot of data in order to obtain good results, but in our case, the official data or field data was limited so we tried to obtain it by accessing two kinds of free sources of satellite data. First, we used Arc GIS tools to analyze (digital elevation model (DEM files in order to study the watershed and better identify the risk area. Second, we studied historical imagery in Google Earth to determine the age of each urban block. The urban growth rate in the risk areas had risen to 63.31% in 2001. Urban growth in the case study area had been influenced by house sizes, because most people were looking to live in bigger houses. The aforementioned problem can be observed by considering the increasing average house sizes from 2001 until 2013, where, especially in risky areas, the average of house sizes had grown from 223 m2 in 2001 to 318 m2 in 2013. The findings from this study would be useful to urban planners and government officials in helping them to make informed decisions on urban development to benefit the community, especially those living in areas at risk from flash flooding from heavy rain events.

  20. Enterococcus phages as potential tool for identifying sewage inputs in the Great Lakes region

    Science.gov (United States)

    Vijayavel, K.; Byappanahalli, Muruleedhara N.; Whitman, Richard L.; Ebdon, J.; Taylor, H.; Kashian, D.R.

    2014-01-01

    Bacteriophages are viruses living in bacteria that can be used as a tool to detect fecal contamination in surface waters around the world. However, the lack of a universal host strain makes them unsuitable for tracking fecal sources. We evaluated the suitability of two newly isolated Enterococcus host strains (ENT-49 and ENT-55) capable for identifying sewage contamination in impacted waters by targeting phages specific to these hosts. Both host strains were isolated from wastewater samples and identified as E. faecium by 16S rRNA gene sequencing. Occurrence of Enterococcus phages was evaluated in sewage samples (n = 15) from five wastewater treatment plants and in fecal samples from twenty-two species of wild and domesticated animals (individual samples; n = 22). Levels of Enterococcus phages, F + coliphages, Escherichia coli and enterococci were examined from four rivers, four beaches, and three harbors. Enterococcus phages enumeration was at similar levels (Mean = 6.72 Log PFU/100 mL) to F + coliphages in all wastewater samples, but were absent from all non-human fecal sources tested. The phages infecting Enterococcus spp. and F + coliphages were not detected in the river samples (detection threshold < 10 PFU/100 mL), but were present in the beach and harbor samples (range = 1.83 to 2.86 Log PFU/100 mL). Slightly higher concentrations (range = 3.22 to 3.69 Log MPN/100 mL) of E. coli and enterococci when compared to F + coliphages and Enterococcus phages, were observed in the river, beach and harbor samples. Our findings suggest that the bacteriophages associated with these particular Enterococcus host strains offer potentially sensitive and human-source specific indicators of enteric pathogen risk.

  1. Development of the ProPal-COPD tool to identify patients with COPD for proactive palliative care

    Directory of Open Access Journals (Sweden)

    Duenk RG

    2017-07-01

    Full Text Available RG Duenk,1 C Verhagen,1 EM Bronkhorst,2 RS Djamin,3 GJ Bosman,4 E Lammers,5 PNR Dekhuijzen,6 KCP Vissers,1 Y Engels,1,* Y Heijdra6,* 1Department of Anesthesiology, Pain and Palliative Medicine, 2Department of Health Evidence, Radboud University Medical Center, Nijmegen, 3Department of Respiratory Medicine, Amphia Hospital, Breda, 4Department of Respiratory Medicine, Slingeland Hospital, Doetinchem, 5Department of Respiratory Medicine, Gelre Hospitals, Zutphen, 6Department of Pulmonary Diseases, Radboud University Medical Center, Nijmegen, the Netherlands *These authors contributed equally to this work Background: Our objective was to develop a tool to identify patients with COPD for proactive palliative care. Since palliative care needs increase during the disease course of COPD, the prediction of mortality within 1 year, measured during hospitalizations for acute exacerbation COPD (AECOPD, was used as a proxy for the need of proactive palliative care.Patients and methods: Patients were recruited from three general hospitals in the Netherlands in 2014. Data of 11 potential predictors, a priori selected based on literature, were collected during hospitalization for AECOPD. After 1 year, the medical files were explored for the date of death. An optimal prediction model was assessed by Lasso logistic regression, with 20-fold cross-validation for optimal shrinkage. Missing data were handled using complete case analysis.Results: Of 174 patients, 155 patients were included; of those 30 (19.4% died within 1 year. The optimal prediction model was internally validated and had good discriminating power (AUC =0.82, 95% CI 0.81–0.82. This model relied on the following seven predictors: the surprise question, Medical Research Council dyspnea questionnaire (MRC dyspnea, Clinical COPD Questionnaire (CCQ, FEV1% of predicted value, body mass index, previous hospitalizations for AECOPD and specific comorbidities. To ensure minimal miss out of patients in need

  2. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  3. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    Lu Qiming; Biery, Kurt A; Kowalkowski, James B

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  4. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  5. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  6. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  7. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  8. Identifying organizational deficiencies through root-cause analysis

    International Nuclear Information System (INIS)

    Tuli, R.W.; Apostolakis, G.E.

    1996-01-01

    All nuclear power plants incorporate root-cause analysis as an instrument to help identify and isolate key factors judged to be of significance following an incident or accident. Identifying the principal deficiencies can become very difficult when the event involves not only human and machine interaction, but possibly the underlying safety and quality culture of the organization. The current state of root-cause analysis is to conclude the investigation after identifying human and/or hardware failures. In this work, root-cause analysis is taken one step further by examining plant work processes and organizational factors. This extension is considered significant to the success of the analysis, especially when management deficiency is believed to contribute to the incident. The results of root-cause analysis can be most effectively implemented if the organization, as a whole, wishes to improve the overall operation of the plant by preventing similar incidents from occurring again. The study adds to the existing root-cause analysis the ability to localize the causes of undesirable events and to focus on those problems hidden deeply within the work processes that are routinely followed in the operation and maintenance of the facility

  9. Validation of the TAPS-1: A Four-Item Screening Tool to Identify Unhealthy Substance Use in Primary Care.

    Science.gov (United States)

    Gryczynski, Jan; McNeely, Jennifer; Wu, Li-Tzy; Subramaniam, Geetha A; Svikis, Dace S; Cathers, Lauretta A; Sharma, Gaurav; King, Jacqueline; Jelstrom, Eve; Nordeck, Courtney D; Sharma, Anjalee; Mitchell, Shannon G; O'Grady, Kevin E; Schwartz, Robert P

    2017-09-01

    The Tobacco, Alcohol, Prescription Medication, and Other Substance use (TAPS) tool is a combined two-part screening and brief assessment developed for adult primary care patients. The tool's first-stage screening component (TAPS-1) consists of four items asking about past 12-month use for four substance categories, with response options of never, less than monthly, monthly, weekly, and daily or almost daily. To validate the TAPS-1 in primary care patients. Participants completed the TAPS tool in self- and interviewer-administered formats, in random order. In this secondary analysis, the TAPS-1 was evaluated against DSM-5 substance use disorder (SUD) criteria to determine optimal cut-points for identifying unhealthy substance use at three severity levels (problem use, mild SUD, and moderate-to-severe SUD). Two thousand adult patients at five primary care sites. DSM-5 SUD criteria were determined via the modified Composite International Diagnostic Interview. Oral fluid was used as a biomarker of recent drug use. Optimal frequency-of-use cut-points on the self-administered TAPS-1 for identifying SUDs were ≥ monthly use for tobacco and alcohol (sensitivity = 0.92 and 0.71, specificity = 0.80 and 0.85, AUC = 0.86 and 0.78, respectively) and any reported use for illicit drugs and prescription medication misuse (sensitivity = 0.93 and 0.89, specificity = 0.85 and 0.91, AUC = 0.89 and 0.90, respectively). The performance of the interviewer-administered format was similar. When administered first, the self-administered format yielded higher disclosure rates for past 12-month alcohol use, illicit drug use, and prescription medication misuse. Frequency of use alone did not provide sufficient information to discriminate between gradations of substance use problem severity. Among those who denied drug use on the TAPS-1, less than 4% had a drug-positive biomarker. The TAPS-1 can identify unhealthy substance use in primary care patients with a high level of accuracy

  10. Using molecular tools to identify the geographical origin of a case of human brucellosis.

    Science.gov (United States)

    Muchowski, J K; Koylass, M S; Dainty, A C; Stack, J A; Perrett, L; Whatmore, A M; Perrier, C; Chircop, S; Demicoli, N; Gatt, A B; Caruana, P A; Gopaul, K K

    2015-10-01

    Although Malta is historically linked with the zoonosis brucellosis, there had not been a case of the disease in either the human or livestock population for several years. However, in July 2013 a case of human brucellosis was identified on the island. To determine whether this recent case originated in Malta, four isolates from this case were subjected to molecular analysis. Molecular profiles generated using multilocus sequence analysis and multilocus variable number tandem repeat for the recent human case isolates and 11 Brucella melitensis strains of known Maltese origin were compared with others held on in-house and global databases. While the 11 isolates of Maltese origin formed a distinct cluster, the recent human isolation was not associated with these strains but instead clustered with isolates originating from the Horn of Africa. These data was congruent with epidemiological trace-back showed that the individual had travelled to Malta from Eritrea. This work highlights the potential of using molecular typing data to aid in epidemiological trace-back of Brucella isolations and assist in monitoring of the effectiveness of brucellosis control schemes.

  11. Completed Ensemble Empirical Mode Decomposition: a Robust Signal Processing Tool to Identify Sequence Strata

    Science.gov (United States)

    Purba, H.; Musu, J. T.; Diria, S. A.; Permono, W.; Sadjati, O.; Sopandi, I.; Ruzi, F.

    2018-03-01

    Well logging data provide many geological information and its trends resemble nonlinear or non-stationary signals. As long well log data recorded, there will be external factors can interfere or influence its signal resolution. A sensitive signal analysis is required to improve the accuracy of logging interpretation which it becomes an important thing to determine sequence stratigraphy. Complete Ensemble Empirical Mode Decomposition (CEEMD) is one of nonlinear and non-stationary signal analysis method which decomposes complex signal into a series of intrinsic mode function (IMF). Gamma Ray and Spontaneous Potential well log parameters decomposed into IMF-1 up to IMF-10 and each of its combination and correlation makes physical meaning identification. It identifies the stratigraphy and cycle sequence and provides an effective signal treatment method for sequence interface. This method was applied to BRK- 30 and BRK-13 well logging data. The result shows that the combination of IMF-5, IMF-6, and IMF-7 pattern represent short-term and middle-term while IMF-9 and IMF-10 represent the long-term sedimentation which describe distal front and delta front facies, and inter-distributary mouth bar facies, respectively. Thus, CEEMD clearly can determine the different sedimentary layer interface and better identification of the cycle of stratigraphic base level.

  12. BIOELECTRICAL IMPEDANCE VECTOR ANALYSIS IDENTIFIES SARCOPENIA IN NURSING HOME RESIDENTS

    Science.gov (United States)

    Loss of muscle mass and water shifts between body compartments are contributing factors to frailty in the elderly. The body composition changes are especially pronounced in institutionalized elderly. We investigated the ability of single-frequency bioelectrical impedance analysis (BIA) to identify b...

  13. Identifying Students’ Misconceptions on Basic Algorithmic Concepts Through Flowchart Analysis

    NARCIS (Netherlands)

    Rahimi, E.; Barendsen, E.; Henze, I.; Dagienė, V.; Hellas, A.

    2017-01-01

    In this paper, a flowchart-based approach to identifying secondary school students’ misconceptions (in a broad sense) on basic algorithm concepts is introduced. This approach uses student-generated flowcharts as the units of analysis and examines them against plan composition and construct-based

  14. INTERFACING INTERACTIVE DATA ANALYSIS TOOLS WITH THE GRID: THE PPDG CS-11 ACTIVITY

    International Nuclear Information System (INIS)

    Perl, Joseph

    2003-01-01

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  15. SIMMER as a safety analysis tool

    International Nuclear Information System (INIS)

    Smith, L.L.; Bell, C.R.; Bohl, W.R.; Bott, T.F.; Dearing, J.F.; Luck, L.B.

    1982-01-01

    SIMMER has been used for numerous applications in fast reactor safety, encompassing both accident and experiment analysis. Recent analyses of transition-phase behavior in potential core disruptive accidents have integrated SIMMER testing with the accident analysis. Results of both the accident analysis and the verification effort are presented as a comprehensive safety analysis program

  16. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  17. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  18. Structural parameter identifiability analysis for dynamic reaction networks

    DEFF Research Database (Denmark)

    Davidescu, Florin Paul; Jørgensen, Sten Bay

    2008-01-01

    method based on Lie derivatives. The proposed systematic two phase methodology is illustrated on a mass action based model for an enzymatically catalyzed reaction pathway network where only a limited set of variables is measured. The methodology clearly pinpoints the structurally identifiable parameters...... where for a given set of measured variables it is desirable to investigate which parameters may be estimated prior to spending computational effort on the actual estimation. This contribution addresses the structural parameter identifiability problem for the typical case of reaction network models....... The proposed analysis is performed in two phases. The first phase determines the structurally identifiable reaction rates based on reaction network stoichiometry. The second phase assesses the structural parameter identifiability of the specific kinetic rate expressions using a generating series expansion...

  19. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  20. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  1. Structural analysis of ITER sub-assembly tools

    International Nuclear Information System (INIS)

    Nam, K.O.; Park, H.K.; Kim, D.J.; Ahn, H.J.; Lee, J.H.; Kim, K.K.; Im, K.; Shaw, R.

    2011-01-01

    The ITER Tokamak assembly tools are purpose-built assembly tools to complete the ITER Tokamak machine which includes the cryostat and the components contained therein. The sector sub-assembly tools descried in this paper are main assembly tools to assemble vacuum vessel, thermal shield and toroidal filed coils into a complete 40 o sector. The 40 o sector sub-assembly tools are composed of sector sub-assembly tool, including radial beam, vacuum vessel supports and mid-plane brace tools. These tools shall have sufficient strength to transport and handle heavy weight of the ITER Tokamak machine reached several hundred tons. Therefore these tools should be designed and analyzed to confirm both the strength and structural stability even in the case of conservative assumptions. To verify structural stabilities of the sector sub-assembly tools in terms of strength and deflection, ANSYS code was used for linear static analysis. The results of the analysis show that these tools are designed with sufficient strength and stiffness. The conceptual designs of these tools are briefly described in this paper also.

  2. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    Lenzi, Bruno

    2012-01-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  3. Uniqueness plots: A simple graphical tool for identifying poor peak fits in X-ray photoelectron spectroscopy

    Science.gov (United States)

    Singh, Bhupinder; Diwan, Anubhav; Jain, Varun; Herrera-Gomez, Alberto; Terry, Jeff; Linford, Matthew R.

    2016-11-01

    Peak fitting is an essential part of X-ray photoelectron spectroscopy (XPS) narrow scan analysis, and the Literature contains both good and bad examples of peak fitting. A common cause of poor peak fitting is the inclusion of too many fit parameters, often without a sound chemical and/or physical basis for them, and/or the failure to reasonably constrain them. Under these conditions, fit parameters are often correlated, and therefore lacking in statistical meaning. Here we introduce the uniqueness plot as a simple graphical tool for identifying bad peak fits in XPS, i.e., fit parameter correlation. These plots are widely used in spectroscopic ellipsometry. We illustrate uniqueness plots with two data sets: a C 1s narrow scan from ozone-treated carbon nanotube forests and an Si 2p narrow scan from an air-oxidized silicon wafer. For each fit, we consider different numbers of parameters and constraints on them. As expected, the uniqueness plots are parabolic when fewer fit parameters and/or more constraints are applied. However, they fan out and eventually become horizontal lines as more unconstrained parameters are included in the fits. Uniqueness plots are generated by plotting the chi squared (χ2) value for a fit vs. a systematically varied value of a parameter in the fit. The Abbe criterion is also considered as a figure of merit for uniqueness plots in the Supporting Information. We recommend that uniqueness plots be used by XPS practitioners for identifying inappropriate peak fits.

  4. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  5. Integration of multiple networks and pathways identifies cancer driver genes in pan-cancer analysis.

    Science.gov (United States)

    Cava, Claudia; Bertoli, Gloria; Colaprico, Antonio; Olsen, Catharina; Bontempi, Gianluca; Castiglioni, Isabella

    2018-01-06

    Modern high-throughput genomic technologies represent a comprehensive hallmark of molecular changes in pan-cancer studies. Although different cancer gene signatures have been revealed, the mechanism of tumourigenesis has yet to be completely understood. Pathways and networks are important tools to explain the role of genes in functional genomic studies. However, few methods consider the functional non-equal roles of genes in pathways and the complex gene-gene interactions in a network. We present a novel method in pan-cancer analysis that identifies de-regulated genes with a functional role by integrating pathway and network data. A pan-cancer analysis of 7158 tumour/normal samples from 16 cancer types identified 895 genes with a central role in pathways and de-regulated in cancer. Comparing our approach with 15 current tools that identify cancer driver genes, we found that 35.6% of the 895 genes identified by our method have been found as cancer driver genes with at least 2/15 tools. Finally, we applied a machine learning algorithm on 16 independent GEO cancer datasets to validate the diagnostic role of cancer driver genes for each cancer. We obtained a list of the top-ten cancer driver genes for each cancer considered in this study. Our analysis 1) confirmed that there are several known cancer driver genes in common among different types of cancer, 2) highlighted that cancer driver genes are able to regulate crucial pathways.

  6. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    Science.gov (United States)

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Identifying clinical course patterns in SMS data using cluster analysis

    DEFF Research Database (Denmark)

    Kent, Peter; Kongsted, Alice

    2012-01-01

    ABSTRACT: BACKGROUND: Recently, there has been interest in using the short message service (SMS or text messaging), to gather frequent information on the clinical course of individual patients. One possible role for identifying clinical course patterns is to assist in exploring clinically important...... showed that clinical course patterns can be identified by cluster analysis using all SMS time points as cluster variables. This method is simple, intuitive and does not require a high level of statistical skill. However, there are alternative ways of managing SMS data and many different methods...

  8. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  9. Use of rapid needs assessment as a tool to identify vaccination delays in Guatemala and Peru.

    Science.gov (United States)

    D'Ardenne, Katie K; Darrow, Juliana; Furniss, Anna; Chavez, Catia; Hernandez, Herminio; Berman, Stephen; Asturias, Edwin J

    2016-03-29

    To explore the use of rapid needs assessment (RNA) surveys to determine the prevalence and factors contributing to delays in vaccination of children in two low middle-income countries (LMIC). Data from two RNA surveys performed as part of program improvement evaluations in Guatemala and Peru were used for this analysis. The primary endpoint was the timeliness of immunization with delay defined as administration of vaccines beyond 28 days from recommended age for DTwP-HepB-Hib (Penta) and measles-mumps-rubella (MMR) vaccines, as well as past age-restrictions for rotavirus vaccine. Independent risk factors analyzed included child's gender, birth year, number of children in household, maternal age, maternal education, and food insecurity. Vaccine information was available from 811 children from 838 households surveyed. High rate of immunization delays was observed, with 75.6% of children in Guatemala and 57.8% of children in Peru being delayed for the third dose of Penta primary series. Factors associated with delayed vaccination in Guatemala included advanced maternal age and increased number of children in household. In Peru, significant associations were birth year before 2009, lower maternal education level, and increased number of children in household. RNA is a fast and effective method to identify timely vaccine coverage and derive a hypothesis of factors possibly associated with vaccination delay. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. RNA-ID, a Powerful Tool for Identifying and Characterizing Regulatory Sequences.

    Science.gov (United States)

    Brule, C E; Dean, K M; Grayhack, E J

    2016-01-01

    The identification and analysis of sequences that regulate gene expression is critical because regulated gene expression underlies biology. RNA-ID is an efficient and sensitive method to discover and investigate regulatory sequences in the yeast Saccharomyces cerevisiae, using fluorescence-based assays to detect green fluorescent protein (GFP) relative to a red fluorescent protein (RFP) control in individual cells. Putative regulatory sequences can be inserted either in-frame or upstream of a superfolder GFP fusion protein whose expression, like that of RFP, is driven by the bidirectional GAL1,10 promoter. In this chapter, we describe the methodology to identify and study cis-regulatory sequences in the RNA-ID system, explaining features and variations of the RNA-ID reporter, as well as some applications of this system. We describe in detail the methods to analyze a single regulatory sequence, from construction of a single GFP variant to assay of variants by flow cytometry, as well as modifications required to screen libraries of different strains simultaneously. We also describe subsequent analyses of regulatory sequences. © 2016 Elsevier Inc. All rights reserved.

  11. Web-based Tool Identifies and Quantifies Potential Cost Savings Measures at the Hanford Site

    International Nuclear Information System (INIS)

    Renevitz, Marisa J.; Peschong, Jon C.; Charboneau, Briant L.; Simpson, Brett C.

    2014-01-01

    The Technical Improvement system is an approachable web-based tool that is available to Hanford DOE staff, site contractors, and general support service contractors as part of the baseline optimization effort underway at the Hanford Site. Finding and implementing technical improvements are a large part of DOE's cost savings efforts. The Technical Improvement dashboard is a key tool for brainstorming and monitoring the progress of submitted baseline optimization and potential cost/schedule efficiencies. The dashboard is accessible to users over the Hanford Local Area Network (HLAN) and provides a highly visual and straightforward status to management on the ideas provided, alleviating the need for resource intensive weekly and monthly reviews

  12. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  13. Latent cluster analysis of ALS phenotypes identifies prognostically differing groups.

    Directory of Open Access Journals (Sweden)

    Jeban Ganesalingam

    2009-09-01

    Full Text Available Amyotrophic lateral sclerosis (ALS is a degenerative disease predominantly affecting motor neurons and manifesting as several different phenotypes. Whether these phenotypes correspond to different underlying disease processes is unknown. We used latent cluster analysis to identify groupings of clinical variables in an objective and unbiased way to improve phenotyping for clinical and research purposes.Latent class cluster analysis was applied to a large database consisting of 1467 records of people with ALS, using discrete variables which can be readily determined at the first clinic appointment. The model was tested for clinical relevance by survival analysis of the phenotypic groupings using the Kaplan-Meier method.The best model generated five distinct phenotypic classes that strongly predicted survival (p<0.0001. Eight variables were used for the latent class analysis, but a good estimate of the classification could be obtained using just two variables: site of first symptoms (bulbar or limb and time from symptom onset to diagnosis (p<0.00001.The five phenotypic classes identified using latent cluster analysis can predict prognosis. They could be used to stratify patients recruited into clinical trials and generating more homogeneous disease groups for genetic, proteomic and risk factor research.

  14. Identifying functions for ex-core neutron noise analysis

    International Nuclear Information System (INIS)

    Avila, J.M.; Oliveira, J.C.

    1987-01-01

    A method of performing the phase analysis of signals arising from neutron detectors placed in the periphery of a pressurized water reactor is proposed. It consists in the definition of several identifying functions, based on the phases of cross power spectral densities corresponding to four ex-core neutron detectors. Each of these functions enhances the appearance of different sources of noise. The method, applied to the ex-core neutron fluctuation analysis of a French PWR, proved to be very useful as it allows quick recognition of various patterns in the power spectral densities. (orig.) [de

  15. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  16. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  17. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  18. Ergonomics as aid tool to identify and to analyze factors that can affect the operational performance of nuclear power plants

    International Nuclear Information System (INIS)

    Luquetti Santos, I.J.A.; Carvalho, P.V.R.

    2005-01-01

    The study of ergonomics has evolved around the world as one of the keys to understand human behavior in interaction with complex systems as nuclear power plant and to achieve the best match between the system and its users in the context of task to be performed. Increasing research efforts have yielded a considerable body of knowledge concerning the design of workstations, workplace, control rooms, human-system interfaces, user-interface interaction and organizational design to prevent worker discomfort, illness and also to improve productivity, product quality, ease of use and safety. The work ergonomics analysis consists of gathering a series of observation in order to better understand the work done and to propose changes and improvements in the working conditions. The work ergonomics analysis implies both the correction of existing situations (safety, reliability and production problems) and the development of new work system. Operator activity analysis provides a useful tool for the ergonomics approach, based on work ergonomics analysis. The operators will be systematically observed in their real work environment (control room) or in simulators. The focus is on description of the distributed regulated mechanisms (in the sense that operators work in crew), both in nominal and degraded situations, observing how operators regulate collectively their work during an increase in workload or when confronted with situations where incidents or accidents occur. Audio, video recorders and field notes can be used to collect empirical data, conversations and interactions that occur naturally within the work environment. Our research develops an applied ergonomics methodology, based on field studies, that permits to identify and analyze situations, factors that may affect the operational performance of nuclear power plants. Our contribution is related to the following technical topic: How best to learn from and share operational safety experience and manage changes during

  19. Identifying the Learning Styles and Instructional Tool Preferences of Beginning Food Science and Human Nutrition Majors

    Science.gov (United States)

    Bohn, D. M.; Rasmussen, C. N.; Schmidt, S. J.

    2004-01-01

    Learning styles vary among individuals, and understanding which instructional tools certain learning styles prefer can be utilized to enhance student learning. Students in the introductory Food Science and Human Nutrition course (FSHN 101), taught at the Univ. of Illinois at Urbana-Champaign, were asked to complete Gregorc's Learning Style…

  20. Identifying Professional Development Needs in Mathematics: A Planning Tool for Grades 3-7. Second Edition

    Science.gov (United States)

    Taylor, Mary Jo; Dimino, Joseph A.; Gellar, Leanne Ketterlin; Koontz, Trish

    2010-01-01

    This document offers a planning tool for grades 3-7 that can be used by regional comprehensive centers, other technical assistance centers, and state departments of education to plan professional development for teachers. It is based on the "National Mathematics Advisory Panel Report" which was published in 2008. The panel synthesized its final…

  1. Identifying obstructive sleep apnea after stroke/TIA: evaluating four simple screening tools.

    Science.gov (United States)

    Boulos, Mark I; Wan, Anthony; Im, James; Elias, Sara; Frankul, Fadi; Atalla, Mina; Black, Sandra E; Basile, Vincenzo S; Sundaram, Arun; Hopyan, Julia J; Boyle, Karl; Gladstone, David J; Murray, Brian J; Swartz, Richard H

    2016-05-01

    Despite its high prevalence and unfavorable clinical consequences, obstructive sleep apnea (OSA) often remains underappreciated after cerebrovascular events. The purpose of our study was to evaluate the clinical utility of four simple paper-based screening tools for excluding OSA after stroke or transient ischemic attack (TIA). Sixty-nine inpatients and outpatients with stroke or TIA during the past 180 days completed the 4-Variable screening tool (4V), STOP-BAG questionnaire (ie, STOP-BANG questionnaire without the neck circumference measurement), Berlin questionnaire, and the Sleep Obstructive apnea score optimized for Stroke (SOS). They subsequently underwent objective testing using a portable sleep monitoring device. Cutoffs were selected to maximize sensitivity and exclude OSA (AHI ≥ 10) in ≥10% of the cohort. The mean age was 68.3 ± 14.2 years and 47.8% were male. Thirty-two patients (46.4%) were found to have OSA. Male sex, body mass index (BMI), and atrial fibrillation were independent predictors of OSA. Among the screening tools, the 4V had the greatest area under the curve (AUC) of 0.688 (p = 0.007); the sensitivity was 96.9% for a cutoff of stroke/TIA. Due to the atypical presentation of poststroke/TIA OSA, these tools are only moderately predictive; objective testing should still be used for OSA diagnosis in this population. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Interuniversity Telecollaboration to Improve Academic Results and Identify Preferred Communication Tools

    Science.gov (United States)

    Jaime, Arturo; Dominguez, Cesar; Sanchez, Ana; Blanco, Jose Miguel

    2013-01-01

    Telecollaboration is defined as a collaborative activity that involves people from distant geographic locations working together through Internet tools and other resources. This technique has not been frequently used in learning experiences and has produced diverse academic results, as well as degrees of satisfaction. This paper describes a…

  3. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  4. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    Science.gov (United States)

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  5. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  6. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  7. Reduced Clostridium difficile Tests and Laboratory-Identified Events With a Computerized Clinical Decision Support Tool and Financial Incentive.

    Science.gov (United States)

    Madden, Gregory R; German Mesner, Ian; Cox, Heather L; Mathers, Amy J; Lyman, Jason A; Sifri, Costi D; Enfield, Kyle B

    2018-06-01

    We hypothesized that a computerized clinical decision support tool for Clostridium difficile testing would reduce unnecessary inpatient tests, resulting in fewer laboratory-identified events. Census-adjusted interrupted time-series analyses demonstrated significant reductions of 41% fewer tests and 31% fewer hospital-onset C. difficile infection laboratory-identified events following this intervention.Infect Control Hosp Epidemiol 2018;39:737-740.

  8. Recurrence time statistics: versatile tools for genomic DNA sequence analysis.

    Science.gov (United States)

    Cao, Yinhe; Tung, Wen-Wen; Gao, J B

    2004-01-01

    With the completion of the human and a few model organisms' genomes, and the genomes of many other organisms waiting to be sequenced, it has become increasingly important to develop faster computational tools which are capable of easily identifying the structures and extracting features from DNA sequences. One of the more important structures in a DNA sequence is repeat-related. Often they have to be masked before protein coding regions along a DNA sequence are to be identified or redundant expressed sequence tags (ESTs) are to be sequenced. Here we report a novel recurrence time based method for sequence analysis. The method can conveniently study all kinds of periodicity and exhaustively find all repeat-related features from a genomic DNA sequence. An efficient codon index is also derived from the recurrence time statistics, which has the salient features of being largely species-independent and working well on very short sequences. Efficient codon indices are key elements of successful gene finding algorithms, and are particularly useful for determining whether a suspected EST belongs to a coding or non-coding region. We illustrate the power of the method by studying the genomes of E. coli, the yeast S. cervisivae, the nematode worm C. elegans, and the human, Homo sapiens. Computationally, our method is very efficient. It allows us to carry out analysis of genomes on the whole genomic scale by a PC.

  9. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  10. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  11. Authoring Tool for Identifying Learning Styles, Using Self-Organizing Maps on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Ramón Zatarain Cabada

    2011-05-01

    Full Text Available This work explores a methodological proposal whose main objective is the identification of learning styles using a method of self-organizing maps designed to work, for the most part, on mobile devices. These maps can work in real time and without direct student interaction, which implies the absence of prior information. The results generated are an authoring tool for adaptive courses in Web 2.0 environments.

  12. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    Science.gov (United States)

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  13. Consumer understanding of food labels: toward a generic tool for identifying the average consumer

    DEFF Research Database (Denmark)

    Sørensen, Henrik Selsøe; Holm, Lotte; Møgelvang-Hansen, Peter

    2013-01-01

    The ‘average consumer’ is referred to as a standard in regulatory contexts when attempts are made to benchmark how consumers are expected to reason while decoding food labels. An attempt is made to operationalize this hypothetical ‘average consumer’ by proposing a tool for measuring the level of ...... that independent future studies of consumer behavior and decision making in relation to food products in different contexts could benefit from this type of benchmarking tool.......The ‘average consumer’ is referred to as a standard in regulatory contexts when attempts are made to benchmark how consumers are expected to reason while decoding food labels. An attempt is made to operationalize this hypothetical ‘average consumer’ by proposing a tool for measuring the level...... of informedness of an individual consumer against the national median at any time. Informedness, i.e. the individual consumer's ability to interpret correctly the meaning of the words and signs on a food label is isolated as one essential dimension for dividing consumers into three groups: less-informed, informed...

  14. Cluster analysis of spontaneous preterm birth phenotypes identifies potential associations among preterm birth mechanisms.

    Science.gov (United States)

    Esplin, M Sean; Manuck, Tracy A; Varner, Michael W; Christensen, Bryce; Biggio, Joseph; Bukowski, Radek; Parry, Samuel; Zhang, Heping; Huang, Hao; Andrews, William; Saade, George; Sadovsky, Yoel; Reddy, Uma M; Ilekis, John

    2015-09-01

    We sought to use an innovative tool that is based on common biologic pathways to identify specific phenotypes among women with spontaneous preterm birth (SPTB) to enhance investigators' ability to identify and to highlight common mechanisms and underlying genetic factors that are responsible for SPTB. We performed a secondary analysis of a prospective case-control multicenter study of SPTB. All cases delivered a preterm singleton at SPTB ≤34.0 weeks' gestation. Each woman was assessed for the presence of underlying SPTB causes. A hierarchic cluster analysis was used to identify groups of women with homogeneous phenotypic profiles. One of the phenotypic clusters was selected for candidate gene association analysis with the use of VEGAS software. One thousand twenty-eight women with SPTB were assigned phenotypes. Hierarchic clustering of the phenotypes revealed 5 major clusters. Cluster 1 (n = 445) was characterized by maternal stress; cluster 2 (n = 294) was characterized by premature membrane rupture; cluster 3 (n = 120) was characterized by familial factors, and cluster 4 (n = 63) was characterized by maternal comorbidities. Cluster 5 (n = 106) was multifactorial and characterized by infection (INF), decidual hemorrhage (DH), and placental dysfunction (PD). These 3 phenotypes were correlated highly by χ(2) analysis (PD and DH, P cluster 3 of SPTB. We identified 5 major clusters of SPTB based on a phenotype tool and hierarch clustering. There was significant correlation between several of the phenotypes. The INS gene was associated with familial factors that were underlying SPTB. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Development and Assessment of a Diagnostic Tool to Identify Organic Chemistry Students' Alternative Conceptions Related to Acid Strength

    Science.gov (United States)

    McClary, LaKeisha M.; Bretz, Stacey Lowery

    2012-01-01

    The central goal of this study was to create a new diagnostic tool to identify organic chemistry students' alternative conceptions related to acid strength. Twenty years of research on secondary and college students' conceptions about acids and bases has shown that these important concepts are difficult for students to apply to qualitative problem…

  16. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  17. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  18. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  19. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  20. Analysis and specification tools in relation to the APSE

    Science.gov (United States)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  1. Identify-Isolate-Inform: A Tool for Initial Detection and Management of Zika Virus Patients in the Emergency Department

    Directory of Open Access Journals (Sweden)

    Kristi L. Koenig

    2016-05-01

    Full Text Available First isolated in 1947 from a monkey in the Zika forest in Uganda, and from mosquitoes in the same forest the following year, Zika virus has gained international attention due to concerns for infection in pregnant women potentially causing fetal microcephaly. More than one million people have been infected since the appearance of the virus in Brazil in 2015. Approximately 80% of infected patients are asymptomatic. An association with microcephaly and other birth defects as well as Guillain-Barre Syndrome has led to a World Health Organization declaration of Zika virus as a Public Health Emergency of International Concern in February 2016. Zika virus is a vector-borne disease transmitted primarily by the Aedes aegypti mosquito. Male to female sexual transmission has been reported and there is potential for transmission via blood transfusions. After an incubation period of 2-7 days, symptomatic patients develop rapid onset fever, maculopapular rash, arthralgia, and conjunctivitis, often associated with headache and myalgias. Emergency department (ED personnel must be prepared to address concerns from patients presenting with symptoms consistent with acute Zika virus infection, especially those who are pregnant or planning travel to Zika-endemic regions, as well as those women planning to become pregnant and their partners. The identify-isolate-inform (3I tool, originally conceived for initial detection and management of Ebola virus disease patients in the ED, and later adjusted for measles and Middle East Respiratory Syndrome, can be adapted for real-time use for any emerging infectious disease. This paper reports a modification of the 3I tool for initial detection and management of patients under investigation for Zika virus. Following an assessment of epidemiologic risk, including travel to countries with mosquitoes that transmit Zika virus, patients are further investigated if clinically indicated. If after a rapid evaluation, Zika or other

  2. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  3. Rice Transcriptome Analysis to Identify Possible Herbicide Quinclorac Detoxification Genes

    Directory of Open Access Journals (Sweden)

    Wenying eXu

    2015-09-01

    Full Text Available Quinclorac is a highly selective auxin-type herbicide, and is widely used in the effective control of barnyard grass in paddy rice fields, improving the world’s rice yield. The herbicide mode of action of quinclorac has been proposed and hormone interactions affect quinclorac signaling. Because of widespread use, quinclorac may be transported outside rice fields with the drainage waters, leading to soil and water pollution and environmental health problems.In this study, we used 57K Affymetrix rice whole-genome array to identify quinclorac signaling response genes to study the molecular mechanisms of action and detoxification of quinclorac in rice plants. Overall, 637 probe sets were identified with differential expression levels under either 6 or 24 h of quinclorac treatment. Auxin-related genes such as GH3 and OsIAAs responded to quinclorac treatment. Gene Ontology analysis showed that genes of detoxification-related family genes were significantly enriched, including cytochrome P450, GST, UGT, and ABC and drug transporter genes. Moreover, real-time RT-PCR analysis showed that top candidate P450 families such as CYP81, CYP709C and CYP72A genes were universally induced by different herbicides. Some Arabidopsis genes for the same P450 family were up-regulated under quinclorac treatment.We conduct rice whole-genome GeneChip analysis and the first global identification of quinclorac response genes. This work may provide potential markers for detoxification of quinclorac and biomonitors of environmental chemical pollution.

  4. Applying genotoxicology tools to identify environmental stressors in support of river management

    CSIR Research Space (South Africa)

    Oberholster, Paul J

    2015-09-01

    Full Text Available Although bioassay approaches are useful for identifying chemicals of potential concern, they provide little understanding of the mechanisms of chemical toxicity. Without this understanding, it is difficult to address some of the key challenges...

  5. Obesogenic family types identified through latent profile analysis.

    Science.gov (United States)

    Martinson, Brian C; VazquezBenitez, Gabriela; Patnode, Carrie D; Hearst, Mary O; Sherwood, Nancy E; Parker, Emily D; Sirard, John; Pasch, Keryn E; Lytle, Leslie

    2011-10-01

    Obesity may cluster in families due to shared physical and social environments. This study aims to identify family typologies of obesity risk based on family environments. Using 2007-2008 data from 706 parent/youth dyads in Minnesota, we applied latent profile analysis and general linear models to evaluate associations between family typologies and body mass index (BMI) of youth and parents. Three typologies described most families with 18.8% "Unenriched/Obesogenic," 16.9% "Risky Consumer," and 64.3% "Healthy Consumer/Salutogenic." After adjustment for demographic and socioeconomic factors, parent BMI and youth BMI Z-scores were higher in unenriched/obesogenic families (BMI difference = 2.7, p typology. In contrast, parent BMI and youth BMI Z-scores were similar in the risky consumer families relative to those in healthy consumer/salutogenic type. We can identify family types differing in obesity risks with implications for public health interventions.

  6. Proteogenomic Analysis Identifies a Novel Human SHANK3 Isoform

    Directory of Open Access Journals (Sweden)

    Fahad Benthani

    2015-05-01

    Full Text Available Mutations of the SHANK3 gene have been associated with autism spectrum disorder. Individuals harboring different SHANK3 mutations display considerable heterogeneity in their cognitive impairment, likely due to the high SHANK3 transcriptional diversity. In this study, we report a novel interaction between the Mutated in colorectal cancer (MCC protein and a newly identified SHANK3 protein isoform in human colon cancer cells and mouse brain tissue. Hence, our proteogenomic analysis identifies a new human long isoform of the key synaptic protein SHANK3 that was not predicted by the human reference genome. Taken together, our findings describe a potential new role for MCC in neurons, a new human SHANK3 long isoform and, importantly, highlight the use of proteomic data towards the re-annotation of GC-rich genomic regions.

  7. Using the simplified case mix tool (sCMT) to identify cost in special care dental services to support commissioning.

    Science.gov (United States)

    Duane, B G; Freeman, R; Richards, D; Crosbie, S; Patel, P; White, S; Humphris, G

    2017-03-01

    To commission dental services for vulnerable (special care) patient groups effectively, consistently and fairly an evidence base is needed of the costs involved. The simplified Case Mixed Tool (sCMT) can assess treatment mode complexity for these patient groups. To determine if the sCMT can be used to identify costs of service provision. Patients (n=495) attending the Sussex Community NHS Trust Special Care Dental Service for care were assessed using the sCMT. sCMT score and costs (staffing, laboratory fees, etc.) besides patient age, whether a new patient and use of general anaesthetic/intravenous sedation. Statistical analysis (adjusted linear regression modelling) compared sCMT score and costs then sensitivity analyses of the costings to age, being a new patient and sedation use were undertaken. Regression tables were produced to present estimates of service costs. Costs increased with sCMT total scale and single item values in a predictable manner in all analyses except for 'cooperation'. Costs increased with the use of IV sedation; with each rising level of the sCMT, and with complexity in every sCMT category, except cooperation. Costs increased with increase in complexity of treatment mode as measured by sCMT scores. Measures such as the sCMT can provide predictions of the resource allocations required when commissioning special care dental services. Copyright© 2017 Dennis Barber Ltd.

  8. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  9. ResStock Analysis Tool | Buildings | NREL

    Science.gov (United States)

    Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency

  10. LEAP2000: tools for sustainable energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heaps, C.; Lazarus, M.; Raskin, P. [SEU-Boston, Boston, MA (USA)

    2000-09-01

    LEAP2000 is a collaborative initiative, led by the Boston Center for the Stockholm Environment Institute, to create a new suite of analytical software and databases for integrated energy-environment analysis. The LEAP2000 software and the Technology and Environmental Database (TED) are described. 5 refs., 5 figs.

  11. Spreadsheet as a tool of engineering analysis

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    In engineering analysis, problems tend to be categorized into those that can be done by hand and those that require the computer for solution. The advent of personal computers, and in particular, the advent of spreadsheet software, blurs this distinction, creating an intermediate category of problems appropriate for use with interactive personal computing

  12. Parameter trajectory analysis to identify treatment effects of pharmacological interventions.

    Directory of Open Access Journals (Sweden)

    Christian A Tiemann

    Full Text Available The field of medical systems biology aims to advance understanding of molecular mechanisms that drive disease progression and to translate this knowledge into therapies to effectively treat diseases. A challenging task is the investigation of long-term effects of a (pharmacological treatment, to establish its applicability and to identify potential side effects. We present a new modeling approach, called Analysis of Dynamic Adaptations in Parameter Trajectories (ADAPT, to analyze the long-term effects of a pharmacological intervention. A concept of time-dependent evolution of model parameters is introduced to study the dynamics of molecular adaptations. The progression of these adaptations is predicted by identifying necessary dynamic changes in the model parameters to describe the transition between experimental data obtained during different stages of the treatment. The trajectories provide insight in the affected underlying biological systems and identify the molecular events that should be studied in more detail to unravel the mechanistic basis of treatment outcome. Modulating effects caused by interactions with the proteome and transcriptome levels, which are often less well understood, can be captured by the time-dependent descriptions of the parameters. ADAPT was employed to identify metabolic adaptations induced upon pharmacological activation of the liver X receptor (LXR, a potential drug target to treat or prevent atherosclerosis. The trajectories were investigated to study the cascade of adaptations. This provided a counter-intuitive insight concerning the function of scavenger receptor class B1 (SR-B1, a receptor that facilitates the hepatic uptake of cholesterol. Although activation of LXR promotes cholesterol efflux and -excretion, our computational analysis showed that the hepatic capacity to clear cholesterol was reduced upon prolonged treatment. This prediction was confirmed experimentally by immunoblotting measurements of SR-B1

  13. Identifying Sources of Clinical Conflict: A Tool for Practice and Training in Bioethics Mediation.

    Science.gov (United States)

    Bergman, Edward J

    2015-01-01

    Bioethics mediators manage a wide range of clinical conflict emanating from diverse sources. Parties to clinical conflict are often not fully aware of, nor willing to express, the true nature and scope of their conflict. As such, a significant task of the bioethics mediator is to help define that conflict. The ability to assess and apply the tools necessary for an effective mediation process can be facilitated by each mediator's creation of a personal compendium of sources that generate clinical conflict, to provide an orientation for the successful management of complex dilemmatic cases. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.

  14. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  15. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  16. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  17. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  18. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  19. Bulk tank somatic cell counts analyzed by statistical process control tools to identify and monitor subclinical mastitis incidence.

    Science.gov (United States)

    Lukas, J M; Hawkins, D M; Kinsel, M L; Reneau, J K

    2005-11-01

    The objective of this study was to examine the relationship between monthly Dairy Herd Improvement (DHI) subclinical mastitis and new infection rate estimates and daily bulk tank somatic cell count (SCC) summarized by statistical process control tools. Dairy Herd Improvement Association test-day subclinical mastitis and new infection rate estimates along with daily or every other day bulk tank SCC data were collected for 12 mo of 2003 from 275 Upper Midwest dairy herds. Herds were divided into 5 herd production categories. A linear score [LNS = ln(BTSCC/100,000)/0.693147 + 3] was calculated for each individual bulk tank SCC. For both the raw SCC and the transformed data, the mean and sigma were calculated using the statistical quality control individual measurement and moving range chart procedure of Statistical Analysis System. One hundred eighty-three herds of the 275 herds from the study data set were then randomly selected and the raw (method 1) and transformed (method 2) bulk tank SCC mean and sigma were used to develop models for predicting subclinical mastitis and new infection rate estimates. Herd production category was also included in all models as 5 dummy variables. Models were validated by calculating estimates of subclinical mastitis and new infection rates for the remaining 92 herds and plotting them against observed values of each of the dependents. Only herd production category and bulk tank SCC mean were significant and remained in the final models. High R2 values (0.83 and 0.81 for methods 1 and 2, respectively) indicated a strong correlation between the bulk tank SCC and herd's subclinical mastitis prevalence. The standard errors of the estimate were 4.02 and 4.28% for methods 1 and 2, respectively, and decreased with increasing herd production. As a case study, Shewhart Individual Measurement Charts were plotted from the bulk tank SCC to identify shifts in mastitis incidence. Four of 5 charts examined signaled a change in bulk tank SCC before

  20. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  1. Physical Examination Tools Used to Identify Swollen and Tender Lower Limb Joints in Juvenile Idiopathic Arthritis: A Scoping Review.

    Science.gov (United States)

    Fellas, Antoni; Singh-Grewal, Davinder; Santos, Derek; Coda, Andrea

    2018-01-01

    Juvenile idiopathic arthritis (JIA) is the most common form of rheumatic disease in childhood and adolescents, affecting between 16 and 150 per 100,000 young persons below the age of 16. The lower limb is commonly affected in JIA, with joint swelling and tenderness often observed as a result of active synovitis. The objective of this scoping review is to identify the existence of physical examination (PE) tools to identify and record swollen and tender lower limb joints in children with JIA. Two reviewers individually screened the eligibility of titles and abstracts retrieved from the following online databases: MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials, and CINAHL. Studies that proposed and validated a comprehensive lower limb PE tool were included in this scoping review. After removal of duplicates, 1232 citations were retrieved, in which twelve were identified as potentially eligible. No studies met the set criteria for inclusion. Further research is needed in developing and validating specific PE tools for clinicians such as podiatrists and other allied health professionals involved in the management of pathological lower limb joints in children diagnosed with JIA. These lower limb PE tools may be useful in conjunction with existing disease activity scores to optimise screening of the lower extremity and monitoring the efficacy of targeted interventions.

  2. Theoretical analysis tools in building business competitiveness

    Directory of Open Access Journals (Sweden)

    Yeisson Diego Tamayo

    2015-12-01

    Full Text Available Due to the internationalization of markets from firms free trade agreements and the standardization process in Colombia companies increasingly seek to satisfy the needs of their customers, so the business management systems take a heavy weight business development phases. In Colombia management systems have a boom in the financial system, so much so that there is manual quality of financial supervision of Colombia, but at the microeconomic level firms have not developed or at least there is no evidence of development that topic. Therefore it is necessary to analyze models of business management at international level in order to identify which elements or strategies can be applied by stages of business development, based on the measurement of indicator variables compliance management department in context Colombian.

  3. Using lexical analysis to identify emotional distress in psychometric schizotypy.

    Science.gov (United States)

    Abplanalp, Samuel J; Buck, Benjamin; Gonzenbach, Virgilio; Janela, Carlos; Lysaker, Paul H; Minor, Kyle S

    2017-09-01

    Through the use of lexical analysis software, researchers have demonstrated a greater frequency of negative affect word use in those with schizophrenia and schizotypy compared to the general population. In addition, those with schizotypy endorse greater emotional distress than healthy controls. In this study, our aim was to expand on previous findings in schizotypy to determine whether negative affect word use could be linked to emotional distress. Schizotypy (n=33) and non-schizotypy groups (n=33) completed an open-ended, semi-structured interview and negative affect word use was analyzed using a validated lexical analysis instrument. Emotional distress was assessed using subjective questionnaires of depression and psychological quality of life (QOL). When groups were compared, those with schizotypy used significantly more negative affect words; endorsed greater depression; and reported lower QOL. Within schizotypy, a trend level association between depression and negative affect word use was observed; QOL and negative affect word use showed a significant inverse association. Our findings offer preliminary evidence of the potential effectiveness of lexical analysis as an objective, behavior-based method for identifying emotional distress throughout the schizophrenia-spectrum. Utilizing lexical analysis in schizotypy offers promise for providing researchers with an assessment capable of objectively detecting emotional distress. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  4. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  5. Cluster analysis of clinical data identifies fibromyalgia subgroups.

    Directory of Open Access Journals (Sweden)

    Elisa Docampo

    Full Text Available INTRODUCTION: Fibromyalgia (FM is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. MATERIAL AND METHODS: 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. RESULTS: VARIABLES CLUSTERED INTO THREE INDEPENDENT DIMENSIONS: "symptomatology", "comorbidities" and "clinical scales". Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1, high symptomatology and comorbidities (Cluster 2, and high symptomatology but low comorbidities (Cluster 3, showing differences in measures of disease severity. CONCLUSIONS: We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment.

  6. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  7. Pattern check of TLD disc readings - an important tool to identify abnormal conditions in workplace

    International Nuclear Information System (INIS)

    Pradhan, S.M.; Sneha, C.; Bhattacharya, M.; Sahai, M.K.; Pradeep, Ratna; Datta, D.; Bhatnagar, Amit

    2016-01-01

    Personnel monitoring for external radiation using CaSO 4 :Dy based TLD badge is well established in Indian radiation protection program. TLD badge enables evaluation of occupational dose based on the pattern and values of the three disc readings. Different patterns of disc readings are obtained depending on the type and energy of radiation in the workplace. Pattern not conforming to the radiation in the workplace also called as improper pattern can be a useful tool for investigation of any deviation / abnormality in workplace or monitoring practices. The paper presents different examples of improper pattern observed in monitoring that has helped to find out the deviations in the workplace or monitoring practices. Results of the experiments conducted to simulate some of the observed pattern are also presented in the paper

  8. Secret Shopping is an Effective Tool for Identifyings in Library User Experience Local Pattern

    Directory of Open Access Journals (Sweden)

    Kelley Wadson

    2016-12-01

    Full Text Available A Review of: Boyce, C. M. (2015. Secret shopping as user experience assessment tool. Public Services Quarterly, 11(4, 237-253. doi:10.1080/15228959.2015.1084903 Objective – To assess library user experience (UX at two entry-level service desks to determine the need for, and inform the aspects in which to improve, services and staff training. Design – Observational study using secret shopping. Setting – A small, private university in Illinois, United States of America. Subjects – Library employees, comprised primarily of student assistants; and 11 secret shoppers, comprised of 5 faculty members, 4 staff members, and 2 first-year students from the university.

  9. Preliminary Validation of a New Clinical Tool for Identifying Problem Video Game Playing

    Science.gov (United States)

    King, Daniel Luke; Delfabbro, Paul H.; Zajac, Ian T.

    2011-01-01

    Research has estimated that between 6 to 13% of individuals who play video games do so excessively. However, the methods and definitions used to identify "problem" video game players often vary considerably. This research presents preliminary validation data for a new measure of problematic video game play called the Problem Video Game…

  10. Survey of Poetry Reading Strategy as the Modern Tool to Identify Poetry Reading Strategies

    Science.gov (United States)

    Ebrahimi, Shirin Shafiei; Zainal, Zaidah

    2016-01-01

    This study examines common strategies that English as a Foreign language (EFL) students employ when reading English poetry. To identify the strategies, a survey was designed for data collection from TESL students. The result shows that students significantly tend to use the strategies that require their creativity to construct new ideas in the…

  11. The Palm-Heart Diameter: A Prospective Simple Screening Tool for Identifying Heart Enlargement

    Directory of Open Access Journals (Sweden)

    Adegbenro Omotuyi John Fakoya

    2017-11-01

    CONCLUSION: This study establishes the correlation between the palm and heart diameters. Since the heart tissue and the upper limb share a similar embryonic origin, being the mesoderm, this study prospects the fact that heart enlargement could be preliminarily identified by measuring the size of the hand.

  12. MALDI-TOF MS as a tool to identify foodborne yeasts and yeast-like fungi.

    Science.gov (United States)

    Quintilla, Raquel; Kolecka, Anna; Casaregola, Serge; Daniel, Heide M; Houbraken, Jos; Kostrzewa, Markus; Boekhout, Teun; Groenewald, Marizeth

    2018-02-02

    Since food spoilage by yeasts causes high economic losses, fast and accurate identifications of yeasts associated with food and food-related products are important for the food industry. In this study the efficiency of the matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) to identify food related yeasts was evaluated. A CBS in-house MALDI-TOF MS database was created and later challenged with a blinded test set of 146 yeast strains obtained from food and food related products. Ninety eight percent of the strains were correctly identified with log score values>1.7. One strain, Mrakia frigida, gained a correct identification with a score value1.7. Ambiguous identifications were observed due to two incorrect reference mass spectra's found in the commercial database BDAL v.4.0, namely Candida sake DSM 70763 which was re-identified as Candida oleophila, and Candida inconspicua DSM 70631 which was re-identified as Pichia membranifaciens. MALDI-TOF MS can distinguish between most of the species, but for some species complexes, such as the Kazachstania telluris and Mrakia frigida complexes, MALDI-TOF MS showed limited resolution and identification of sibling species was sometimes problematic. Despite this, we showed that the MALDI-TOF MS is applicable for routine identification and validation of foodborne yeasts, but a further update of the commercial reference databases is needed. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Identifying compromised systems through correlation of suspicious traffic from malware behavioral analysis

    Science.gov (United States)

    Camilo, Ana E. F.; Grégio, André; Santos, Rafael D. C.

    2016-05-01

    Malware detection may be accomplished through the analysis of their infection behavior. To do so, dynamic analysis systems run malware samples and extract their operating system activities and network traffic. This traffic may represent malware accessing external systems, either to steal sensitive data from victims or to fetch other malicious artifacts (configuration files, additional modules, commands). In this work, we propose the use of visualization as a tool to identify compromised systems based on correlating malware communications in the form of graphs and finding isomorphisms between them. We produced graphs from over 6 thousand distinct network traffic files captured during malware execution and analyzed the existing relationships among malware samples and IP addresses.

  14. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  15. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  16. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  17. Validating the Modified Drug Adherence Work-Up (M-DRAW) Tool to Identify and Address Barriers to Medication Adherence.

    Science.gov (United States)

    Lee, Sun; Bae, Yuna H; Worley, Marcia; Law, Anandi

    2017-09-08

    Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW) was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often). The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity) in patients taking one or more prescription medication(s) for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 "adherers" (65.4%), and into the intervention group of nine "unintentional and intentional non-adherers" (34.6%). Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74) for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p tool will include construct validation.

  18. Validating the Modified Drug Adherence Work-Up (M-DRAW Tool to Identify and Address Barriers to Medication Adherence

    Directory of Open Access Journals (Sweden)

    Sun Lee

    2017-09-01

    Full Text Available Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often. The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity in patients taking one or more prescription medication(s for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 “adherers” (65.4%, and into the intervention group of nine “unintentional and intentional non-adherers” (34.6%. Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74 for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p < 0.05. The current study did not investigate construct validity due to small sample size and challenges on follow-up with patients. Future testing of the tool will include construct validation.

  19. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  20. Systematic enrichment analysis of gene expression profiling studies identifies consensus pathways implicated in colorectal cancer development

    Directory of Open Access Journals (Sweden)

    Jesús Lascorz

    2011-01-01

    Full Text Available Background: A large number of gene expression profiling (GEP studies on colorectal carcinogenesis have been performed but no reliable gene signature has been identified so far due to the lack of reproducibility in the reported genes. There is growing evidence that functionally related genes, rather than individual genes, contribute to the etiology of complex traits. We used, as a novel approach, pathway enrichment tools to define functionally related genes that are consistently up- or down-regulated in colorectal carcinogenesis. Materials and Methods: We started the analysis with 242 unique annotated genes that had been reported by any of three recent meta-analyses covering GEP studies on genes differentially expressed in carcinoma vs normal mucosa. Most of these genes (218, 91.9% had been reported in at least three GEP studies. These 242 genes were submitted to bioinformatic analysis using a total of nine tools to detect enrichment of Gene Ontology (GO categories or Kyoto Encyclopedia of Genes and Genomes (KEGG pathways. As a final consistency criterion the pathway categories had to be enriched by several tools to be taken into consideration. Results: Our pathway-based enrichment analysis identified the categories of ribosomal protein constituents, extracellular matrix receptor interaction, carbonic anhydrase isozymes, and a general category related to inflammation and cellular response as significantly and consistently overrepresented entities. Conclusions: We triaged the genes covered by the published GEP literature on colorectal carcinogenesis and subjected them to multiple enrichment tools in order to identify the consistently enriched gene categories. These turned out to have known functional relationships to cancer development and thus deserve further investigation.

  1. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts.

    Science.gov (United States)

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement.

  2. Maximum covariance analysis to identify intraseasonal oscillations over tropical Brazil

    Science.gov (United States)

    Barreto, Naurinete J. C.; Mesquita, Michel d. S.; Mendes, David; Spyrides, Maria H. C.; Pedra, George U.; Lucio, Paulo S.

    2017-09-01

    A reliable prognosis of extreme precipitation events in the tropics is arguably challenging to obtain due to the interaction of meteorological systems at various time scales. A pivotal component of the global climate variability is the so-called intraseasonal oscillations, phenomena that occur between 20 and 100 days. The Madden-Julian Oscillation (MJO), which is directly related to the modulation of convective precipitation in the equatorial belt, is considered the primary oscillation in the tropical region. The aim of this study is to diagnose the connection between the MJO signal and the regional intraseasonal rainfall variability over tropical Brazil. This is achieved through the development of an index called Multivariate Intraseasonal Index for Tropical Brazil (MITB). This index is based on Maximum Covariance Analysis (MCA) applied to the filtered daily anomalies of rainfall data over tropical Brazil against a group of covariates consisting of: outgoing longwave radiation and the zonal component u of the wind at 850 and 200 hPa. The first two MCA modes, which were used to create the { MITB}_1 and { MITB}_2 indices, represent 65 and 16 % of the explained variance, respectively. The combined multivariate index was able to satisfactorily represent the pattern of intraseasonal variability over tropical Brazil, showing that there are periods of activation and inhibition of precipitation connected with the pattern of MJO propagation. The MITB index could potentially be used as a diagnostic tool for intraseasonal forecasting.

  3. Structural health monitoring (vibration) as a tool for identifying structural alterations of the lumbar spine

    DEFF Research Database (Denmark)

    Kawchuk, Gregory N; Hartvigsen, Jan; Edgecombe, Tiffany

    2016-01-01

    Structural health monitoring (SHM) is an engineering technique used to identify mechanical abnormalities not readily apparent through other means. Recently, SHM has been adapted for use in biological systems, but its invasive nature limits its clinical application. As such, the purpose of this pr......Structural health monitoring (SHM) is an engineering technique used to identify mechanical abnormalities not readily apparent through other means. Recently, SHM has been adapted for use in biological systems, but its invasive nature limits its clinical application. As such, the purpose...... of this project was to determine if a non-invasive form of SHM could identify structural alterations in the spines of living human subjects. Lumbar spines of 10 twin pairs were visualized by magnetic resonance imaging then assessed by a blinded radiologist to determine whether twin pairs were structurally...... concordant or discordant. Vibration was then applied to each subject's spine and the resulting response recorded from sensors overlying lumbar spinous processes. The peak frequency, area under the curve and the root mean square were computed from the frequency response function of each sensor. Statistical...

  4. Potential of isotope analysis (C, Cl) to identify dechlorination mechanisms

    Science.gov (United States)

    Cretnik, Stefan; Thoreson, Kristen; Bernstein, Anat; Ebert, Karin; Buchner, Daniel; Laskov, Christine; Haderlein, Stefan; Shouakar-Stash, Orfan; Kliegman, Sarah; McNeill, Kristopher; Elsner, Martin

    2013-04-01

    Chloroethenes are commonly used in industrial applications, and detected as carcinogenic contaminants in the environment. Their dehalogenation is of environmental importance in remediation processes. However, a detailed understanding frequently accounted problem is the accumulation of toxic degradation products such as cis-dichloroethylene (cis-DCE) at contaminated sites. Several studies have addressed the reductive dehalogenation reactions using biotic and abiotic model systems, but a crucial question in this context has remained open: Do environmental transformations occur by the same mechanism as in their corresponding in vitro model systems? The presented study shows the potential to close this research gap using the latest developments in compound specific chlorine isotope analysis, which make it possible to routinely measure chlorine isotope fractionation of chloroethenes in environmental samples and complex reaction mixtures.1,2 In particular, such chlorine isotope analysis enables the measurement of isotope fractionation for two elements (i.e., C and Cl) in chloroethenes. When isotope values of both elements are plotted against each other, different slopes reflect different underlying mechanisms and are remarkably insensitive towards masking. Our results suggest that different microbial strains (G. lovleyi strain SZ, D. hafniense Y51) and the isolated cofactor cobalamin employ similar mechanisms of reductive dechlorination of TCE. In contrast, evidence for a different mechanism was obtained with cobaloxime cautioning its use as a model for biodegradation. The study shows the potential of the dual isotope approach as a tool to directly compare transformation mechanisms of environmental scenarios, biotic transformations, and their putative chemical lab scale systems. Furthermore, it serves as an essential reference when using the dual isotope approach to assess the fate of chlorinated compounds in the environment.

  5. Using new tools to identify eggs of Engraulis anchoita (Clupeiformes, Engraulidae).

    Science.gov (United States)

    Favero, J M; Katsuragawa, M; Zani-Teixeira, M L; Turner, J T

    2014-12-26

    Efficiency of the identification of eggs of Engraulis anchoita can be greatly improved by a method developed from egg measurements, using photography and the ImageJ programme, analysed by discriminant analysis using R software. © 2014 The Fisheries Society of the British Isles.

  6. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  7. Identifying subgroups of patients using latent class analysis

    DEFF Research Database (Denmark)

    Nielsen, Anne Mølgaard; Kent, Peter; Hestbæk, Lise

    2017-01-01

    BACKGROUND: Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However......, as LBP is a complex multi-domain condition, the optimal approach when using LCA is unknown. Therefore, this paper describes the exploration of two approaches to LCA that may help improve the identification of clinically relevant and interpretable LBP subgroups. METHODS: From 928 LBP patients consulting...... of statistical performance measures, qualitative evaluation of clinical interpretability (face validity) and a subgroup membership comparison. RESULTS: For the single-stage LCA, a model solution with seven patient subgroups was preferred, and for the two-stage LCA, a nine patient subgroup model. Both approaches...

  8. Optical Whole-Genome Restriction Mapping as a Tool for Rapidly Distinguishing and Identifying Bacterial Contaminants in Clinical Samples

    Science.gov (United States)

    2015-08-01

    Article 3. DATES COVERED (From – To) Oct 2011 – Aug 2012 4. TITLE AND SUBTITLE Optical Whole-Genome Restriction Mapping as a Tool for Rapidly...multiple bacteria could be uniquely identified within mixtures. In the first set of experiments, three unique organisms ( Bacillus subtilis subsp. globigii...be useful in monitoring nosocomial outbreaks in neonatal and intensive care wards, or even as an initial screen for antibiotic resistant strains

  9. Cluster Analysis of Clinical Data Identifies Fibromyalgia Subgroups

    Science.gov (United States)

    Docampo, Elisa; Collado, Antonio; Escaramís, Geòrgia; Carbonell, Jordi; Rivera, Javier; Vidal, Javier; Alegre, José

    2013-01-01

    Introduction Fibromyalgia (FM) is mainly characterized by widespread pain and multiple accompanying symptoms, which hinder FM assessment and management. In order to reduce FM heterogeneity we classified clinical data into simplified dimensions that were used to define FM subgroups. Material and Methods 48 variables were evaluated in 1,446 Spanish FM cases fulfilling 1990 ACR FM criteria. A partitioning analysis was performed to find groups of variables similar to each other. Similarities between variables were identified and the variables were grouped into dimensions. This was performed in a subset of 559 patients, and cross-validated in the remaining 887 patients. For each sample and dimension, a composite index was obtained based on the weights of the variables included in the dimension. Finally, a clustering procedure was applied to the indexes, resulting in FM subgroups. Results Variables clustered into three independent dimensions: “symptomatology”, “comorbidities” and “clinical scales”. Only the two first dimensions were considered for the construction of FM subgroups. Resulting scores classified FM samples into three subgroups: low symptomatology and comorbidities (Cluster 1), high symptomatology and comorbidities (Cluster 2), and high symptomatology but low comorbidities (Cluster 3), showing differences in measures of disease severity. Conclusions We have identified three subgroups of FM samples in a large cohort of FM by clustering clinical data. Our analysis stresses the importance of family and personal history of FM comorbidities. Also, the resulting patient clusters could indicate different forms of the disease, relevant to future research, and might have an impact on clinical assessment. PMID:24098674

  10. Clinical trial regulation in Argentina: overview and analysis of regulatory framework, use of existing tools, and researchers' perspectives to identify potential barriers Reglamentación de ensayos clínicos en la Argentina: panorama y análisis del marco normativo, uso de los instrumentos existentes y perspectivas de los investigadores para identificar posibles obstáculos

    Directory of Open Access Journals (Sweden)

    Lauren White

    2011-11-01

    Full Text Available OBJECTIVE: To review and analyze the regulatory framework of clinical trial registration, use of existing tools (publicly accessible national/international registration databases, and users' perspectives to identify possible barriers to registration compliance by sponsors and researchers in Argentina. METHODS: Internationally registered trials recruiting patients in Argentina were found through clincialtrials.gov and the International Clinical Trial Registration Platform (ICTRP and compared with publically available clinical trials registered through the National Administration of Drugs, Foods, and Medical Devices (ANMAT. A questionnaire addressing hypothesized attitudinal, knowledge-related, idiomatic, technical, economic, and regulatory barriers that could discourage or impede registration of clinical trials was developed, and semi-structured, in-depth interviews were conducted with a purposively selected sample of researchers (investigators, sponsors, and monitors in Argentina. RESULTS: A response rate of 74.3% (n = 29 was achieved, and 27 interviews were ultimately used for analysis. Results suggested that the high proportion of foreign-sponsored or multinational trials (64.8% of all protocols approved by ANMAT from 1994-2006 may contribute to a communication gap between locally based investigators and foreign-based administrative officials. A lack of knowledge about available international registration tools and limited awareness of the importance of registration were also identified as limiting factors for local investigators and sponsors. CONCLUSIONS: To increase compliance and promote clinical trial registration in Argentina, national health authorities, sponsors, and local investigators could take the following steps: implement a grassroots educational campaign to improve clinical trial regulation, support local investigator-sponsor-initiated clinical trials, and/or encourage local and regional scientific journal compliance with

  11. Social Network Analysis Identifies Key Participants in Conservation Development.

    Science.gov (United States)

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-05-01

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  12. Global Proteome Analysis Identifies Active Immunoproteasome Subunits in Human Platelets*

    Science.gov (United States)

    Klockenbusch, Cordula; Walsh, Geraldine M.; Brown, Lyda M.; Hoffman, Michael D.; Ignatchenko, Vladimir; Kislinger, Thomas; Kast, Juergen

    2014-01-01

    The discovery of new functions for platelets, particularly in inflammation and immunity, has expanded the role of these anucleate cell fragments beyond their primary hemostatic function. Here, four in-depth human platelet proteomic data sets were generated to explore potential new functions for platelets based on their protein content and this led to the identification of 2559 high confidence proteins. During a more detailed analysis, consistently high expression of the proteasome was discovered, and the composition and function of this complex, whose role in platelets has not been thoroughly investigated, was examined. Data set mining resulted in identification of nearly all members of the 26S proteasome in one or more data sets, except the β5 subunit. However, β5i, a component of the immunoproteasome, was identified. Biochemical analyses confirmed the presence of all catalytically active subunits of the standard 20S proteasome and immunoproteasome in human platelets, including β5, which was predominantly found in its precursor form. It was demonstrated that these components were assembled into the proteasome complex and that standard proteasome as well as immunoproteasome subunits were constitutively active in platelets. These findings suggest potential new roles for platelets in the immune system. For example, the immunoproteasome may be involved in major histocompatibility complex I (MHC I) peptide generation, as the MHC I machinery was also identified in our data sets. PMID:25146974

  13. Global proteome analysis identifies active immunoproteasome subunits in human platelets.

    Science.gov (United States)

    Klockenbusch, Cordula; Walsh, Geraldine M; Brown, Lyda M; Hoffman, Michael D; Ignatchenko, Vladimir; Kislinger, Thomas; Kast, Juergen

    2014-12-01

    The discovery of new functions for platelets, particularly in inflammation and immunity, has expanded the role of these anucleate cell fragments beyond their primary hemostatic function. Here, four in-depth human platelet proteomic data sets were generated to explore potential new functions for platelets based on their protein content and this led to the identification of 2559 high confidence proteins. During a more detailed analysis, consistently high expression of the proteasome was discovered, and the composition and function of this complex, whose role in platelets has not been thoroughly investigated, was examined. Data set mining resulted in identification of nearly all members of the 26S proteasome in one or more data sets, except the β5 subunit. However, β5i, a component of the immunoproteasome, was identified. Biochemical analyses confirmed the presence of all catalytically active subunits of the standard 20S proteasome and immunoproteasome in human platelets, including β5, which was predominantly found in its precursor form. It was demonstrated that these components were assembled into the proteasome complex and that standard proteasome as well as immunoproteasome subunits were constitutively active in platelets. These findings suggest potential new roles for platelets in the immune system. For example, the immunoproteasome may be involved in major histocompatibility complex I (MHC I) peptide generation, as the MHC I machinery was also identified in our data sets. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  14. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  15. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  16. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  17. Visual Indicators on Vaccine Boxes as Early Warning Tools to Identify Potential Freeze Damage.

    Science.gov (United States)

    Angoff, Ronald; Wood, Jillian; Chernock, Maria C; Tipping, Diane

    2015-07-01

    The aim of this study was to determine whether the use of visual freeze indicators on vaccines would assist health care providers in identifying vaccines that may have been exposed to potentially damaging temperatures. Twenty-seven sites in Connecticut involved in the Vaccine for Children Program participated. In addition to standard procedures, visual freeze indicators (FREEZEmarker ® L; Temptime Corporation, Morris Plains, NJ) were affixed to each box of vaccine that required refrigeration but must not be frozen. Temperatures were monitored twice daily. During the 24 weeks, all 27 sites experienced triggered visual freeze indicator events in 40 of the 45 refrigerators. A total of 66 triggered freeze indicator events occurred in all 4 types of refrigerators used. Only 1 of the freeze events was identified by a temperature-monitoring device. Temperatures recorded on vaccine data logs before freeze indicator events were within the 35°F to 46°F (2°C to 8°C) range in all but 1 instance. A total of 46,954 doses of freeze-sensitive vaccine were stored at the time of a visual freeze indicator event. Triggered visual freeze indicators were found on boxes containing 6566 doses (14.0% of total doses). Of all doses stored, 14,323 doses (30.5%) were of highly freeze-sensitive vaccine; 1789 of these doses (12.5%) had triggered indicators on the boxes. Visual freeze indicators are useful in the early identification of freeze events involving vaccines. Consideration should be given to including these devices as a component of the temperature-monitoring system for vaccines.

  18. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  19. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  20. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  1. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  2. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  3. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  4. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  5. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  6. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  7. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  8. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  9. A Systematic Review of Tools to Measure Respiratory Rate in Order to Identify Childhood Pneumonia.

    Science.gov (United States)

    Ginsburg, Amy Sarah; Lenahan, Jennifer L; Izadnegahdar, Rasa; Ansermino, J Mark

    2018-05-01

    Pneumonia is the leading infectious cause of death in children worldwide, with most deaths occurring in developing countries. Measuring respiratory rate is critical to the World Health Organization's guidelines for diagnosing childhood pneumonia in low-resource settings, yet it is difficult to accurately measure. We conducted a systematic review to landscape existing respiratory rate measurement technologies. We searched PubMed, Embase, and Compendex for studies published through September 2017 assessing the accuracy of respiratory rate measurement technologies in children. We identified 16 studies: 2 describing manual devices and 14 describing automated devices. Although both studies describing manual devices took place in low-resource settings, all studies describing automated devices were conducted in well-resourced settings. Direct comparison between studies was complicated by small sample size, absence of a consistent reference standard, and variations in comparison methodology. There is an urgent need for affordable and appropriate innovations that can reliably measure a child's respiratory rate in low-resource settings. Accelerating development or scale-up of these technologies could have the potential to advance childhood pneumonia diagnosis worldwide.

  10. Tools to identify linear combination of prognostic factors which maximizes area under receiver operator curve.

    Science.gov (United States)

    Todor, Nicolae; Todor, Irina; Săplăcan, Gavril

    2014-01-01

    The linear combination of variables is an attractive method in many medical analyses targeting a score to classify patients. In the case of ROC curves the most popular problem is to identify the linear combination which maximizes area under curve (AUC). This problem is complete closed when normality assumptions are met. With no assumption of normality search algorithm are avoided because it is accepted that we have to evaluate AUC n(d) times where n is the number of distinct observation and d is the number of variables. For d = 2, using particularities of AUC formula, we described an algorithm which lowered the number of evaluations of AUC from n(2) to n(n-1) + 1. For d > 2 our proposed solution is an approximate method by considering equidistant points on the unit sphere in R(d) where we evaluate AUC. The algorithms were applied to data from our lab to predict response of treatment by a set of molecular markers in cervical cancers patients. In order to evaluate the strength of our algorithms a simulation was added. In the case of no normality presented algorithms are feasible. For many variables computation time could be increased but acceptable.

  11. New Tools to Identify the Location of Seagrass Meadows: Marine Grazers as Habitat Indicators

    KAUST Repository

    Hays, Graeme C.

    2018-02-21

    Seagrasses are hugely valuable to human life, but the global extent of seagrass meadows remains unclear. As evidence of their value, a United Nations program exists (http://data.unep-wcmc.org/datasets/7) to try and assess their distribution and there has been a call from 122 scientists across 28 countries for more work to manage, protect and monitor seagrass meadows (http://www.bbc.com/news/science-environment-37606827). Emerging from the 12th International Seagrass Biology Workshop, held in October 2016, has been the view that grazing marine megafauna may play a useful role in helping to identify previously unknown seagrass habitats. Here we describe this concept, showing how detailed information on the distribution of both dugongs (Dugong dugon) and green sea turtles (Chelonia mydas) obtained, for example, by aerial surveys and satellite tracking, can reveal new information on the location of seagrass meadows. We show examples of how marine megaherbivores have been effective habitat indicators, revealing major, new, deep-water seagrass meadows and offering the potential for more informed estimates of seagrass extent in tropical and sub-tropical regions where current information is often lacking.

  12. Stream analysis, a practical tool for innovators and change agents

    NARCIS (Netherlands)

    Kastelein, A.

    1993-01-01

    To survive organizations have to innovate and to change. Fundamental questions are: * Which strategies and tools could be applied succesfully in the everchanging environment? * Are the identified instruments effective iri our business? * Do we need professional support? In more than a dozen projects

  13. Mid-upper arm circumference as a screening tool for identifying children with obesity: a 12-country study.

    Science.gov (United States)

    Chaput, J-P; Katzmarzyk, P T; Barnes, J D; Fogelholm, M; Hu, G; Kuriyan, R; Kurpad, A; Lambert, E V; Maher, C; Maia, J; Matsudo, V; Olds, T; Onywera, V; Sarmiento, O L; Standage, M; Tudor-Locke, C; Zhao, P; Tremblay, M S

    2017-12-01

    No studies have examined if mid-upper arm circumference (MUAC) can be an alternative screening tool for obesity in an international sample of children differing widely in levels of human development. Our aim is to determine whether MUAC could be used to identify obesity in children from 12 countries in five major geographic regions of the world. This observational, multinational cross-sectional study included 7337 children aged 9-11 years. Anthropometric measurements were objectively assessed, and obesity was defined according to the World Health Organization reference data. In the total sample, MUAC was strongly correlated with adiposity indicators in both boys and girls (r > 0.86, p obesity was high in both sexes and across study sites (overall area under the curve of 0.97, sensitivity of 95% and specificity of 90%). The MUAC cut-off value to identify obesity was ~25 cm for both boys and girls. In country-specific analyses, the cut-off value to identify obesity ranged from 23.2 cm (boys in South Africa) to 26.2 cm (girls in the UK). Results from this 12-country study suggest that MUAC is a simple and accurate measurement that may be used to identify obesity in children aged 9-11 years. MUAC may be a promising screening tool for obesity in resource-limited settings. © 2016 World Obesity Federation.

  14. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  15. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  16. Risk analysis for dengue suitability in Africa using the ArcGIS predictive analysis tools (PA tools).

    Science.gov (United States)

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Waters, Nigel M

    2016-06-01

    Risk maps identifying suitable locations for infection transmission are important for public health planning. Data on dengue infection rates are not readily available in most places where the disease is known to occur. A newly available add-in to Esri's ArcGIS software package, the ArcGIS Predictive Analysis Toolset (PA Tools), was used to identify locations within Africa with environmental characteristics likely to be suitable for transmission of dengue virus. A more accurate, robust, and localized (1 km × 1 km) dengue risk map for Africa was created based on bioclimatic layers, elevation data, high-resolution population data, and other environmental factors that a search of the peer-reviewed literature showed to be associated with dengue risk. Variables related to temperature, precipitation, elevation, and population density were identified as good predictors of dengue suitability. Areas of high dengue suitability occur primarily within West Africa and parts of Central Africa and East Africa, but even in these regions the suitability is not homogenous. This risk mapping technique for an infection transmitted by Aedes mosquitoes draws on entomological, epidemiological, and geographic data. The method could be applied to other infectious diseases (such as Zika) in order to provide new insights for public health officials and others making decisions about where to increase disease surveillance activities and implement infection prevention and control efforts. The ability to map threats to human and animal health is important for tracking vectorborne and other emerging infectious diseases and modeling the likely impacts of climate change. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  18. Building a Natural Language Processing Tool to Identify Patients With High Clinical Suspicion for Kawasaki Disease from Emergency Department Notes.

    Science.gov (United States)

    Doan, Son; Maehara, Cleo K; Chaparro, Juan D; Lu, Sisi; Liu, Ruiling; Graham, Amanda; Berry, Erika; Hsu, Chun-Nan; Kanegaye, John T; Lloyd, David D; Ohno-Machado, Lucila; Burns, Jane C; Tremoulet, Adriana H

    2016-05-01

    Delayed diagnosis of Kawasaki disease (KD) may lead to serious cardiac complications. We sought to create and test the performance of a natural language processing (NLP) tool, the KD-NLP, in the identification of emergency department (ED) patients for whom the diagnosis of KD should be considered. We developed an NLP tool that recognizes the KD diagnostic criteria based on standard clinical terms and medical word usage using 22 pediatric ED notes augmented by Unified Medical Language System vocabulary. With high suspicion for KD defined as fever and three or more KD clinical signs, KD-NLP was applied to 253 ED notes from children ultimately diagnosed with either KD or another febrile illness. We evaluated KD-NLP performance against ED notes manually reviewed by clinicians and compared the results to a simple keyword search. KD-NLP identified high-suspicion patients with a sensitivity of 93.6% and specificity of 77.5% compared to notes manually reviewed by clinicians. The tool outperformed a simple keyword search (sensitivity = 41.0%; specificity = 76.3%). KD-NLP showed comparable performance to clinician manual chart review for identification of pediatric ED patients with a high suspicion for KD. This tool could be incorporated into the ED electronic health record system to alert providers to consider the diagnosis of KD. KD-NLP could serve as a model for decision support for other conditions in the ED. © 2016 by the Society for Academic Emergency Medicine.

  19. Computational Biology Tools for Identifying Specific Ligand Binding Residues for Novel Agrochemical and Drug Design.

    Science.gov (United States)

    Neshich, Izabella Agostinho Pena; Nishimura, Leticia; de Moraes, Fabio Rogerio; Salim, Jose Augusto; Villalta-Romero, Fabian; Borro, Luiz; Yano, Inacio Henrique; Mazoni, Ivan; Tasic, Ljubica; Jardine, Jose Gilberto; Neshich, Goran

    2015-01-01

    The term "agrochemicals" is used in its generic form to represent a spectrum of pesticides, such as insecticides, fungicides or bactericides. They contain active components designed for optimized pest management and control, therefore allowing for economically sound and labor efficient agricultural production. A "drug" on the other side is a term that is used for compounds designed for controlling human diseases. Although drugs are subjected to much more severe testing and regulation procedures before reaching the market, they might contain exactly the same active ingredient as certain agrochemicals, what is the case described in present work, showing how a small chemical compound might be used to control pathogenicity of Gram negative bacteria Xylella fastidiosa which devastates citrus plantations, as well as for control of, for example, meningitis in humans. It is also clear that so far the production of new agrochemicals is not benefiting as much from the in silico new chemical compound identification/discovery as pharmaceutical production. Rational drug design crucially depends on detailed knowledge of structural information about the receptor (target protein) and the ligand (drug/agrochemical). The interaction between the two molecules is the subject of analysis that aims to understand relationship between structure and function, mainly deciphering some fundamental elements of the nanoenvironment where the interaction occurs. In this work we will emphasize the role of understanding nanoenvironmental factors that guide recognition and interaction of target protein and its function modifier, an agrochemical or a drug. The repertoire of nanoenvironment descriptors is used for two selected and specific cases we have approached in order to offer a technological solution for some very important problems that needs special attention in agriculture: elimination of pathogenicity of a bacterium which is attacking citrus plants and formulation of a new fungicide. Finally

  20. Behavioral metabolomics analysis identifies novel neurochemical signatures in methamphetamine sensitization

    Science.gov (United States)

    Adkins, Daniel E.; McClay, Joseph L.; Vunck, Sarah A.; Batman, Angela M.; Vann, Robert E.; Clark, Shaunna L.; Souza, Renan P.; Crowley, James J.; Sullivan, Patrick F.; van den Oord, Edwin J.C.G.; Beardsley, Patrick M.

    2014-01-01

    Behavioral sensitization has been widely studied in animal models and is theorized to reflect neural modifications associated with human psychostimulant addiction. While the mesolimbic dopaminergic pathway is known to play a role, the neurochemical mechanisms underlying behavioral sensitization remain incompletely understood. In the present study, we conducted the first metabolomics analysis to globally characterize neurochemical differences associated with behavioral sensitization. Methamphetamine-induced sensitization measures were generated by statistically modeling longitudinal activity data for eight inbred strains of mice. Subsequent to behavioral testing, nontargeted liquid and gas chromatography-mass spectrometry profiling was performed on 48 brain samples, yielding 301 metabolite levels per sample after quality control. Association testing between metabolite levels and three primary dimensions of behavioral sensitization (total distance, stereotypy and margin time) showed four robust, significant associations at a stringent metabolome-wide significance threshold (false discovery rate < 0.05). Results implicated homocarnosine, a dipeptide of GABA and histidine, in total distance sensitization, GABA metabolite 4-guanidinobutanoate and pantothenate in stereotypy sensitization, and myo-inositol in margin time sensitization. Secondary analyses indicated that these associations were independent of concurrent methamphetamine levels and, with the exception of the myo-inositol association, suggest a mechanism whereby strain-based genetic variation produces specific baseline neurochemical differences that substantially influence the magnitude of MA-induced sensitization. These findings demonstrate the utility of mouse metabolomics for identifying novel biomarkers, and developing more comprehensive neurochemical models, of psychostimulant sensitization. PMID:24034544

  1. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  2. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  3. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  4. Multivariate data analysis as a fast tool in evaluation of solid state phenomena

    DEFF Research Database (Denmark)

    Jørgensen, Anna Cecilia; Miroshnyk, Inna; Karjalainen, Milja

    2006-01-01

    of information generated can be overwhelming and the need for more effective data analysis tools is well recognized. The aim of this study was to investigate the use of multivariate data analysis, in particular principal component analysis (PCA), for fast analysis of solid state information. The data sets...... the molecular level interpretation of the structural changes related to the loss of water, as well as interpretation of the phenomena related to the crystallization. The critical temperatures or critical time points were identified easily using the principal component analysis. The variables (diffraction angles...... or wavenumbers) that changed could be identified by the careful interpretation of the loadings plots. The PCA approach provides an effective tool for fast screening of solid state information....

  5. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Directory of Open Access Journals (Sweden)

    Renee Heffron

    Full Text Available HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year.Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects.The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76. Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score.A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  6. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Science.gov (United States)

    Heffron, Renee; Cohen, Craig R; Ngure, Kenneth; Bukusi, Elizabeth; Were, Edwin; Kiarie, James; Mugo, Nelly; Celum, Connie; Baeten, Jared M

    2015-01-01

    HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception) or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year. Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects. The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC) for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76). Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score. A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  7. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  8. Trabecular bone score as an assessment tool to identify the risk of osteoporosis in axial spondyloarthritis: a case-control study.

    Science.gov (United States)

    Kang, Kwi Young; Goo, Hye Yeon; Park, Sung-Hwan; Hong, Yeon Sik

    2018-03-01

    To compare the trabecular bone score (TBS) between patients with axial spondyloarthritis (axSpA) and matched normal controls and identify risk factors associated with a low TBS. TBS and BMD were assessed in the two groups (axSpA and control) using DXA. Osteoporosis risk factors and inflammatory markers were also assessed. Disease activity and radiographic progression in the sacroiliac joint and spine were evaluated in the axSpA group. Multivariate linear regression analysis was performed to identify risk factors associated with TBS. In the axSpA group, 248 subjects were enrolled; an equal number of age- and sex-matched subjects comprised the control group. The mean TBS was 1.43 (0.08) and 1.38 (0.12) in the control and axSpA groups, respectively (P tool to identify the risk of osteoporosis in patients with axSpA.

  9. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  10. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  11. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  12. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  13. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  14. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  15. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  16. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  17. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  18. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  19. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  20. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  1. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  2. Energy Zones Study: A Comprehensive Web-Based Mapping Tool to Identify and Analyze Clean Energy Zones in the Eastern Interconnection

    Energy Technology Data Exchange (ETDEWEB)

    Koritarov, V.; Kuiper, J.; Hlava, K.; Orr, A.; Rollins, K.; Brunner, D.; Green, H.; Makar, J.; Ayers, A.; Holm, M.; Simunich, K.; Wang, J.; Augustine, C.; Heimiller, D.; Hurlbut, D. J.; Milbrandt, A.; Schneider, T. R.; et al.

    2013-09-01

    This report describes the work conducted in support of the Eastern Interconnection States’ Planning Council (EISPC) Energy Zones Study and the development of the Energy Zones Mapping Tool performed by a team of experts from three National Laboratories. The multi-laboratory effort was led by Argonne National Laboratory (Argonne), in collaboration with the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). In June 2009, the U.S. Department of Energy (DOE) and the National Energy Technology Laboratory published Funding Opportunity Announcement FOA-0000068, which invited applications for interconnection-level analysis and planning. In December 2009, the Eastern Interconnection Planning Collaborative (EIPC) and the EISPC were selected as two award recipients for the Eastern Interconnection. Subsequently, in 2010, DOE issued Research Call RC-BM-2010 to DOE’s Federal Laboratories to provide research support and assistance to FOA-0000068 awardees on a variety of key subjects. Argonne was selected as the lead laboratory to provide support to EISPC in developing a methodology and a mapping tool for identifying potential clean energy zones in the Eastern Interconnection. In developing the EISPC Energy Zones Mapping Tool (EZ Mapping Tool), Argonne, NREL, and ORNL closely collaborated with the EISPC Energy Zones Work Group which coordinated the work on the Energy Zones Study. The main product of the Energy Zones Study is the EZ Mapping Tool, which is a web-based decision support system that allows users to locate areas with high suitability for clean power generation in the U.S. portion of the Eastern Interconnection. The mapping tool includes 9 clean (low- or no-carbon) energy resource categories and 29 types of clean energy technologies. The EZ Mapping Tool contains an extensive geographic information system database and allows the user to apply a flexible modeling approach for the identification and analysis of potential energy zones

  3. Risk Prioritization Tool to Identify the Public Health Risks of Wildlife Trade: The Case of Rodents from Latin America.

    Science.gov (United States)

    Bueno, I; Smith, K M; Sampedro, F; Machalaba, C C; Karesh, W B; Travis, D A

    2016-06-01

    Wildlife trade (both formal and informal) is a potential driver of disease introduction and emergence. Legislative proposals aim to prevent these risks by banning wildlife imports, and creating 'white lists' of species that are cleared for importation. These approaches pose economic harm to the pet industry, and place substantial burden on importers and/or federal agencies to provide proof of low risk for importation of individual species. As a feasibility study, a risk prioritization tool was developed to rank the pathogens found in rodent species imported from Latin America into the United States with the highest risk of zoonotic consequence in the United States. Four formally traded species and 16 zoonotic pathogens were identified. Risk scores were based on the likelihood of pathogen release and human exposure, and the severity of the disease (consequences). Based on the methodology applied, three pathogens (Mycobacterium microti, Giardia spp. and Francisella tularensis) in one species (Cavia porcellus) were ranked as highest concern. The goal of this study was to present a methodological approach by which preliminary management resources can be allocated to the identified high-concern pathogen-species combinations when warranted. This tool can be expanded to other taxa and geographic locations to inform policy surrounding the wildlife trade. © 2015 Blackwell Verlag GmbH.

  4. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  5. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  6. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  7. An online database for plant image analysis software tools

    OpenAIRE

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-01-01

    Background: Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is...

  8. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  9. Integration of molecular biology tools for identifying promoters and genes abundantly expressed in flowers of Oncidium Gower Ramsey

    Directory of Open Access Journals (Sweden)

    Tung Shu-Yun

    2011-04-01

    Full Text Available Abstract Background Orchids comprise one of the largest families of flowering plants and generate commercially important flowers. However, model plants, such as Arabidopsis thaliana do not contain all plant genes, and agronomic and horticulturally important genera and species must be individually studied. Results Several molecular biology tools were used to isolate flower-specific gene promoters from Oncidium 'Gower Ramsey' (Onc. GR. A cDNA library of reproductive tissues was used to construct a microarray in order to compare gene expression in flowers and leaves. Five genes were highly expressed in flower tissues, and the subcellular locations of the corresponding proteins were identified using lip transient transformation with fluorescent protein-fusion constructs. BAC clones of the 5 genes, together with 7 previously published flower- and reproductive growth-specific genes in Onc. GR, were identified for cloning of their promoter regions. Interestingly, 3 of the 5 novel flower-abundant genes were putative trypsin inhibitor (TI genes (OnTI1, OnTI2 and OnTI3, which were tandemly duplicated in the same BAC clone. Their promoters were identified using transient GUS reporter gene transformation and stable A. thaliana transformation analyses. Conclusions By combining cDNA microarray, BAC library, and bombardment assay techniques, we successfully identified flower-directed orchid genes and promoters.

  10. Identifying the new Influencers in the Internet Era: Social Media and Social Network Analysis

    Directory of Open Access Journals (Sweden)

    MIGUEL DEL FRESNO GARCÍA

    2016-01-01

    Full Text Available Social media influencers (SMIs can be defined as a new type of independent actor who are able to shape audience attitudes through the use of social media channels in competition and coexistence with professional media. Being able to accurately identify SMIs is critical no matter what is being transmitted in a social system. Social Network Analysis (SNA has been recognized as a powerful tool for representing social network structures and information dissemination. SMIs can be identifi ed by their high-ranking position in a network as the most important or central nodes. The results reveal the existence of three different typologies of SMIs: disseminator, engager and leader. This methodology permits the optimization of resources to create effective online communication strategies.

  11. Gastric Cancer Associated Genes Identified by an Integrative Analysis of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Bing Jiang

    2017-01-01

    Full Text Available Gastric cancer is one of the most severe complex diseases with high morbidity and mortality in the world. The molecular mechanisms and risk factors for this disease are still not clear since the cancer heterogeneity caused by different genetic and environmental factors. With more and more expression data accumulated nowadays, we can perform integrative analysis for these data to understand the complexity of gastric cancer and to identify consensus players for the heterogeneous cancer. In the present work, we screened the published gene expression data and analyzed them with integrative tool, combined with pathway and gene ontology enrichment investigation. We identified several consensus differentially expressed genes and these genes were further confirmed with literature mining; at last, two genes, that is, immunoglobulin J chain and C-X-C motif chemokine ligand 17, were screened as novel gastric cancer associated genes. Experimental validation is proposed to further confirm this finding.

  12. Analysis tools for discovering strong parity violation at hadron colliders

    International Nuclear Information System (INIS)

    Backovic, Mihailo; Ralston, John P.

    2011-01-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or 'azimuthal flow'. Analysis uses the representations of the orthogonal group O(2) and dihedral groups D N necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single 'reaction plane'. Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of 'event-shape sorting' to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  13. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  14. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  15. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  16. Life-history strategies as a tool to identify conservation constraints: A case-study on ants in chalk grasslands

    NARCIS (Netherlands)

    Noordwijk, C.G.E.; Boer, P.; Mabelis, A.A.; Verberk, W.C.E.P.; Siepel, H.

    2012-01-01

    Species’ life-history traits underlie species–environment relationships. Therefore, analysis of species traits, combined into life-history strategies, can be used to identify key factors shaping the local species composition. This is demonstrated in a case-study on ants in chalk grasslands. We

  17. Spatial-temporal analysis of dengue deaths: identifying social vulnerabilities

    Directory of Open Access Journals (Sweden)

    Maria do Socorro da Silva

    Full Text Available Abstract: INTRODUCTION Currently, dengue fever, chikungunya fever, and zika virus represent serious public health issues in Brazil, despite efforts to control the vector, the Aedes aegypti mosquito. METHODS: This was a descriptive and ecological study of dengue deaths occurring from 2002 to 2013 in São Luis, Maranhão, Brazil. Geoprocessing software was used to draw maps, linking the geo-referenced deaths with urban/social data at census tract level. RESULTS: There were 74 deaths, concentrated in areas of social vulnerability. CONCLUSIONS: The use of geo-technology tools pointed to a concentration of dengue deaths in specific intra-urban areas.

  18. Spatial-temporal analysis of dengue deaths: identifying social vulnerabilities.

    Science.gov (United States)

    Silva, Maria do Socorro da; Branco, Maria Dos Remédios Freitas Carvalho; Aquino, José; Queiroz, Rejane Christine de Sousa; Bani, Emanuele; Moreira, Emnielle Pinto Borges; Medeiros, Maria Nilza Lima; Rodrigues, Zulimar Márita Ribeiro

    2017-01-01

    Currently, dengue fever, chikungunya fever, and zika virus represent serious public health issues in Brazil, despite efforts to control the vector, the Aedes aegypti mosquito. This was a descriptive and ecological study of dengue deaths occurring from 2002 to 2013 in São Luis, Maranhão, Brazil. Geoprocessing software was used to draw maps, linking the geo-referenced deaths with urban/social data at census tract level. There were 74 deaths, concentrated in areas of social vulnerability. The use of geo-technology tools pointed to a concentration of dengue deaths in specific intra-urban areas.

  19. Status of CONRAD, a nuclear reaction analysis tool

    International Nuclear Information System (INIS)

    Saint Jean, C. de; Habert, B.; Litaize, O.; Noguere, G.; Suteau, C.

    2008-01-01

    The development of a software tool (CONRAD) was initiated at CEA/Cadarache to give answers to various problems arising in the data analysis of nuclear reactions. This tool is then characterized by the handling of uncertainties from experimental values to covariance matrices for multi-group cross sections. An object oriented design was chosen allowing an easy interface with graphical tool for input/output data and being a natural framework for innovative nuclear models (Fission). The major achieved developments are a data model for describing channels, nuclear reactions, nuclear models and processes with interface to classical data formats, theoretical calculations for the resolved resonance range (Reich-Moore) and unresolved resonance range (Hauser-Feshbach, Gilbert-Cameron,...) with nuclear model parameters adjustment on experimental data sets and a Monte Carlo method based on conditional probabilities developed to calculate properly covariance matrices. The on-going developments deal with the experimental data description (covariance matrices) and the graphical user interface. (authors)

  20. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  1. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  2. Development of a Simple Tool for Identifying Alcohol Use Disorder in Female Korean Drinkers from Previous Questionnaires.

    Science.gov (United States)

    Seo, Yu Ri; Kim, Jong Sung; Kim, Sung Soo; Yoon, Seok Joon; Suh, Won Yoon; Youn, Kwangmi

    2016-01-01

    This study aimed to develop a simple tool for identifying alcohol use disorders in female Korean drinkers from previous questionnaires. This research was conducted on 400 women who consumed at least one alcoholic drink during the past month and visited the health promotion center at Chungnam National University Hospital between June 2013 to May 2014. Drinking habits and alcohol use disorders were assessed by structured interviews using the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition diagnostic criteria. The subjects were also asked to answer the Alcohol Use Disorders Identification Test (AUDIT), AUDIT-Consumption, CAGE (Cut down, Annoyed, Guilty, Eye-opener), TWEAK (Tolerance, Worried, Eye-opener, Amnesia, Kut down), TACE (Tolerance, Annoyed, Cut down, Eye-opener), and NET (Normal drinker, Eye-opener, Tolerance) questionnaires. The area under receiver operating characteristic (AUROC) of each question of the questionnaires on alcohol use disorders was assessed. After combining two questions with the largest AUROC, it was compared to other previous questionnaires. Among the 400 subjects, 58 (14.5%) were identified as having an alcohol use disorder. Two questions with the largest AUROC were question no. 7 in AUDIT, "How often during the last year have you had a feeling of guilt or remorse after drinking?" and question no. 5 in AUDIT, "How often during the past year have you failed to do what was normally expected from you because of drinking?" with an AUROC (95% confidence interval [CI]) of 0.886 (0.850-0.915) and 0.862 (0.824-0.894), respectively. The AUROC (95% CI) of the combination of the two questions was 0.958 (0.934-0.976) with no significant difference as compared to the existing AUDIT with the largest AUROC. The above results suggest that the simple tool consisting of questions no. 5 and no. 7 in AUDIT is useful in identifying alcohol use disorders in Korean female drinkers.

  3. AN ANALYSIS OF THE CAUSES OF PRODUCT DEFECTS USING QUALITY MANAGEMENT TOOLS

    Directory of Open Access Journals (Sweden)

    Katarzyna MIDOR

    2014-10-01

    Full Text Available To stay or strengthen its position on the market, a modern business needs to follow the principles of quality control in its actions. Especially important is the Zero Defects concept developed by Philip Crosby, which means flawless production. The concept consists in preventing the occurrence of defects and flaws in all production stages. To achieve that, we must, among other things, make use of quality management tools. This article presents an analysis of the reasons for the return of damaged or faulty goods in the automotive industry by means of quality management tools such as the Ishikawa diagram and Pareto analysis, which allow us to identify the causes of product defectiveness. Based on the results, preventive measures have been proposed. The actions presented in this article and the results of the analysis prove the effectiveness of the aforementioned quality management tools.

  4. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  5. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  6. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  7. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  8. Structural identifiability analysis of a cardiovascular system model.

    Science.gov (United States)

    Pironet, Antoine; Dauby, Pierre C; Chase, J Geoffrey; Docherty, Paul D; Revie, James A; Desaive, Thomas

    2016-05-01

    The six-chamber cardiovascular system model of Burkhoff and Tyberg has been used in several theoretical and experimental studies. However, this cardiovascular system model (and others derived from it) are not identifiable from any output set. In this work, two such cases of structural non-identifiability are first presented. These cases occur when the model output set only contains a single type of information (pressure or volume). A specific output set is thus chosen, mixing pressure and volume information and containing only a limited number of clinically available measurements. Then, by manipulating the model equations involving these outputs, it is demonstrated that the six-chamber cardiovascular system model is structurally globally identifiable. A further simplification is made, assuming known cardiac valve resistances. Because of the poor practical identifiability of these four parameters, this assumption is usual. Under this hypothesis, the six-chamber cardiovascular system model is structurally identifiable from an even smaller dataset. As a consequence, parameter values computed from limited but well-chosen datasets are theoretically unique. This means that the parameter identification procedure can safely be performed on the model from such a well-chosen dataset. Thus, the model may be considered suitable for use in diagnosis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  9. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  10. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  11. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  12. Gene expression analysis identifies global gene dosage sensitivity in cancer

    DEFF Research Database (Denmark)

    Fehrmann, Rudolf S. N.; Karjalainen, Juha M.; Krajewska, Malgorzata

    2015-01-01

    Many cancer-associated somatic copy number alterations (SCNAs) are known. Currently, one of the challenges is to identify the molecular downstream effects of these variants. Although several SCNAs are known to change gene expression levels, it is not clear whether each individual SCNA affects gen...

  13. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  14. COSMID: A Web-based Tool for Identifying and Validating CRISPR/Cas Off-target Sites

    Directory of Open Access Journals (Sweden)

    Thomas J Cradick

    2014-01-01

    Full Text Available Precise genome editing using engineered nucleases can significantly facilitate biological studies and disease treatment. In particular, clustered regularly interspaced short palindromic repeats (CRISPR with CRISPR-associated (Cas proteins are a potentially powerful tool for modifying a genome by targeted cleavage of DNA sequences complementary to designed guide strand RNAs. Although CRISPR/Cas systems can have on-target cleavage rates close to the transfection rates, they may also have relatively high off-target cleavage at similar genomic sites that contain one or more base pair mismatches, and insertions or deletions relative to the guide strand. We have developed a bioinformatics-based tool, COSMID (CRISPR Off-target Sites with Mismatches, Insertions, and Deletions that searches genomes for potential off-target sites (http://crispr.bme.gatech.edu. Based on the user-supplied guide strand and input parameters, COSMID identifies potential off-target sites with the specified number of mismatched bases and insertions or deletions when compared with the guide strand. For each site, amplification primers optimal for the chosen application are also given as output. This ranked-list of potential off-target sites assists the choice and evaluation of intended target sites, thus helping the design of CRISPR/Cas systems with minimal off-target effects, as well as the identification and quantification of CRISPR/Cas induced off-target cleavage in cells.

  15. Use of the Operon Structure of the C. elegans Genome as a Tool to Identify Functionally Related Proteins

    Directory of Open Access Journals (Sweden)

    Silvia Dossena

    2013-12-01

    Full Text Available One of the most pressing challenges in the post genomic era is the identification and characterization of protein-protein interactions (PPIs, as these are essential in understanding the cellular physiology of health and disease. Experimental techniques suitable for characterizing PPIs (X-ray crystallography or nuclear magnetic resonance spectroscopy, among others are usually laborious, time-consuming and often difficult to apply to membrane proteins, and therefore require accurate prediction of the candidate interacting partners. High-throughput experimental methods (yeast two-hybrid and affinity purification succumb to the same shortcomings, and can also lead to high rates of false positive and negative results. Therefore, reliable tools for predicting PPIs are needed. The use of the operon structure in the eukaryote Caenorhabditis elegans genome is a valuable, though underserved, tool for identifying physically or functionally interacting proteins. Based on the concept that genes organized in the same operon may encode physically or functionally related proteins, this algorithm is easy to be applied and, importantly, gives a limited number of candidate partners of a given protein, allowing for focused experimental verification. Moreover, this approach can be successfully used to predict PPIs in the human system, including those of membrane proteins.

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    Science.gov (United States)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  17. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; Garcia-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; Da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  18. Identifying barriers to patient acceptance of active surveillance: content analysis of online patient communications.

    Science.gov (United States)

    Mishra, Mark V; Bennett, Michele; Vincent, Armon; Lee, Olivia T; Lallas, Costas D; Trabulsi, Edouard J; Gomella, Leonard G; Dicker, Adam P; Showalter, Timothy N

    2013-01-01

    Qualitative research aimed at identifying patient acceptance of active surveillance (AS) has been identified as a public health research priority. The primary objective of this study was to determine if analysis of a large-sample of anonymous internet conversations (ICs) could be utilized to identify unmet public needs regarding AS. English-language ICs regarding prostate cancer (PC) treatment with AS from 2002-12 were identified using a novel internet search methodology. Web spiders were developed to mine, aggregate, and analyze content from the world-wide-web for ICs centered on AS. Collection of ICs was not restricted to any specific geographic region of origin. NLP was used to evaluate content and perform a sentiment analysis. Conversations were scored as positive, negative, or neutral. A sentiment index (SI) was subsequently calculated according to the following formula to compare temporal trends in public sentiment towards AS: [(# Positive IC/#Total IC)-(#Negative IC/#Total IC) x 100]. A total of 464 ICs were identified. Sentiment increased from -13 to +2 over the study period. The increase sentiment has been driven by increased patient emphasis on quality-of-life factors and endorsement of AS by national medical organizations. Unmet needs identified in these ICs include: a gap between quantitative data regarding long-term outcomes with AS vs. conventional treatments, desire for treatment information from an unbiased specialist, and absence of public role models managed with AS. This study demonstrates the potential utility of online patient communications to provide insight into patient preferences and decision-making. Based on our findings, we recommend that multidisciplinary clinics consider including an unbiased specialist to present treatment options and that future decision tools for AS include quantitative data regarding outcomes after AS.

  19. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    Science.gov (United States)

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  20. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  1. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  2. Genomic analysis identifies masqueraders of full-term cerebral palsy.

    Science.gov (United States)

    Takezawa, Yusuke; Kikuchi, Atsuo; Haginoya, Kazuhiro; Niihori, Tetsuya; Numata-Uematsu, Yurika; Inui, Takehiko; Yamamura-Suzuki, Saeko; Miyabayashi, Takuya; Anzai, Mai; Suzuki-Muromoto, Sato; Okubo, Yukimune; Endo, Wakaba; Togashi, Noriko; Kobayashi, Yasuko; Onuma, Akira; Funayama, Ryo; Shirota, Matsuyuki; Nakayama, Keiko; Aoki, Yoko; Kure, Shigeo

    2018-05-01

    Cerebral palsy is a common, heterogeneous neurodevelopmental disorder that causes movement and postural disabilities. Recent studies have suggested genetic diseases can be misdiagnosed as cerebral palsy. We hypothesized that two simple criteria, that is, full-term births and nonspecific brain MRI findings, are keys to extracting masqueraders among cerebral palsy cases due to the following: (1) preterm infants are susceptible to multiple environmental factors and therefore demonstrate an increased risk of cerebral palsy and (2) brain MRI assessment is essential for excluding environmental causes and other particular disorders. A total of 107 patients-all full-term births-without specific findings on brain MRI were identified among 897 patients diagnosed with cerebral palsy who were followed at our center. DNA samples were available for 17 of the 107 cases for trio whole-exome sequencing and array comparative genomic hybridization. We prioritized variants in genes known to be relevant in neurodevelopmental diseases and evaluated their pathogenicity according to the American College of Medical Genetics guidelines. Pathogenic/likely pathogenic candidate variants were identified in 9 of 17 cases (52.9%) within eight genes: CTNNB1 , CYP2U1 , SPAST , GNAO1 , CACNA1A , AMPD2 , STXBP1 , and SCN2A . Five identified variants had previously been reported. No pathogenic copy number variations were identified. The AMPD2 missense variant and the splice-site variants in CTNNB1 and AMPD2 were validated by in vitro functional experiments. The high rate of detecting causative genetic variants (52.9%) suggests that patients diagnosed with cerebral palsy in full-term births without specific MRI findings may include genetic diseases masquerading as cerebral palsy.

  3. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    Science.gov (United States)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  4. Association analysis identifies ZNF750 regulatory variants in psoriasis

    Directory of Open Access Journals (Sweden)

    Birnbaum Ramon Y

    2011-12-01

    Full Text Available Abstract Background Mutations in the ZNF750 promoter and coding regions have been previously associated with Mendelian forms of psoriasis and psoriasiform dermatitis. ZNF750 encodes a putative zinc finger transcription factor that is highly expressed in keratinocytes and represents a candidate psoriasis gene. Methods We examined whether ZNF750 variants were associated with psoriasis in a large case-control population. We sequenced the promoter and exon regions of ZNF750 in 716 Caucasian psoriasis cases and 397 Caucasian controls. Results We identified a total of 47 variants, including 38 rare variants of which 35 were novel. Association testing identified two ZNF750 haplotypes associated with psoriasis (p ZNF750 promoter and 5' UTR variants displayed a 35-55% reduction of ZNF750 promoter activity, consistent with the promoter activity reduction seen in a Mendelian psoriasis family with a ZNF750 promoter variant. However, the rare promoter and 5' UTR variants identified in this study did not strictly segregate with the psoriasis phenotype within families. Conclusions Two haplotypes of ZNF750 and rare 5' regulatory variants of ZNF750 were found to be associated with psoriasis. These rare 5' regulatory variants, though not causal, might serve as a genetic modifier of psoriasis.

  5. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  6. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  7. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    Science.gov (United States)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  8. Using Factor Analysis to Identify Topic Preferences Within MBA Courses

    Directory of Open Access Journals (Sweden)

    Earl Chrysler

    2003-02-01

    Full Text Available This study demonstrates the role of a principal components factor analysis in conducting a gap analysis as to the desired characteristics of business alumni. Typically, gap analyses merely compare the emphases that should be given to areas of inquiry with perceptions of actual emphases. As a result, the focus is upon depth of coverage. A neglected area in need of investigation is the breadth of topic dimensions and their differences between the normative (should offer and the descriptive (actually offer. The implications of factor structures, as well as traditional gap analyses, are developed and discussed in the context of outcomes assessment.

  9. Immunoproteomic tools are used to identify masked allergens: Ole e 12, an allergenic isoflavone reductase from olive (Olea europaea) pollen.

    Science.gov (United States)

    Castro, Lourdes; Crespo, Jesús F; Rodríguez, Julia; Rodríguez, Rosalía; Villalba, Mayte

    2015-12-01

    Proteins performing important biochemical activities in the olive tree (Olea europaea) pollen have been identified as allergens. One novel 37-kDa protein seems to be associated to the IgE-binding profile of a group of patients suffering allergy to peach and olive pollen. Three previously described olive pollen allergens exhibit very similar molecular mass. Our objective was to identify this allergen by using immunoproteomic approaches. After 2D-electrophoresis and mass spectrometry, peptide sequences from several IgE-binding spots, allowed identifying this new allergen, as well as cloning and DNA sequencing of the corresponding gene. The allergen, named Ole e 12, is a polymorphic isoflavone reductase-like protein of 308 amino acids showing 80% and 74% identity with birch and pear allergens, Bet v 6 and Pyr c 5, respectively. A prevalence of 33% in the selected population is in contrast to 4%-10% in groups of subjects suffering from pollinosis. Recombinant allergen was produced in Escherichia coli, and deeply characterised. Immunoblotting and ELISA detection as well as inhibition experiments were performed with polyclonal antisera and allergic patients' sera. The recombinant allergen retains the IgE reactivity of its natural counterpart. Close structural and immunological relationships between members of this protein family were supported by their IgG recognition in vegetable species. In summary, Ole e 12 is a minor olive pollen allergen, which gains relevance in patients allergic to peach with olive pollinosis. Proteomic approaches used to analyse this allergen provide useful tools to identify hidden allergens, relevant for several allergic populations and thus complete allergenic panels. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Development and validation of a tool for identifying women with low bone mineral density and low-impact fractures: the São Paulo Osteoporosis Risk Index (SAPORI).

    Science.gov (United States)

    Pinheiro, M M; Reis Neto, E T; Machado, F S; Omura, F; Szejnfeld, J; Szejnfeld, V L

    2012-04-01

    The performance of the São Paulo Osteoporosis Risk Index (SAPORI) was tested in 1,915 women from the original cohort, São Paulo Osteoporosis Study (SAPOS) (N = 4332). This new tool was able to identify women with low bone density (spine and hip) and low-impact fracture, with an area under the receiving operator curve (ROC) of 0.831, 0.724, and 0.689, respectively. A number of studies have demonstrated the clinical relevance of risk factors for identifying individuals at risk of fracture (Fx) and osteoporosis (OP). The SAPOS is an epidemiological study for the assessment of risk factors for Fx and low bone density in women from the community of the metropolitan area of São Paulo, Brazil. The aim of the present study was to develop and validate a tool for identifying women at higher risk for OP and low-impact Fx. A total of 4,332 pre-, peri-, and postmenopausal women were analyzed through a questionnaire addressing risk factors for OP and Fx. All of them performed bone densitometry at the lumbar spine and proximal femur (DPX NT, GE-Lunar). Following the identification of the main risk factors for OP and Fx through multivariate and logistic regression, respectively, the SAPORI was designed and subsequently validated on a second cohort of 1,915 women from the metropolitan community of São Paulo. The performance of this tool was assessed through ROC analysis. The main and significant risk factors associated with low bone density and low-impact Fx were low body weight, advanced age, Caucasian ethnicity, family history of hip Fx, current smoking, and chronic use of glucocorticosteroids. Hormonal replacement therapy and regular physical activity in the previous year played a protective role (p < 0.05). After the statistical adjustments, the SAPORI was able to identify women with low bone density (T-score ≤ -2 standard deviations) in the femur, with 91.4% sensitivity, 52% specificity, and an area under the ROC of 0.831 (p < 0.001). At the lumbar spine

  11. Using multidimensional topological data analysis to identify traits of hip osteoarthritis.

    Science.gov (United States)

    Rossi-deVries, Jasmine; Pedoia, Valentina; Samaan, Michael A; Ferguson, Adam R; Souza, Richard B; Majumdar, Sharmila

    2018-05-07

    Osteoarthritis (OA) is a multifaceted disease with many variables affecting diagnosis and progression. Topological data analysis (TDA) is a state-of-the-art big data analytics tool that can combine all variables into multidimensional space. TDA is used to simultaneously analyze imaging and gait analysis techniques. To identify biochemical and biomechanical biomarkers able to classify different disease progression phenotypes in subjects with and without radiographic signs of hip OA. Longitudinal study for comparison of progressive and nonprogressive subjects. In all, 102 subjects with and without radiographic signs of hip osteoarthritis. 3T, SPGR 3D MAPSS T 1ρ /T 2 , intermediate-weighted fat-suppressed fast spin-echo (FSE). Multidimensional data analysis including cartilage composition, bone shape, Kellgren-Lawrence (KL) classification of osteoarthritis, scoring hip osteoarthritis with MRI (SHOMRI), hip disability and osteoarthritis outcome score (HOOS). Analysis done using TDA, Kolmogorov-Smirnov (KS) testing, and Benjamini-Hochberg to rank P-value results to correct for multiple comparisons. Subjects in the later stages of the disease had an increased SHOMRI score (P Analysis of this subgroup identified knee biomechanics (P analysis of an OA subgroup with femoroacetabular impingement (FAI) showed anterior labral tears to be the most significant marker (P = 0.0017) between those FAI subjects with and without OA symptoms. The data-driven analysis obtained with TDA proposes new phenotypes of these subjects that partially overlap with the radiographic-based classical disease status classification and also shows the potential for further examination of an early onset biomechanical intervention. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  12. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  13. Proteomic analysis of cell lines to identify the irinotecan resistance ...

    Indian Academy of Sciences (India)

    MADHU

    was selected from the wild-type LoVo cell line by chronic exposure to irinotecan ... dose–effect curves of anticancer drugs were drawn on semilogarithm .... alcohol metabolites daunorubicinol (Forrest and Gonzalez. 2000; Mordente et al. ..... Chen L, Huang C and Wei Y 2007 Proteomic analysis of liver cancer cells treated ...

  14. Association analysis identifies 65 new breast cancer risk loci

    OpenAIRE

    Michailidou, Kyriaki; Lindström, Sara; Dennis, Joe; Beesley, Jonathan; Hui, Shirley; Kar, Siddhartha; Lemaçon, Audrey; Soucy, Penny; Glubb, Dylan; Rostamianfar, Asha; Bolla, Manjeet K; Wang, Qin; Tyrer, Jonathan; Dicks, Ed; Lee, Andrew

    2017-01-01

    Breast cancer risk is influenced by rare coding variants in susceptibility genes, such as BRCA1, and many common, mostly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. Here we report the results of a genome-wide association study of breast cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry. We identified 65 new loci that are associated with overall breast cancer ri...

  15. ADVANCED AND RAPID DEVELOPMENT OF DYNAMIC ANALYSIS TOOLS FOR JAVA

    Directory of Open Access Journals (Sweden)

    Alex Villazón

    2012-01-01

    Full Text Available Low-level bytecode instrumentation techniques are widely used in many software-engineering tools for the Java Virtual Machine (JVM, that perform some form of dynamic program analysis, such as profilers or debuggers. While program manipulation at the bytecode level is very flexible, because the possible bytecode transformations are not restricted, tool development based on this technique is tedious and error-prone. As a promising alternative, the specification of bytecode instrumentation at a higher level using aspect-oriented programming (AOP can reduce tool development time and cost. Unfortunately, prevailing AOP frameworks lack some features that are essential for certain dynamic analyses. In this article, we focus on three common shortcomings in AOP frameworks with respect to the development of aspect-based tools - (1 the lack of mechanisms for passing data between woven advices in local variables, (2 the support for user-defined static analyses at weaving time, and (3 the absence of pointcuts at the level of individual basic blocks of code. We propose @J, an annotation-based AOP language and weaver that integrates support for these three features. The benefits of the proposed features are illustrated with concrete examples.

  16. Integrating Stakeholder Preferences and GIS-Based Multicriteria Analysis to Identify Forest Landscape Restoration Priorities

    Directory of Open Access Journals (Sweden)

    David Uribe

    2014-02-01

    Full Text Available A pressing question that arises during the planning of an ecological restoration process is: where to restore first? Answering this question is a complex task; it requires a multidimensional approach to consider economic constrains and the preferences of stakeholders. Being the problem of spatial nature, it may be explored effectively through Multicriteria Decision Analysis (MCDA performed in a Geographical Information System (GIS environment. The proposed approach is based on the definition and weighting of multiple criteria for evaluating land suitability. An MCDA-based methodology was used to identify priority areas for Forest Landscape Restoration in the Upper Mixtec region, Oaxaca (Mexico, one of the most degraded areas of Latin America. Socioeconomic and environmental criteria were selected and evaluated. The opinions of four different stakeholder groups were considered: general public, academic, Non-governmental organizations (NGOs and governmental officers. The preferences of these groups were spatially modeled to identify their priorities. The final result was a map that identifies the most preferable sites for restoration, where resources and efforts should be concentrated. MCDA proved to be a very useful tool in collective planning, when alternative sites have to be identified and prioritized to guide the restoration work.

  17. Structural and practical identifiability analysis of S-system.

    Science.gov (United States)

    Zhan, Choujun; Li, Benjamin Yee Shing; Yeung, Lam Fat

    2015-12-01

    In the field of systems biology, biological reaction networks are usually modelled by ordinary differential equations. A sub-class, the S-systems representation, is a widely used form of modelling. Existing S-systems identification techniques assume that the system itself is always structurally identifiable. However, due to practical limitations, biological reaction networks are often only partially measured. In addition, the captured data only covers a limited trajectory, therefore data can only be considered as a local snapshot of the system responses with respect to the complete set of state trajectories over the entire state space. Hence the estimated model can only reflect partial system dynamics and may not be unique. To improve the identification quality, the structural and practical identifiablility of S-system are studied. The S-system is shown to be identifiable under a set of assumptions. Then, an application on yeast fermentation pathway was conducted. Two case studies were chosen; where the first case is based on a larger state trajectories and the second case is based on a smaller one. By expanding the dataset which span a relatively larger state space, the uncertainty of the estimated system can be reduced. The results indicated that initial concentration is related to the practical identifiablity.

  18. Co-expression analysis identifies CRC and AP1 the regulator of Arabidopsis fatty acid biosynthesis.

    Science.gov (United States)

    Han, Xinxin; Yin, Linlin; Xue, Hongwei

    2012-07-01

    Fatty acids (FAs) play crucial rules in signal transduction and plant development, however, the regulation of FA metabolism is still poorly understood. To study the relevant regulatory network, fifty-eight FA biosynthesis genes including de novo synthases, desaturases and elongases were selected as "guide genes" to construct the co-expression network. Calculation of the correlation between all Arabidopsis thaliana (L.) genes with each guide gene by Arabidopsis co-expression dating mining tools (ACT) identifies 797 candidate FA-correlated genes. Gene ontology (GO) analysis of these co-expressed genes showed they are tightly correlated to photosynthesis and carbohydrate metabolism, and function in many processes. Interestingly, 63 transcription factors (TFs) were identified as candidate FA biosynthesis regulators and 8 TF families are enriched. Two TF genes, CRC and AP1, both correlating with 8 FA guide genes, were further characterized. Analyses of the ap1 and crc mutant showed the altered total FA composition of mature seeds. The contents of palmitoleic acid, stearic acid, arachidic acid and eicosadienoic acid are decreased, whereas that of oleic acid is increased in ap1 and crc seeds, which is consistent with the qRT-PCR analysis revealing the suppressed expression of the corresponding guide genes. In addition, yeast one-hybrid analysis and electrophoretic mobility shift assay (EMSA) revealed that CRC can bind to the promoter regions of KCS7 and KCS15, indicating that CRC may directly regulate FA biosynthesis. © 2012 Institute of Botany, Chinese Academy of Sciences.

  19. The role of records management as a tool to identify risks in the public sector in South Africa

    Directory of Open Access Journals (Sweden)

    Mpho Ngoepe

    2014-06-01

    Objectives: The study utilised the King III report on corporate governance in South Africa as a framework to investigate the role of records management in identifying risks in the public sector, with a view to entrench the synergy between records management and risk management. Method: Quantitative data were collected through questionnaires distributed to records managers, risk managers and auditors in governmental bodies in South Africa. Provisions of the King III report, guided the research objectives. Results: Even though the study established that there is a reciprocal relationship between risk identification and records management, most governmental bodies in South Africa lack records management and risk-mitigating frameworks or strategy. Furthermore, records management did not feature in most governmental bodies’ risk registers. It has been established that most governmental bodies have established risk committees that do not include records management practitioners. In most governmental bodies, risk management resides within internal audit functions. Conclusion: The study concludes by arguing that a strong records management regime can be one of an organisation’s primary tools in identifying risks and implementing proper risk management. Therefore, records management should be integrated with risk management processes for organisations to benefit from the synergy.

  20. Development of a screening tool for detecting undernutrition and dietary inadequacy among rural elderly in Malaysia: simple indices to identify individuals at high risk.

    Science.gov (United States)

    Shahar, S; Dixon, R A; Earland, J

    1999-11-01

    Undernutrition and the consumption of poor diets are prevalent among elderly people in developing countries. Recognising the importance of the early identification of individuals at high nutritional risk, this study aimed to develop a simple tool for screening. A cross-sectional study was conducted on 11 randomly selected villages among the 62 in Mersing District, Malaysia. Undernutrition was assessed using body mass index, plasma albumin and haemoglobin on 285 subjects. Dietary inadequacy (a count of nutrients falling below two-thirds of the Recommended Dietary Allowances) was examined for 337 subjects. Logistic regression analysis was performed to identify significant predictors of undernutrition and dietary inadequacy from social and health factors, and to derive appropriate indices based on these predictions. The multivariate predictors of undernutrition were 'no joint disease', 'smoker', 'no hypertension', 'depended on others for economic resource', 'respiratory disease', 'perceived weight loss' and 'chewing difficulty', with a joint sensitivity of 56% and specificity of 84%. The equivalent predictors of dietary inadequacy were 'unable to take public transport', 'loss of appetite', 'chewing difficulty', 'no regular fruit intake' and 'regularly taking less than three meals per day', with a joint sensitivity of 77% and specificity of 47%. These predictions, with minor modification to simplify operational use, led to the production of a simple screening tool. The tool can be used by public health professionals or community workers or leaders as a simple and rapid instrument to screen individual at high risk of undernutrition and/or dietary inadequacy.

  1. Association analysis identifies 65 new breast cancer risk loci

    Science.gov (United States)

    Lemaçon, Audrey; Soucy, Penny; Glubb, Dylan; Rostamianfar, Asha; Bolla, Manjeet K.; Wang, Qin; Tyrer, Jonathan; Dicks, Ed; Lee, Andrew; Wang, Zhaoming; Allen, Jamie; Keeman, Renske; Eilber, Ursula; French, Juliet D.; Chen, Xiao Qing; Fachal, Laura; McCue, Karen; McCart Reed, Amy E.; Ghoussaini, Maya; Carroll, Jason; Jiang, Xia; Finucane, Hilary; Adams, Marcia; Adank, Muriel A.; Ahsan, Habibul; Aittomäki, Kristiina; Anton-Culver, Hoda; Antonenkova, Natalia N.; Arndt, Volker; Aronson, Kristan J.; Arun, Banu; Auer, Paul L.; Bacot, François; Barrdahl, Myrto; Baynes, Caroline; Beckmann, Matthias W.; Behrens, Sabine; Benitez, Javier; Bermisheva, Marina; Bernstein, Leslie; Blomqvist, Carl; Bogdanova, Natalia V.; Bojesen, Stig E.; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith S.; Brauch, Hiltrud; Brennan, Paul; Brenner, Hermann; Brinton, Louise; Broberg, Per; Brock, Ian W.; Broeks, Annegien; Brooks-Wilson, Angela; Brucker, Sara Y.; Brüning, Thomas; Burwinkel, Barbara; Butterbach, Katja; Cai, Qiuyin; Cai, Hui; Caldés, Trinidad; Canzian, Federico; Carracedo, Angel; Carter, Brian D.; Castelao, Jose E.; Chan, Tsun L.; Cheng, Ting-Yuan David; Chia, Kee Seng; Choi, Ji-Yeob; Christiansen, Hans; Clarke, Christine L.; Collée, Margriet; Conroy, Don M.; Cordina-Duverger, Emilie; Cornelissen, Sten; Cox, David G; Cox, Angela; Cross, Simon S.; Cunningham, Julie M.; Czene, Kamila; Daly, Mary B.; Devilee, Peter; Doheny, Kimberly F.; Dörk, Thilo; dos-Santos-Silva, Isabel; Dumont, Martine; Durcan, Lorraine; Dwek, Miriam; Eccles, Diana M.; Ekici, Arif B.; Eliassen, A. Heather; Ellberg, Carolina; Elvira, Mingajeva; Engel, Christoph; Eriksson, Mikael; Fasching, Peter A.; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fritschi, Lin; Gaborieau, Valerie; Gabrielson, Marike; Gago-Dominguez, Manuela; Gao, Yu-Tang; Gapstur, Susan M.; García-Sáenz, José A.; Gaudet, Mia M.; Georgoulias, Vassilios; Giles, Graham G.; Glendon, Gord; Goldberg, Mark S.; Goldgar, David E.; González-Neira, Anna; Grenaker Alnæs, Grethe I.; Grip, Mervi; Gronwald, Jacek; Grundy, Anne; Guénel, Pascal; Haeberle, Lothar; Hahnen, Eric; Haiman, Christopher A.; Håkansson, Niclas; Hamann, Ute; Hamel, Nathalie; Hankinson, Susan; Harrington, Patricia; Hart, Steven N.; Hartikainen, Jaana M.; Hartman, Mikael; Hein, Alexander; Heyworth, Jane; Hicks, Belynda; Hillemanns, Peter; Ho, Dona N.; Hollestelle, Antoinette; Hooning, Maartje J.; Hoover, Robert N.; Hopper, John L.; Hou, Ming-Feng; Hsiung, Chia-Ni; Huang, Guanmengqian; Humphreys, Keith; Ishiguro, Junko; Ito, Hidemi; Iwasaki, Motoki; Iwata, Hiroji; Jakubowska, Anna; Janni, Wolfgang; John, Esther M.; Johnson, Nichola; Jones, Kristine; Jones, Michael; Jukkola-Vuorinen, Arja; Kaaks, Rudolf; Kabisch, Maria; Kaczmarek, Katarzyna; Kang, Daehee; Kasuga, Yoshio; Kerin, Michael J.; Khan, Sofia; Khusnutdinova, Elza; Kiiski, Johanna I.; Kim, Sung-Won; Knight, Julia A.; Kosma, Veli-Matti; Kristensen, Vessela N.; Krüger, Ute; Kwong, Ava; Lambrechts, Diether; Marchand, Loic Le; Lee, Eunjung; Lee, Min Hyuk; Lee, Jong Won; Lee, Chuen Neng; Lejbkowicz, Flavio; Li, Jingmei; Lilyquist, Jenna; Lindblom, Annika; Lissowska, Jolanta; Lo, Wing-Yee; Loibl, Sibylle; Long, Jirong; Lophatananon, Artitaya; Lubinski, Jan; Luccarini, Craig; Lux, Michael P.; Ma, Edmond S.K.; MacInnis, Robert J.; Maishman, Tom; Makalic, Enes; Malone, Kathleen E; Kostovska, Ivana Maleva; Mannermaa, Arto; Manoukian, Siranoush; Manson, JoAnn E.; Margolin, Sara; Mariapun, Shivaani; Martinez, Maria Elena; Matsuo, Keitaro; Mavroudis, Dimitrios; McKay, James; McLean, Catriona; Meijers-Heijboer, Hanne; Meindl, Alfons; Menéndez, Primitiva; Menon, Usha; Meyer, Jeffery; Miao, Hui; Miller, Nicola; Mohd Taib, Nur Aishah; Muir, Kenneth; Mulligan, Anna Marie; Mulot, Claire; Neuhausen, Susan L.; Nevanlinna, Heli; Neven, Patrick; Nielsen, Sune F.; Noh, Dong-Young; Nordestgaard, Børge G.; Norman, Aaron; Olopade, Olufunmilayo I.; Olson, Janet E.; Olsson, Håkan; Olswold, Curtis; Orr, Nick; Pankratz, V. Shane; Park, Sue K.; Park-Simon, Tjoung-Won; Lloyd, Rachel; Perez, Jose I.A.; Peterlongo, Paolo; Peto, Julian; Phillips, Kelly-Anne; Pinchev, Mila; Plaseska-Karanfilska, Dijana; Prentice, Ross; Presneau, Nadege; Prokofieva, Darya; Pugh, Elizabeth; Pylkäs, Katri; Rack, Brigitte; Radice, Paolo; Rahman, Nazneen; Rennert, Gadi; Rennert, Hedy S.; Rhenius, Valerie; Romero, Atocha; Romm, Jane; Ruddy, Kathryn J; Rüdiger, Thomas; Rudolph, Anja; Ruebner, Matthias; Rutgers, Emiel J. Th.; Saloustros, Emmanouil; Sandler, Dale P.; Sangrajrang, Suleeporn; Sawyer, Elinor J.; Schmidt, Daniel F.; Schmutzler, Rita K.; Schneeweiss, Andreas; Schoemaker, Minouk J.; Schumacher, Fredrick; Schürmann, Peter; Scott, Rodney J.; Scott, Christopher; Seal, Sheila; Seynaeve, Caroline; Shah, Mitul; Sharma, Priyanka; Shen, Chen-Yang; Sheng, Grace; Sherman, Mark E.; Shrubsole, Martha J.; Shu, Xiao-Ou; Smeets, Ann; Sohn, Christof; Southey, Melissa C.; Spinelli, John J.; Stegmaier, Christa; Stewart-Brown, Sarah; Stone, Jennifer; Stram, Daniel O.; Surowy, Harald; Swerdlow, Anthony; Tamimi, Rulla; Taylor, Jack A.; Tengström, Maria; Teo, Soo H.; Terry, Mary Beth; Tessier, Daniel C.; Thanasitthichai, Somchai; Thöne, Kathrin; Tollenaar, Rob A.E.M.; Tomlinson, Ian; Tong, Ling; Torres, Diana; Truong, Thérèse; Tseng, Chiu-chen; Tsugane, Shoichiro; Ulmer, Hans-Ulrich; Ursin, Giske; Untch, Michael; Vachon, Celine; van Asperen, Christi J.; Van Den Berg, David; van den Ouweland, Ans M.W.; van der Kolk, Lizet; van der Luijt, Rob B.; Vincent, Daniel; Vollenweider, Jason; Waisfisz, Quinten; Wang-Gohrke, Shan; Weinberg, Clarice R.; Wendt, Camilla; Whittemore, Alice S.; Wildiers, Hans; Willett, Walter; Winqvist, Robert; Wolk, Alicja; Wu, Anna H.; Xia, Lucy; Yamaji, Taiki; Yang, Xiaohong R.; Yip, Cheng Har; Yoo, Keun-Young; Yu, Jyh-Cherng; Zheng, Wei; Zheng, Ying; Zhu, Bin; Ziogas, Argyrios; Ziv, Elad; Lakhani, Sunil R.; Antoniou, Antonis C.; Droit, Arnaud; Andrulis, Irene L.; Amos, Christopher I.; Couch, Fergus J.; Pharoah, Paul D.P.; Chang-Claude, Jenny; Hall, Per; Hunter, David J.; Milne, Roger L.; García-Closas, Montserrat; Schmidt, Marjanka K.; Chanock, Stephen J.; Dunning, Alison M.; Edwards, Stacey L.; Bader, Gary D.; Chenevix-Trench, Georgia; Simard, Jacques; Kraft, Peter; Easton, Douglas F.

    2017-01-01

    Breast cancer risk is influenced by rare coding variants in susceptibility genes such as BRCA1 and many common, mainly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. We report results from a genome-wide association study (GWAS) of breast cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry1. We identified 65 new loci associated with overall breast cancer at pcancer due to all SNPs in regulatory features was 2-5-fold enriched relative to the genome-wide average, with strong enrichment for particular transcription factor binding sites. These results provide further insight into genetic susceptibility to breast cancer and will improve the utility of genetic risk scores for individualized screening and prevention. PMID:29059683

  2. Association analysis identifies 65 new breast cancer risk loci

    DEFF Research Database (Denmark)

    Michailidou, Kyriaki; Lindström, Sara; Dennis, Joe

    2017-01-01

    Breast cancer risk is influenced by rare coding variants in susceptibility genes, such as BRCA1, and many common, mostly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. Here we report the results of a genome-wide association study of breast...... cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry. We identified 65 new loci that are associated with overall breast cancer risk at P risk single-nucleotide polymorphisms in these loci fall......-nucleotide polymorphisms in regulatory features was 2-5-fold enriched relative to the genome-wide average, with strong enrichment for particular transcription factor binding sites. These results provide further insight into genetic susceptibility to breast cancer and will improve the use of genetic risk scores...

  3. [Analysis of researchers' implication in a research-intervention in the Stork Network: a tool for institutional analysis].

    Science.gov (United States)

    Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles

    2016-09-19

    This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.

  4. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  5. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  6. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  7. Implementing a screening tool for identifying patients at risk for hereditary breast and ovarian cancer: a statewide initiative.

    Science.gov (United States)

    Brannon Traxler, L; Martin, Monique L; Kerber, Alice S; Bellcross, Cecelia A; Crane, Barbara E; Green, Victoria; Matthews, Roland; Paris, Nancy M; Gabram, Sheryl G A

    2014-10-01

    The Georgia Breast Cancer Genomic Health Consortium is a partnership created with funding from the Centers for Disease Control and Prevention (CDC) to the Georgia Department of Public Health to reduce cancer disparities among high-risk minority women. The project addresses young women at increased risk for hereditary breast and ovarian cancer (HBOC) syndrome through outreach efforts. The consortium provides education and collects surveillance data using the breast cancer genetics referral screening tool (B-RST) available at www.BreastCancerGeneScreen.org . The HBOC educational protocol was presented to 73 staff in 6 public health centers. Staff used the tool during the collection of medical history. Further family history assessments and testing for mutations in the BRCA1/2 genes were facilitated if appropriate. Data was collected from November 2012 through December 2013, including 2,159 screened women. The majority of patients identified as black/African American and were 18-49 years old. Also, 6.0 % (n = 130) had positive screens, and 60.9 % (n = 67) of the 110 patients who agreed to be contacted provided a detailed family history. A total of 47 patients (42.7 %) met National Comprehensive Cancer Network guidelines when family history was clarified. Fourteen (12.7 %) underwent genetic testing; 1 patient was positive for a BRCA2 mutation, and 1 patient was found to carry a variant of uncertain significance. The introduction of genomics practice within public health departments has provided access to comprehensive cancer care for uninsured individuals. The successful implementation of the B-RST into public health centers demonstrates the opportunity for integration of HBOC screening into primary care practices.

  8. Probabilistic analysis for identifying the driving force of protein folding

    Science.gov (United States)

    Tokunaga, Yoshihiko; Yamamori, Yu; Matubayasi, Nobuyuki

    2018-03-01

    Toward identifying the driving force of protein folding, energetics was analyzed in water for Trp-cage (20 residues), protein G (56 residues), and ubiquitin (76 residues) at their native (folded) and heat-denatured (unfolded) states. All-atom molecular dynamics simulation was conducted, and the hydration effect was quantified by the solvation free energy. The free-energy calculation was done by employing the solution theory in the energy representation, and it was seen that the sum of the protein intramolecular (structural) energy and the solvation free energy is more favorable for a folded structure than for an unfolded one generated by heat. Probabilistic arguments were then developed to determine which of the electrostatic, van der Waals, and excluded-volume components of the interactions in the protein-water system governs the relative stabilities between the folded and unfolded structures. It was found that the electrostatic interaction does not correspond to the preference order of the two structures. The van der Waals and excluded-volume components were shown, on the other hand, to provide the right order of preference at probabilities of almost unity, and it is argued that a useful modeling of protein folding is possible on the basis of the excluded-volume effect.

  9. Association analysis identifies 65 new breast cancer risk loci.

    Science.gov (United States)

    Michailidou, Kyriaki; Lindström, Sara; Dennis, Joe; Beesley, Jonathan; Hui, Shirley; Kar, Siddhartha; Lemaçon, Audrey; Soucy, Penny; Glubb, Dylan; Rostamianfar, Asha; Bolla, Manjeet K; Wang, Qin; Tyrer, Jonathan; Dicks, Ed; Lee, Andrew; Wang, Zhaoming; Allen, Jamie; Keeman, Renske; Eilber, Ursula; French, Juliet D; Qing Chen, Xiao; Fachal, Laura; McCue, Karen; McCart Reed, Amy E; Ghoussaini, Maya; Carroll, Jason S; Jiang, Xia; Finucane, Hilary; Adams, Marcia; Adank, Muriel A; Ahsan, Habibul; Aittomäki, Kristiina; Anton-Culver, Hoda; Antonenkova, Natalia N; Arndt, Volker; Aronson, Kristan J; Arun, Banu; Auer, Paul L; Bacot, François; Barrdahl, Myrto; Baynes, Caroline; Beckmann, Matthias W; Behrens, Sabine; Benitez, Javier; Bermisheva, Marina; Bernstein, Leslie; Blomqvist, Carl; Bogdanova, Natalia V; Bojesen, Stig E; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith S; Brauch, Hiltrud; Brennan, Paul; Brenner, Hermann; Brinton, Louise; Broberg, Per; Brock, Ian W; Broeks, Annegien; Brooks-Wilson, Angela; Brucker, Sara Y; Brüning, Thomas; Burwinkel, Barbara; Butterbach, Katja; Cai, Qiuyin; Cai, Hui; Caldés, Trinidad; Canzian, Federico; Carracedo, Angel; Carter, Brian D; Castelao, Jose E; Chan, Tsun L; David Cheng, Ting-Yuan; Seng Chia, Kee; Choi, Ji-Yeob; Christiansen, Hans; Clarke, Christine L; Collée, Margriet; Conroy, Don M; Cordina-Duverger, Emilie; Cornelissen, Sten; Cox, David G; Cox, Angela; Cross, Simon S; Cunningham, Julie M; Czene, Kamila; Daly, Mary B; Devilee, Peter; Doheny, Kimberly F; Dörk, Thilo; Dos-Santos-Silva, Isabel; Dumont, Martine; Durcan, Lorraine; Dwek, Miriam; Eccles, Diana M; Ekici, Arif B; Eliassen, A Heather; Ellberg, Carolina; Elvira, Mingajeva; Engel, Christoph; Eriksson, Mikael; Fasching, Peter A; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fritschi, Lin; Gaborieau, Valerie; Gabrielson, Marike; Gago-Dominguez, Manuela; Gao, Yu-Tang; Gapstur, Susan M; García-Sáenz, José A; Gaudet, Mia M; Georgoulias, Vassilios; Giles, Graham G; Glendon, Gord; Goldberg, Mark S; Goldgar, David E; González-Neira, Anna; Grenaker Alnæs, Grethe I; Grip, Mervi; Gronwald, Jacek; Grundy, Anne; Guénel, Pascal; Haeberle, Lothar; Hahnen, Eric; Haiman, Christopher A; Håkansson, Niclas; Hamann, Ute; Hamel, Nathalie; Hankinson, Susan; Harrington, Patricia; Hart, Steven N; Hartikainen, Jaana M; Hartman, Mikael; Hein, Alexander; Heyworth, Jane; Hicks, Belynda; Hillemanns, Peter; Ho, Dona N; Hollestelle, Antoinette; Hooning, Maartje J; Hoover, Robert N; Hopper, John L; Hou, Ming-Feng; Hsiung, Chia-Ni; Huang, Guanmengqian; Humphreys, Keith; Ishiguro, Junko; Ito, Hidemi; Iwasaki, Motoki; Iwata, Hiroji; Jakubowska, Anna; Janni, Wolfgang; John, Esther M; Johnson, Nichola; Jones, Kristine; Jones, Michael; Jukkola-Vuorinen, Arja; Kaaks, Rudolf; Kabisch, Maria; Kaczmarek, Katarzyna; Kang, Daehee; Kasuga, Yoshio; Kerin, Michael J; Khan, Sofia; Khusnutdinova, Elza; Kiiski, Johanna I; Kim, Sung-Won; Knight, Julia A; Kosma, Veli-Matti; Kristensen, Vessela N; Krüger, Ute; Kwong, Ava; Lambrechts, Diether; Le Marchand, Loic; Lee, Eunjung; Lee, Min Hyuk; Lee, Jong Won; Neng Lee, Chuen; Lejbkowicz, Flavio; Li, Jingmei; Lilyquist, Jenna; Lindblom, Annika; Lissowska, Jolanta; Lo, Wing-Yee; Loibl, Sibylle; Long, Jirong; Lophatananon, Artitaya; Lubinski, Jan; Luccarini, Craig; Lux, Michael P; Ma, Edmond S K; MacInnis, Robert J; Maishman, Tom; Makalic, Enes; Malone, Kathleen E; Kostovska, Ivana Maleva; Mannermaa, Arto; Manoukian, Siranoush; Manson, JoAnn E; Margolin, Sara; Mariapun, Shivaani; Martinez, Maria Elena; Matsuo, Keitaro; Mavroudis, Dimitrios; McKay, James; McLean, Catriona; Meijers-Heijboer, Hanne; Meindl, Alfons; Menéndez, Primitiva; Menon, Usha; Meyer, Jeffery; Miao, Hui; Miller, Nicola; Taib, Nur Aishah Mohd; Muir, Kenneth; Mulligan, Anna Marie; Mulot, Claire; Neuhausen, Susan L; Nevanlinna, Heli; Neven, Patrick; Nielsen, Sune F; Noh, Dong-Young; Nordestgaard, Børge G; Norman, Aaron; Olopade, Olufunmilayo I; Olson, Janet E; Olsson, Håkan; Olswold, Curtis; Orr, Nick; Pankratz, V Shane; Park, Sue K; Park-Simon, Tjoung-Won; Lloyd, Rachel; Perez, Jose I A; Peterlongo, Paolo; Peto, Julian; Phillips, Kelly-Anne; Pinchev, Mila; Plaseska-Karanfilska, Dijana; Prentice, Ross; Presneau, Nadege; Prokofyeva, Darya; Pugh, Elizabeth; Pylkäs, Katri; Rack, Brigitte; Radice, Paolo; Rahman, Nazneen; Rennert, Gadi; Rennert, Hedy S; Rhenius, Valerie; Romero, Atocha; Romm, Jane; Ruddy, Kathryn J; Rüdiger, Thomas; Rudolph, Anja; Ruebner, Matthias; Rutgers, Emiel J T; Saloustros, Emmanouil; Sandler, Dale P; Sangrajrang, Suleeporn; Sawyer, Elinor J; Schmidt, Daniel F; Schmutzler, Rita K; Schneeweiss, Andreas; Schoemaker, Minouk J; Schumacher, Fredrick; Schürmann, Peter; Scott, Rodney J; Scott, Christopher; Seal, Sheila; Seynaeve, Caroline; Shah, Mitul; Sharma, Priyanka; Shen, Chen-Yang; Sheng, Grace; Sherman, Mark E; Shrubsole, Martha J; Shu, Xiao-Ou; Smeets, Ann; Sohn, Christof; Southey, Melissa C; Spinelli, John J; Stegmaier, Christa; Stewart-Brown, Sarah; Stone, Jennifer; Stram, Daniel O; Surowy, Harald; Swerdlow, Anthony; Tamimi, Rulla; Taylor, Jack A; Tengström, Maria; Teo, Soo H; Beth Terry, Mary; Tessier, Daniel C; Thanasitthichai, Somchai; Thöne, Kathrin; Tollenaar, Rob A E M; Tomlinson, Ian; Tong, Ling; Torres, Diana; Truong, Thérèse; Tseng, Chiu-Chen; Tsugane, Shoichiro; Ulmer, Hans-Ulrich; Ursin, Giske; Untch, Michael; Vachon, Celine; van Asperen, Christi J; Van Den Berg, David; van den Ouweland, Ans M W; van der Kolk, Lizet; van der Luijt, Rob B; Vincent, Daniel; Vollenweider, Jason; Waisfisz, Quinten; Wang-Gohrke, Shan; Weinberg, Clarice R; Wendt, Camilla; Whittemore, Alice S; Wildiers, Hans; Willett, Walter; Winqvist, Robert; Wolk, Alicja; Wu, Anna H; Xia, Lucy; Yamaji, Taiki; Yang, Xiaohong R; Har Yip, Cheng; Yoo, Keun-Young; Yu, Jyh-Cherng; Zheng, Wei; Zheng, Ying; Zhu, Bin; Ziogas, Argyrios; Ziv, Elad; Lakhani, Sunil R; Antoniou, Antonis C; Droit, Arnaud; Andrulis, Irene L; Amos, Christopher I; Couch, Fergus J; Pharoah, Paul D P; Chang-Claude, Jenny; Hall, Per; Hunter, David J; Milne, Roger L; García-Closas, Montserrat; Schmidt, Marjanka K; Chanock, Stephen J; Dunning, Alison M; Edwards, Stacey L; Bader, Gary D; Chenevix-Trench, Georgia; Simard, Jacques; Kraft, Peter; Easton, Douglas F

    2017-11-02

    Breast cancer risk is influenced by rare coding variants in susceptibility genes, such as BRCA1, and many common, mostly non-coding variants. However, much of the genetic contribution to breast cancer risk remains unknown. Here we report the results of a genome-wide association study of breast cancer in 122,977 cases and 105,974 controls of European ancestry and 14,068 cases and 13,104 controls of East Asian ancestry. We identified 65 new loci that are associated with overall breast cancer risk at P < 5 × 10 -8 . The majority of credible risk single-nucleotide polymorphisms in these loci fall in distal regulatory elements, and by integrating in silico data to predict target genes in breast cells at each locus, we demonstrate a strong overlap between candidate target genes and somatic driver genes in breast tumours. We also find that heritability of breast cancer due to all single-nucleotide polymorphisms in regulatory features was 2-5-fold enriched relative to the genome-wide average, with strong enrichment for particular transcription factor binding sites. These results provide further insight into genetic susceptibility to breast cancer and will improve the use of genetic risk scores for individualized screening and prevention.

  10. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  11. Validation of three tools for identifying painful new osteoporotic vertebral fractures in older Chinese men: bone mineral density, Osteoporosis Self-Assessment Tool for Asians, and fracture risk assessment tool.

    Science.gov (United States)

    Lin, JiSheng; Yang, Yong; Fei, Qi; Zhang, XiaoDong; Ma, Zhao; Wang, Qi; Li, JinJun; Li, Dong; Meng, Qian; Wang, BingQiang

    2016-01-01

    This cross-sectional study compared three tools for predicting painful new osteoporotic vertebral fractures (PNOVFs) in older Chinese men: bone mineral density (BMD), the Osteoporosis Self-Assessment Tool for Asians (OSTA), and the World Health Organization fracture risk assessment tool (FRAX) (without BMD). Men aged ≥50 years were apportioned to a group for men with fractures who had undergone percutaneous vertebroplasty (n=111), or a control group of healthy men (n=385). Fractures were verified on X-ray and magnetic resonance imaging. BMD T-scores were determined by dual energy X-ray absorptiometry. Diagnosis of osteoporosis was determined by a BMD T-score of ≤2.5 standard deviations below the average for a young adult at peak bone density at the femoral neck, total hip, or L1-L4. Demographic and clinical risk factor data were self-reported through a questionnaire. BMD, OSTA, and FRAX scores were assessed for identifying PNOVFs via receiver-operating characteristic (ROC) curves. Optimal cutoff points, sensitivity, specificity, and areas under the ROC curves (AUCs) were determined. Between the men with fractures and the control group, there were significant differences in BMD T-scores (at femoral neck, total hip, and L1-L4), and OSTA and FRAX scores. In those with fractures, only 53.15% satisfied the criteria for osteoporosis. Compared to BMD or OSTA, the FRAX score had the best predictive value for PNOVFs: the AUC of the FRAX score (cutoff =2.9%) was 0.738, and the sensitivity and specificity were 82% and 62%, respectively. FRAX may be a valuable tool for identifying PNOVFs in older Chinese men.

  12. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  13. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    Science.gov (United States)

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  14. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  15. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  16. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  17. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  18. Identifying a preservation zone using multicriteria decision analysis

    Energy Technology Data Exchange (ETDEWEB)

    Farashi, A.; Naderi, M.; Parvian, N.

    2016-07-01

    Zoning of a protected area is an approach to partition landscape into various land use units. The management of these landscape units can reduce conflicts caused by human activities. Tandoreh National Park is one of the most biologically diverse, protected areas in Iran. Although the area is generally designed to protect biodiversity, there are many conflicts between biodiversity conservation and human activities. For instance, the area is highly controversial and has been considered as an impediment to local economic development, such as tourism, grazing, road construction, and cultivation. In order to reduce human conflicts with biodiversity conservation in Tandoreh National Park, safe zones need to be established and human activities need to be moved out of the zones. In this study we used a systematic methodology to integrate a participatory process with Geographic Information Systems (GIS) using a multi–criteria decision analysis (MCDA) technique to guide a zoning scheme for the Tandoreh National Park, Iran. Our results show that the northern and eastern parts of the Tandoreh National Park that were close to rural areas and farmlands returned less desirability for selection as a preservation area. Rocky Mountains were the most important and most destructed areas and abandoned plains were the least important criteria for preservation in the area. Furthermore, the results reveal that the land properties were considered to be important for protection based on the obtaine. (Author)

  19. Analysis tools for discovering strong parity violation at hadron colliders

    Science.gov (United States)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  20. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  1. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  2. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  3. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  4. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  5. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  6. Single Molecule Cluster Analysis Identifies Signature Dynamic Conformations along the Splicing Pathway

    Science.gov (United States)

    Blanco, Mario R.; Martin, Joshua S.; Kahlscheuer, Matthew L.; Krishnan, Ramya; Abelson, John; Laederach, Alain; Walter, Nils G.

    2016-01-01

    The spliceosome is the dynamic RNA-protein machine responsible for faithfully splicing introns from precursor messenger RNAs (pre-mRNAs). Many of the dynamic processes required for the proper assembly, catalytic activation, and disassembly of the spliceosome as it acts on its pre-mRNA substrate remain poorly understood, a challenge that persists for many biomolecular machines. Here, we developed a fluorescence-based Single Molecule Cluster Analysis (SiMCAn) tool to dissect the manifold conformational dynamics of a pre-mRNA through the splicing cycle. By clustering common dynamic behaviors derived from selectively blocked splicing reactions, SiMCAn was able to identify signature conformations and dynamic behaviors of multiple ATP-dependent intermediates. In addition, it identified a conformation adopted late in splicing by a 3′ splice site mutant, invoking a mechanism for substrate proofreading. SiMCAn presents a novel framework for interpreting complex single molecule behaviors that should prove widely useful for the comprehensive analysis of a plethora of dynamic cellular machines. PMID:26414013

  7. Integrating text mining, data mining, and network analysis for identifying genetic breast cancer trends.

    Science.gov (United States)

    Jurca, Gabriela; Addam, Omar; Aksac, Alper; Gao, Shang; Özyer, Tansel; Demetrick, Douglas; Alhajj, Reda

    2016-04-26

    Breast cancer is a serious disease which affects many women and may lead to death. It has received considerable attention from the research community. Thus, biomedical researchers aim to find genetic biomarkers indicative of the disease. Novel biomarkers can be elucidated from the existing literature. However, the vast amount of scientific publications on breast cancer make this a daunting task. This paper presents a framework which investigates existing literature data for informative discoveries. It integrates text mining and social network analysis in order to identify new potential biomarkers for breast cancer. We utilized PubMed for the testing. We investigated gene-gene interactions, as well as novel interactions such as gene-year, gene-country, and abstract-country to find out how the discoveries varied over time and how overlapping/diverse are the discoveries and the interest of various research groups in different countries. Interesting trends have been identified and discussed, e.g., different genes are highlighted in relationship to different countries though the various genes were found to share functionality. Some text analysis based results have been validated against results from other tools that predict gene-gene relations and gene functions.

  8. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  9. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    Johnson, B.E.

    1994-01-01

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  10. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  11. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  12. A design and performance analysis tool for superconducting RF systems

    International Nuclear Information System (INIS)

    Schilcher, T.; Simrock, S.N.; Merminga, L.; Wang, D.X.

    1997-01-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented

  13. Development of the Aboriginal Communication Assessment After Brain Injury (ACAABI): A screening tool for identifying acquired communication disorders in Aboriginal Australians.

    Science.gov (United States)

    Armstrong, Elizabeth M; Ciccone, Natalie; Hersh, Deborah; Katzenellebogen, Judith; Coffin, Juli; Thompson, Sandra; Flicker, Leon; Hayward, Colleen; Woods, Deborah; McAllister, Meaghan

    2017-06-01

    Acquired communication disorders (ACD), following stroke and traumatic brain injury, may not be correctly identified in Aboriginal Australians due to a lack of linguistically and culturally appropriate assessment tools. Within this paper we explore key issues that were considered in the development of the Aboriginal Communication Assessment After Brain Injury (ACAABI) - a screening tool designed to assess the presence of ACD in Aboriginal populations. A literature review and consultation with key stakeholders were undertaken to explore directions needed to develop a new tool, based on existing tools and recommendations for future developments. The literature searches revealed no existing screening tool for ACD in these populations, but identified tools in the areas of cognition and social-emotional wellbeing. Articles retrieved described details of the content and style of these tools, with recommendations for the development and administration of a new tool. The findings from the interview and focus group views were consistent with the approach recommended in the literature. There is a need for a screening tool for ACD to be developed but any tool must be informed by knowledge of Aboriginal language, culture and community input in order to be acceptable and valid.

  14. Lipid ratio as a suitable tool to identify individuals with MetS risk: A case- control study.

    Science.gov (United States)

    Abbasian, Maryam; Delvarianzadeh, Mehri; Ebrahimi, Hossein; Khosravi, Farideh

    2017-11-01

    This study aimed to compare the serum lipids ratio in staff with and without metabolic syndrome (MetS) who were working in Shahroud University of Medical Sciences. This case-control study was conducted in 2015 on 499 personnel aged 30-60 years old. ATP III criteria were used to diagnose patients with MetS. The data were analyzed by using logistic regression and ROC curve. Mean lipid ratio was higher in individuals having the MetS in both sexes compared with those without. In addition, the mean levels of lipid ratios significantly increased with increasing number of MetS components in both sexes. Also it could be concluded that TG/HDL-C ratio is the best marker for the diagnosis of MetS in men and women. Moreover, the cut-off point for the TG/HDL-C was 2.86 in women and 4.03 in men. It was found that for any unit of increases in the TG/HDL-C, the risk of developing the MetS will increase by 2.12 times. TG/HDL-C ratio is found to be the best clinical marker for the diagnosis of MetS compare with other lipid ratios, therefore it is recommended to be used as a feasible tool to identify individuals with MetS risk. Copyright © 2016 Diabetes India. Published by Elsevier Ltd. All rights reserved.

  15. The nematode Caenorhabditis elegans as a tool to predict chemical activity on mammalian development and identify mechanisms influencing toxicological outcome.

    Science.gov (United States)

    Harlow, Philippa H; Perry, Simon J; Widdison, Stephanie; Daniels, Shannon; Bondo, Eddie; Lamberth, Clemens; Currie, Richard A; Flemming, Anthony J

    2016-03-18

    To determine whether a C. elegans bioassay could predict mammalian developmental activity, we selected diverse compounds known and known not to elicit such activity and measured their effect on C. elegans egg viability. 89% of compounds that reduced C. elegans egg viability also had mammalian developmental activity. Conversely only 25% of compounds found not to reduce egg viability in C. elegans were also inactive in mammals. We conclude that the C. elegans egg viability assay is an accurate positive predictor, but an inaccurate negative predictor, of mammalian developmental activity. We then evaluated C. elegans as a tool to identify mechanisms affecting toxicological outcomes among related compounds. The difference in developmental activity of structurally related fungicides in C. elegans correlated with their rate of metabolism. Knockdown of the cytochrome P450s cyp-35A3 and cyp-35A4 increased the toxicity to C. elegans of the least developmentally active compounds to the level of the most developmentally active. This indicated that these P450s were involved in the greater rate of metabolism of the less toxic of these compounds. We conclude that C. elegans based approaches can predict mammalian developmental activity and can yield plausible hypotheses for factors affecting the biological potency of compounds in mammals.

  16. Hepatitis A Virus: Essential Knowledge and a Novel Identify-Isolate-Inform Tool for Frontline Healthcare Providers

    Directory of Open Access Journals (Sweden)

    Kristi L. Koenig

    2017-10-01

    Full Text Available Infection with hepatitis A virus (HAV causes a highly contagious illness that can lead to serious morbidity and occasional mortality. Although the overall incidence of HAV has been declining since the introduction of the HAV vaccine, there have been an increasing number of outbreaks within the United States and elsewhere between 2016 and 2017. These outbreaks have had far-reaching consequences, with a large number of patients requiring hospitalization and several deaths. Accordingly, HAV is proving to present a renewed public health challenge. Through use of the “Identify-Isolate-Inform” tool as adapted for HAV, emergency physicians can become more familiar with the identification and management of patients presenting to the emergency department (ED with exposure, infection, or risk of contracting disease. While it can be asymptomatic, HAV typically presents with a prodrome of fever, nausea/vomiting, and abdominal pain followed by jaundice. Healthcare providers should maintain strict standard precautions for all patients suspected of having HAV infection as well as contact precautions in special cases. Hand hygiene with soap and warm water should be emphasized, and affected patients should be counseled to avoid food preparation and close contact with vulnerable populations. Additionally, ED providers should offer post-exposure prophylaxis to exposed contacts and encourage vaccination as well as other preventive measures for at-risk individuals. ED personnel should inform local public health departments of any suspected case.

  17. How can we identify patients with delirium in the emergency department?: A review of available screening and diagnostic tools.

    Science.gov (United States)

    Tamune, Hidetaka; Yasugi, Daisuke

    2017-09-01

    Delirium is a widespread and serious but under-recognized problem. Increasing evidence argues that emergency health care providers need to assess the mental status of the patient as the "sixth vital sign". A simple, sensitive, time-efficient, and cost-effective tool is needed to identify delirium in patients in the emergency department (ED); however, a stand-alone measurement has not yet been established despite previous studies partly because the differential diagnosis of dementia and delirium superimposed on dementia (DSD) is too difficult to achieve using a single indicator. To fill up the gap, multiple aspects of a case should be assessed including inattention and arousal. For instance, we proposed the 100 countdown test as an effective means of detecting inattention. Further dedicated studies are warranted to shed light on the pathophysiology and better management of dementia, delirium and/or "altered mental status". We reviewed herein the clinical questions and controversies concerning delirium in an ED setting. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  19. Structural health monitoring (vibration) as a tool for identifying structural alterations of the lumbar spine: a twin control study.

    Science.gov (United States)

    Kawchuk, Gregory N; Hartvigsen, Jan; Edgecombe, Tiffany; Prasad, Narasimha; van Dieen, Jaap H

    2016-03-11

    Structural health monitoring (SHM) is an engineering technique used to identify mechanical abnormalities not readily apparent through other means. Recently, SHM has been adapted for use in biological systems, but its invasive nature limits its clinical application. As such, the purpose of this project was to determine if a non-invasive form of SHM could identify structural alterations in the spines of living human subjects. Lumbar spines of 10 twin pairs were visualized by magnetic resonance imaging then assessed by a blinded radiologist to determine whether twin pairs were structurally concordant or discordant. Vibration was then applied to each subject's spine and the resulting response recorded from sensors overlying lumbar spinous processes. The peak frequency, area under the curve and the root mean square were computed from the frequency response function of each sensor. Statistical analysis demonstrated that in twins whose structural appearance was discordant, peak frequency was significantly different between twin pairs while in concordant twins, no outcomes were significantly different. From these results, we conclude that structural changes within the spine can alter its vibration response. As such, further investigation of SHM to identify spinal abnormalities in larger human populations is warranted.

  20. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  1. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  2. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  3. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  4. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  5. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  6. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  7. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  8. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  9. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  10. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  11. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  12. Simple Strategic Analysis Tools at SMEs in Ecuador

    Directory of Open Access Journals (Sweden)

    Diego H. Álvarez Peralta

    2015-06-01

    Full Text Available This article explores the possible applications of Strategic Analysis Tools (SAT in SMEs located in emerging countries such as Ecuador (where there are no formal studies on the subject. It is intended to analyze if whether or not it is feasible to effectively apply a set of proposed tools to guide mental map decisions of executives when decisions on strategy have to be made. Through an in-depth review of the state of the art in regards to SAT and interviews performed to main participants such as chambers and executives of different firms, it is shown the feasibility of their application. This analysis is complemented with specialists´ interviews to deepen our insights and obtaining valid conclusions. Our conclusion is that SMEs can smoothly develop and apply an appropriate set of SAT when opting for very relevant choices. However, there are some inconveniences to be solved which are connected with resources (such as peoples’ abilities and technology and behavioral (cultural factors and methodological processes.Once these barriers are knocked down, it would be more likely to enrich current approaches to make strategic decisions even more effective. This is a qualitative investigation and the research design is not experimental (among them it is transversal as it relates to a specific moment in time.

  13. [Analysis of software for identifying spectral line of laser-induced breakdown spectroscopy based on LabVIEW].

    Science.gov (United States)

    Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang

    2012-03-01

    Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.

  14. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  15. Dipeptide frequency/bias analysis identifies conserved sites of nonrandomness shared by cysteine-rich motifs.

    Science.gov (United States)

    Campion, S R; Ameen, A S; Lai, L; King, J M; Munzenmaier, T N

    2001-08-15

    This report describes the application of a simple computational tool, AAPAIR.TAB, for the systematic analysis of the cysteine-rich EGF, Sushi, and Laminin motif/sequence families at the two-amino acid level. Automated dipeptide frequency/bias analysis detects preferences in the distribution of amino acids in established protein families, by determining which "ordered dipeptides" occur most frequently in comprehensive motif-specific sequence data sets. Graphic display of the dipeptide frequency/bias data revealed family-specific preferences for certain dipeptides, but more importantly detected a shared preference for employment of the ordered dipeptides Gly-Tyr (GY) and Gly-Phe (GF) in all three protein families. The dipeptide Asn-Gly (NG) also exhibited high-frequency and bias in the EGF and Sushi motif families, whereas Asn-Thr (NT) was distinguished in the Laminin family. Evaluation of the distribution of dipeptides identified by frequency/bias analysis subsequently revealed the highly restricted localization of the G(F/Y) and N(G/T) sequence elements at two separate sites of extreme conservation in the consensus sequence of all three sequence families. The similar employment of the high-frequency/bias dipeptides in three distinct protein sequence families was further correlated with the concurrence of these shared molecular determinants at similar positions within the distinctive scaffolds of three structurally divergent, but similarly employed, motif modules.

  16. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    Directory of Open Access Journals (Sweden)

    Kevin Till

    Full Text Available Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional. Players were blindly and randomly divided into an exploratory (n = 165 and validation dataset (n = 92. The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001, although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003. Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification.

  17. Application of cluster analysis to geochemical compositional data for identifying ore-related geochemical anomalies

    Science.gov (United States)

    Zhou, Shuguang; Zhou, Kefa; Wang, Jinlin; Yang, Genfang; Wang, Shanshan

    2017-12-01

    Cluster analysis is a well-known technique that is used to analyze various types of data. In this study, cluster analysis is applied to geochemical data that describe 1444 stream sediment samples collected in northwestern Xinjiang with a sample spacing of approximately 2 km. Three algorithms (the hierarchical, k-means, and fuzzy c-means algorithms) and six data transformation methods (the z-score standardization, ZST; the logarithmic transformation, LT; the additive log-ratio transformation, ALT; the centered log-ratio transformation, CLT; the isometric log-ratio transformation, ILT; and no transformation, NT) are compared in terms of their effects on the cluster analysis of the geochemical compositional data. The study shows that, on the one hand, the ZST does not affect the results of column- or variable-based (R-type) cluster analysis, whereas the other methods, including the LT, the ALT, and the CLT, have substantial effects on the results. On the other hand, the results of the row- or observation-based (Q-type) cluster analysis obtained from the geochemical data after applying NT and the ZST are relatively poor. However, we derive some improved results from the geochemical data after applying the CLT, the ILT, the LT, and the ALT. Moreover, the k-means and fuzzy c-means clustering algorithms are more reliable than the hierarchical algorithm when they are used to cluster the geochemical data. We apply cluster analysis to the geochemical data to explore for Au deposits within the study area, and we obtain a good correlation between the results retrieved by combining the CLT or the ILT with the k-means or fuzzy c-means algorithms and the potential zones of Au mineralization. Therefore, we suggest that the combination of the CLT or the ILT with the k-means or fuzzy c-means algorithms is an effective tool to identify potential zones of mineralization from geochemical data.

  18. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    Science.gov (United States)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for

  19. Development of the policy indicator checklist: a tool to identify and measure policies for calorie-dense foods and sugar-sweetened beverages across multiple settings.

    Science.gov (United States)

    Lee, Rebecca E; Hallett, Allen M; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O'Connor, Daniel P

    2015-05-01

    We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies.

  20. Genome-wide analysis of regulatory proteases sequences identified through bioinformatics data mining in Taenia solium.

    Science.gov (United States)

    Yan, Hong-Bin; Lou, Zhong-Zi; Li, Li; Brindley, Paul J; Zheng, Yadong; Luo, Xuenong; Hou, Junling; Guo, Aijiang; Jia, Wan-Zhong; Cai, Xuepeng

    2014-06-04

    Cysticercosis remains a major neglected tropical disease of humanity in many regions, especially in sub-Saharan Africa, Central America and elsewhere. Owing to the emerging drug resistance and the inability of current drugs to prevent re-infection, identification of novel vaccines and chemotherapeutic agents against Taenia solium and related helminth pathogens is a public health priority. The T. solium genome and the predicted proteome were reported recently, providing a wealth of information from which new interventional targets might be identified. In order to characterize and classify the entire repertoire of protease-encoding genes of T. solium, which act fundamental biological roles in all life processes, we analyzed the predicted proteins of this cestode through a combination of bioinformatics tools. Functional annotation was performed to yield insights into the signaling processes relevant to the complex developmental cycle of this tapeworm and to highlight a suite of the proteases as potential intervention targets. Within the genome of this helminth parasite, we identified 200 open reading frames encoding proteases from five clans, which correspond to 1.68% of the 11,902 protein-encoding genes predicted to be present in its genome. These proteases include calpains, cytosolic, mitochondrial signal peptidases, ubiquitylation related proteins, and others. Many not only show significant similarity to proteases in the Conserved Domain Database but have conserved active sites and catalytic domains. KEGG Automatic Annotation Server (KAAS) analysis indicated that ~60% of these proteases share strong sequence identities with proteins of the KEGG database, which are involved in human disease, metabolic pathways, genetic information processes, cellular processes, environmental information processes and organismal systems. Also, we identified signal peptides and transmembrane helices through comparative analysis with classes of important regulatory proteases

  1. Proteomics as a Tool to Identify New Targets Against Aspergillus and Scedosporium in the Context of Cystic Fibrosis.

    Science.gov (United States)

    Ramirez-Garcia, Andoni; Pellon, Aize; Buldain, Idoia; Antoran, Aitziber; Arbizu-Delgado, Aitana; Guruceaga, Xabier; Rementeria, Aitor; Hernando, Fernando L

    2018-02-01

    Cystic fibrosis (CF) is a genetic disorder that increases the risk of suffering microbial, including fungal, infections. In this paper, proteomics-based information was collated relating to secreted and cell wall proteins with potential medical applications from the most common filamentous fungi in CF, i.e., Aspergillus and Scedosporium/Lomentospora species. Among the Aspergillus fumigatus secreted allergens, β-1,3-endoglucanase, the alkaline protease 1 (Alp1/oryzin), Asp f 2, Asp f 13/15, chitinase, chitosanase, dipeptidyl-peptidase V (DppV), the metalloprotease Asp f 5, mitogillin/Asp f 1, and thioredoxin reductase receive a special mention. In addition, the antigens β-glucosidase 1, catalase, glucan endo-1,3-β-glucosidase EglC, β-1,3-glucanosyltransferases Gel1 and Gel2, and glutaminase A were also identified in secretomes of other Aspergillus species associated with CF: Aspergillus flavus, Aspergillus niger, Aspergillus nidulans, and Aspergillus terreus. Regarding cell wall proteins, cytochrome P450 and eEF-3 were proposed as diagnostic targets, and alkaline protease 2 (Alp2), Asp f 3 (putative peroxiredoxin pmp20), probable glycosidases Asp f 9/Crf1 and Crf2, GPI-anchored protein Ecm33, β-1,3-glucanosyltransferase Gel4, conidial hydrophobin Hyp1/RodA, and secreted aspartyl protease Pep2 as protective vaccines in A. fumigatus. On the other hand, for Scedosporium/Lomentospora species, the heat shock protein Hsp70 stands out as a relevant secreted and cell wall antigen. Additionally, the secreted aspartyl proteinase and an ortholog of Asp f 13, as well as the cell wall endo-1,3-β-D-glucosidase and 1,3-β-glucanosyl transferase, were also found to be significant proteins. In conclusion, proteins mentioned in this review may be promising candidates for developing innovative diagnostic and therapeutic tools for fungal infections in CF patients.

  2. Remote Sensing and GIS as Tools for Identifying Risk for Phreatomagmatic Eruptions in the Bishoftu Volcanic Field, Ethiopia

    Science.gov (United States)

    Pennington, H. G.; Graettinger, A.

    2017-12-01

    Bishoftu is a fast-growing town in the Oromia region of Ethiopia, located 47 km southeast of the nation's capital, Addis Ababa. It is situated atop a monogenetic basaltic volcanic field, called the Bishoftu Volcanic Field (BVF), which is composed of maar craters, scoria cones, lava flows, and rhyolite domes. Although not well dated, the morphology and archeological evidence have been used to infer a Holocene age, indicating that the community is exposed to continued volcanic risk. The presence of phreatomagmatic constructs in particular indicates that the hazards are not only vent-localized, but may have far reaching impacts. Hazard mapping is an essential tool for evaluating and communicating risks. This study presents the results of GIS analyses of proximal and distal syn-eruptive hazards associated with phreatomagmatic eruptions in the BVF. A digitized infrastructure map based on a SPOT 6 satellite image is used to identify the areas at risk from eruption scenarios. Parameters such as wind direction, vent location, and explosion energy are varied for hazard simulations to quantify the area impacted by different eruption scenarios. Proximal syn-eruptive hazards include tephra fall, base pyroclastic surges, and ballistic bombs. Distal hazards include predominantly ash fall. Eruption scenarios are simulated using Eject and Plumeria models as well as similar case studies from other urban volcanic fields. Within 5 km of the volcanic field center, more than 30 km2 of residential and commercial/industrial infrastructure will be damaged by proximal syn-eruptive hazards, in addition to 34 km2 of agricultural land, 291 km of roads, more than 10 km of railway, an airport, and two health centers. Within 100 km of the volcanic field center, ash fall will affect 3946 km2 of agricultural land, 179 km2 of residential land, and 28 km2 of commercial/industrial land. Approximately 2700 km of roads and railways, 553 km of waterways, an airport, and 14 health centers are located

  3. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pabian, Frank V [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant

  4. Utilizing a Photo-Analysis Software for Content Identifying Method (CIM

    Directory of Open Access Journals (Sweden)

    Nejad Nasim Sahraei

    2015-01-01

    Full Text Available Content Identifying Methodology or (CIM was developed to measure public preferences in order to reveal the common characteristics of landscapes and aspects of underlying perceptions including the individual's reactions to content and spatial configuration, therefore, it can assist with the identification of factors that influenced preference. Regarding the analysis of landscape photographs through CIM, there are several studies utilizing image analysis software, such as Adobe Photoshop, in order to identify the physical contents in the scenes. This study attempts to evaluate public’s ‘preferences for aesthetic qualities of pedestrian bridges in urban areas through a photo-questionnaire survey, in which respondents evaluated images of pedestrian bridges in urban areas. Two groups of images were evaluated as the most and least preferred scenes that concern the highest and lowest mean scores respectively. These two groups were analyzed by CIM and also evaluated based on the respondent’s description of each group to reveal the pattern of preferences and the factors that may affect them. Digimizer Software was employed to triangulate the two approaches and to determine the role of these factors on people’s preferences. This study attempts to introduce the useful software for image analysis which can measure the physical contents and also their spatial organization in the scenes. According to the findings, it is revealed that Digimizer could be a useful tool in CIM approaches through preference studies that utilizes photographs in place of the actual landscape in order to determine the most important factors in public preferences for pedestrian bridges in urban areas.

  5. ISAC - A tool for aeroservoelastic modeling and analysis. [Interaction of Structures, Aerodynamics, and Control

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood T.

    1993-01-01

    This paper discusses the capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrate some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  6. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  7. Are your students ready for anatomy and physiology? Developing tools to identify students at risk for failure.

    Science.gov (United States)

    Gultice, Amy; Witham, Ann; Kallmeyer, Robert

    2015-06-01

    High failure rates in introductory college science courses, including anatomy and physiology, are common at institutions across the country, and determining the specific factors that contribute to this problem is challenging. To identify students at risk for failure in introductory physiology courses at our open-enrollment institution, an online pilot survey was administered to 200 biology students. The survey results revealed several predictive factors related to academic preparation and prompted a comprehensive analysis of college records of >2,000 biology students over a 5-yr period. Using these historical data, a model that was 91% successful in predicting student success in these courses was developed. The results of the present study support the use of surveys and similar models to identify at-risk students and to provide guidance in the development of evidence-based advising programs and pedagogies. This comprehensive approach may be a tangible step in improving student success for students from a wide variety of backgrounds in anatomy and physiology courses. Copyright © 2015 The American Physiological Society.

  8. The design and use of reliability data base with analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.

  9. The design and use of reliability data base with analysis tool

    International Nuclear Information System (INIS)

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.

    1996-06-01

    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst's current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs

  10. Gene expression patterns combined with network analysis identify hub genes associated with bladder cancer.

    Science.gov (United States)

    Bi, Dongbin; Ning, Hao; Liu, Shuai; Que, Xinxiang; Ding, Kejia

    2015-06-01

    To explore molecular mechanisms of bladder cancer (BC), network strategy was used to find biomarkers for early detection and diagnosis. The differentially expressed genes (DEGs) between bladder carcinoma patients and normal subjects were screened using empirical Bayes method of the linear models for microarray data package. Co-expression networks were constructed by differentially co-expressed genes and links. Regulatory impact factors (RIF) metric was used to identify critical transcription factors (TFs). The protein-protein interaction (PPI) networks were constructed by the Search Tool for the Retrieval of Interacting Genes/Proteins (STRING) and clusters were obtained through molecular complex detection (MCODE) algorithm. Centralities analyses for complex networks were performed based on degree, stress and betweenness. Enrichment analyses were performed based on Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) databases. Co-expression networks and TFs (based on expression data of global DEGs and DEGs in different stages and grades) were identified. Hub genes of complex networks, such as UBE2C, ACTA2, FABP4, CKS2, FN1 and TOP2A, were also obtained according to analysis of degree. In gene enrichment analyses of global DEGs, cell adhesion, proteinaceous extracellular matrix and extracellular matrix structural constituent were top three GO terms. ECM-receptor interaction, focal adhesion, and cell cycle were significant pathways. Our results provide some potential underlying biomarkers of BC. However, further validation is required and deep studies are needed to elucidate the pathogenesis of BC. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  12. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  13. Identifying Key Features, Cutting Edge Cloud Resources, and Artificial Intelligence Tools to Achieve User-Friendly Water Science in the Cloud

    Science.gov (United States)

    Pierce, S. A.

    2017-12-01

    Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case

  14. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  15. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  16. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-04-01

    Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  17. Architecture of the global land acquisition system: applying the tools of network science to identify key vulnerabilities

    International Nuclear Information System (INIS)

    Seaquist, J W; Li Johansson, Emma; Nicholas, Kimberly A

    2014-01-01

    Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling. (letter)

  18. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  19. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  20. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  1. Means and tools to identify future public concerns as a basis for developing information and public involvement strategies

    International Nuclear Information System (INIS)

    Pages, J.P.

    1996-01-01

    Many studies have been devoted to the debate about nuclear power and nuclear waste. What are these studies exactly and what issues do they raise? What information do they provide to decision-makers? Can they help in formulating policy about communication and decision-making in the nuclear sphere? Today, the public is considered as a privileged partner with whom constructive dialog is possible. It is no longer simply a question of providing information, but of rethinking decision-making processes: is an active participation by the public in such processes desirable, and is it possible? This change in approach is of concern to social science researchers: do the bases underlying the studies carried out throughout the world need to be reviewed? Radical social change modifies the context in which decisions are taken and information and communication developed. A comprehensive and historical analysis of such change identifies elements which have to be taken into account in forward planning. The new methods the latter employ with regard to decision-making and development are the reflection of the questioning of science and expertise and the calls for greater environment protection and for a more democratic process on the part of the public. But having noted that the context is changing, how may the future be envisaged? A whole range of instruments is available to complete this comprehensive analysis: surveys of attitudes and opinions, monographs on actor interplay and press analysis. The analyses which are undertaken from studies to work in the field make it possible to identify a number of principles of action which, in turn, allow in theory to envisage possible strategies. But clearly, the application of studies to concrete situations raise problems: are statements made by the public to be taken literally? Must account be taken of claims? How are the values expressed by the public to be incorporated in the decisions taken? In fact, a project's future always depends on

  2. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  3. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  4. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  5. SINEBase: a database and tool for SINE analysis.

    Science.gov (United States)

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  6. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  7. Integrative genomic analysis identifies isoleucine and CodY as regulators of Listeria monocytogenes virulence.

    Directory of Open Access Journals (Sweden)

    Lior Lobel

    2012-09-01

    Full Text Available Intracellular bacterial pathogens are metabolically adapted to grow within mammalian cells. While these adaptations are fundamental to the ability to cause disease, we know little about the relationship between the pathogen's metabolism and virulence. Here we used an integrative Metabolic Analysis Tool that combines transcriptome data with genome-scale metabolic models to define the metabolic requirements of Listeria monocytogenes during infection. Twelve metabolic pathways were identified as differentially active during L. monocytogenes growth in macrophage cells. Intracellular replication requires de novo synthesis of histidine, arginine, purine, and branch chain amino acids (BCAAs, as well as catabolism of L-rhamnose and glycerol. The importance of each metabolic pathway during infection was confirmed by generation of gene knockout mutants in the respective pathways. Next, we investigated the association of these metabolic requirements in the regulation of L. monocytogenes virulence. Here we show that limiting BCAA concentrations, primarily isoleucine, results in robust induction of the master virulence activator gene, prfA, and the PrfA-regulated genes. This response was specific and required the nutrient responsive regulator CodY, which is known to bind isoleucine. Further analysis demonstrated that CodY is involved in prfA regulation, playing a role in prfA activation under limiting conditions of BCAAs. This study evidences an additional regulatory mechanism underlying L. monocytogenes virulence, placing CodY at the crossroads of metabolism and virulence.

  8. Integrated sequence analysis pipeline provides one-stop solution for identifying disease-causing mutations.

    Science.gov (United States)

    Hu, Hao; Wienker, Thomas F; Musante, Luciana; Kalscheuer, Vera M; Kahrizi, Kimia; Najmabadi, Hossein; Ropers, H Hilger

    2014-12-01

    Next-generation sequencing has greatly accelerated the search for disease-causing defects, but even for experts the data analysis can be a major challenge. To facilitate the data processing in a clinical setting, we have developed a novel medical resequencing analysis pipeline (MERAP). MERAP assesses the quality of sequencing, and has optimized capacity for calling variants, including single-nucleotide variants, insertions and deletions, copy-number variation, and other structural variants. MERAP identifies polymorphic and known causal variants by filtering against public domain databases, and flags nonsynonymous and splice-site changes. MERAP uses a logistic model to estimate the causal likelihood of a given missense variant. MERAP considers the relevant information such as phenotype and interaction with known disease-causing genes. MERAP compares favorably with GATK, one of the widely used tools, because of its higher sensitivity for detecting indels, its easy installation, and its economical use of computational resources. Upon testing more than 1,200 individuals with mutations in known and novel disease genes, MERAP proved highly reliable, as illustrated here for five families with disease-causing variants. We believe that the clinical implementation of MERAP will expedite the diagnostic process of many disease-causing defects. © 2014 WILEY PERIODICALS, INC.

  9. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  10. A new tool for risk analysis and assessment in petrochemical plants

    Directory of Open Access Journals (Sweden)

    El-Arkam Mechhoud

    2016-09-01

    Full Text Available The aim of our work was the implementation of a new automated tool dedicated to risk analysis and assessment in petrochemical plants, based on a combination of two analysis methods: HAZOP (HAZard and OPerability and FMEA (Failure Mode and Effect Analysis. Assessment of accident scenarios is also considered. The principal advantage of the two analysis methods is to speed-up hazard identification and risk assessment and forecast the nature and impact of such accidents. Plant parameters are analyzed under a graphical interface to facilitate the exploitation of our developed approach. This automated analysis brings out the different deviations of the operating parameters of any system in the plant. Possible causes of these deviations, their consequences and preventive actions are identified. The result is risk minimization and dependability enhancement of the considered system.

  11. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    Science.gov (United States)

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  12. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  13. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  14. Identifying and prioritizing the tools/techniques of knowledge management based on the Asian Productivity Organization Model (APO) to use in hospitals.

    Science.gov (United States)

    Khajouei, Hamid; Khajouei, Reza

    2017-12-01

    Appropriate knowledge, correct information, and relevant data are vital in medical diagnosis and treatment systems. Knowledge Management (KM) through its tools/techniques provides a pertinent framework for decision-making in healthcare systems. The objective of this study was to identify and prioritize the KM tools/techniques that apply to hospital setting. This is a descriptive-survey study. Data were collected using a -researcher-made questionnaire that was developed based on experts' opinions to select the appropriate tools/techniques from 26 tools/techniques of the Asian Productivity Organization (APO) model. Questions were categorized into five steps of KM (identifying, creating, storing, sharing, and applying the knowledge) according to this model. The study population consisted of middle and senior managers of hospitals and managing directors of Vice-Chancellor for Curative Affairs in Kerman University of Medical Sciences in Kerman, Iran. The data were analyzed in SPSS v.19 using one-sample t-test. Twelve out of 26 tools/techniques of the APO model were identified as the tools applicable in hospitals. "Knowledge café" and "APO knowledge management assessment tool" with respective means of 4.23 and 3.7 were the most and the least applicable tools in the knowledge identification step. "Mentor-mentee scheme", as well as "voice and Voice over Internet Protocol (VOIP)" with respective means of 4.20 and 3.52 were the most and the least applicable tools/techniques in the knowledge creation step. "Knowledge café" and "voice and VOIP" with respective means of 3.85 and 3.42 were the most and the least applicable tools/techniques in the knowledge storage step. "Peer assist and 'voice and VOIP' with respective means of 4.14 and 3.38 were the most and the least applicable tools/techniques in the knowledge sharing step. Finally, "knowledge worker competency plan" and "knowledge portal" with respective means of 4.38 and 3.85 were the most and the least applicable tools

  15. Metrics for Identifying Food Security Status and the Population with Potential to Benefit from Nutrition Interventions in the Lives Saved Tool (LiST).

    Science.gov (United States)

    Jackson, Bianca D; Walker, Neff; Heidkamp, Rebecca

    2017-11-01

    Background: The Lives Saved Tool (LiST) uses the poverty head-count ratio at $1.90/d as a proxy for food security to identify the percentage of the population with the potential to benefit from balanced energy supplementation and complementary feeding (CF) interventions, following the approach used for the Lancet 's 2008 series on Maternal and Child Undernutrition. Because much work has been done in the development of food security indicators, a re-evaluation of the use of this indicator was warranted. Objective: The aim was to re-evaluate the use of the poverty head-count ratio at $1.90/d as the food security proxy indicator in LiST. Methods: We carried out a desk review to identify available indicators of food security. We identified 3 indicators and compared them by using scatterplots, Spearman's correlations, and Bland-Altman plot analysis. We generated LiST projections to compare the modeled impact results with the use of the different indicators. Results: There are many food security indicators available, but only 3 additional indicators were identified with the data availability requirements to be used as the food security indicator in LiST. As expected, analyzed food security indicators were significantly positively correlated ( P security indicators that were used in the meta-analyses that produced the effect estimates. These are the poverty head-count ratio at $1.90/d for CF interventions and the prevalence of a low body mass index in women of reproductive age for balanced energy supplementation interventions. © 2017 American Society for Nutrition.

  16. Development of a PubMed Based Search Tool for Identifying Sex and Gender Specific Health Literature.

    Science.gov (United States)

    Song, Michael M; Simonsen, Cheryl K; Wilson, Joanna D; Jenkins, Marjorie R

    2016-02-01

    An effective literature search strategy is critical to achieving the aims of Sex and Gender Specific Health (SGSH): to understand sex and gender differences through research and to effectively incorporate the new knowledge into the clinical decision making process to benefit both male and female patients. The goal of this project was to develop and validate an SGSH literature search tool that is readily and freely available to clinical researchers and practitioners. PubMed, a freely available search engine for the Medline database, was selected as the platform to build the SGSH literature search tool. Combinations of Medical Subject Heading terms, text words, and title words were evaluated for optimal specificity and sensitivity. The search tool was then validated against reference bases compiled for two disease states, diabetes and stroke. Key sex and gender terms and limits were bundled to create a search tool to facilitate PubMed SGSH literature searches. During validation, the search tool retrieved 50 of 94 (53.2%) stroke and 62 of 95 (65.3%) diabetes reference articles selected for validation. A general keyword search of stroke or diabetes combined with sex difference retrieved 33 of 94 (35.1%) stroke and 22 of 95 (23.2%) diabetes reference base articles, with lower sensitivity and specificity for SGSH content. The Texas Tech University Health Sciences Center SGSH PubMed Search Tool provides higher sensitivity and specificity to sex and gender specific health literature. The tool will facilitate research, clinical decision-making, and guideline development relevant to SGSH.

  17. MetaFIND: A feature analysis tool for metabolomics data

    Directory of Open Access Journals (Sweden)

    Cunningham Pádraig

    2008-11-01

    Full Text Available Abstract Background Metabolomics, or metabonomics, refers to the quantitative analysis of all metabolites present within a biological sample and is generally carried out using NMR spectroscopy or Mass Spectrometry. Such analysis produces a set of peaks, or features, indicative of the metabolic composition of the sample and may be used as a basis for sample classification. Feature selection may be employed to improve classification accuracy or aid model explanation by establishing a subset of class discriminating features. Factors such as experimental noise, choice of technique and threshold selection may adversely affect the set of selected features retrieved. Furthermore, the high dimensionality and multi-collinearity inherent within metabolomics data may exacerbate discrepancies between the set of features retrieved and those required to provide a complete explanation of metabolite signatures. Given these issues, the latter in particular, we present the MetaFIND application for 'post-feature selection' correlation