WorldWideScience

Sample records for model key features

  1. Iris recognition based on key image feature extraction.

    Science.gov (United States)

    Ren, X; Tian, Q; Zhang, J; Wu, S; Zeng, Y

    2008-01-01

    In iris recognition, feature extraction can be influenced by factors such as illumination and contrast, and thus the features extracted may be unreliable, which can cause a high rate of false results in iris pattern recognition. In order to obtain stable features, an algorithm was proposed in this paper to extract key features of a pattern from multiple images. The proposed algorithm built an iris feature template by extracting key features and performed iris identity enrolment. Simulation results showed that the selected key features have high recognition accuracy on the CASIA Iris Set, where both contrast and illumination variance exist.

  2. Key Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  3. Key-Feature-Probleme zum Prüfen von prozeduralem Wissen: Ein Praxisleitfaden [Key Feature Problems for the assessment of procedural knowledge: a practical guide

    Directory of Open Access Journals (Sweden)

    Kopp, Veronika

    2006-08-01

    Full Text Available [english] After assigning the different examination formats to the diverse terms of Miller's pyramide of knowledge, this paper provides a short presentation of the key feature approach by giving the definition and an example for clarification. Afterwards, a practical guide to writing key feature problems is given consisting of the following steps: define the domain, choose a clinical situation, define the key features, develop a test case scenario, write questions, select a preferred response format, define the scoring key, and validation. Finally, we present the evaluation results of this practical guide. In sum, the participants were very pleased with it. The differences between the estimations of their knowledge before and after the workshop concerning key features were significant. The key feature approach is an innovative tool for assessing clinical decision-making skills, also for electronical examinations. Substituting the write-in format for the long-menu format allows an automatic data analysis. [german] Im vorliegenden Beitrag wird - nach der Zuordnung unterschiedlicher Prüfungsformen zu den verschiedenen Wissensarten der Wissenspyramide von Miller - der Key-Feature (KF Ansatz vorgestellt. Nachdem anhand der Definition und einem Beispiel erklärt wurde, was ein KF ist, wird im Anschluss eine Anleitung für die Erstellung eines KF-Problems gegeben. Diese besteht aus folgenden Schritten: Definition des Kontextes, Wahl der klinischen Situation, Identifikation der KFs des klinischen Problems, Schreiben des klinischen Szenarios (Fallvignette, Schreiben der einzelnen KF-Fragen, Auswahl des Antwortformates, Bewertungsverfahren und Inhaltsvalidierung. Am Ende werden die Ergebnisse einer Evaluation dieser Anleitung, die im Rahmen eines KF-Workshops gewonnen wurden, präsentiert. Die Teilnehmer waren mit dieser Workshopeinheit sehr zufrieden und gaben an, sehr viel gelernt zu haben. Die subjektive Einschätzung ihres Wissensstands vor und nach

  4. Feature and Meta-Models in Clafer: Mixed, Specialized, and Coupled

    DEFF Research Database (Denmark)

    Bąk, Kacper; Czarnecki, Krzysztof; Wasowski, Andrzej

    2011-01-01

    constraints (such as mapping feature configurations to component configurations or model templates). Clafer also allows arranging models into multiple specialization and extension layers via constraints and inheritance. We identify four key mechanisms allowing a meta-modeling language to express feature...

  5. The feature-weighted receptive field: an interpretable encoding model for complex feature spaces.

    Science.gov (United States)

    St-Yves, Ghislain; Naselaris, Thomas

    2017-06-20

    We introduce the feature-weighted receptive field (fwRF), an encoding model designed to balance expressiveness, interpretability and scalability. The fwRF is organized around the notion of a feature map-a transformation of visual stimuli into visual features that preserves the topology of visual space (but not necessarily the native resolution of the stimulus). The key assumption of the fwRF model is that activity in each voxel encodes variation in a spatially localized region across multiple feature maps. This region is fixed for all feature maps; however, the contribution of each feature map to voxel activity is weighted. Thus, the model has two separable sets of parameters: "where" parameters that characterize the location and extent of pooling over visual features, and "what" parameters that characterize tuning to visual features. The "where" parameters are analogous to classical receptive fields, while "what" parameters are analogous to classical tuning functions. By treating these as separable parameters, the fwRF model complexity is independent of the resolution of the underlying feature maps. This makes it possible to estimate models with thousands of high-resolution feature maps from relatively small amounts of data. Once a fwRF model has been estimated from data, spatial pooling and feature tuning can be read-off directly with no (or very little) additional post-processing or in-silico experimentation. We describe an optimization algorithm for estimating fwRF models from data acquired during standard visual neuroimaging experiments. We then demonstrate the model's application to two distinct sets of features: Gabor wavelets and features supplied by a deep convolutional neural network. We show that when Gabor feature maps are used, the fwRF model recovers receptive fields and spatial frequency tuning functions consistent with known organizational principles of the visual cortex. We also show that a fwRF model can be used to regress entire deep

  6. Cemento-osseous dysplasia of the jaw bones: key radiographic features.

    Science.gov (United States)

    Alsufyani, N A; Lam, E W N

    2011-03-01

    The purpose of this study is to assess possible diagnostic differences between general dentists (GPs) and oral and maxillofacial radiologists (RGs) in the identification of pathognomonic radiographic features of cemento-osseous dysplasia (COD) and its interpretation. Using a systematic objective survey instrument, 3 RGs and 3 GPs reviewed 50 image sets of COD and similarly appearing entities (dense bone island, cementoblastoma, cemento-ossifying fibroma, fibrous dysplasia, complex odontoma and sclerosing osteitis). Participants were asked to identify the presence or absence of radiographic features and then to make an interpretation of the images. RGs identified a well-defined border (odds ratio (OR) 6.67, P < 0.05); radiolucent periphery (OR 8.28, P < 0.005); bilateral occurrence (OR 10.23, P < 0.01); mixed radiolucent/radiopaque internal structure (OR 10.53, P < 0.01); the absence of non-concentric bony expansion (OR 7.63, P < 0.05); and the association with anterior and posterior teeth (OR 4.43, P < 0.05) as key features of COD. Consequently, RGs were able to correctly interpret 79.3% of COD cases. In contrast, GPs identified the absence of root resorption (OR 4.52, P < 0.05) and the association with anterior and posterior teeth (OR 3.22, P = 0.005) as the only key features of COD and were able to correctly interpret 38.7% of COD cases. There are statistically significant differences between RGs and GPs in the identification and interpretation of the radiographic features associated with COD (P < 0.001). We conclude that COD is radiographically discernable from other similarly appearing entities only if the characteristic radiographic features are correctly identified and then correctly interpreted.

  7. Research on Digital Product Modeling Key Technologies of Digital Manufacturing

    Institute of Scientific and Technical Information of China (English)

    DING Guoping; ZHOU Zude; HU Yefa; ZHAO Liang

    2006-01-01

    With the globalization and diversification of the market and the rapid development of Information Technology (IT) and Artificial Intelligence (AI), the digital revolution of manufacturing is coming. One of the key technologies in digital manufacturing is product digital modeling. This paper firstly analyzes the information and features of the product digital model during each stage in the product whole lifecycle, then researches on the three critical technologies of digital modeling in digital manufacturing-product modeling, standard for the exchange of product model data and digital product data management. And the potential signification of the product digital model during the process of digital manufacturing is concluded-product digital model integrates primary features of each stage during the product whole lifecycle based on graphic features, applies STEP as data exchange mechanism, and establishes PDM system to manage the large amount, complicated and dynamic product data to implement the product digital model data exchange, sharing and integration.

  8. Stego Keys Performance on Feature Based Coding Method in Text Domain

    Directory of Open Access Journals (Sweden)

    Din Roshidi

    2017-01-01

    Full Text Available A main critical factor on embedding process in any text steganography method is a key used known as stego key. This factor will be influenced the success of the embedding process of text steganography method to hide a message from third party or any adversary. One of the important aspects on embedding process in text steganography method is the fitness performance of the stego key. Three parameters of the fitness performance of the stego key have been identified such as capacity ratio, embedded fitness ratio and saving space ratio. It is because a better as capacity ratio, embedded fitness ratio and saving space ratio offers of any stego key; a more message can be hidden. Therefore, main objective of this paper is to analyze three features coding based namely CALP, VERT and QUAD of stego keys in text steganography on their capacity ratio, embedded fitness ratio and saving space ratio. It is found that CALP method give a good effort performance compared to VERT and QUAD methods.

  9. Feature inference with uncertain categorization: Re-assessing Anderson's rational model.

    Science.gov (United States)

    Konovalova, Elizaveta; Le Mens, Gaël

    2017-09-18

    A key function of categories is to help predictions about unobserved features of objects. At the same time, humans are often in situations where the categories of the objects they perceive are uncertain. In an influential paper, Anderson (Psychological Review, 98(3), 409-429, 1991) proposed a rational model for feature inferences with uncertain categorization. A crucial feature of this model is the conditional independence assumption-it assumes that the within category feature correlation is zero. In prior research, this model has been found to provide a poor fit to participants' inferences. This evidence is restricted to task environments inconsistent with the conditional independence assumption. Currently available evidence thus provides little information about how this model would fit participants' inferences in a setting with conditional independence. In four experiments based on a novel paradigm and one experiment based on an existing paradigm, we assess the performance of Anderson's model under conditional independence. We find that this model predicts participants' inferences better than competing models. One model assumes that inferences are based on just the most likely category. The second model is insensitive to categories but sensitive to overall feature correlation. The performance of Anderson's model is evidence that inferences were influenced not only by the more likely category but also by the other candidate category. Our findings suggest that a version of Anderson's model which relaxes the conditional independence assumption will likely perform well in environments characterized by within-category feature correlation.

  10. The Progressive BSSG Rat Model of Parkinson's: Recapitulating Multiple Key Features of the Human Disease.

    Directory of Open Access Journals (Sweden)

    Jackalina M Van Kampen

    Full Text Available The development of effective neuroprotective therapies for Parkinson's disease (PD has been severely hindered by the notable lack of an appropriate animal model for preclinical screening. Indeed, most models currently available are either acute in nature or fail to recapitulate all characteristic features of the disease. Here, we present a novel progressive model of PD, with behavioural and cellular features that closely approximate those observed in patients. Chronic exposure to dietary phytosterol glucosides has been found to be neurotoxic. When fed to rats, β-sitosterol β-d-glucoside (BSSG triggers the progressive development of parkinsonism, with clinical signs and histopathology beginning to appear following cessation of exposure to the neurotoxic insult and continuing to develop over several months. Here, we characterize the progressive nature of this model, its non-motor features, the anatomical spread of synucleinopathy, and response to levodopa administration. In Sprague Dawley rats, chronic BSSG feeding for 4 months triggered the progressive development of a parkinsonian phenotype and pathological events that evolved slowly over time, with neuronal loss beginning only after toxin exposure was terminated. At approximately 3 months following initiation of BSSG exposure, animals displayed the early emergence of an olfactory deficit, in the absence of significant dopaminergic nigral cell loss or locomotor deficits. Locomotor deficits developed gradually over time, initially appearing as locomotor asymmetry and developing into akinesia/bradykinesia, which was reversed by levodopa treatment. Late-stage cognitive impairment was observed in the form of spatial working memory deficits, as assessed by the radial arm maze. In addition to the progressive loss of TH+ cells in the substantia nigra, the appearance of proteinase K-resistant intracellular α-synuclein aggregates was also observed to develop progressively, appearing first in the

  11. Salient Key Features of Actual English Instructional Practices in Saudi Arabia

    Science.gov (United States)

    Al-Seghayer, Khalid

    2015-01-01

    This is a comprehensive review of the salient key features of the actual English instructional practices in Saudi Arabia. The goal of this work is to gain insights into the practices and pedagogic approaches to English as a foreign language (EFL) teaching currently employed in this country. In particular, we identify the following central features…

  12. Key Features of Political Advertising as an Independent Type of Advertising Communication

    Directory of Open Access Journals (Sweden)

    Svetlana Anatolyevna Chubay

    2015-09-01

    Full Text Available To obtain the most complete understanding of the features of political advertising, the author characterizes its specific features allocated by modern researchers. The problem of defining the notion of political advertising is studied in detail. The analysis of definitions available in professional literature has allowed the author to identify a number of key features that characterize political advertising as an independent type of promotional activity. These features include belonging to the forms of mass communication, implemented through different communication channels; the presence of characteristics typical of any advertising as a form of mass communication (strategies and concepts promoting the program, ideas; an integrated approach to the selection of communication channels, means and the methods of informing the addressers that focus on the audience; the formation of psychological attitude to voting; the image nature; the manipulative potential. It is shown that the influence is the primary function of political advertising – it determines the key characteristics common to this type of advertising. Political advertising, reflecting the essence of the political platform of certain political forces, setting up voters for their support, forming and introducing into the mass consciousness a definite idea of the character of these political forces, creates the desired psychological attitude to the voting. The analysis of definitions available in professional literature has allowed the author to formulate an operational definition of political advertising, which allowed to include the features that distinguish political advertising from other forms of political communication such as political PR which is traditionally mixed with political advertising.

  13. A Co-modeling Method Based on Component Features for Mechatronic Devices in Aero-engines

    Science.gov (United States)

    Wang, Bin; Zhao, Haocen; Ye, Zhifeng

    2017-08-01

    Data-fused and user-friendly design of aero-engine accessories is required because of their structural complexity and stringent reliability. This paper gives an overview of a typical aero-engine control system and the development process of key mechatronic devices used. Several essential aspects of modeling and simulation in the process are investigated. Considering the limitations of a single theoretic model, feature-based co-modeling methodology is suggested to satisfy the design requirements and compensate for diversity of component sub-models for these devices. As an example, a stepper motor controlled Fuel Metering Unit (FMU) is modeled in view of the component physical features using two different software tools. An interface is suggested to integrate the single discipline models into the synthesized one. Performance simulation of this device using the co-model and parameter optimization for its key components are discussed. Comparison between delivery testing and the simulation shows that the co-model for the FMU has a high accuracy and the absolute superiority over a single model. Together with its compatible interface with the engine mathematical model, the feature-based co-modeling methodology is proven to be an effective technical measure in the development process of the device.

  14. Some key features in the evolution of self psychology and psychoanalysis.

    Science.gov (United States)

    Fosshage, James L

    2009-04-01

    Psychoanalysis, as every science and its application, has continued to evolve over the past century, especially accelerating over the last 30 years. Self psychology has played a constitutive role in that evolution and has continued to change itself. These movements have been supported and augmented by a wide range of emergent research and theory, especially that of cognitive psychology, infant and attachment research, rapid eye movement and dream research, psychotherapy research, and neuroscience. I present schematically some of what I consider to be the key features of the evolution of self psychology and their interconnection with that of psychoanalysis at large, including the revolutionary paradigm changes, the new epistemology, listening/experiencing perspectives, from narcissism to the development of the self, the new organization model of transference, the new organization model of dreams, and the implicit and explicit dimensions of analytic work. I conclude with a focus on the radical ongoing extension of the analyst's participation in the analytic relationship, using, as an example, the co-creation of analytic love, and providing several brief clinical illustrations. The leading edge question guiding my discussion is "How does analytic change occur?"

  15. Extraction and representation of common feature from uncertain facial expressions with cloud model.

    Science.gov (United States)

    Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing

    2017-12-01

    Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.

  16. The key design features of the Indian advanced heavy water reactor

    International Nuclear Information System (INIS)

    Sinha, R.K.; Kakodkar, A.; Anand, A.K.; Venkat Raj, V.; Balakrishnan, K.

    1999-01-01

    The 235 MWe Indian Advanced Heavy Water Reactor (AHWR) is a vertical, pressure tube type, boiling light water cooled reactor. The three key specific features of design of the AHWR, having a large impact on its viability, safety and economics, relate to its reactor physics, coolant channel, and passive safety features. The reactor physics design is tuned for maximising use of thorium based fuel, and achieving a slightly negative void coefficient of reactivity. The fulfilment of these requirements has been possible through use of PuO 2 -ThO 2 MOX, and ThO 2 -U 233 O 2 MOX in different pins of the same fuel cluster, and use of a heterogeneous moderator consisting of pyrolytic carbon and heavy water in 80%-20% volume ratio. The coolant channels of AHWR are designed for easy replaceability of pressure tubes, during normal maintenance shutdowns. The removal of pressure tube along with bottom end-fitting, using rolled joint detachment technology, can be done in AHWR coolant channels without disturbing the top end-fitting, tail pipe and feeder connections, and all other appendages of the coolant channel. The AHWR incorporates several passive safety features. These include core heat removal through natural circulation, direct injection of Emergency Core Coolant System (ECCS) water in fuel, passive systems for containment cooling and isolation, and availability of a large inventory of borated water in overhead Gravity Driven Water Pool (GDWP) to facilitate sustenance of core decay heat removal, ECCS injection, and containment cooling for three days without invoking any active systems or operator action. Incorporation of these features has been done together with considerable design simplifications, and elimination of several reactor grade equipment. A rigorous evaluation of feasibility of AHWR design concept has been completed. The economy enhancing aspects of its key design features are expected to compensate for relative complexity of the thorium fuel cycle activities

  17. Key Features of Electric Vehicle Diffusion and Its Impact on the Korean Power Market

    Directory of Open Access Journals (Sweden)

    Dongnyok Shim

    2018-06-01

    Full Text Available The market share of electric vehicles is growing and the interest in these vehicles is rapidly increasing in industrialized countries. In the light of these circumstances, this study provides an integrated policy-making package, which includes key features for electric vehicle diffusion and its impact on the Korean power market. This research is based on a quantitative analysis with the following steps: (1 it analyzes drivers’ preferences for electric or traditional internal combustion engine (ICE vehicles with respect to key automobile attributes and these key attributes indicate what policy makers should focus on; (2 it forecasts the achievable level of market share of electric vehicles in relation to improvements in their key attributes; and (3 it evaluates the impact of electric vehicle diffusion on the Korean power market based on an achievable level of market share with different charging demand profiles. Our results reveal the market share of electric vehicles can increase to around 40% of the total market share if the key features of electric vehicles reach a similar level to those of traditional vehicles. In this estimation, an increase in the power market’s system generation costs will reach around 10% of the cost in the baseline scenario, which differs slightly depending on charging demand profiles.

  18. Overall Design Features and Key Technology Development for KJRR

    Energy Technology Data Exchange (ETDEWEB)

    Park, C.; Lee, B. C.; Ryu, J. S.; Kim, Y. K. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The KJRR (Ki-Jang Research Reactor) project was launched on Apr., 2012; 1) to make up the advanced technology related to RRs, 2) to provide the self-sufficiency in terms of medical and industrial radioisotope (RI) supply, and 3) to enlarge the NTD silicon doping services for growing the power device industry. The major facilities to be built through the KJRR project are, • 15 MW Research Reactor and Reactor building • Radioisotopes Production Facility (RIPF) and related R and D Facility • Fission Mo Production Facility (FMPF) with LEU Target • Radio-waste Treatment Facility (RTF) • Neutron Irradiation Facility such as PTS and HTS. This paper describes the overall design features of the KJRR and the key technology development for RRs during the project. The overall design features of the KJRR and RR technology under development have been overviewd. The design of the KJRR will comply with the Korean Nuclear Law, regulatory requirements and guidelines as well as international standards and guidelines. The KJRR is expected to be put into operation in the middle of 2019.

  19. Key clinical features to identify girls with CDKL5 mutations.

    Science.gov (United States)

    Bahi-Buisson, Nadia; Nectoux, Juliette; Rosas-Vargas, Haydeé; Milh, Mathieu; Boddaert, Nathalie; Girard, Benoit; Cances, Claude; Ville, Dorothée; Afenjar, Alexandra; Rio, Marlène; Héron, Delphine; N'guyen Morel, Marie Ange; Arzimanoglou, Alexis; Philippe, Christophe; Jonveaux, Philippe; Chelly, Jamel; Bienvenu, Thierry

    2008-10-01

    Mutations in the human X-linked cyclin-dependent kinase-like 5 (CDKL5) gene have been shown to cause infantile spasms as well as Rett syndrome (RTT)-like phenotype. To date, less than 25 different mutations have been reported. So far, there are still little data on the key clinical diagnosis criteria and on the natural history of CDKL5-associated encephalopathy. We screened the entire coding region of CDKL5 for mutations in 183 females with encephalopathy with early seizures by denaturing high liquid performance chromatography and direct sequencing, and we identified in 20 unrelated girls, 18 different mutations including 7 novel mutations. These mutations were identified in eight patients with encephalopathy with RTT-like features, five with infantile spasms and seven with encephalopathy with refractory epilepsy. Early epilepsy with normal interictal EEG and severe hypotonia are the key clinical features in identifying patients likely to have CDKL5 mutations. Our study also indicates that these patients clearly exhibit some RTT features such as deceleration of head growth, stereotypies and hand apraxia and that these RTT features become more evident in older and ambulatory patients. However, some RTT signs are clearly absent such as the so called RTT disease profile (period of nearly normal development followed by regression with loss of acquired fine finger skill in early childhood and characteristic intensive eye communication) and the characteristic evolution of the RTT electroencephalogram. Interestingly, in addition to the overall stereotypical symptomatology (age of onset and evolution of the disease) resulting from CDKL5 mutations, atypical forms of CDKL5-related conditions have also been observed. Our data suggest that phenotypic heterogeneity does not correlate with the nature or the position of the mutations or with the pattern of X-chromosome inactivation, but most probably with the functional transcriptional and/or translational consequences of CDKL5

  20. Multiple Information Fusion Face Recognition Using Key Feature Points

    Directory of Open Access Journals (Sweden)

    LIN Kezheng

    2017-06-01

    Full Text Available After years of face recognition research,due to the effect of illumination,noise and other conditions have led to the recognition rate is relatively low,2 d face recognition technology has couldn’t keep up with the pace of The Times the forefront,Although 3 d face recognition technology is developing step by step,but it has a higher complexity. In order to solve this problem,based on the traditional depth information positioning method and local characteristic analysis methods LFA,puts forward an improved 3 d face key feature points localization algorithm, and on the basis of the trained sample which obtained by complete cluster,further put forward the global and local feature extraction algorithm of weighted fusion. Through FRGC and BU-3DFE experiment data comparison and analysis of the two face library,the method in terms of 3 d face recognition effect has a higher robustness.

  1. Modelling energy demand of developing countries: Are the specific features adequately captured?

    International Nuclear Information System (INIS)

    Bhattacharyya, Subhes C.; Timilsina, Govinda R.

    2010-01-01

    This paper critically reviews existing energy demand forecasting methodologies highlighting the methodological diversities and developments over the past four decades in order to investigate whether the existing energy demand models are appropriate for capturing the specific features of developing countries. The study finds that two types of approaches, econometric and end-use accounting, are commonly used in the existing energy demand models. Although energy demand models have greatly evolved since the early seventies, key issues such as the poor-rich and urban-rural divides, traditional energy resources and differentiation between commercial and non-commercial energy commodities are often poorly reflected in these models. While the end-use energy accounting models with detailed sectoral representations produce more realistic projections as compared to the econometric models, they still suffer from huge data deficiencies especially in developing countries. Development and maintenance of more detailed energy databases, further development of models to better reflect developing country context and institutionalizing the modelling capacity in developing countries are the key requirements for energy demand modelling to deliver richer and more reliable input to policy formulation in developing countries.

  2. Modelling energy demand of developing countries: Are the specific features adequately captured?

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharyya, Subhes C. [CEPMLP, University of Dundee, Dundee DD1 4HN (United Kingdom); Timilsina, Govinda R. [Development Research Group, The World Bank, Washington DC (United States)

    2010-04-15

    This paper critically reviews existing energy demand forecasting methodologies highlighting the methodological diversities and developments over the past four decades in order to investigate whether the existing energy demand models are appropriate for capturing the specific features of developing countries. The study finds that two types of approaches, econometric and end-use accounting, are commonly used in the existing energy demand models. Although energy demand models have greatly evolved since the early seventies, key issues such as the poor-rich and urban-rural divides, traditional energy resources and differentiation between commercial and non-commercial energy commodities are often poorly reflected in these models. While the end-use energy accounting models with detailed sectoral representations produce more realistic projections as compared to the econometric models, they still suffer from huge data deficiencies especially in developing countries. Development and maintenance of more detailed energy databases, further development of models to better reflect developing country context and institutionalizing the modelling capacity in developing countries are the key requirements for energy demand modelling to deliver richer and more reliable input to policy formulation in developing countries. (author)

  3. Innovations in individual feature history management - The significance of feature-based temporal model

    Science.gov (United States)

    Choi, J.; Seong, J.C.; Kim, B.; Usery, E.L.

    2008-01-01

    A feature relies on three dimensions (space, theme, and time) for its representation. Even though spatiotemporal models have been proposed, they have principally focused on the spatial changes of a feature. In this paper, a feature-based temporal model is proposed to represent the changes of both space and theme independently. The proposed model modifies the ISO's temporal schema and adds new explicit temporal relationship structure that stores temporal topological relationship with the ISO's temporal primitives of a feature in order to keep track feature history. The explicit temporal relationship can enhance query performance on feature history by removing topological comparison during query process. Further, a prototype system has been developed to test a proposed feature-based temporal model by querying land parcel history in Athens, Georgia. The result of temporal query on individual feature history shows the efficiency of the explicit temporal relationship structure. ?? Springer Science+Business Media, LLC 2007.

  4. Model plant Key Measurement Points

    International Nuclear Information System (INIS)

    Schneider, R.A.

    1984-01-01

    For IAEA safeguards a Key Measurement Point is defined as the location where nuclear material appears in such a form that it may be measured to determine material flow or inventory. This presentation describes in an introductory manner the key measurement points and associated measurements for the model plant used in this training course

  5. Safety analysis for key design features of KALIMER-600 design concept

    International Nuclear Information System (INIS)

    Lee, Yong-Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Joeng, H. Y.; Ha, K. S.; Heo, S.

    2005-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents, containment design basis accidents, and flow blockages in the KALIMER design are presented. First, the basic approach to achieve the safety goal and main design features of KALIMER-600 are introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2, In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. The objectives of Chapter 4, are to assess the response of KALIMER-600 containment to the design basis accidents and to evaluate whether the consequences are acceptable or not in the aspect of structural integrity and the exposure dose rate. In Chapter 5, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly, are described. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed

  6. Discrete Feature Model (DFM) User Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Geier, Joel (Clearwater Hardrock Consulting, Corvallis, OR (United States))

    2008-06-15

    This manual describes the Discrete-Feature Model (DFM) software package for modelling groundwater flow and solute transport in networks of discrete features. A discrete-feature conceptual model represents fractures and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which is usually treated as impermeable. This approximation may be valid for crystalline rocks such as granite or basalt, which have very low permeability if macroscopic fractures are excluded. A discrete feature is any entity that can conduct water and permit solute transport through bedrock, and can be reasonably represented as a piecewise-planar conductor. Examples of such entities may include individual natural fractures (joints or faults), fracture zones, and disturbed-zone features around tunnels (e.g. blasting-induced fractures or stress-concentration induced 'onion skin' fractures around underground openings). In a more abstract sense, the effectively discontinuous nature of pathways through fractured crystalline bedrock may be idealized as discrete, equivalent transmissive features that reproduce large-scale observations, even if the details of connective paths (and unconnected domains) are not precisely known. A discrete-feature model explicitly represents the fundamentally discontinuous and irregularly connected nature of systems of such systems, by constraining flow and transport to occur only within such features and their intersections. Pathways for flow and solute transport in this conceptualization are a consequence not just of the boundary conditions and hydrologic properties (as with continuum models), but also the irregularity of connections between conductive/transmissive features. The DFM software package described here is an extensible code for investigating problems of flow and transport in geological (natural or human-altered) systems that can be characterized effectively in terms of discrete features. With this

  7. Discrete Feature Model (DFM) User Documentation

    International Nuclear Information System (INIS)

    Geier, Joel

    2008-06-01

    This manual describes the Discrete-Feature Model (DFM) software package for modelling groundwater flow and solute transport in networks of discrete features. A discrete-feature conceptual model represents fractures and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which is usually treated as impermeable. This approximation may be valid for crystalline rocks such as granite or basalt, which have very low permeability if macroscopic fractures are excluded. A discrete feature is any entity that can conduct water and permit solute transport through bedrock, and can be reasonably represented as a piecewise-planar conductor. Examples of such entities may include individual natural fractures (joints or faults), fracture zones, and disturbed-zone features around tunnels (e.g. blasting-induced fractures or stress-concentration induced 'onion skin' fractures around underground openings). In a more abstract sense, the effectively discontinuous nature of pathways through fractured crystalline bedrock may be idealized as discrete, equivalent transmissive features that reproduce large-scale observations, even if the details of connective paths (and unconnected domains) are not precisely known. A discrete-feature model explicitly represents the fundamentally discontinuous and irregularly connected nature of systems of such systems, by constraining flow and transport to occur only within such features and their intersections. Pathways for flow and solute transport in this conceptualization are a consequence not just of the boundary conditions and hydrologic properties (as with continuum models), but also the irregularity of connections between conductive/transmissive features. The DFM software package described here is an extensible code for investigating problems of flow and transport in geological (natural or human-altered) systems that can be characterized effectively in terms of discrete features. With this software, the

  8. Modeling multiple visual words assignment for bag-of-features based medical image retrieval

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-01-01

    In this paper, we investigate the bag-of-features based medical image retrieval methods, which represent an image as a collection of local features, such as image patch and key points with SIFT descriptor. To improve the bag-of-features method, we first model the assignment of local descriptor as contribution functions, and then propose a new multiple assignment strategy. By assuming the local feature can be reconstructed by its neighboring visual words in vocabulary, we solve the reconstruction weights as a QP problem and then use the solved weights as contribution functions, which results in a new assignment method called the QP assignment. We carry our experiments on ImageCLEFmed datasets. Experiments\\' results show that our proposed method exceeds the performances of traditional solutions and works well for the bag-of-features based medical image retrieval tasks.

  9. Modeling multiple visual words assignment for bag-of-features based medical image retrieval

    KAUST Repository

    Wang, Jim Jing-Yan; Almasri, Islam

    2012-01-01

    In this paper, we investigate the bag-of-features based medical image retrieval methods, which represent an image as a collection of local features, such as image patch and key points with SIFT descriptor. To improve the bag-of-features method, we first model the assignment of local descriptor as contribution functions, and then propose a new multiple assignment strategy. By assuming the local feature can be reconstructed by its neighboring visual words in vocabulary, we solve the reconstruction weights as a QP problem and then use the solved weights as contribution functions, which results in a new assignment method called the QP assignment. We carry our experiments on ImageCLEFmed datasets. Experiments' results show that our proposed method exceeds the performances of traditional solutions and works well for the bag-of-features based medical image retrieval tasks.

  10. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    Science.gov (United States)

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it

  11. Key-feature questions for assessment of clinical reasoning: a literature review.

    Science.gov (United States)

    Hrynchak, Patricia; Takahashi, Susan Glover; Nayer, Marla

    2014-09-01

    Key-feature questions (KFQs) have been developed to assess clinical reasoning skills. The purpose of this paper is to review the published evidence on the reliability and validity of KFQs to assess clinical reasoning. A literature review was conducted by searching MEDLINE (1946-2012) and EMBASE (1980-2012) via OVID and ERIC. The following search terms were used: key feature; question or test or tests or testing or tested or exam; assess or evaluation, and case-based or case-specific. Articles not in English were eliminated. The literature search resulted in 560 articles. Duplicates were eliminated, as were articles that were not relevant; nine articles that contained reliability or validity data remained. A review of the references and of citations of these articles resulted in an additional 12 articles to give a total of 21 for this review. Format, language and scoring of KFQ examinations have been studied and modified to maximise reliability. Internal consistency reliability has been reported as being between 0.49 and 0.95. Face and content validity have been shown to be moderate to high. Construct validity has been shown to be good using vector thinking processes and novice versus expert paradigms, and to discriminate between teaching methods. The very modest correlations between KFQ examinations and more general knowledge-based examinations point to differing roles for each. Importantly, the results of KFQ examinations have been shown to successfully predict future physician performance, including patient outcomes. Although it is inaccurate to conclude that any testing format is universally reliable or valid, published research supports the use of examinations using KFQs to assess clinical reasoning. The review identifies areas of further study, including all categories of evidence. Investigation into how examinations using KFQs integrate with other methods in a system of assessment is needed. © 2014 John Wiley & Sons Ltd.

  12. Identifying Key Features of Student Performance in Educational Video Games and Simulations through Cluster Analysis

    Science.gov (United States)

    Kerr, Deirdre; Chung, Gregory K. W. K.

    2012-01-01

    The assessment cycle of "evidence-centered design" (ECD) provides a framework for treating an educational video game or simulation as an assessment. One of the main steps in the assessment cycle of ECD is the identification of the key features of student performance. While this process is relatively simple for multiple choice tests, when…

  13. Soil fauna: key to new carbon models

    OpenAIRE

    Filser, Juliane; Faber, Jack H.; Tiunov, Alexei V.; Brussaard, Lijbert; Frouz, Jan; Deyn, Gerlinde; Uvarov, Alexei V.; Berg, Matty P.; Lavelle, Patrick; Loreau, Michel; Wall, Diana H.; Querner, Pascal; Eijsackers, Herman; Jiménez, Juan José

    2016-01-01

    Soil organic matter (SOM) is key to maintaining soil fertility, mitigating climate change, combatting land degradation, and conserving above- and below-ground biodiversity and associated soil processes and ecosystem services. In order to derive management options for maintaining these essential services provided by soils, policy makers depend on robust, predictive models identifying key drivers of SOM dynamics. Existing SOM models and suggested guidelines for future SOM modelling are defined ...

  14. Analysing Feature Model Changes using FMDiff

    NARCIS (Netherlands)

    Dintzner, N.J.R.; Van Deursen, A.; Pinzger, M.

    2015-01-01

    Evolving a large scale, highly variable sys- tems is a challenging task. For such a system, evolution operations often require to update consistently both their implementation and its feature model. In this con- text, the evolution of the feature model closely follows the evolution of the system.

  15. Orthognathic model surgery with LEGO key-spacer.

    Science.gov (United States)

    Tsang, Alfred Chee-Ching; Lee, Alfred Siu Hong; Li, Wai Keung

    2013-12-01

    A new technique of model surgery using LEGO plates as key-spacers is described. This technique requires less time to set up compared with the conventional plaster model method. It also retains the preoperative setup with the same set of models. Movement of the segments can be measured and examined in detail with LEGO key-spacers. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Object feature extraction and recognition model

    International Nuclear Information System (INIS)

    Wan Min; Xiang Rujian; Wan Yongxing

    2001-01-01

    The characteristics of objects, especially flying objects, are analyzed, which include characteristics of spectrum, image and motion. Feature extraction is also achieved. To improve the speed of object recognition, a feature database is used to simplify the data in the source database. The feature vs. object relationship maps are stored in the feature database. An object recognition model based on the feature database is presented, and the way to achieve object recognition is also explained

  17. A bank-fund projection framework with CGE features

    DEFF Research Database (Denmark)

    Jensen, Henning Tarp; Tarp, Finn

    2006-01-01

    In this paper, we present a SAM-based methodology for integrating standard CGE features with a macroeconomic World Bank–International Monetary Fund (IMF) modelling framework. The resulting macro–micro framework is based on optimising agents, but it retains key features from the macroeconomic model...

  18. Model Checking Feature Interactions

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Pedersen, Thomas

    2015-01-01

    This paper presents an offline approach to analyzing feature interactions in embedded systems. The approach consists of a systematic process to gather the necessary information about system components and their models. The model is first specified in terms of predicates, before being refined to t...... to timed automata. The consistency of the model is verified at different development stages, and the correct linkage between the predicates and their semantic model is checked. The approach is illustrated on a use case from home automation....

  19. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  20. Bayesian latent feature modeling for modeling bipartite networks with overlapping groups

    DEFF Research Database (Denmark)

    Jørgensen, Philip H.; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2016-01-01

    Bi-partite networks are commonly modelled using latent class or latent feature models. Whereas the existing latent class models admit marginalization of parameters specifying the strength of interaction between groups, existing latent feature models do not admit analytical marginalization...... by the notion of community structure such that the edge density within groups is higher than between groups. Our model further assumes that entities can have different propensities of generating links in one of the modes. The proposed framework is contrasted on both synthetic and real bi-partite networks...... feature representations in bipartite networks provides a new framework for accounting for structure in bi-partite networks using binary latent feature representations providing interpretable representations that well characterize structure as quantified by link prediction....

  1. A study of key features of the RAE atmospheric turbulence model

    Science.gov (United States)

    Jewell, W. F.; Heffley, R. K.

    1978-01-01

    A complex atmospheric turbulence model for use in aircraft simulation is analyzed in terms of its temporal, spectral, and statistical characteristics. First, a direct comparison was made between cases of the RAE model and the more conventional Dryden turbulence model. Next the control parameters of the RAE model were systematically varied and the effects noted. The RAE model was found to possess a high degree of flexibility in its characteristics, but the individual control parameters are cross-coupled in terms of their effect on various measures of intensity, bandwidth, and probability distribution.

  2. An Appraisal Model Based on a Synthetic Feature Selection Approach for Students’ Academic Achievement

    Directory of Open Access Journals (Sweden)

    Ching-Hsue Cheng

    2017-11-01

    Full Text Available Obtaining necessary information (and even extracting hidden messages from existing big data, and then transforming them into knowledge, is an important skill. Data mining technology has received increased attention in various fields in recent years because it can be used to find historical patterns and employ machine learning to aid in decision-making. When we find unexpected rules or patterns from the data, they are likely to be of high value. This paper proposes a synthetic feature selection approach (SFSA, which is combined with a support vector machine (SVM to extract patterns and find the key features that influence students’ academic achievement. For verifying the proposed model, two databases, namely, “Student Profile” and “Tutorship Record”, were collected from an elementary school in Taiwan, and were concatenated into an integrated dataset based on students’ names as a research dataset. The results indicate the following: (1 the accuracy of the proposed feature selection approach is better than that of the Minimum-Redundancy-Maximum-Relevance (mRMR approach; (2 the proposed model is better than the listing methods when the six least influential features have been deleted; and (3 the proposed model can enhance the accuracy and facilitate the interpretation of the pattern from a hybrid-type dataset of students’ academic achievement.

  3. A Standard Bank-Fund Projection Framework with CGE Features

    DEFF Research Database (Denmark)

    Jensen, Henning Tarp; Tarp, Finn

    2006-01-01

    In this paper, we present a SAM-based methodology for integrating standard CGE features with a macroeconomic World Bank–International Monetary Fund (IMF) modelling framework. The resulting macro–micro framework is based on optimising agents, but it retains key features from the macroeconomic model...

  4. A mouse model of alcoholic liver fibrosis-associated acute kidney injury identifies key molecular pathways

    International Nuclear Information System (INIS)

    Furuya, Shinji; Chappell, Grace A.; Iwata, Yasuhiro; Uehara, Takeki; Kato, Yuki; Kono, Hiroshi; Bataller, Ramon; Rusyn, Ivan

    2016-01-01

    Clinical data strongly indicate that acute kidney injury (AKI) is a critical complication in alcoholic hepatitis, an acute-on-chronic form of liver failure in patients with advanced alcoholic fibrosis. Development of targeted therapies for AKI in this setting is hampered by the lack of an animal model. To enable research into molecular drivers and novel therapies for fibrosis- and alcohol-associated AKI, we aimed to combine carbon tetrachloride (CCl 4 )-induced fibrosis with chronic intra-gastric alcohol feeding. Male C57BL/6J mice were administered a low dose of CCl 4 (0.2 ml/kg 2 × week/6 weeks) followed by alcohol intragastrically (up to 25 g/kg/day for 3 weeks) and with continued CCl 4 . We observed that combined treatment with CCl 4 and alcohol resulted in severe liver injury, more pronounced than using each treatment alone. Importantly, severe kidney injury was evident only in the combined treatment group. This mouse model reproduced distinct pathological features consistent with AKI in human alcoholic hepatitis. Transcriptomic analysis of kidneys revealed profound effects in the combined treatment group, with enrichment for damage-associated pathways, such as apoptosis, inflammation, immune-response and hypoxia. Interestingly, Havcr1 and Lcn2, biomarkers of AKI, were markedly up-regulated. Overall, this study established a novel mouse model of fibrosis- and alcohol-associated AKI and identified key mechanistic pathways. - Highlights: • Acute kidney injury (AKI) is a critical complication in alcoholic hepatitis • We developed a novel mouse model of fibrosis- and alcohol-associated AKI • This model reproduces key molecular and pathological features of human AKI • This animal model can help identify new targeted therapies for alcoholic hepatitis

  5. Feature-based component model for design of embedded systems

    Science.gov (United States)

    Zha, Xuan Fang; Sriram, Ram D.

    2004-11-01

    An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.

  6. Discrete-Feature Model Implementation of SDM-Site Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Geier, Joel (Clearwater Hardrock Consulting, Corvallis, OR (United States))

    2010-03-15

    A discrete-feature model (DFM) was implemented for the Forsmark repository site based on the final site descriptive model from surface based investigations. The discrete-feature conceptual model represents deformation zones, individual fractures, and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which, in the present study, is treated as impermeable. This approximation is reasonable for sites in crystalline rock which has very low permeability, apart from that which results from macroscopic fracturing. Models are constructed based on the geological and hydrogeological description of the sites and engineering designs. Hydraulic heads and flows through the network of water-conducting features are calculated by the finite-element method, and are used in turn to simulate migration of non-reacting solute by a particle-tracking method, in order to estimate the properties of pathways by which radionuclides could be released to the biosphere. Stochastic simulation is used to evaluate portions of the model that can only be characterized in statistical terms, since many water-conducting features within the model volume cannot be characterized deterministically. Chapter 2 describes the methodology by which discrete features are derived to represent water-conducting features around the hypothetical repository at Forsmark (including both natural features and features that result from the disturbance of excavation), and then assembled to produce a discrete-feature network model for numerical simulation of flow and transport. Chapter 3 describes how site-specific data and repository design are adapted to produce the discrete-feature model. Chapter 4 presents results of the calculations. These include utilization factors for deposition tunnels based on the emplacement criteria that have been set forth by the implementers, flow distributions to the deposition holes, and calculated properties of discharge paths as well as

  7. Discrete-Feature Model Implementation of SDM-Site Forsmark

    International Nuclear Information System (INIS)

    Geier, Joel

    2010-03-01

    A discrete-feature model (DFM) was implemented for the Forsmark repository site based on the final site descriptive model from surface based investigations. The discrete-feature conceptual model represents deformation zones, individual fractures, and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which, in the present study, is treated as impermeable. This approximation is reasonable for sites in crystalline rock which has very low permeability, apart from that which results from macroscopic fracturing. Models are constructed based on the geological and hydrogeological description of the sites and engineering designs. Hydraulic heads and flows through the network of water-conducting features are calculated by the finite-element method, and are used in turn to simulate migration of non-reacting solute by a particle-tracking method, in order to estimate the properties of pathways by which radionuclides could be released to the biosphere. Stochastic simulation is used to evaluate portions of the model that can only be characterized in statistical terms, since many water-conducting features within the model volume cannot be characterized deterministically. Chapter 2 describes the methodology by which discrete features are derived to represent water-conducting features around the hypothetical repository at Forsmark (including both natural features and features that result from the disturbance of excavation), and then assembled to produce a discrete-feature network model for numerical simulation of flow and transport. Chapter 3 describes how site-specific data and repository design are adapted to produce the discrete-feature model. Chapter 4 presents results of the calculations. These include utilization factors for deposition tunnels based on the emplacement criteria that have been set forth by the implementers, flow distributions to the deposition holes, and calculated properties of discharge paths as well as

  8. The Key Lake project

    International Nuclear Information System (INIS)

    1991-01-01

    Key Lake is located in the Athabasca sand stone basin, 640 kilometers north of Saskatoon, Saskatchewan, Canada. The three sources of ore at Key Lake contain 70 100 tonnes of uranium. Features of the Key Lake Project were described under the key headings: work force, mining, mill process, tailings storage, permanent camp, environmental features, worker health and safety, and economic benefits. Appendices covering the historical background, construction projects, comparisons of western world mines, mining statistics, Northern Saskatchewan surface lease, and Key Lake development and regulatory agencies were included

  9. Component Composition Using Feature Models

    DEFF Research Database (Denmark)

    Eichberg, Michael; Klose, Karl; Mitschke, Ralf

    2010-01-01

    interface description languages. If this variability is relevant when selecting a matching component then human interaction is required to decide which components can be bound. We propose to use feature models for making this variability explicit and (re-)enabling automatic component binding. In our...... approach, feature models are one part of service specifications. This enables to declaratively specify which service variant is provided by a component. By referring to a service's variation points, a component that requires a specific service can list the requirements on the desired variant. Using...... these specifications, a component environment can then determine if a binding of the components exists that satisfies all requirements. The prototypical environment Columbus demonstrates the feasibility of the approach....

  10. Annotation-based feature extraction from sets of SBML models.

    Science.gov (United States)

    Alm, Rebekka; Waltemath, Dagmar; Wolfien, Markus; Wolkenhauer, Olaf; Henkel, Ron

    2015-01-01

    Model repositories such as BioModels Database provide computational models of biological systems for the scientific community. These models contain rich semantic annotations that link model entities to concepts in well-established bio-ontologies such as Gene Ontology. Consequently, thematically similar models are likely to share similar annotations. Based on this assumption, we argue that semantic annotations are a suitable tool to characterize sets of models. These characteristics improve model classification, allow to identify additional features for model retrieval tasks, and enable the comparison of sets of models. In this paper we discuss four methods for annotation-based feature extraction from model sets. We tested all methods on sets of models in SBML format which were composed from BioModels Database. To characterize each of these sets, we analyzed and extracted concepts from three frequently used ontologies, namely Gene Ontology, ChEBI and SBO. We find that three out of the methods are suitable to determine characteristic features for arbitrary sets of models: The selected features vary depending on the underlying model set, and they are also specific to the chosen model set. We show that the identified features map on concepts that are higher up in the hierarchy of the ontologies than the concepts used for model annotations. Our analysis also reveals that the information content of concepts in ontologies and their usage for model annotation do not correlate. Annotation-based feature extraction enables the comparison of model sets, as opposed to existing methods for model-to-keyword comparison, or model-to-model comparison.

  11. A mouse model of alcoholic liver fibrosis-associated acute kidney injury identifies key molecular pathways

    Energy Technology Data Exchange (ETDEWEB)

    Furuya, Shinji; Chappell, Grace A.; Iwata, Yasuhiro [Department of Veterinary Integrative Biosciences, Texas A& M University, College Station, TX (United States); Uehara, Takeki; Kato, Yuki [Laboratory of Veterinary Pathology, Osaka Prefecture University, Osaka (Japan); Kono, Hiroshi [First Department of Surgery, University of Yamanashi, Yamanashi (Japan); Bataller, Ramon [Division of Gastroenterology & Hepatology, Department of Medicine, University of North Carolina, Chapel Hill, NC (United States); Rusyn, Ivan, E-mail: irusyn@tamu.edu [Department of Veterinary Integrative Biosciences, Texas A& M University, College Station, TX (United States)

    2016-11-01

    Clinical data strongly indicate that acute kidney injury (AKI) is a critical complication in alcoholic hepatitis, an acute-on-chronic form of liver failure in patients with advanced alcoholic fibrosis. Development of targeted therapies for AKI in this setting is hampered by the lack of an animal model. To enable research into molecular drivers and novel therapies for fibrosis- and alcohol-associated AKI, we aimed to combine carbon tetrachloride (CCl{sub 4})-induced fibrosis with chronic intra-gastric alcohol feeding. Male C57BL/6J mice were administered a low dose of CCl{sub 4} (0.2 ml/kg 2 × week/6 weeks) followed by alcohol intragastrically (up to 25 g/kg/day for 3 weeks) and with continued CCl{sub 4}. We observed that combined treatment with CCl{sub 4} and alcohol resulted in severe liver injury, more pronounced than using each treatment alone. Importantly, severe kidney injury was evident only in the combined treatment group. This mouse model reproduced distinct pathological features consistent with AKI in human alcoholic hepatitis. Transcriptomic analysis of kidneys revealed profound effects in the combined treatment group, with enrichment for damage-associated pathways, such as apoptosis, inflammation, immune-response and hypoxia. Interestingly, Havcr1 and Lcn2, biomarkers of AKI, were markedly up-regulated. Overall, this study established a novel mouse model of fibrosis- and alcohol-associated AKI and identified key mechanistic pathways. - Highlights: • Acute kidney injury (AKI) is a critical complication in alcoholic hepatitis • We developed a novel mouse model of fibrosis- and alcohol-associated AKI • This model reproduces key molecular and pathological features of human AKI • This animal model can help identify new targeted therapies for alcoholic hepatitis.

  12. Improving Latino Children's Early Language and Literacy Development: Key Features of Early Childhood Education within Family Literacy Programmes

    Science.gov (United States)

    Jung, Youngok; Zuniga, Stephen; Howes, Carollee; Jeon, Hyun-Joo; Parrish, Deborah; Quick, Heather; Manship, Karen; Hauser, Alison

    2016-01-01

    Noting the lack of research on how early childhood education (ECE) programmes within family literacy programmes influence Latino children's early language and literacy development, this study examined key features of ECE programmes, specifically teacher-child interactions and child engagement in language and literacy activities and how these…

  13. Cycling hypoxia: A key feature of the tumor microenvironment.

    Science.gov (United States)

    Michiels, Carine; Tellier, Céline; Feron, Olivier

    2016-08-01

    A compelling body of evidence indicates that most human solid tumors contain hypoxic areas. Hypoxia is the consequence not only of the chaotic proliferation of cancer cells that places them at distance from the nearest capillary but also of the abnormal structure of the new vasculature network resulting in transient blood flow. Hence two types of hypoxia are observed in tumors: chronic and cycling (intermittent) hypoxia. Most of the current work aims at understanding the role of chronic hypoxia in tumor growth, response to treatment and metastasis. Only recently, cycling hypoxia, with spatial and temporal fluctuations in oxygen levels, has emerged as another key feature of the tumor environment that triggers different responses in comparison to chronic hypoxia. Either type of hypoxia is associated with distinct effects not only in cancer cells but also in stromal cells. In particular, cycling hypoxia has been demonstrated to favor, to a higher extent than chronic hypoxia, angiogenesis, resistance to anti-cancer treatments, intratumoral inflammation and tumor metastasis. These review details these effects as well as the signaling pathway it triggers to switch on specific transcriptomic programs. Understanding the signaling pathways through which cycling hypoxia induces these processes that support the development of an aggressive cancer could convey to the emergence of promising new cancer treatments. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Doubly sparse factor models for unifying feature transformation and feature selection

    International Nuclear Information System (INIS)

    Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato; Matsumoto, Narihisa; Sugase-Miyamoto, Yasuko

    2010-01-01

    A number of unsupervised learning methods for high-dimensional data are largely divided into two groups based on their procedures, i.e., (1) feature selection, which discards irrelevant dimensions of the data, and (2) feature transformation, which constructs new variables by transforming and mixing over all dimensions. We propose a method that both selects and transforms features in a common Bayesian inference procedure. Our method imposes a doubly automatic relevance determination (ARD) prior on the factor loading matrix. We propose a variational Bayesian inference for our model and demonstrate the performance of our method on both synthetic and real data.

  15. Doubly sparse factor models for unifying feature transformation and feature selection

    Energy Technology Data Exchange (ETDEWEB)

    Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato [ERATO, Okanoya Emotional Information Project, Japan Science Technology Agency, Saitama (Japan); Matsumoto, Narihisa; Sugase-Miyamoto, Yasuko, E-mail: okada@k.u-tokyo.ac.j [Human Technology Research Institute, National Institute of Advanced Industrial Science and Technology, Ibaraki (Japan)

    2010-06-01

    A number of unsupervised learning methods for high-dimensional data are largely divided into two groups based on their procedures, i.e., (1) feature selection, which discards irrelevant dimensions of the data, and (2) feature transformation, which constructs new variables by transforming and mixing over all dimensions. We propose a method that both selects and transforms features in a common Bayesian inference procedure. Our method imposes a doubly automatic relevance determination (ARD) prior on the factor loading matrix. We propose a variational Bayesian inference for our model and demonstrate the performance of our method on both synthetic and real data.

  16. Extracting Feature Model Changes from the Linux Kernel Using FMDiff

    NARCIS (Netherlands)

    Dintzner, N.J.R.; Van Deursen, A.; Pinzger, M.

    2014-01-01

    The Linux kernel feature model has been studied as an example of large scale evolving feature model and yet details of its evolution are not known. We present here a classification of feature changes occurring on the Linux kernel feature model, as well as a tool, FMDiff, designed to automatically

  17. Identifying Key Features of Effective Active Learning: The Effects of Writing and Peer Discussion

    Science.gov (United States)

    Pangle, Wiline M.; Wyatt, Kevin H.; Powell, Karli N.; Sherwood, Rachel E.

    2014-01-01

    We investigated some of the key features of effective active learning by comparing the outcomes of three different methods of implementing active-learning exercises in a majors introductory biology course. Students completed activities in one of three treatments: discussion, writing, and discussion + writing. Treatments were rotated weekly between three sections taught by three different instructors in a full factorial design. The data set was analyzed by generalized linear mixed-effect models with three independent variables: student aptitude, treatment, and instructor, and three dependent (assessment) variables: change in score on pre- and postactivity clicker questions, and coding scores on in-class writing and exam essays. All independent variables had significant effects on student performance for at least one of the dependent variables. Students with higher aptitude scored higher on all assessments. Student scores were higher on exam essay questions when the activity was implemented with a writing component compared with peer discussion only. There was a significant effect of instructor, with instructors showing different degrees of effectiveness with active-learning techniques. We suggest that individual writing should be implemented as part of active learning whenever possible and that instructors may need training and practice to become effective with active learning. PMID:25185230

  18. Towards the maturity model for feature oriented domain analysis

    Directory of Open Access Journals (Sweden)

    Muhammad Javed

    2014-09-01

    Full Text Available Assessing the quality of a model has always been a challenge for researchers in academia and industry. The quality of a feature model is a prime factor because it is used in the development of products. A degraded feature model leads the development of low quality products. Few efforts have been made on improving the quality of feature models. This paper is an effort to present our ongoing work i.e. development of FODA (Feature Oriented Domain Analysis maturity model which will help to evaluate the quality of a given feature model. In this paper, we provide the quality levels along with their descriptions. The proposed model consists of four levels starting from level 0 to level 3. Design of each level is based on the severity of errors, whereas severity of errors decreases from level 0 to level 3. We elaborate each level with the help of examples. We borrowed all examples from the material published by the research community of Software Product Lines (SPL for the application of our framework.

  19. Secure image retrieval with multiple keys

    Science.gov (United States)

    Liang, Haihua; Zhang, Xinpeng; Wei, Qiuhan; Cheng, Hang

    2018-03-01

    This article proposes a secure image retrieval scheme under a multiuser scenario. In this scheme, the owner first encrypts and uploads images and their corresponding features to the cloud; then, the user submits the encrypted feature of the query image to the cloud; next, the cloud compares the encrypted features and returns encrypted images with similar content to the user. To find the nearest neighbor in the encrypted features, an encryption with multiple keys is proposed, in which the query feature of each user is encrypted by his/her own key. To improve the key security and space utilization, global optimization and Gaussian distribution are, respectively, employed to generate multiple keys. The experiments show that the proposed encryption can provide effective and secure image retrieval for each user and ensure confidentiality of the query feature of each user.

  20. The SSB-positive/SSA-negative antibody profile is not associated with key phenotypic features of Sjögren's syndrome

    DEFF Research Database (Denmark)

    Baer, Alan N; McAdams DeMarco, Mara; Shiboski, Stephen C

    2015-01-01

    phenotypic features. Among SICCA participants classified with SS on the basis of the American-European Consensus Group or American College of Rheumatology criteria, only 2% required the anti-SSB-alone test result to meet these criteria. CONCLUSIONS: The presence of anti-SSB, without anti-SSA antibodies, had...... participants, 2061 (63%) had negative anti-SSA/anti-SSB, 1162 (35%) had anti-SSA with or without anti-SSB, and 74 (2%) anti-SSB alone. Key SS phenotypic features were more prevalent and had measures indicative of greater disease activity in those participants with anti-SSA, either alone or with anti-SSB, than...... in those with anti-SSB alone or negative SSA/SSB serology. These between-group differences were highly significant and not explained by confounding by age, race/ethnicity or gender. Participants with anti-SSB alone were comparable to those with negative SSA/SSB serology in their association with these key...

  1. Evidence on Features of a DSGE Business Cycle Model from Bayesian Model Averaging

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2012-01-01

    textabstractThe empirical support for features of a Dynamic Stochastic General Equilibrium model with two technology shocks is valuated using Bayesian model averaging over vector autoregressions. The model features include equilibria, restrictions on long-run responses, a structural break of unknown

  2. Time to refine key climate policy models

    Science.gov (United States)

    Barron, Alexander R.

    2018-05-01

    Ambition regarding climate change at the national level is critical but is often calibrated with the projected costs — as estimated by a small suite of energy-economic models. Weaknesses in several key areas in these models will continue to distort policy design unless collectively addressed by a diversity of researchers.

  3. Topical video object discovery from key frames by modeling word co-occurrence prior.

    Science.gov (United States)

    Zhao, Gangqiang; Yuan, Junsong; Hua, Gang; Yang, Jiong

    2015-12-01

    A topical video object refers to an object, that is, frequently highlighted in a video. It could be, e.g., the product logo and the leading actor/actress in a TV commercial. We propose a topic model that incorporates a word co-occurrence prior for efficient discovery of topical video objects from a set of key frames. Previous work using topic models, such as latent Dirichelet allocation (LDA), for video object discovery often takes a bag-of-visual-words representation, which ignored important co-occurrence information among the local features. We show that such data driven co-occurrence information from bottom-up can conveniently be incorporated in LDA with a Gaussian Markov prior, which combines top-down probabilistic topic modeling with bottom-up priors in a unified model. Our experiments on challenging videos demonstrate that the proposed approach can discover different types of topical objects despite variations in scale, view-point, color and lighting changes, or even partial occlusions. The efficacy of the co-occurrence prior is clearly demonstrated when compared with topic models without such priors.

  4. On the Use of Memory Models in Audio Features

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2011-01-01

    Audio feature estimation is potentially improved by including higher- level models. One such model is the Short Term Memory (STM) model. A new paradigm of audio feature estimation is obtained by adding the influence of notes in the STM. These notes are identified when the perceptual spectral flux...

  5. Individual discriminative face recognition models based on subsets of features

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder; Gomez, David Delgado; Ersbøll, Bjarne Kjær

    2007-01-01

    The accuracy of data classification methods depends considerably on the data representation and on the selected features. In this work, the elastic net model selection is used to identify meaningful and important features in face recognition. Modelling the characteristics which distinguish one...... person from another using only subsets of features will both decrease the computational cost and increase the generalization capacity of the face recognition algorithm. Moreover, identifying which are the features that better discriminate between persons will also provide a deeper understanding...... of the face recognition problem. The elastic net model is able to select a subset of features with low computational effort compared to other state-of-the-art feature selection methods. Furthermore, the fact that the number of features usually is larger than the number of images in the data base makes feature...

  6. Soil fauna: key to new carbon models

    Science.gov (United States)

    Filser, Juliane; Faber, Jack H.; Tiunov, Alexei V.; Brussaard, Lijbert; Frouz, Jan; De Deyn, Gerlinde; Uvarov, Alexei V.; Berg, Matty P.; Lavelle, Patrick; Loreau, Michel; Wall, Diana H.; Querner, Pascal; Eijsackers, Herman; José Jiménez, Juan

    2016-11-01

    Soil organic matter (SOM) is key to maintaining soil fertility, mitigating climate change, combatting land degradation, and conserving above- and below-ground biodiversity and associated soil processes and ecosystem services. In order to derive management options for maintaining these essential services provided by soils, policy makers depend on robust, predictive models identifying key drivers of SOM dynamics. Existing SOM models and suggested guidelines for future SOM modelling are defined mostly in terms of plant residue quality and input and microbial decomposition, overlooking the significant regulation provided by soil fauna. The fauna controls almost any aspect of organic matter turnover, foremost by regulating the activity and functional composition of soil microorganisms and their physical-chemical connectivity with soil organic matter. We demonstrate a very strong impact of soil animals on carbon turnover, increasing or decreasing it by several dozen percent, sometimes even turning C sinks into C sources or vice versa. This is demonstrated not only for earthworms and other larger invertebrates but also for smaller fauna such as Collembola. We suggest that inclusion of soil animal activities (plant residue consumption and bioturbation altering the formation, depth, hydraulic properties and physical heterogeneity of soils) can fundamentally affect the predictive outcome of SOM models. Understanding direct and indirect impacts of soil fauna on nutrient availability, carbon sequestration, greenhouse gas emissions and plant growth is key to the understanding of SOM dynamics in the context of global carbon cycling models. We argue that explicit consideration of soil fauna is essential to make realistic modelling predictions on SOM dynamics and to detect expected non-linear responses of SOM dynamics to global change. We present a decision framework, to be further developed through the activities of KEYSOM, a European COST Action, for when mechanistic SOM models

  7. A Multiobjective Sparse Feature Learning Model for Deep Neural Networks.

    Science.gov (United States)

    Gong, Maoguo; Liu, Jia; Li, Hao; Cai, Qing; Su, Linzhi

    2015-12-01

    Hierarchical deep neural networks are currently popular learning models for imitating the hierarchical architecture of human brain. Single-layer feature extractors are the bricks to build deep networks. Sparse feature learning models are popular models that can learn useful representations. But most of those models need a user-defined constant to control the sparsity of representations. In this paper, we propose a multiobjective sparse feature learning model based on the autoencoder. The parameters of the model are learnt by optimizing two objectives, reconstruction error and the sparsity of hidden units simultaneously to find a reasonable compromise between them automatically. We design a multiobjective induced learning procedure for this model based on a multiobjective evolutionary algorithm. In the experiments, we demonstrate that the learning procedure is effective, and the proposed multiobjective model can learn useful sparse features.

  8. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang

    2014-01-01

    Full Text Available For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy.

  9. Physical model for the 2175 A interstellar extinction feature

    International Nuclear Information System (INIS)

    Hecht, J.H.

    1986-01-01

    Recent IUE observations have shown that the 2175 A interstellar extinction feature is constant in wavelength but varies in width. A model has been constructed to explain these results. It is proposed that the 2175 A feature will only be seen when there is extinction due to carbon grains which have lost their hydrogen. In particular, the feature is caused by a separate population of small (less than 50 A radius), hydrogen-free carbon grains. The variations in width would be due to differences in either their temperature, size distribution, or impurity content. All other carbon grains retain hydrogen, which causes the feature to be suppressed. If this model is correct, then it implies that the grains responsible for the unidentified IR emission features would not generally cause the 2175 A feature. 53 references

  10. Key West, Florida Tsunami Forecast Grids for MOST Model

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Key West, Florida Forecast Model Grids provides bathymetric data strictly for tsunami inundation modeling with the Method of Splitting Tsunami (MOST) model. MOST...

  11. Preliminary safety analysis for key design features of KALIMER with breakeven core

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Do Hee; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, Y. B.; Jeong, K. S

    2001-06-01

    KAERI is currently developing the conceptual design of a Liquid Metal Reactor, KALIMER (Korea Advanced Liquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, descriptions of safety design features and safety analyses results for selected ATWS accidents for the breakeven core KALIMER are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the safety evaluation procedure for the KALIMER design is described in Chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events.In Chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed to investigate the KALIMER system response to the events. In Chapter 4, the design of the KALIMER containment dome and the results of its performance analyses are presented. The design of the existing containment and the KALIMER containment dome are compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core energetics behavior during HCDA in Chapter 5. Sensitivity analyses have been performed for the KALIMER core behavior during super-prompt critical excursions, using mathematical formulations developed in the framework of the Modified Bethe-Tait method. Work energy potential was then calculated based on the isentropic fuel expansion model.

  12. Model of key success factors for Business Intelligence implementation

    Directory of Open Access Journals (Sweden)

    Peter Mesaros

    2016-07-01

    Full Text Available New progressive technologies recorded growth in every area. Information-communication technologies facilitate the exchange of information and it facilitates management of everyday activities in enterprises. Specific modules (such as Business Intelligence facilitate decision-making. Several studies have demonstrated the positive impact of Business Intelligence to decision-making. The first step is to put in place the enterprise. The implementation process is influenced by many factors. This article discusses the issue of key success factors affecting to successful implementation of Business Intelligence. The article describes the key success factors for successful implementation and use of Business Intelligence based on multiple studies. The main objective of this study is to verify the effects and dependence of selected factors and proposes a model of key success factors for successful implementation of Business Intelligence. Key success factors and the proposed model are studied in Slovak enterprises.

  13. A feature-based approach to modeling protein-DNA interactions.

    Directory of Open Access Journals (Sweden)

    Eilon Sharon

    Full Text Available Transcription factor (TF binding to its DNA target site is a fundamental regulatory interaction. The most common model used to represent TF binding specificities is a position specific scoring matrix (PSSM, which assumes independence between binding positions. However, in many cases, this simplifying assumption does not hold. Here, we present feature motif models (FMMs, a novel probabilistic method for modeling TF-DNA interactions, based on log-linear models. Our approach uses sequence features to represent TF binding specificities, where each feature may span multiple positions. We develop the mathematical formulation of our model and devise an algorithm for learning its structural features from binding site data. We also developed a discriminative motif finder, which discovers de novo FMMs that are enriched in target sets of sequences compared to background sets. We evaluate our approach on synthetic data and on the widely used TF chromatin immunoprecipitation (ChIP dataset of Harbison et al. We then apply our algorithm to high-throughput TF ChIP data from mouse and human, reveal sequence features that are present in the binding specificities of mouse and human TFs, and show that FMMs explain TF binding significantly better than PSSMs. Our FMM learning and motif finder software are available at http://genie.weizmann.ac.il/.

  14. Analysing the Linux kernel feature model changes using FMDiff

    NARCIS (Netherlands)

    Dintzner, N.J.R.; van Deursen, A.; Pinzger, M.

    Evolving a large scale, highly variable system is a challenging task. For such a system, evolution operations often require to update consistently both their implementation and its feature model. In this context, the evolution of the feature model closely follows the evolution of the system. The

  15. Analysing the Linux kernel feature model changes using FMDiff

    NARCIS (Netherlands)

    Dintzner, N.J.R.; Van Deursen, A.; Pinzger, M.

    2015-01-01

    Evolving a large scale, highly variable system is a challenging task. For such a system, evolution operations often require to update consistently both their implementation and its feature model. In this context, the evolution of the feature model closely follows the evolution of the system. The

  16. Predicting Key Events in the Popularity Evolution of Online Information.

    Science.gov (United States)

    Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  17. Predicting Key Events in the Popularity Evolution of Online Information.

    Directory of Open Access Journals (Sweden)

    Ying Hu

    Full Text Available The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  18. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Science.gov (United States)

    Choi, Ickwon; Chung, Amy W; Suscovich, Todd J; Rerks-Ngarm, Supachai; Pitisuttithum, Punnee; Nitayaphan, Sorachai; Kaewkungwal, Jaranit; O'Connell, Robert J; Francis, Donald; Robb, Merlin L; Michael, Nelson L; Kim, Jerome H; Alter, Galit; Ackerman, Margaret E; Bailey-Kellogg, Chris

    2015-04-01

    The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity) and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release). We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  19. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Directory of Open Access Journals (Sweden)

    Ickwon Choi

    2015-04-01

    Full Text Available The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release. We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  20. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  1. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  2. Cluster regression model and level fluctuation features of Van Lake, Turkey

    Directory of Open Access Journals (Sweden)

    Z. Şen

    1999-02-01

    Full Text Available Lake water levels change under the influences of natural and/or anthropogenic environmental conditions. Among these influences are the climate change, greenhouse effects and ozone layer depletions which are reflected in the hydrological cycle features over the lake drainage basins. Lake levels are among the most significant hydrological variables that are influenced by different atmospheric and environmental conditions. Consequently, lake level time series in many parts of the world include nonstationarity components such as shifts in the mean value, apparent or hidden periodicities. On the other hand, many lake level modeling techniques have a stationarity assumption. The main purpose of this work is to develop a cluster regression model for dealing with nonstationarity especially in the form of shifting means. The basis of this model is the combination of transition probability and classical regression technique. Both parts of the model are applied to monthly level fluctuations of Lake Van in eastern Turkey. It is observed that the cluster regression procedure does preserve the statistical properties and the transitional probabilities that are indistinguishable from the original data.Key words. Hydrology (hydrologic budget; stochastic processes · Meteorology and atmospheric dynamics (ocean-atmosphere interactions

  3. Cluster regression model and level fluctuation features of Van Lake, Turkey

    Directory of Open Access Journals (Sweden)

    Z. Şen

    Full Text Available Lake water levels change under the influences of natural and/or anthropogenic environmental conditions. Among these influences are the climate change, greenhouse effects and ozone layer depletions which are reflected in the hydrological cycle features over the lake drainage basins. Lake levels are among the most significant hydrological variables that are influenced by different atmospheric and environmental conditions. Consequently, lake level time series in many parts of the world include nonstationarity components such as shifts in the mean value, apparent or hidden periodicities. On the other hand, many lake level modeling techniques have a stationarity assumption. The main purpose of this work is to develop a cluster regression model for dealing with nonstationarity especially in the form of shifting means. The basis of this model is the combination of transition probability and classical regression technique. Both parts of the model are applied to monthly level fluctuations of Lake Van in eastern Turkey. It is observed that the cluster regression procedure does preserve the statistical properties and the transitional probabilities that are indistinguishable from the original data.

    Key words. Hydrology (hydrologic budget; stochastic processes · Meteorology and atmospheric dynamics (ocean-atmosphere interactions

  4. Virtual-optical information security system based on public key infrastructure

    Science.gov (United States)

    Peng, Xiang; Zhang, Peng; Cai, Lilong; Niu, Hanben

    2005-01-01

    A virtual-optical based encryption model with the aid of public key infrastructure (PKI) is presented in this paper. The proposed model employs a hybrid architecture in which our previously published encryption method based on virtual-optics scheme (VOS) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). The whole information security model is run under the framework of international standard ITU-T X.509 PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOS security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network. Numerical experiments prove the effectiveness of the method. The security of proposed model is briefly analyzed by examining some possible attacks from the viewpoint of a cryptanalysis.

  5. Key features of human episodic recollection in the cross-episode retrieval of rat hippocampus representations of space.

    Directory of Open Access Journals (Sweden)

    Eduard Kelemen

    2013-07-01

    Full Text Available Neurophysiological studies focus on memory retrieval as a reproduction of what was experienced and have established that neural discharge is replayed to express memory. However, cognitive psychology has established that recollection is not a verbatim replay of stored information. Recollection is constructive, the product of memory retrieval cues, the information stored in memory, and the subject's state of mind. We discovered key features of constructive recollection embedded in the rat CA1 ensemble discharge during an active avoidance task. Rats learned two task variants, one with the arena stable, the other with it rotating; each variant defined a distinct behavioral episode. During the rotating episode, the ensemble discharge of CA1 principal neurons was dynamically organized to concurrently represent space in two distinct codes. The code for spatial reference frame switched rapidly between representing the rat's current location in either the stationary spatial frame of the room or the rotating frame of the arena. The code for task variant switched less frequently between a representation of the current rotating episode and the stable episode from the rat's past. The characteristics and interplay of these two hippocampal codes revealed three key properties of constructive recollection. (1 Although the ensemble representations of the stable and rotating episodes were distinct, ensemble discharge during rotation occasionally resembled the stable condition, demonstrating cross-episode retrieval of the representation of the remote, stable episode. (2 This cross-episode retrieval at the level of the code for task variant was more likely when the rotating arena was about to match its orientation in the stable episode. (3 The likelihood of cross-episode retrieval was influenced by preretrieval information that was signaled at the level of the code for spatial reference frame. Thus key features of episodic recollection manifest in rat hippocampal

  6. Music Genre Classification using the multivariate AR feature integration model

    DEFF Research Database (Denmark)

    Ahrendt, Peter; Meng, Anders

    2005-01-01

    informative decisions about musical genre. For the MIREX music genre contest several authors derive long time features based either on statistical moments and/or temporal structure in the short time features. In our contribution we model a segment (1.2 s) of short time features (texture) using a multivariate...... autoregressive model. Other authors have applied simpler statistical models such as the mean-variance model, which also has been included in several of this years MIREX submissions, see e.g. Tzanetakis (2005); Burred (2005); Bergstra et al. (2005); Lidy and Rauber (2005)....

  7. Qualitative research methods: key features and insights gained from use in infection prevention research.

    Science.gov (United States)

    Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L

    2008-12-01

    Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.

  8. Summary on several key techniques in 3D geological modeling.

    Science.gov (United States)

    Mei, Gang

    2014-01-01

    Several key techniques in 3D geological modeling including planar mesh generation, spatial interpolation, and surface intersection are summarized in this paper. Note that these techniques are generic and widely used in various applications but play a key role in 3D geological modeling. There are two essential procedures in 3D geological modeling: the first is the simulation of geological interfaces using geometric surfaces and the second is the building of geological objects by means of various geometric computations such as the intersection of surfaces. Discrete geometric surfaces that represent geological interfaces can be generated by creating planar meshes first and then spatially interpolating; those surfaces intersect and then form volumes that represent three-dimensional geological objects such as rock bodies. In this paper, the most commonly used algorithms of the key techniques in 3D geological modeling are summarized.

  9. Practical Implementation of Various Public Key Infrastructure Models

    Directory of Open Access Journals (Sweden)

    Dmitriy Anatolievich Melnikov

    2016-03-01

    Full Text Available The paper proposes a short comparative analysis of the contemporary models of public key infrastructure (PKI and the issues of the PKI models real implementation. The Russian model of PKI is presented. Differences between the North American and West Europe models of PKI and Russian model of PKI are described. The problems of creation and main directions of further development and improvement of the Russian PKI and its integration into the global trust environment are defined.

  10. Safety Analysis for Key Design Features of KALIMER-600 Design Concept

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Jeong, H. Y.; Ha, K. S

    2007-02-15

    This report contains the safety analyses of the KALIMER-600 conceptual design which KAERI has been developing under the Long-term Nuclear R and D Program. The analyses have been performed reflecting the design developments during the second year of the 4th design phase in the program. The specific presentations are the key design features with the safety principles for achieving the safety objectives, the event categorization and safety criteria, and results on the safety analyses for the DBAs and ATWS events, the containment performance, and the channel blockages. The safety analyses for both the DBAs and ATWS events have been performed using SSC-K version 1.3., and the results have shown the fulfillment of the safety criteria for DBAs with conservative assumptions. The safety margins as well as the inherent safety also have been confirmed for the ATWS events. For the containment performance analysis, ORIGEN-2.1 and CONTAIN-LMR have been used. In results, the structural integrity has been acceptable and the evaluated exposure dose rate has been complied with 10 CFR 100 and PAG limits. The analysis results for flow blockages of 6-subchannels, 24-subchannels, and 54- subchannels with the MATRA-LMR-FB code, have assured the integrity of subassemblies.

  11. A Feature Fusion Based Forecasting Model for Financial Time Series

    Science.gov (United States)

    Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie

    2014-01-01

    Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455

  12. BioModels: Content, Features, Functionality, and Use

    Science.gov (United States)

    Juty, N; Ali, R; Glont, M; Keating, S; Rodriguez, N; Swat, MJ; Wimalaratne, SM; Hermjakob, H; Le Novère, N; Laibe, C; Chelliah, V

    2015-01-01

    BioModels is a reference repository hosting mathematical models that describe the dynamic interactions of biological components at various scales. The resource provides access to over 1,200 models described in literature and over 140,000 models automatically generated from pathway resources. Most model components are cross-linked with external resources to facilitate interoperability. A large proportion of models are manually curated to ensure reproducibility of simulation results. This tutorial presents BioModels' content, features, functionality, and usage. PMID:26225232

  13. A Hierarchical Feature Extraction Model for Multi-Label Mechanical Patent Classification

    Directory of Open Access Journals (Sweden)

    Jie Hu

    2018-01-01

    Full Text Available Various studies have focused on feature extraction methods for automatic patent classification in recent years. However, most of these approaches are based on the knowledge from experts in related domains. Here we propose a hierarchical feature extraction model (HFEM for multi-label mechanical patent classification, which is able to capture both local features of phrases as well as global and temporal semantics. First, a n-gram feature extractor based on convolutional neural networks (CNNs is designed to extract salient local lexical-level features. Next, a long dependency feature extraction model based on the bidirectional long–short-term memory (BiLSTM neural network model is proposed to capture sequential correlations from higher-level sequence representations. Then the HFEM algorithm and its hierarchical feature extraction architecture are detailed. We establish the training, validation and test datasets, containing 72,532, 18,133, and 2679 mechanical patent documents, respectively, and then check the performance of HFEMs. Finally, we compared the results of the proposed HFEM and three other single neural network models, namely CNN, long–short-term memory (LSTM, and BiLSTM. The experimental results indicate that our proposed HFEM outperforms the other compared models in both precision and recall.

  14. Key Elements of the User-Friendly, GFDL SKYHI General Circulation Model

    Directory of Open Access Journals (Sweden)

    Richard S. Hemler

    2000-01-01

    Full Text Available Over the past seven years, the portability of the GFDL SKYHI general circulation model has greatly increased. Modifications to the source code have allowed SKYHI to be run on the GFDL Cray Research PVP machines, the TMC CM-5 machine at Los Alamos National Laboratory, and more recently on the GFDL 40-processor Cray Research T3E system. At the same time, changes have been made to the model to make it more usable and flexible. Because of the reduction of the human resources available to manage and analyze scientific experiments, it is no longer acceptable to consider only the optimization of computer resources when producing a research code; one must also consider the availability and cost of the people necessary to maintain, modify and use the model as an investigative tool, and include these factors in defining the form of the model code. The new SKYHI model attempts to strike a balance between the optimization of the use of machine resources (CPU time, memory, disc and the optimal use of human resources (ability to understand code, ability to modify code, ability to perturb code to do experiments, ability to run code on different platforms. Two of the key features that make the new SKYHI code more usable and flexible are the archiving package and the user variable block. The archiving package is used to manage the writing of all archive files, which contain data for later analysis. The model-supplied user variable block allows the easy inclusion of any new variables needed for particular experiments.

  15. Key enabling design features of the ITER HNB Duct Liner

    Energy Technology Data Exchange (ETDEWEB)

    Chuilon, Ben, E-mail: ben.chuilon@ccfe.ac.uk; Mistry, Sanjay; Andrews, Rodney; Verhoeven, Roel; Xue, Yongkuan

    2015-10-15

    Highlights: • Key engineering design details of the ITER HND Duct Liner are presented. • A standardised CuCrZr water cooled panel that can be remotely handled is detailed. • Bolts are protected from beam power by means of a tungsten cap to radiate heat away. • Water connections placed coaxially are protected from beam power by a tungsten ring. • Explosion-bonded CuCrZr-316L panels result in a tenfold disruption torque reduction. - Abstract: The Duct Liner (DL) for the ITER Heating Neutral Beam (HNB) is a key component in the beam transport system. Duct Liners installed into equatorial ports 4 and 5 of the Vacuum Vessel (VV) will protect the port extension from power deposition due to re-ionisation and direct interception of the HNB. Furthermore, the DL contributes towards the shielding of the VV and superconducting coils from plasma photons and neutrons. The DL incorporates a 316L(N)-IG, deep-drilled and water cooled Neutron Shield (NS) whose internal walls are lined with actively cooled CuCrZr Duct Liner Modules (DLMs). These Remote Handling Class 2 and 3 panels provide protection from neutral beam power. This paper provides an overview of the preliminary design for the ITER HNB DL and focusses on critical features that ensure compatibility with: high heat flux requirements, remote maintenance procedures, and transient magnetic fields arising from major plasma disruptions. The power deposited on a single DLM can reach 300 kW with a peak power density of 2.4 MW/m{sup 2}. Feeding coolant to the DLMs is accomplished via welded connections to the internal coolant network of the NS. These are placed coaxially to allow for thermal expansion of the DLMs without the use of deformable connections. Critically, the remote maintenance of individual DLMs necessitates access to water connections and bolts from the beam facing surface, thus subjecting them to high heat flux loads. This design challenge will become more prevalent as fusion devices become more powerful

  16. Fabled IBM Tank nears launch without key features

    CERN Multimedia

    2003-01-01

    "IBM is preparing to roll out the TotalStorage SAN File System, the ballyhooed, renamed, much delayed Storage Tank the company's been working on for ages, although it now appears some of its key capabilities won't appear until next year in a later version" (1 page).

  17. Identifying key features of effective active learning: the effects of writing and peer discussion.

    Science.gov (United States)

    Linton, Debra L; Pangle, Wiline M; Wyatt, Kevin H; Powell, Karli N; Sherwood, Rachel E

    2014-01-01

    We investigated some of the key features of effective active learning by comparing the outcomes of three different methods of implementing active-learning exercises in a majors introductory biology course. Students completed activities in one of three treatments: discussion, writing, and discussion + writing. Treatments were rotated weekly between three sections taught by three different instructors in a full factorial design. The data set was analyzed by generalized linear mixed-effect models with three independent variables: student aptitude, treatment, and instructor, and three dependent (assessment) variables: change in score on pre- and postactivity clicker questions, and coding scores on in-class writing and exam essays. All independent variables had significant effects on student performance for at least one of the dependent variables. Students with higher aptitude scored higher on all assessments. Student scores were higher on exam essay questions when the activity was implemented with a writing component compared with peer discussion only. There was a significant effect of instructor, with instructors showing different degrees of effectiveness with active-learning techniques. We suggest that individual writing should be implemented as part of active learning whenever possible and that instructors may need training and practice to become effective with active learning. © 2014 D. L. Linton et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  18. Information security system based on virtual-optics imaging methodology and public key infrastructure

    Science.gov (United States)

    Peng, Xiang; Zhang, Peng; Cai, Lilong

    In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.

  19. Enhancing Critical Infrastructure and Key Resources (CIKR) Level-0 Physical Process Security Using Field Device Distinct Native Attribute Features

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Liefer, Nathan C. [Wright-Patterson AFB, Dayton, OH (United States); Busho, Colin R. [Wright-Patterson AFB, Dayton, OH (United States); Temple, Michael A. [Wright-Patterson AFB, Dayton, OH (United States)

    2017-12-04

    Here, the need for improved Critical Infrastructure and Key Resource (CIKR) security is unquestioned and there has been minimal emphasis on Level-0 (PHY Process) improvements. Wired Signal Distinct Native Attribute (WS-DNA) Fingerprinting is investigated here as a non-intrusive PHY-based security augmentation to support an envisioned layered security strategy. Results are based on experimental response collections from Highway Addressable Remote Transducer (HART) Differential Pressure Transmitter (DPT) devices from three manufacturers (Yokogawa, Honeywell, Endress+Hauer) installed in an automated process control system. Device discrimination is assessed using Time Domain (TD) and Slope-Based FSK (SB-FSK) fingerprints input to Multiple Discriminant Analysis, Maximum Likelihood (MDA/ML) and Random Forest (RndF) classifiers. For 12 different classes (two devices per manufacturer at two distinct set points), both classifiers performed reliably and achieved an arbitrary performance benchmark of average cross-class percent correct of %C > 90%. The least challenging cross-manufacturer results included near-perfect %C ≈ 100%, while the more challenging like-model (serial number) discrimination results included 90%< %C < 100%, with TD Fingerprinting marginally outperforming SB-FSK Fingerprinting; SB-FSK benefits from having less stringent response alignment and registration requirements. The RndF classifier was most beneficial and enabled reliable selection of dimensionally reduced fingerprint subsets that minimize data storage and computational requirements. The RndF selected feature sets contained 15% of the full-dimensional feature sets and only suffered a worst case %CΔ = 3% to 4% performance degradation.

  20. Hole Feature on Conical Face Recognition for Turning Part Model

    Science.gov (United States)

    Zubair, A. F.; Abu Mansor, M. S.

    2018-03-01

    Computer Aided Process Planning (CAPP) is the bridge between CAD and CAM and pre-processing of the CAD data in the CAPP system is essential. For CNC turning part, conical faces of part model is inevitable to be recognised beside cylindrical and planar faces. As the sinus cosines of the cone radius structure differ according to different models, face identification in automatic feature recognition of the part model need special intention. This paper intends to focus hole on feature on conical faces that can be detected by CAD solid modeller ACIS via. SAT file. Detection algorithm of face topology were generated and compared. The study shows different faces setup for similar conical part models with different hole type features. Three types of holes were compared and different between merge faces and unmerge faces were studied.

  1. Application of the Value Optimization Model of Key Factors Based on DSEM

    Directory of Open Access Journals (Sweden)

    Chao Su

    2016-01-01

    Full Text Available The key factors of the damping solvent extraction method (DSEM for the analysis of the unbounded medium are the size of bounded domain, the artificial damping ratio, and the finite element mesh density. To control the simulation accuracy and computational efficiency of the soil-structure interaction, this study establishes a value optimization model of key factors that is composed of the design variables, the objective function, and the constraint function system. Then the optimum solutions of key factors are obtained by the optimization model. According to some comparisons of the results provided by the different initial conditions, the value optimization model of key factors is feasible to govern the simulation accuracy and computational efficiency and to analyze the practical unbounded medium-structure interaction.

  2. Neuroendocrine androgen action is a key extraovarian mediator in the development of polycystic ovary syndrome

    OpenAIRE

    Caldwell, Aimee S. L.; Edwards, Melissa C.; Desai, Reena; Jimenez, Mark; Gilchrist, Robert B.; Handelsman, David J.; Walters, Kirsty A.

    2017-01-01

    The cause of polycystic ovary syndrome (PCOS) is unknown, but androgen excess is a key feature. We combined a hyperandrogenized PCOS mouse model with global and tissue- and cell-specific androgen-resistant mouse lines to uncover the sites of androgen action that initiate PCOS. We demonstrate that direct androgen actions, particularly in neurons but less so in granulosa cells, are required for the development of key reproductive and metabolic PCOS features. These data highlight the previously ...

  3. Feature network models for proximity data : statistical inference, model selection, network representations and links with related models

    NARCIS (Netherlands)

    Frank, Laurence Emmanuelle

    2006-01-01

    Feature Network Models (FNM) are graphical structures that represent proximity data in a discrete space with the use of features. A statistical inference theory is introduced, based on the additivity properties of networks and the linear regression framework. Considering features as predictor

  4. DISTANCE AS KEY FACTOR IN MODELLING STUDENTS’ RECRUITMENT BY UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    SIMONA MĂLĂESCU

    2015-10-01

    Full Text Available Distance as Key Factor in Modelling Students’ Recruitment by Universities. In a previous paper analysing the challenge of keeping up with the current methodologies in the analysis and modelling of students’ recruitment by universities in the case of some ECE countries which still don’t register or develop key data to take advantage from the state of the art knowledge on the domain, we have promised to approach the factor distance in a future work due to the extent of the topic. This paper fulfill that promise bringing a review of the literature especially dealing with modelling the geographical area of recruiting students of an university, where combining distance with the proximate key factors previously reviewed, complete the meta-analysis of existing literature we have started a year ago. Beyond the theoretical benefit from a practical perspective, the metaanalysis aimed at synthesizing elements of good practice that can be applied to the local university system.

  5. Quantum key management

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  6. Enhanced HMAX model with feedforward feature learning for multiclass categorization

    Directory of Open Access Journals (Sweden)

    Yinlin eLi

    2015-10-01

    Full Text Available In recent years, the interdisciplinary research between neuroscience and computer vision has promoted the development in both fields. Many biologically inspired visual models are proposed, and among them, the Hierarchical Max-pooling model (HMAX is a feedforward model mimicking the structures and functions of V1 to posterior inferotemporal (PIT layer of the primate visual cortex, which could generate a series of position- and scale- invariant features. However, it could be improved with attention modulation and memory processing, which are two important properties of the primate visual cortex. Thus, in this paper, based on recent biological research on the primate visual cortex, we still mimic the first 100-150 milliseconds of visual cognition to enhance the HMAX model, which mainly focuses on the unsupervised feedforward feature learning process. The main modifications are as follows: 1 To mimic the attention modulation mechanism of V1 layer, a bottom-up saliency map is computed in the S1 layer of the HMAX model, which can support the initial feature extraction for memory processing; 2 To mimic the learning, clustering and short-term memory to long-term memory conversion abilities of V2 and IT, an unsupervised iterative clustering method is used to learn clusters with multiscale middle level patches, which are taken as long-term memory; 3 Inspired by the multiple feature encoding mode of the primate visual cortex, information including color, orientation, and spatial position are encoded in different layers of the HMAX model progressively. By adding a softmax layer at the top of the model, multiclass categorization experiments can be conducted, and the results on Caltech101 show that the enhanced model with a smaller memory size exhibits higher accuracy than the original HMAX model, and could also achieve better accuracy than other unsupervised feature learning methods in multiclass categorization task.

  7. Enhanced HMAX model with feedforward feature learning for multiclass categorization.

    Science.gov (United States)

    Li, Yinlin; Wu, Wei; Zhang, Bo; Li, Fengfu

    2015-01-01

    In recent years, the interdisciplinary research between neuroscience and computer vision has promoted the development in both fields. Many biologically inspired visual models are proposed, and among them, the Hierarchical Max-pooling model (HMAX) is a feedforward model mimicking the structures and functions of V1 to posterior inferotemporal (PIT) layer of the primate visual cortex, which could generate a series of position- and scale- invariant features. However, it could be improved with attention modulation and memory processing, which are two important properties of the primate visual cortex. Thus, in this paper, based on recent biological research on the primate visual cortex, we still mimic the first 100-150 ms of visual cognition to enhance the HMAX model, which mainly focuses on the unsupervised feedforward feature learning process. The main modifications are as follows: (1) To mimic the attention modulation mechanism of V1 layer, a bottom-up saliency map is computed in the S1 layer of the HMAX model, which can support the initial feature extraction for memory processing; (2) To mimic the learning, clustering and short-term memory to long-term memory conversion abilities of V2 and IT, an unsupervised iterative clustering method is used to learn clusters with multiscale middle level patches, which are taken as long-term memory; (3) Inspired by the multiple feature encoding mode of the primate visual cortex, information including color, orientation, and spatial position are encoded in different layers of the HMAX model progressively. By adding a softmax layer at the top of the model, multiclass categorization experiments can be conducted, and the results on Caltech101 show that the enhanced model with a smaller memory size exhibits higher accuracy than the original HMAX model, and could also achieve better accuracy than other unsupervised feature learning methods in multiclass categorization task.

  8. Operational Details of the Five Domains Model and Its Key Applications to the Assessment and Management of Animal Welfare

    Science.gov (United States)

    Mellor, David J.

    2017-01-01

    Simple Summary The Five Domains Model is a focusing device to facilitate systematic, structured, comprehensive and coherent assessment of animal welfare; it is not a definition of animal welfare, nor is it intended to be an accurate representation of body structure and function. The purpose of each of the five domains is to draw attention to areas that are relevant to both animal welfare assessment and management. This paper begins by briefly describing the major features of the Model and the operational interactions between the five domains, and then it details seven interacting applications of the Model. These underlie its utility and increasing application to welfare assessment and management in diverse animal use sectors. Abstract In accord with contemporary animal welfare science understanding, the Five Domains Model has a significant focus on subjective experiences, known as affects, which collectively contribute to an animal’s overall welfare state. Operationally, the focus of the Model is on the presence or absence of various internal physical/functional states and external circumstances that give rise to welfare-relevant negative and/or positive mental experiences, i.e., affects. The internal states and external circumstances of animals are evaluated systematically by referring to each of the first four domains of the Model, designated “Nutrition”, “Environment”, “Health” and “Behaviour”. Then affects, considered carefully and cautiously to be generated by factors in these domains, are accumulated into the fifth domain, designated “Mental State”. The scientific foundations of this operational procedure, published in detail elsewhere, are described briefly here, and then seven key ways the Model may be applied to the assessment and management of animal welfare are considered. These applications have the following beneficial objectives—they (1) specify key general foci for animal welfare management; (2) highlight the foundations of

  9. Characteristics of evolving models of care for arthritis: A key informant study

    Directory of Open Access Journals (Sweden)

    Veinot Paula

    2008-07-01

    Full Text Available Abstract Background The burden of arthritis is increasing in the face of diminishing health human resources to deliver care. In response, innovative models of care delivery are developing to facilitate access to quality care. Most models have developed in response to local needs with limited evaluation. The primary objective of this study is to a examine the range of models of care that deliver specialist services using a medical/surgical specialist and at least one other health care provider and b document the strengths and challenges of the identified models. A secondary objective is to identify key elements of best practice models of care for arthritis. Methods Semi-structured interviews were conducted with a sample of key informants with expertise in arthritis from jurisdictions with primarily publicly-funded health care systems. Qualitative data were analyzed using a constant comparative approach to identify common types of models of care, strengths and challenges of models, and key components of arthritis care. Results Seventy-four key informants were interviewed from six countries. Five main types of models of care emerged. 1 Specialized arthritis programs deliver comprehensive, multidisciplinary team care for arthritis. Two models were identified using health care providers (e.g. nurses or physiotherapists in expanded clinical roles: 2 triage of patients with musculoskeletal conditions to the appropriate services including specialists; and 3 ongoing management in collaboration with a specialist. Two models promoting rural access were 4 rural consultation support and 5 telemedicine. Key informants described important components of models of care including knowledgeable health professionals and patients. Conclusion A range of models of care for arthritis have been developed. This classification can be used as a framework for discussing care delivery. Areas for development include integration of care across the continuum, including primary

  10. District-Wide Involvement: The Key to Successful School Improvement.

    Science.gov (United States)

    Mundell, Scott; Babich, George

    1989-01-01

    Describes the self-study process used by the Marana Unified School District to meet accreditation requirements with minimal expense, to emphasize curriculum development, and to improve the school. Considers the key feature of the cyclical review model to be the personal involvement of nearly every faculty member in the 10-school district. (DMM)

  11. Spatial age-length key modelling using continuation ratio logits

    DEFF Research Database (Denmark)

    Berg, Casper W.; Kristensen, Kasper

    2012-01-01

    -called age-length key (ALK) is then used to obtain the age distribution. Regional differences in ALKs are not uncommon, but stratification is often problematic due to a small number of samples. Here, we combine generalized additive modelling with continuation ratio logits to model the probability of age...

  12. Replication of surface features from a master model to an amorphous metallic article

    Science.gov (United States)

    Johnson, William L.; Bakke, Eric; Peker, Atakan

    1999-01-01

    The surface features of an article are replicated by preparing a master model having a preselected surface feature thereon which is to be replicated, and replicating the preselected surface feature of the master model. The replication is accomplished by providing a piece of a bulk-solidifying amorphous metallic alloy, contacting the piece of the bulk-solidifying amorphous metallic alloy to the surface of the master model at an elevated replication temperature to transfer a negative copy of the preselected surface feature of the master model to the piece, and separating the piece having the negative copy of the preselected surface feature from the master model.

  13. Key Questions in Building Defect Prediction Models in Practice

    Science.gov (United States)

    Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas

    The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.

  14. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  15. Bile Routing Modification Reproduces Key Features of Gastric Bypass in Rat.

    Science.gov (United States)

    Goncalves, Daisy; Barataud, Aude; De Vadder, Filipe; Vinera, Jennifer; Zitoun, Carine; Duchampt, Adeline; Mithieux, Gilles

    2015-12-01

    To evaluate the role of bile routing modification on the beneficial effects of gastric bypass surgery on glucose and energy metabolism. Gastric bypass surgery (GBP) promotes early improvements in glucose and energy homeostasis in obese diabetic patients. A suggested mechanism associates a decrease in hepatic glucose production to an enhanced intestinal gluconeogenesis. Moreover, plasma bile acids are elevated after GBP and bile acids are inhibitors of gluconeogenesis. In male Sprague-Dawley rats, we performed bile diversions from the bile duct to the midjejunum or the mid-ileum to match the modified bile delivery in the gut occurring in GBP. Body weight, food intake, glucose tolerance, insulin sensitivity, and food preference were analyzed. The expression of gluconeogenesis genes was evaluated in both the liver and the intestine. Bile diversions mimicking GBP promote an increase in plasma bile acids and a marked improvement in glucose control. Bile bioavailability modification is causal because a bile acid sequestrant suppresses the beneficial effects of bile diversions on glucose control. In agreement with the inhibitory role of bile acids on gluconeogenesis, bile diversions promote a blunting in hepatic glucose production, whereas intestinal gluconeogenesis is increased in the gut segments devoid of bile. In rats fed a high-fat-high-sucrose diet, bile diversions improve glucose control and dramatically decrease food intake because of an acquired disinterest in fatty food. This study shows that bile routing modification is a key mechanistic feature in the beneficial outcomes of GBP.

  16. Modeling crash injury severity by road feature to improve safety.

    Science.gov (United States)

    Penmetsa, Praveena; Pulugurtha, Srinivas S

    2018-01-02

    The objective of this research is 2-fold: to (a) model and identify critical road features (or locations) based on crash injury severity and compare it with crash frequency and (b) model and identify drivers who are more likely to contribute to crashes by road feature. Crash data from 2011 to 2013 were obtained from the Highway Safety Information System (HSIS) for the state of North Carolina. Twenty-three different road features were considered, analyzed, and compared with each other as well as no road feature. A multinomial logit (MNL) model was developed and odds ratios were estimated to investigate the effect of road features on crash injury severity. Among the many road features, underpass, end or beginning of a divided highway, and on-ramp terminal on crossroad are the top 3 critical road features. Intersection crashes are frequent but are not highly likely to result in severe injuries compared to critical road features. Roundabouts are least likely to result in both severe and moderate injuries. Female drivers are more likely to be involved in crashes at intersections (4-way and T) compared to male drivers. Adult drivers are more likely to be involved in crashes at underpasses. Older drivers are 1.6 times more likely to be involved in a crash at the end or beginning of a divided highway. The findings from this research help to identify critical road features that need to be given priority. As an example, additional advanced warning signs and providing enlarged or highly retroreflective signs that grab the attention of older drivers may help in making locations such as end or beginning of a divided highway much safer. Educating drivers about the necessary skill sets required at critical road features in addition to engineering solutions may further help them adopt safe driving behaviors on the road.

  17. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  18. A feature-based approach to modeling protein-protein interaction hot spots.

    Science.gov (United States)

    Cho, Kyu-il; Kim, Dongsup; Lee, Doheon

    2009-05-01

    Identifying features that effectively represent the energetic contribution of an individual interface residue to the interactions between proteins remains problematic. Here, we present several new features and show that they are more effective than conventional features. By combining the proposed features with conventional features, we develop a predictive model for interaction hot spots. Initially, 54 multifaceted features, composed of different levels of information including structure, sequence and molecular interaction information, are quantified. Then, to identify the best subset of features for predicting hot spots, feature selection is performed using a decision tree. Based on the selected features, a predictive model for hot spots is created using support vector machine (SVM) and tested on an independent test set. Our model shows better overall predictive accuracy than previous methods such as the alanine scanning methods Robetta and FOLDEF, and the knowledge-based method KFC. Subsequent analysis yields several findings about hot spots. As expected, hot spots have a larger relative surface area burial and are more hydrophobic than other residues. Unexpectedly, however, residue conservation displays a rather complicated tendency depending on the types of protein complexes, indicating that this feature is not good for identifying hot spots. Of the selected features, the weighted atomic packing density, relative surface area burial and weighted hydrophobicity are the top 3, with the weighted atomic packing density proving to be the most effective feature for predicting hot spots. Notably, we find that hot spots are closely related to pi-related interactions, especially pi . . . pi interactions.

  19. A feature-based approach to modeling protein–protein interaction hot spots

    Science.gov (United States)

    Cho, Kyu-il; Kim, Dongsup; Lee, Doheon

    2009-01-01

    Identifying features that effectively represent the energetic contribution of an individual interface residue to the interactions between proteins remains problematic. Here, we present several new features and show that they are more effective than conventional features. By combining the proposed features with conventional features, we develop a predictive model for interaction hot spots. Initially, 54 multifaceted features, composed of different levels of information including structure, sequence and molecular interaction information, are quantified. Then, to identify the best subset of features for predicting hot spots, feature selection is performed using a decision tree. Based on the selected features, a predictive model for hot spots is created using support vector machine (SVM) and tested on an independent test set. Our model shows better overall predictive accuracy than previous methods such as the alanine scanning methods Robetta and FOLDEF, and the knowledge-based method KFC. Subsequent analysis yields several findings about hot spots. As expected, hot spots have a larger relative surface area burial and are more hydrophobic than other residues. Unexpectedly, however, residue conservation displays a rather complicated tendency depending on the types of protein complexes, indicating that this feature is not good for identifying hot spots. Of the selected features, the weighted atomic packing density, relative surface area burial and weighted hydrophobicity are the top 3, with the weighted atomic packing density proving to be the most effective feature for predicting hot spots. Notably, we find that hot spots are closely related to π–related interactions, especially π · · · π interactions. PMID:19273533

  20. The effective field theory of inflation models with sharp features

    International Nuclear Information System (INIS)

    Bartolo, Nicola; Cannone, Dario; Matarrese, Sabino

    2013-01-01

    We describe models of single-field inflation with small and sharp step features in the potential (and sound speed) of the inflaton field, in the context of the Effective Field Theory of Inflation. This approach allows us to study the effects of features in the power-spectrum and in the bispectrum of curvature perturbations, from a model-independent point of view, by parametrizing the features directly with modified ''slow-roll'' parameters. We can obtain a self-consistent power-spectrum, together with enhanced non-Gaussianity, which grows with a quantity β that parametrizes the sharpness of the step. With this treatment it is straightforward to generalize and include features in other coefficients of the effective action of the inflaton field fluctuations. Our conclusion in this case is that, excluding extrinsic curvature terms, the only interesting effects at the level of the bispectrum could arise from features in the first slow-roll parameter ε or in the speed of sound c s . Finally, we derive an upper bound on the parameter β from the consistency of the perturbative expansion of the action for inflaton perturbations. This constraint can be used for an estimation of the signal-to-noise ratio, to show that the observable which is most sensitive to features is the power-spectrum. This conclusion would change if we consider the contemporary presence of a feature and a speed of sound c s < 1, as, in such a case, contributions from an oscillating folded configuration can potentially make the bispectrum the leading observable for feature models

  1. Slim Battery Modelling Features

    Science.gov (United States)

    Borthomieu, Y.; Prevot, D.

    2011-10-01

    Saft has developed a life prediction model for VES and MPS cells and batteries. The Saft Li-ion Model (SLIM) is a macroscopic electrochemical model based on energy (global at cell level). The main purpose is to predict the battery performances during the life for GEO, MEO and LEO missions. This model is based on electrochemical characteristics such as Energy, Capacity, EMF, Internal resistance, end of charge voltage. It uses fading and calendar law effects on energy and internal impedance vs. time, temperature, End of Charge voltage. Based on the mission profile, satellite power system characteristics, the model proposes the various battery configurations. For each configuration, the model gives the battery performances using mission figures and profiles: power, duration, DOD, end of charge voltages, temperatures during eclipses and solstices, thermal dissipations and cell failures. For the GEO/MEO missions, eclipse and solstice periods can include specific profile such as plasmic propulsion fires and specific balancing operations. For LEO missions, the model is able to simulate high power peaks to predict radar pulses. Saft's main customers have been using the SLIM model available in house for two years. The purpose is to have the satellite builder power engineers able to perform by themselves in the battery pre-dimensioning activities their own battery simulations. The simulations can be shared with Saft engineers to refine the power system designs. This model has been correlated with existing life and calendar tests performed on all the VES and MPS cells. In comparing with more than 10 year lasting life tests, the accuracy of the model from a voltage point of view is less than 10 mV at end Of Life. In addition, thethe comparison with in-orbit data has been also done. b This paper will present the main features of the SLIM software and outputs comparison with real life tests. b0

  2. A national-scale model of linear features improves predictions of farmland biodiversity.

    Science.gov (United States)

    Sullivan, Martin J P; Pearce-Higgins, James W; Newson, Stuart E; Scholefield, Paul; Brereton, Tom; Oliver, Tom H

    2017-12-01

    Modelling species distribution and abundance is important for many conservation applications, but it is typically performed using relatively coarse-scale environmental variables such as the area of broad land-cover types. Fine-scale environmental data capturing the most biologically relevant variables have the potential to improve these models. For example, field studies have demonstrated the importance of linear features, such as hedgerows, for multiple taxa, but the absence of large-scale datasets of their extent prevents their inclusion in large-scale modelling studies.We assessed whether a novel spatial dataset mapping linear and woody-linear features across the UK improves the performance of abundance models of 18 bird and 24 butterfly species across 3723 and 1547 UK monitoring sites, respectively.Although improvements in explanatory power were small, the inclusion of linear features data significantly improved model predictive performance for many species. For some species, the importance of linear features depended on landscape context, with greater importance in agricultural areas. Synthesis and applications . This study demonstrates that a national-scale model of the extent and distribution of linear features improves predictions of farmland biodiversity. The ability to model spatial variability in the role of linear features such as hedgerows will be important in targeting agri-environment schemes to maximally deliver biodiversity benefits. Although this study focuses on farmland, data on the extent of different linear features are likely to improve species distribution and abundance models in a wide range of systems and also can potentially be used to assess habitat connectivity.

  3. Automated prostate cancer detection via comprehensive multi-parametric magnetic resonance imaging texture feature models

    International Nuclear Information System (INIS)

    Khalvati, Farzad; Wong, Alexander; Haider, Masoom A.

    2015-01-01

    Prostate cancer is the most common form of cancer and the second leading cause of cancer death in North America. Auto-detection of prostate cancer can play a major role in early detection of prostate cancer, which has a significant impact on patient survival rates. While multi-parametric magnetic resonance imaging (MP-MRI) has shown promise in diagnosis of prostate cancer, the existing auto-detection algorithms do not take advantage of abundance of data available in MP-MRI to improve detection accuracy. The goal of this research was to design a radiomics-based auto-detection method for prostate cancer via utilizing MP-MRI data. In this work, we present new MP-MRI texture feature models for radiomics-driven detection of prostate cancer. In addition to commonly used non-invasive imaging sequences in conventional MP-MRI, namely T2-weighted MRI (T2w) and diffusion-weighted imaging (DWI), our proposed MP-MRI texture feature models incorporate computed high-b DWI (CHB-DWI) and a new diffusion imaging modality called correlated diffusion imaging (CDI). Moreover, the proposed texture feature models incorporate features from individual b-value images. A comprehensive set of texture features was calculated for both the conventional MP-MRI and new MP-MRI texture feature models. We performed feature selection analysis for each individual modality and then combined best features from each modality to construct the optimized texture feature models. The performance of the proposed MP-MRI texture feature models was evaluated via leave-one-patient-out cross-validation using a support vector machine (SVM) classifier trained on 40,975 cancerous and healthy tissue samples obtained from real clinical MP-MRI datasets. The proposed MP-MRI texture feature models outperformed the conventional model (i.e., T2w+DWI) with regard to cancer detection accuracy. Comprehensive texture feature models were developed for improved radiomics-driven detection of prostate cancer using MP-MRI. Using a

  4. The idiopathic interstitial pneumonias: understanding key radiological features

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, S. [Department of Radiology, Churchill Hospital, Old Road, Oxford OX3 7LJ (United Kingdom); Benamore, R., E-mail: Rachel.Benamore@orh.nhs.u [Department of Radiology, Churchill Hospital, Old Road, Oxford OX3 7LJ (United Kingdom)

    2010-10-15

    Many radiologists find it challenging to distinguish between the different interstitial idiopathic pneumonias (IIPs). The British Thoracic Society guidelines on interstitial lung disease (2008) recommend the formation of multidisciplinary meetings, with diagnoses made by combined radiological, pathological, and clinical findings. This review focuses on understanding typical and atypical radiological features on high-resolution computed tomography between the different IIPs, to help the radiologist determine when a confident diagnosis can be made and how to deal with uncertainty.

  5. The idiopathic interstitial pneumonias: understanding key radiological features

    International Nuclear Information System (INIS)

    Dixon, S.; Benamore, R.

    2010-01-01

    Many radiologists find it challenging to distinguish between the different interstitial idiopathic pneumonias (IIPs). The British Thoracic Society guidelines on interstitial lung disease (2008) recommend the formation of multidisciplinary meetings, with diagnoses made by combined radiological, pathological, and clinical findings. This review focuses on understanding typical and atypical radiological features on high-resolution computed tomography between the different IIPs, to help the radiologist determine when a confident diagnosis can be made and how to deal with uncertainty.

  6. A Method for Model Checking Feature Interactions

    DEFF Research Database (Denmark)

    Pedersen, Thomas; Le Guilly, Thibaut; Ravn, Anders Peter

    2015-01-01

    This paper presents a method to check for feature interactions in a system assembled from independently developed concurrent processes as found in many reactive systems. The method combines and refines existing definitions and adds a set of activities. The activities describe how to populate the ...... the definitions with models to ensure that all interactions are captured. The method is illustrated on a home automation example with model checking as analysis tool. In particular, the modelling formalism is timed automata and the analysis uses UPPAAL to find interactions....

  7. Numerical rigid plastic modelling of shear capacity of keyed joints

    DEFF Research Database (Denmark)

    Herfelt, Morten Andersen; Poulsen, Peter Noe; Hoang, Linh Cao

    2015-01-01

    Keyed shear joints are currently designed using simple and conservative design formulas, yet these formulas do not take the local mechanisms in the concrete core of the joint into account. To investigate this phenomenon a rigid, perfectly plastic finite element model of keyed joints is used....... The model is formulated for second-order conic optimisation as a lower bound problem, which yields a statically admissible stress field that satisfies the yield condition in every point. The dual solution to the problem can be interpreted as the collapse mode and will be used to analyse the properties...

  8. Projecting biodiversity and wood production in future forest landscapes: 15 key modeling considerations.

    Science.gov (United States)

    Felton, Adam; Ranius, Thomas; Roberge, Jean-Michel; Öhman, Karin; Lämås, Tomas; Hynynen, Jari; Juutinen, Artti; Mönkkönen, Mikko; Nilsson, Urban; Lundmark, Tomas; Nordin, Annika

    2017-07-15

    A variety of modeling approaches can be used to project the future development of forest systems, and help to assess the implications of different management alternatives for biodiversity and ecosystem services. This diversity of approaches does however present both an opportunity and an obstacle for those trying to decide which modeling technique to apply, and interpreting the management implications of model output. Furthermore, the breadth of issues relevant to addressing key questions related to forest ecology, conservation biology, silviculture, economics, requires insights stemming from a number of distinct scientific disciplines. As forest planners, conservation ecologists, ecological economists and silviculturalists, experienced with modeling trade-offs and synergies between biodiversity and wood biomass production, we identified fifteen key considerations relevant to assessing the pros and cons of alternative modeling approaches. Specifically we identified key considerations linked to study question formulation, modeling forest dynamics, forest processes, study landscapes, spatial and temporal aspects, and the key response metrics - biodiversity and wood biomass production, as well as dealing with trade-offs and uncertainties. We also provide illustrative examples from the modeling literature stemming from the key considerations assessed. We use our findings to reiterate the need for explicitly addressing and conveying the limitations and uncertainties of any modeling approach taken, and the need for interdisciplinary research efforts when addressing the conservation of biodiversity and sustainable use of environmental resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Novel DBN Feature Fusion Model for Cross-Corpus Speech Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Zou Cairong

    2016-01-01

    Full Text Available The feature fusion from separate source is the current technical difficulties of cross-corpus speech emotion recognition. The purpose of this paper is to, based on Deep Belief Nets (DBN in Deep Learning, use the emotional information hiding in speech spectrum diagram (spectrogram as image features and then implement feature fusion with the traditional emotion features. First, based on the spectrogram analysis by STB/Itti model, the new spectrogram features are extracted from the color, the brightness, and the orientation, respectively; then using two alternative DBN models they fuse the traditional and the spectrogram features, which increase the scale of the feature subset and the characterization ability of emotion. Through the experiment on ABC database and Chinese corpora, the new feature subset compared with traditional speech emotion features, the recognition result on cross-corpus, distinctly advances by 8.8%. The method proposed provides a new idea for feature fusion of emotion recognition.

  10. A keyword spotting model using perceptually significant energy features

    Science.gov (United States)

    Umakanthan, Padmalochini

    The task of a keyword recognition system is to detect the presence of certain words in a conversation based on the linguistic information present in human speech. Such keyword spotting systems have applications in homeland security, telephone surveillance and human-computer interfacing. General procedure of a keyword spotting system involves feature generation and matching. In this work, new set of features that are based on the psycho-acoustic masking nature of human speech are proposed. After developing these features a time aligned pattern matching process was implemented to locate the words in a set of unknown words. A word boundary detection technique based on frame classification using the nonlinear characteristics of speech is also addressed in this work. Validation of this keyword spotting model was done using widely acclaimed Cepstral features. The experimental results indicate the viability of using these perceptually significant features as an augmented feature set in keyword spotting.

  11. Language Recognition Using Latent Dynamic Conditional Random Field Model with Phonological Features

    Directory of Open Access Journals (Sweden)

    Sirinoot Boonsuk

    2014-01-01

    Full Text Available Spoken language recognition (SLR has been of increasing interest in multilingual speech recognition for identifying the languages of speech utterances. Most existing SLR approaches apply statistical modeling techniques with acoustic and phonotactic features. Among the popular approaches, the acoustic approach has become of greater interest than others because it does not require any prior language-specific knowledge. Previous research on the acoustic approach has shown less interest in applying linguistic knowledge; it was only used as supplementary features, while the current state-of-the-art system assumes independency among features. This paper proposes an SLR system based on the latent-dynamic conditional random field (LDCRF model using phonological features (PFs. We use PFs to represent acoustic characteristics and linguistic knowledge. The LDCRF model was employed to capture the dynamics of the PFs sequences for language classification. Baseline systems were conducted to evaluate the features and methods including Gaussian mixture model (GMM based systems using PFs, GMM using cepstral features, and the CRF model using PFs. Evaluated on the NIST LRE 2007 corpus, the proposed method showed an improvement over the baseline systems. Additionally, it showed comparable result with the acoustic system based on i-vector. This research demonstrates that utilizing PFs can enhance the performance.

  12. Electronic assessment of clinical reasoning in clerkships: A mixed-methods comparison of long-menu key-feature problems with context-rich single best answer questions

    NARCIS (Netherlands)

    Huwendiek, S.; Reichert, F.; Duncker, C.; Leng, B.A. De; Vleuten, C.P.M. van der; Muijtjens, A.M.; Bosse, H.M.; Haag, M.; Hoffmann, G.F.; Tonshoff, B.; Dolmans, D.

    2017-01-01

    BACKGROUND: It remains unclear which item format would best suit the assessment of clinical reasoning: context-rich single best answer questions (crSBAs) or key-feature problems (KFPs). This study compared KFPs and crSBAs with respect to students' acceptance, their educational impact, and

  13. Construction Method of the Topographical Features Model for Underwater Terrain Navigation

    Directory of Open Access Journals (Sweden)

    Wang Lihui

    2015-09-01

    Full Text Available Terrain database is the reference basic for autonomous underwater vehicle (AUV to implement underwater terrain navigation (UTN functions, and is the important part of building topographical features model for UTN. To investigate the feasibility and correlation of a variety of terrain parameters as terrain navigation information metrics, this paper described and analyzed the underwater terrain features and topography parameters calculation method. Proposing a comprehensive evaluation method for terrain navigation information, and constructing an underwater navigation information analysis model, which is associated with topographic features. Simulation results show that the underwater terrain features, are associated with UTN information directly or indirectly, also affect the terrain matching capture probability and the positioning accuracy directly.

  14. Model-Based Learning of Local Image Features for Unsupervised Texture Segmentation

    Science.gov (United States)

    Kiechle, Martin; Storath, Martin; Weinmann, Andreas; Kleinsteuber, Martin

    2018-04-01

    Features that capture well the textural patterns of a certain class of images are crucial for the performance of texture segmentation methods. The manual selection of features or designing new ones can be a tedious task. Therefore, it is desirable to automatically adapt the features to a certain image or class of images. Typically, this requires a large set of training images with similar textures and ground truth segmentation. In this work, we propose a framework to learn features for texture segmentation when no such training data is available. The cost function for our learning process is constructed to match a commonly used segmentation model, the piecewise constant Mumford-Shah model. This means that the features are learned such that they provide an approximately piecewise constant feature image with a small jump set. Based on this idea, we develop a two-stage algorithm which first learns suitable convolutional features and then performs a segmentation. We note that the features can be learned from a small set of images, from a single image, or even from image patches. The proposed method achieves a competitive rank in the Prague texture segmentation benchmark, and it is effective for segmenting histological images.

  15. Gravity Model for Topological Features on a Cylindrical Manifold

    Directory of Open Access Journals (Sweden)

    Bayak I.

    2008-04-01

    Full Text Available A model aimed at understanding quantum gravity in terms of Birkho’s approach is discussed. The geometry of this model is constructed by using a winding map of Minkowski space into a R3 S1 -cylinder. The basic field of this model is a field of unit vectors defined through the velocity field of a flow wrapping the cylinder. The degeneration of some parts of the flow into circles (topological features results in in- homogeneities and gives rise to a scalar field, analogous to the gravitational field. The geometry and dynamics of this field are briefly discussed. We treat the intersections be- tween the topological features and the observer’s 3-space as matter particles and argue that these entities are likely to possess some quantum properties.

  16. Key features of the IPSL ocean atmosphere model and its sensitivity to atmospheric resolution

    Energy Technology Data Exchange (ETDEWEB)

    Marti, Olivier; Braconnot, P.; Bellier, J.; Brockmann, P.; Caubel, A.; Noblet, N. de; Friedlingstein, P.; Idelkadi, A.; Kageyama, M. [Unite Mixte CEA-CNRS-UVSQ, IPSL/LSCE, Gif-sur-Yvette Cedex (France); Dufresne, J.L.; Bony, S.; Codron, F.; Fairhead, L.; Grandpeix, J.Y.; Hourdin, F.; Musat, I. [Unite Mixte CNRS-Ecole Polytechnique-ENS-UPCM, IPSL/LMD, Paris Cedex 05 (France); Benshila, R.; Guilyardi, E.; Levy, C.; Madec, G.; Mignot, J.; Talandier, C. [unite mixte CNRS-IRD-UPMC, IPLS/LOCEAN, Paris Cedex 05 (France); Cadule, P.; Denvil, S.; Foujols, M.A. [Institut Pierre Simon Laplace des Sciences de l' Environnement (IPSL), Paris Cedex 05 (France); Fichefet, T.; Goosse, H. [Universite Catholique de Louvain, Institut d' Astronomie et de Geophysique Georges Lemaitre, Louvain-la-Neuve (Belgium); Krinner, G. [Unite mixte CNRS-UJF Grenoble, LGGE, BP96, Saint-Martin-d' Heres (France); Swingedouw, D. [CNRS/CERFACS, Toulouse (France)

    2010-01-15

    This paper presents the major characteristics of the Institut Pierre Simon Laplace (IPSL) coupled ocean-atmosphere general circulation model. The model components and the coupling methodology are described, as well as the main characteristics of the climatology and interannual variability. The model results of the standard version used for IPCC climate projections, and for intercomparison projects like the Paleoclimate Modeling Intercomparison Project (PMIP 2) are compared to those with a higher resolution in the atmosphere. A focus on the North Atlantic and on the tropics is used to address the impact of the atmosphere resolution on processes and feedbacks. In the North Atlantic, the resolution change leads to an improved representation of the storm-tracks and the North Atlantic oscillation. The better representation of the wind structure increases the northward salt transports, the deep-water formation and the Atlantic meridional overturning circulation. In the tropics, the ocean-atmosphere dynamical coupling, or Bjerknes feedback, improves with the resolution. The amplitude of ENSO (El Nino-Southern oscillation) consequently increases, as the damping processes are left unchanged. (orig.)

  17. Impacts of Changing Climatic Drivers and Land use features on Future Stormwater Runoff in the Northwest Florida Basin: A Large-Scale Hydrologic Modeling Assessment

    Science.gov (United States)

    Khan, M.; Abdul-Aziz, O. I.

    2017-12-01

    Potential changes in climatic drivers and land cover features can significantly influence the stormwater budget in the Northwest Florida Basin. We investigated the hydro-climatic and land use sensitivities of stormwater runoff by developing a large-scale process-based rainfall-runoff model for the large basin by using the EPA Storm Water Management Model (SWMM 5.1). Climatic and hydrologic variables, as well as land use/cover features were incorporated into the model to account for the key processes of coastal hydrology and its dynamic interactions with groundwater and sea levels. We calibrated and validated the model by historical daily streamflow observations during 2009-2012 at four major rivers in the basin. Downscaled climatic drivers (precipitation, temperature, solar radiation) projected by twenty GCMs-RCMs under CMIP5, along with the projected future land use/cover features were also incorporated into the model. The basin storm runoff was then simulated for the historical (2000s = 1976-2005) and two future periods (2050s = 2030-2059, and 2080s = 2070-2099). Comparative evaluation of the historical and future scenarios leads to important guidelines for stormwater management in Northwest Florida and similar regions under a changing climate and environment.

  18. Feature learning and change feature classification based on deep learning for ternary change detection in SAR images

    Science.gov (United States)

    Gong, Maoguo; Yang, Hailun; Zhang, Puzhao

    2017-07-01

    Ternary change detection aims to detect changes and group the changes into positive change and negative change. It is of great significance in the joint interpretation of spatial-temporal synthetic aperture radar images. In this study, sparse autoencoder, convolutional neural networks (CNN) and unsupervised clustering are combined to solve ternary change detection problem without any supervison. Firstly, sparse autoencoder is used to transform log-ratio difference image into a suitable feature space for extracting key changes and suppressing outliers and noise. And then the learned features are clustered into three classes, which are taken as the pseudo labels for training a CNN model as change feature classifier. The reliable training samples for CNN are selected from the feature maps learned by sparse autoencoder with certain selection rules. Having training samples and the corresponding pseudo labels, the CNN model can be trained by using back propagation with stochastic gradient descent. During its training procedure, CNN is driven to learn the concept of change, and more powerful model is established to distinguish different types of changes. Unlike the traditional methods, the proposed framework integrates the merits of sparse autoencoder and CNN to learn more robust difference representations and the concept of change for ternary change detection. Experimental results on real datasets validate the effectiveness and superiority of the proposed framework.

  19. Features of Functioning the Integrated Building Thermal Model

    Directory of Open Access Journals (Sweden)

    Morozov Maxim N.

    2017-01-01

    Full Text Available A model of the building heating system, consisting of energy source, a distributed automatic control system, elements of individual heating unit and heating system is designed. Application Simulink of mathematical package Matlab is selected as a platform for the model. There are the specialized application Simscape libraries in aggregate with a wide range of Matlab mathematical tools allow to apply the “acausal” modeling concept. Implementation the “physical” representation of the object model gave improving the accuracy of the models. Principle of operation and features of the functioning of the thermal model is described. The investigations of building cooling dynamics were carried out.

  20. Key Factors Influencing the Energy Absorption of Dual-Phase Steels: Multiscale Material Model Approach and Microstructural Optimization

    Science.gov (United States)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2018-06-01

    The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.

  1. Key processes and input parameters for environmental tritium models

    International Nuclear Information System (INIS)

    Bunnenberg, C.; Taschner, M.; Ogram, G.L.

    1994-01-01

    The primary objective of the work reported here is to define key processes and input parameters for mathematical models of environmental tritium behaviour adequate for use in safety analysis and licensing of fusion devices like NET and associated tritium handling facilities. (author). 45 refs., 3 figs

  2. Key processes and input parameters for environmental tritium models

    Energy Technology Data Exchange (ETDEWEB)

    Bunnenberg, C; Taschner, M [Niedersaechsisches Inst. fuer Radiooekologie, Hannover (Germany); Ogram, G L [Ontario Hydro, Toronto, ON (Canada)

    1994-12-31

    The primary objective of the work reported here is to define key processes and input parameters for mathematical models of environmental tritium behaviour adequate for use in safety analysis and licensing of fusion devices like NET and associated tritium handling facilities. (author). 45 refs., 3 figs.

  3. A product feature-based user-centric product search model

    OpenAIRE

    Ben Jabeur, Lamjed; Soulier, Laure; Tamine, Lynda; Mousset, Paul

    2016-01-01

    During the online shopping process, users would search for interesting products and quickly access those that fit with their needs among a long tail of similar or closely related products. Our contribution addresses head queries that are frequently submitted on e-commerce Web sites. Head queries usually target featured products with several variations, accessories, and complementary products. We present in this paper a product feature-based user-centric model for product search involving in a...

  4. Swallowing sound detection using hidden markov modeling of recurrence plot features

    International Nuclear Information System (INIS)

    Aboofazeli, Mohammad; Moussavi, Zahra

    2009-01-01

    Automated detection of swallowing sounds in swallowing and breath sound recordings is of importance for monitoring purposes in which the recording durations are long. This paper presents a novel method for swallowing sound detection using hidden Markov modeling of recurrence plot features. Tracheal sound recordings of 15 healthy and nine dysphagic subjects were studied. The multidimensional state space trajectory of each signal was reconstructed using the Taken method of delays. The sequences of three recurrence plot features of the reconstructed trajectories (which have shown discriminating capability between swallowing and breath sounds) were modeled by three hidden Markov models. The Viterbi algorithm was used for swallowing sound detection. The results were validated manually by inspection of the simultaneously recorded airflow signal and spectrogram of the sounds, and also by auditory means. The experimental results suggested that the performance of the proposed method using hidden Markov modeling of recurrence plot features was superior to the previous swallowing sound detection methods.

  5. Swallowing sound detection using hidden markov modeling of recurrence plot features

    Energy Technology Data Exchange (ETDEWEB)

    Aboofazeli, Mohammad [Faculty of Engineering, Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, Manitoba, R3T 5V6 (Canada)], E-mail: umaboofa@cc.umanitoba.ca; Moussavi, Zahra [Faculty of Engineering, Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, Manitoba, R3T 5V6 (Canada)], E-mail: mousavi@ee.umanitoba.ca

    2009-01-30

    Automated detection of swallowing sounds in swallowing and breath sound recordings is of importance for monitoring purposes in which the recording durations are long. This paper presents a novel method for swallowing sound detection using hidden Markov modeling of recurrence plot features. Tracheal sound recordings of 15 healthy and nine dysphagic subjects were studied. The multidimensional state space trajectory of each signal was reconstructed using the Taken method of delays. The sequences of three recurrence plot features of the reconstructed trajectories (which have shown discriminating capability between swallowing and breath sounds) were modeled by three hidden Markov models. The Viterbi algorithm was used for swallowing sound detection. The results were validated manually by inspection of the simultaneously recorded airflow signal and spectrogram of the sounds, and also by auditory means. The experimental results suggested that the performance of the proposed method using hidden Markov modeling of recurrence plot features was superior to the previous swallowing sound detection methods.

  6. A prototype feature system for feature retrieval using relationships

    Science.gov (United States)

    Choi, J.; Usery, E.L.

    2009-01-01

    Using a feature data model, geographic phenomena can be represented effectively by integrating space, theme, and time. This paper extends and implements a feature data model that supports query and visualization of geographic features using their non-spatial and temporal relationships. A prototype feature-oriented geographic information system (FOGIS) is then developed and storage of features named Feature Database is designed. Buildings from the U.S. Marine Corps Base, Camp Lejeune, North Carolina and subways in Chicago, Illinois are used to test the developed system. The results of the applications show the strength of the feature data model and the developed system 'FOGIS' when they utilize non-spatial and temporal relationships in order to retrieve and visualize individual features.

  7. Feature Extraction

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Feature selection and reduction are key to robust multivariate analyses. In this talk I will focus on pros and cons of various variable selection methods and focus on those that are most relevant in the context of HEP.

  8. A Scenario-Based Protocol Checker for Public-Key Authentication Scheme

    Science.gov (United States)

    Saito, Takamichi

    Security protocol provides communication security for the internet. One of the important features of it is authentication with key exchange. Its correctness is a requirement of the whole of the communication security. In this paper, we introduce three attack models realized as their attack scenarios, and provide an authentication-protocol checker for applying three attack-scenarios based on the models. We also utilize it to check two popular security protocols: Secure SHell (SSH) and Secure Socket Layer/Transport Layer Security (SSL/TLS).

  9. Acquired apraxia of speech: features, accounts, and treatment.

    Science.gov (United States)

    Peach, Richard K

    2004-01-01

    The features of apraxia of speech (AOS) are presented with regard to both traditional and contemporary descriptions of the disorder. Models of speech processing, including the neurological bases for apraxia of speech, are discussed. Recent findings concerning subcortical contributions to apraxia of speech and the role of the insula are presented. The key features to differentially diagnose AOS from related speech syndromes are identified. Treatment implications derived from motor accounts of AOS are presented along with a summary of current approaches designed to treat the various subcomponents of the disorder. Finally, guidelines are provided for treating the AOS patient with coexisting aphasia.

  10. Hum-mPLoc 3.0: prediction enhancement of human protein subcellular localization through modeling the hidden correlations of gene ontology and functional domain features.

    Science.gov (United States)

    Zhou, Hang; Yang, Yang; Shen, Hong-Bin

    2017-03-15

    Protein subcellular localization prediction has been an important research topic in computational biology over the last decade. Various automatic methods have been proposed to predict locations for large scale protein datasets, where statistical machine learning algorithms are widely used for model construction. A key step in these predictors is encoding the amino acid sequences into feature vectors. Many studies have shown that features extracted from biological domains, such as gene ontology and functional domains, can be very useful for improving the prediction accuracy. However, domain knowledge usually results in redundant features and high-dimensional feature spaces, which may degenerate the performance of machine learning models. In this paper, we propose a new amino acid sequence-based human protein subcellular location prediction approach Hum-mPLoc 3.0, which covers 12 human subcellular localizations. The sequences are represented by multi-view complementary features, i.e. context vocabulary annotation-based gene ontology (GO) terms, peptide-based functional domains, and residue-based statistical features. To systematically reflect the structural hierarchy of the domain knowledge bases, we propose a novel feature representation protocol denoted as HCM (Hidden Correlation Modeling), which will create more compact and discriminative feature vectors by modeling the hidden correlations between annotation terms. Experimental results on four benchmark datasets show that HCM improves prediction accuracy by 5-11% and F 1 by 8-19% compared with conventional GO-based methods. A large-scale application of Hum-mPLoc 3.0 on the whole human proteome reveals proteins co-localization preferences in the cell. www.csbio.sjtu.edu.cn/bioinf/Hum-mPLoc3/. hbshen@sjtu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Light field morphing using 2D features.

    Science.gov (United States)

    Wang, Lifeng; Lin, Stephen; Lee, Seungyong; Guo, Baining; Shum, Heung-Yeung

    2005-01-01

    We present a 2D feature-based technique for morphing 3D objects represented by light fields. Existing light field morphing methods require the user to specify corresponding 3D feature elements to guide morph computation. Since slight errors in 3D specification can lead to significant morphing artifacts, we propose a scheme based on 2D feature elements that is less sensitive to imprecise marking of features. First, 2D features are specified by the user in a number of key views in the source and target light fields. Then the two light fields are warped view by view as guided by the corresponding 2D features. Finally, the two warped light fields are blended together to yield the desired light field morph. Two key issues in light field morphing are feature specification and warping of light field rays. For feature specification, we introduce a user interface for delineating 2D features in key views of a light field, which are automatically interpolated to other views. For ray warping, we describe a 2D technique that accounts for visibility changes and present a comparison to the ideal morphing of light fields. Light field morphing based on 2D features makes it simple to incorporate previous image morphing techniques such as nonuniform blending, as well as to morph between an image and a light field.

  12. Robustness of digitally modulated signal features against variation in HF noise model

    Directory of Open Access Journals (Sweden)

    Shoaib Mobien

    2011-01-01

    Full Text Available Abstract High frequency (HF band has both military and civilian uses. It can be used either as a primary or backup communication link. Automatic modulation classification (AMC is of an utmost importance in this band for the purpose of communications monitoring; e.g., signal intelligence and spectrum management. A widely used method for AMC is based on pattern recognition (PR. Such a method has two main steps: feature extraction and classification. The first step is generally performed in the presence of channel noise. Recent studies show that HF noise could be modeled by Gaussian or bi-kappa distributions, depending on day-time. Therefore, it is anticipated that change in noise model will have impact on features extraction stage. In this article, we investigate the robustness of well known digitally modulated signal features against variation in HF noise. Specifically, we consider temporal time domain (TTD features, higher order cumulants (HOC, and wavelet based features. In addition, we propose new features extracted from the constellation diagram and evaluate their robustness against the change in noise model. This study is targeting 2PSK, 4PSK, 8PSK, 16QAM, 32QAM, and 64QAM modulations, as they are commonly used in HF communications.

  13. Efficient and robust model-to-image alignment using 3D scale-invariant features.

    Science.gov (United States)

    Toews, Matthew; Wells, William M

    2013-04-01

    This paper presents feature-based alignment (FBA), a general method for efficient and robust model-to-image alignment. Volumetric images, e.g. CT scans of the human body, are modeled probabilistically as a collage of 3D scale-invariant image features within a normalized reference space. Features are incorporated as a latent random variable and marginalized out in computing a maximum a posteriori alignment solution. The model is learned from features extracted in pre-aligned training images, then fit to features extracted from a new image to identify a globally optimal locally linear alignment solution. Novel techniques are presented for determining local feature orientation and efficiently encoding feature intensity in 3D. Experiments involving difficult magnetic resonance (MR) images of the human brain demonstrate FBA achieves alignment accuracy similar to widely-used registration methods, while requiring a fraction of the memory and computation resources and offering a more robust, globally optimal solution. Experiments on CT human body scans demonstrate FBA as an effective system for automatic human body alignment where other alignment methods break down. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Textural features of dynamic contrast-enhanced MRI derived model-free and model-based parameter maps in glioma grading.

    Science.gov (United States)

    Xie, Tian; Chen, Xiao; Fang, Jingqin; Kang, Houyi; Xue, Wei; Tong, Haipeng; Cao, Peng; Wang, Sumei; Yang, Yizeng; Zhang, Weiguo

    2018-04-01

    Presurgical glioma grading by dynamic contrast-enhanced MRI (DCE-MRI) has unresolved issues. The aim of this study was to investigate the ability of textural features derived from pharmacokinetic model-based or model-free parameter maps of DCE-MRI in discriminating between different grades of gliomas, and their correlation with pathological index. Retrospective. Forty-two adults with brain gliomas. 3.0T, including conventional anatomic sequences and DCE-MRI sequences (variable flip angle T1-weighted imaging and three-dimensional gradient echo volumetric imaging). Regions of interest on the cross-sectional images with maximal tumor lesion. Five commonly used textural features, including Energy, Entropy, Inertia, Correlation, and Inverse Difference Moment (IDM), were generated. All textural features of model-free parameters (initial area under curve [IAUC], maximal signal intensity [Max SI], maximal up-slope [Max Slope]) could effectively differentiate between grade II (n = 15), grade III (n = 13), and grade IV (n = 14) gliomas (P textural features, Entropy and IDM, of four DCE-MRI parameters, including Max SI, Max Slope (model-free parameters), vp (Extended Tofts), and vp (Patlak) could differentiate grade III and IV gliomas (P textural features of any DCE-MRI parameter maps could discriminate between subtypes of grade II and III gliomas (P features revealed relatively lower inter-observer agreement. No significant correlation was found between microvascular density and textural features, compared with a moderate correlation found between cellular proliferation index and those features. Textural features of DCE-MRI parameter maps displayed a good ability in glioma grading. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:1099-1111. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Toward a model for lexical access based on acoustic landmarks and distinctive features

    Science.gov (United States)

    Stevens, Kenneth N.

    2002-04-01

    This article describes a model in which the acoustic speech signal is processed to yield a discrete representation of the speech stream in terms of a sequence of segments, each of which is described by a set (or bundle) of binary distinctive features. These distinctive features specify the phonemic contrasts that are used in the language, such that a change in the value of a feature can potentially generate a new word. This model is a part of a more general model that derives a word sequence from this feature representation, the words being represented in a lexicon by sequences of feature bundles. The processing of the signal proceeds in three steps: (1) Detection of peaks, valleys, and discontinuities in particular frequency ranges of the signal leads to identification of acoustic landmarks. The type of landmark provides evidence for a subset of distinctive features called articulator-free features (e.g., [vowel], [consonant], [continuant]). (2) Acoustic parameters are derived from the signal near the landmarks to provide evidence for the actions of particular articulators, and acoustic cues are extracted by sampling selected attributes of these parameters in these regions. The selection of cues that are extracted depends on the type of landmark and on the environment in which it occurs. (3) The cues obtained in step (2) are combined, taking context into account, to provide estimates of ``articulator-bound'' features associated with each landmark (e.g., [lips], [high], [nasal]). These articulator-bound features, combined with the articulator-free features in (1), constitute the sequence of feature bundles that forms the output of the model. Examples of cues that are used, and justification for this selection, are given, as well as examples of the process of inferring the underlying features for a segment when there is variability in the signal due to enhancement gestures (recruited by a speaker to make a contrast more salient) or due to overlap of gestures from

  16. What are the visual features underlying rapid object recognition?

    Directory of Open Access Journals (Sweden)

    Sébastien M Crouzet

    2011-11-01

    Full Text Available Research progress in machine vision has been very significant in recent years. Robust face detection and identification algorithms are already readily available to consumers, and modern computer vision algorithms for generic object recognition are now coping with the richness and complexity of natural visual scenes. Unlike early vision models of object recognition that emphasized the role of figure-ground segmentation and spatial information between parts, recent successful approaches are based on the computation of loose collections of image features without prior segmentation or any explicit encoding of spatial relations. While these models remain simplistic models of visual processing, they suggest that, in principle, bottom-up activation of a loose collection of image features could support the rapid recognition of natural object categories and provide an initial coarse visual representation before more complex visual routines and attentional mechanisms take place. Focusing on biologically-plausible computational models of (bottom-up pre-attentive visual recognition, we review some of the key visual features that have been described in the literature. We discuss the consistency of these feature-based representations with classical theories from visual psychology and test their ability to account for human performance on a rapid object categorization task.

  17. TU-CD-BRB-01: Normal Lung CT Texture Features Improve Predictive Models for Radiation Pneumonitis

    International Nuclear Information System (INIS)

    Krafft, S; Briere, T; Court, L; Martel, M

    2015-01-01

    Purpose: Existing normal tissue complication probability (NTCP) models for radiation pneumonitis (RP) traditionally rely on dosimetric and clinical data but are limited in terms of performance and generalizability. Extraction of pre-treatment image features provides a potential new category of data that can improve NTCP models for RP. We consider quantitative measures of total lung CT intensity and texture in a framework for prediction of RP. Methods: Available clinical and dosimetric data was collected for 198 NSCLC patients treated with definitive radiotherapy. Intensity- and texture-based image features were extracted from the T50 phase of the 4D-CT acquired for treatment planning. A total of 3888 features (15 clinical, 175 dosimetric, and 3698 image features) were gathered and considered candidate predictors for modeling of RP grade≥3. A baseline logistic regression model with mean lung dose (MLD) was first considered. Additionally, a least absolute shrinkage and selection operator (LASSO) logistic regression was applied to the set of clinical and dosimetric features, and subsequently to the full set of clinical, dosimetric, and image features. Model performance was assessed by comparing area under the curve (AUC). Results: A simple logistic fit of MLD was an inadequate model of the data (AUC∼0.5). Including clinical and dosimetric parameters within the framework of the LASSO resulted in improved performance (AUC=0.648). Analysis of the full cohort of clinical, dosimetric, and image features provided further and significant improvement in model performance (AUC=0.727). Conclusions: To achieve significant gains in predictive modeling of RP, new categories of data should be considered in addition to clinical and dosimetric features. We have successfully incorporated CT image features into a framework for modeling RP and have demonstrated improved predictive performance. Validation and further investigation of CT image features in the context of RP NTCP

  18. Password-only authenticated three-party key exchange with provable security in the standard model.

    Science.gov (United States)

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon; Won, Dongho

    2014-01-01

    Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.

  19. Password-Only Authenticated Three-Party Key Exchange with Provable Security in the Standard Model

    Directory of Open Access Journals (Sweden)

    Junghyun Nam

    2014-01-01

    Full Text Available Protocols for password-only authenticated key exchange (PAKE in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000, which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.

  20. A Labeling Model Based on the Region of Movability for Point-Feature Label Placement

    Directory of Open Access Journals (Sweden)

    Lin Li

    2016-09-01

    Full Text Available Automatic point-feature label placement (PFLP is a fundamental task for map visualization. As the dominant solutions to the PFLP problem, fixed-position and slider models have been widely studied in previous research. However, the candidate labels generated with these models are set to certain fixed positions or a specified track line for sliding. Thus, the whole surrounding space of a point feature is not sufficiently used for labeling. Hence, this paper proposes a novel label model based on the region of movability, which comes from plane collision detection theory. The model defines a complete conflict-free search space for label placement. On the premise of no conflict with the point, line, and area features, the proposed model utilizes the surrounding zone of the point feature to generate candidate label positions. By combining with heuristic search method, the model achieves high-quality label placement. In addition, the flexibility of the proposed model enables placing arbitrarily shaped labels.

  1. Compounding local invariant features and global deformable geometry for medical image registration.

    Directory of Open Access Journals (Sweden)

    Jianhua Zhang

    Full Text Available Using deformable models to register medical images can result in problems of initialization of deformable models and robustness and accuracy of matching of inter-subject anatomical variability. To tackle these problems, a novel model is proposed in this paper by compounding local invariant features and global deformable geometry. This model has four steps. First, a set of highly-repeatable and highly-robust local invariant features, called Key Features Model (KFM, are extracted by an effective matching strategy. Second, local features can be matched more accurately through the KFM for the purpose of initializing a global deformable model. Third, the positional relationship between the KFM and the global deformable model can be used to precisely pinpoint all landmarks after initialization. And fourth, the final pose of the global deformable model is determined by an iterative process with a lower time cost. Through the practical experiments, the paper finds three important conclusions. First, it proves that the KFM can detect the matching feature points well. Second, the precision of landmark locations adjusted by the modeled relationship between KFM and global deformable model is greatly improved. Third, regarding the fitting accuracy and efficiency, by observation from the practical experiments, it is found that the proposed method can improve 6~8% of the fitting accuracy and reduce around 50% of the computational time compared with state-of-the-art methods.

  2. An Investigation of Feature Models for Music Genre Classification using the Support Vector Classifier

    DEFF Research Database (Denmark)

    Meng, Anders; Shawe-Taylor, John

    2005-01-01

    In music genre classification the decision time is typically of the order of several seconds however most automatic music genre classification systems focus on short time features derived from 10-50ms. This work investigates two models, the multivariate gaussian model and the multivariate...... probability kernel. In order to examine the different methods an 11 genre music setup was utilized. In this setup the Mel Frequency Cepstral Coefficients (MFCC) were used as short time features. The accuracy of the best performing model on this data set was 44% as compared to a human performance of 52...... autoregressive model for modelling short time features. Furthermore, it was investigated how these models can be integrated over a segment of short time features into a kernel such that a support vector machine can be applied. Two kernels with this property were considered, the convolution kernel and product...

  3. Accessing key steps of human tumor progression in vivo by using an avian embryo model

    Science.gov (United States)

    Hagedorn, Martin; Javerzat, Sophie; Gilges, Delphine; Meyre, Aurélie; de Lafarge, Benjamin; Eichmann, Anne; Bikfalvi, Andreas

    2005-02-01

    Experimental in vivo tumor models are essential for comprehending the dynamic process of human cancer progression, identifying therapeutic targets, and evaluating antitumor drugs. However, current rodent models are limited by high costs, long experimental duration, variability, restricted accessibility to the tumor, and major ethical concerns. To avoid these shortcomings, we investigated whether tumor growth on the chick chorio-allantoic membrane after human glioblastoma cell grafting would replicate characteristics of the human disease. Avascular tumors consistently formed within 2 days, then progressed through vascular endothelial growth factor receptor 2-dependent angiogenesis, associated with hemorrhage, necrosis, and peritumoral edema. Blocking of vascular endothelial growth factor receptor 2 and platelet-derived growth factor receptor signaling pathways by using small-molecule receptor tyrosine kinase inhibitors abrogated tumor development. Gene regulation during the angiogenic switch was analyzed by oligonucleotide microarrays. Defined sample selection for gene profiling permitted identification of regulated genes whose functions are associated mainly with tumor vascularization and growth. Furthermore, expression of known tumor progression genes identified in the screen (IL-6 and cysteine-rich angiogenic inducer 61) as well as potential regulators (lumican and F-box-only 6) follow similar patterns in patient glioma. The model reliably simulates key features of human glioma growth in a few days and thus could considerably increase the speed and efficacy of research on human tumor progression and preclinical drug screening. angiogenesis | animal model alternatives | glioblastoma

  4. The Catchment Feature Model: A Device for Multimodal Fusion and a Bridge between Signal and Sense

    Science.gov (United States)

    Quek, Francis

    2004-12-01

    The catchment feature model addresses two questions in the field of multimodal interaction: how we bridge video and audio processing with the realities of human multimodal communication, and how information from the different modes may be fused. We argue from a detailed literature review that gestural research has clustered around manipulative and semaphoric use of the hands, motivate the catchment feature model psycholinguistic research, and present the model. In contrast to "whole gesture" recognition, the catchment feature model applies a feature decomposition approach that facilitates cross-modal fusion at the level of discourse planning and conceptualization. We present our experimental framework for catchment feature-based research, cite three concrete examples of catchment features, and propose new directions of multimodal research based on the model.

  5. Optimization of an individual re-identification modeling process using biometric features

    Energy Technology Data Exchange (ETDEWEB)

    Heredia-Langner, Alejandro; Amidan, Brett G.; Matzner, Shari; Jarman, Kristin H.

    2014-09-24

    We present results from the optimization of a re-identification process using two sets of biometric data obtained from the Civilian American and European Surface Anthropometry Resource Project (CAESAR) database. The datasets contain real measurements of features for 2378 individuals in a standing (43 features) and seated (16 features) position. A genetic algorithm (GA) was used to search a large combinatorial space where different features are available between the probe (seated) and gallery (standing) datasets. Results show that optimized model predictions obtained using less than half of the 43 gallery features and data from roughly 16% of the individuals available produce better re-identification rates than two other approaches that use all the information available.

  6. Key engineering features of the ITER-FEAT magnet system and implications for the R and D programme

    International Nuclear Information System (INIS)

    Huguet, M.

    2001-01-01

    The magnet design of the new ITER-FEAT machine comprises 18 Toroidal Field (TF) coils, a Central Solenoid (CS), 6 Poloidal Field (PF) coils and Correction Coils (CCs). A key driver of this new design is the requirement to generate and control plasmas with a relatively high elongation (k 95 =1.7) and a relatively high triangularity (δ 95 =0.35). This has lead to a design where the CS is vertically segmented and self-standing and the TF coils are wedged along their inboard legs. Another important design driver is to achieve a high operational reliability of the magnets, and this has resulted in several unconventional designs, and in particular, the use of conductors supported in radial plates for the winding pack of the TF coils. A key mechanical issue is the cyclic loading of the TF coil cases due to the out-of-plane loads which result from the interaction of the TF coil current and the poloidal field. These loads are resisted by a combination of shear keys and 'pre-compression' rings able to provide a centripetal preload at assembly. The fatigue life of the CS conductor jacket is another issue as it determines the CS performance in terms of the flux generation. Two jacket materials and designs are under study. Since 1993, the ITER magnet R and D programme has been focussed on the manufacture and testing of a CS and a TF model coil. During its testing, the CS model coil has successfully achieved all its performance targets in DC and AC operations. The manufacture of the TF model coil is complete. The manufacture of segments of the full scale TF coil case is another important and successful part of this programme and is near completion. New R and D effort is now being initiated to cover specific aspects of the ITER-FEAT design. (author)

  7. Features of CRISPR-Cas Regulation Key to Highly Efficient and Temporally-Specific crRNA Production

    Directory of Open Access Journals (Sweden)

    Andjela Rodic

    2017-11-01

    Full Text Available Bacterial immune systems, such as CRISPR-Cas or restriction-modification (R-M systems, affect bacterial pathogenicity and antibiotic resistance by modulating horizontal gene flow. A model system for CRISPR-Cas regulation, the Type I-E system from Escherichia coli, is silent under standard laboratory conditions and experimentally observing the dynamics of CRISPR-Cas activation is challenging. Two characteristic features of CRISPR-Cas regulation in E. coli are cooperative transcription repression of cas gene and CRISPR array promoters, and fast non-specific degradation of full length CRISPR transcripts (pre-crRNA. In this work, we use computational modeling to understand how these features affect the system expression dynamics. Signaling which leads to CRISPR-Cas activation is currently unknown, so to bypass this step, we here propose a conceptual setup for cas expression activation, where cas genes are put under transcription control typical for a restriction-modification (R-M system and then introduced into a cell. Known transcription regulation of an R-M system is used as a proxy for currently unknown CRISPR-Cas transcription control, as both systems are characterized by high cooperativity, which is likely related to similar dynamical constraints of their function. We find that the two characteristic CRISPR-Cas control features are responsible for its temporally-specific dynamical response, so that the system makes a steep (switch-like transition from OFF to ON state with a time-delay controlled by pre-crRNA degradation rate. We furthermore find that cooperative transcription regulation qualitatively leads to a cross-over to a regime where, at higher pre-crRNA processing rates, crRNA generation approaches the limit of an infinitely abrupt system induction. We propose that these dynamical properties are associated with rapid expression of CRISPR-Cas components and efficient protection of bacterial cells against foreign DNA. In terms of synthetic

  8. Towards Stable Adversarial Feature Learning for LiDAR based Loop Closure Detection

    OpenAIRE

    Xu, Lingyun; Yin, Peng; Luo, Haibo; Liu, Yunhui; Han, Jianda

    2017-01-01

    Stable feature extraction is the key for the Loop closure detection (LCD) task in the simultaneously localization and mapping (SLAM) framework. In our paper, the feature extraction is operated by using a generative adversarial networks (GANs) based unsupervised learning. GANs are powerful generative models, however, GANs based adversarial learning suffers from training instability. We find that the data-code joint distribution in the adversarial learning is a more complex manifold than in the...

  9. Ship Detection Based on Multiple Features in Random Forest Model for Hyperspectral Images

    Science.gov (United States)

    Li, N.; Ding, L.; Zhao, H.; Shi, J.; Wang, D.; Gong, X.

    2018-04-01

    A novel method for detecting ships which aim to make full use of both the spatial and spectral information from hyperspectral images is proposed. Firstly, the band which is high signal-noise ratio in the range of near infrared or short-wave infrared spectrum, is used to segment land and sea on Otsu threshold segmentation method. Secondly, multiple features that include spectral and texture features are extracted from hyperspectral images. Principal components analysis (PCA) is used to extract spectral features, the Grey Level Co-occurrence Matrix (GLCM) is used to extract texture features. Finally, Random Forest (RF) model is introduced to detect ships based on the extracted features. To illustrate the effectiveness of the method, we carry out experiments over the EO-1 data by comparing single feature and different multiple features. Compared with the traditional single feature method and Support Vector Machine (SVM) model, the proposed method can stably achieve the target detection of ships under complex background and can effectively improve the detection accuracy of ships.

  10. The Catchment Feature Model: A Device for Multimodal Fusion and a Bridge between Signal and Sense

    Directory of Open Access Journals (Sweden)

    Francis Quek

    2004-09-01

    Full Text Available The catchment feature model addresses two questions in the field of multimodal interaction: how we bridge video and audio processing with the realities of human multimodal communication, and how information from the different modes may be fused. We argue from a detailed literature review that gestural research has clustered around manipulative and semaphoric use of the hands, motivate the catchment feature model psycholinguistic research, and present the model. In contrast to “whole gesture” recognition, the catchment feature model applies a feature decomposition approach that facilitates cross-modal fusion at the level of discourse planning and conceptualization. We present our experimental framework for catchment feature-based research, cite three concrete examples of catchment features, and propose new directions of multimodal research based on the model.

  11. Key Clinical Features to Identify Girls with "CDKL5" Mutations

    Science.gov (United States)

    Bahi-Buisson, Nadia; Nectoux, Juliette; Rosas-Vargas, Haydee; Milh, Mathieu; Boddaert, Nathalie; Girard, Benoit; Cances, Claude; Ville, Dorothee; Afenjar, Alexandra; Rio, Marlene; Heron, Delphine; Morel, Marie Ange N'Guyen; Arzimanoglou, Alexis; Philippe, Christophe; Jonveaux, Philippe; Chelly, Jamel; Bienvenu, Thierry

    2008-01-01

    Mutations in the human X-linked cyclin-dependent kinase-like 5 ("CDKL5") gene have been shown to cause infantile spasms as well as Rett syndrome (RTT)-like phenotype. To date, less than 25 different mutations have been reported. So far, there are still little data on the key clinical diagnosis criteria and on the natural history of…

  12. Feature selection, statistical modeling and its applications to universal JPEG steganalyzer

    Energy Technology Data Exchange (ETDEWEB)

    Jalan, Jaikishan [Iowa State Univ., Ames, IA (United States)

    2009-01-01

    Steganalysis deals with identifying the instances of medium(s) which carry a message for communication by concealing their exisitence. This research focuses on steganalysis of JPEG images, because of its ubiquitous nature and low bandwidth requirement for storage and transmission. JPEG image steganalysis is generally addressed by representing an image with lower-dimensional features such as statistical properties, and then training a classifier on the feature set to differentiate between an innocent and stego image. Our approach is two fold: first, we propose a new feature reduction technique by applying Mahalanobis distance to rank the features for steganalysis. Many successful steganalysis algorithms use a large number of features relative to the size of the training set and suffer from a ”curse of dimensionality”: large number of feature values relative to training data size. We apply this technique to state-of-the-art steganalyzer proposed by Tom´as Pevn´y (54) to understand the feature space complexity and effectiveness of features for steganalysis. We show that using our approach, reduced-feature steganalyzers can be obtained that perform as well as the original steganalyzer. Based on our experimental observation, we then propose a new modeling technique for steganalysis by developing a Partially Ordered Markov Model (POMM) (23) to JPEG images and use its properties to train a Support Vector Machine. POMM generalizes the concept of local neighborhood directionality by using a partial order underlying the pixel locations. We show that the proposed steganalyzer outperforms a state-of-the-art steganalyzer by testing our approach with many different image databases, having a total of 20000 images. Finally, we provide a software package with a Graphical User Interface that has been developed to make this research accessible to local state forensic departments.

  13. Data Field Modeling and Spectral-Spatial Feature Fusion for Hyperspectral Data Classification.

    Science.gov (United States)

    Liu, Da; Li, Jianxun

    2016-12-16

    Classification is a significant subject in hyperspectral remote sensing image processing. This study proposes a spectral-spatial feature fusion algorithm for the classification of hyperspectral images (HSI). Unlike existing spectral-spatial classification methods, the influences and interactions of the surroundings on each measured pixel were taken into consideration in this paper. Data field theory was employed as the mathematical realization of the field theory concept in physics, and both the spectral and spatial domains of HSI were considered as data fields. Therefore, the inherent dependency of interacting pixels was modeled. Using data field modeling, spatial and spectral features were transformed into a unified radiation form and further fused into a new feature by using a linear model. In contrast to the current spectral-spatial classification methods, which usually simply stack spectral and spatial features together, the proposed method builds the inner connection between the spectral and spatial features, and explores the hidden information that contributed to classification. Therefore, new information is included for classification. The final classification result was obtained using a random forest (RF) classifier. The proposed method was tested with the University of Pavia and Indian Pines, two well-known standard hyperspectral datasets. The experimental results demonstrate that the proposed method has higher classification accuracies than those obtained by the traditional approaches.

  14. Main modelling features of the ASTEC V2.1 major version

    International Nuclear Information System (INIS)

    Chatelard, P.; Belon, S.; Bosland, L.; Carénini, L.; Coindreau, O.; Cousin, F.; Marchetto, C.; Nowack, H.; Piar, L.; Chailan, L.

    2016-01-01

    Highlights: • Recent modelling improvements of the ASTEC European severe accident code are outlined. • Key new physical models now available in the ASTEC V2.1 major version are described. • ASTEC progress towards a multi-design reactor code is illustrated for BWR and PHWR. • ASTEC strong link with the on-going EC CESAM FP7 project is emphasized. • Main remaining modelling issues (on which IRSN efforts are now directing) are given. - Abstract: A new major version of the European severe accident integral code ASTEC, developed by IRSN with some GRS support, was delivered in November 2015 to the ASTEC worldwide community. Main modelling features of this V2.1 version are summarised in this paper. In particular, the in-vessel coupling technique between the reactor coolant system thermal-hydraulics module and the core degradation module has been strongly re-engineered to remove some well-known weaknesses of the former V2.0 series. The V2.1 version also includes new core degradation models specifically addressing BWR and PHWR reactor types, as well as several other physical modelling improvements, notably on reflooding of severely damaged cores, Zircaloy oxidation under air atmosphere, corium coolability during corium concrete interaction and source term evaluation. Moreover, this V2.1 version constitutes the back-bone of the CESAM FP7 project, which final objective is to further improve ASTEC for use in Severe Accident Management analysis of the Gen.II–III nuclear power plants presently under operation or foreseen in near future in Europe. As part of this European project, IRSN efforts to continuously improve both code numerical robustness and computing performances at plant scale as well as users’ tools are being intensified. Besides, ASTEC will continue capitalising the whole knowledge on severe accidents phenomenology by progressively keeping physical models at the state of the art through a regular feed-back from the interpretation of the current and

  15. Spatial features register: toward standardization of spatial features

    Science.gov (United States)

    Cascio, Janette

    1994-01-01

    As the need to share spatial data increases, more than agreement on a common format is needed to ensure that the data is meaningful to both the importer and the exporter. Effective data transfer also requires common definitions of spatial features. To achieve this, part 2 of the Spatial Data Transfer Standard (SDTS) provides a model for a spatial features data content specification and a glossary of features and attributes that fit this model. The model provides a foundation for standardizing spatial features. The glossary now contains only a limited subset of hydrographic and topographic features. For it to be useful, terms and definitions must be included for other categories, such as base cartographic, bathymetric, cadastral, cultural and demographic, geodetic, geologic, ground transportation, international boundaries, soils, vegetation, water, and wetlands, and the set of hydrographic and topographic features must be expanded. This paper will review the philosophy of the SDTS part 2 and the current plans for creating a national spatial features register as one mechanism for maintaining part 2.

  16. Transverse beam splitting made operational: Key features of the multiturn extraction at the CERN Proton Synchrotron

    Directory of Open Access Journals (Sweden)

    A. Huschauer

    2017-06-01

    Full Text Available Following a successful commissioning period, the multiturn extraction (MTE at the CERN Proton Synchrotron (PS has been applied for the fixed-target physics programme at the Super Proton Synchrotron (SPS since September 2015. This exceptional extraction technique was proposed to replace the long-serving continuous transfer (CT extraction, which has the drawback of inducing high activation in the ring. MTE exploits the principles of nonlinear beam dynamics to perform loss-free beam splitting in the horizontal phase space. Over multiple turns, the resulting beamlets are then transferred to the downstream accelerator. The operational deployment of MTE was rendered possible by the full understanding and mitigation of different hardware limitations and by redesigning the extraction trajectories and nonlinear optics, which was required due to the installation of a dummy septum to reduce the activation of the magnetic extraction septum. This paper focuses on these key features including the use of the transverse damper and the septum shadowing, which allowed a transition from the MTE study to a mature operational extraction scheme.

  17. On the relevance of spectral features for instrument classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Sigurdsson, Sigurdur; Hansen, Lars Kai

    2007-01-01

    Automatic knowledge extraction from music signals is a key component for most music organization and music information retrieval systems. In this paper, we consider the problem of instrument modelling and instrument classification from the rough audio data. Existing systems for automatic instrument...... classification operate normally on a relatively large number of features, from which those related to the spectrum of the audio signal are particularly relevant. In this paper, we confront two different models about the spectral characterization of musical instruments. The first assumes a constant envelope...

  18. Research on improving image recognition robustness by combining multiple features with associative memory

    Science.gov (United States)

    Guo, Dongwei; Wang, Zhe

    2018-05-01

    Convolutional neural networks (CNN) achieve great success in computer vision, it can learn hierarchical representation from raw pixels and has outstanding performance in various image recognition tasks [1]. However, CNN is easy to be fraudulent in terms of it is possible to produce images totally unrecognizable to human eyes that CNNs believe with near certainty are familiar objects. [2]. In this paper, an associative memory model based on multiple features is proposed. Within this model, feature extraction and classification are carried out by CNN, T-SNE and exponential bidirectional associative memory neural network (EBAM). The geometric features extracted from CNN and the digital features extracted from T-SNE are associated by EBAM. Thus we ensure the recognition of robustness by a comprehensive assessment of the two features. In our model, we can get only 8% error rate with fraudulent data. In systems that require a high safety factor or some key areas, strong robustness is extremely important, if we can ensure the image recognition robustness, network security will be greatly improved and the social production efficiency will be extremely enhanced.

  19. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    International Nuclear Information System (INIS)

    J. J. Tappen

    2003-01-01

    The purpose of this revision of ''Evaluation of the Applicability of Biosphere-Related Features, Events, and Processes (FEPs)'' (BSC 2001) is to document the screening analysis of biosphere-related primary FEPs, as identified in ''The Development of Information Catalogued in REV00 of the YMP FEP Database'' (Freeze et al. 2001), in accordance with the requirements of the final U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR Part 63. This database is referred to as the Yucca Mountain Project (YMP) FEP Database throughout this document. Those biosphere-related primary FEPs that are screened as applicable will be used to develop the conceptual model portion of the biosphere model, which will in turn be used to develop the mathematical model portion of the biosphere model. As part of this revision, any reference to the screening guidance or criteria provided either by Dyer (1999) or by the proposed NRC regulations at 64 FR 8640 has been removed. The title of this revision has been changed to more accurately reflect the purpose of the analyses. In addition, this revision will address Item Numbers 19, 20, 21, 25, and 26 from Attachment 2 of ''U.S. Nuclear Regulatory Commission/U.S. Department of Energy Technical Exchange and Management Meeting on Total System Performance Assessment and Integration (August 6 through 10, 2001)'' (Reamer 2001). This Scientific Analysis Report (SAR) does not support the current revision to the YMP FEP Database (Freeze et al. 2001). Subsequent to the release of the YMP FEP Database (Freeze et al. 2001), a series of reviews was conducted on both the FEP processes used to support Total System Performance Assessment for Site Recommendation and to develop the YMP FEP Database. In response to observations and comments from these reviews, particularly the NRC/DOE TSPA Technical Exchange in August 2001 (Reamer 2001), several Key Technical Issue (KTI) Agreements were developed. ''The Enhanced Plan for Features, Events and Processes

  20. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    J. J. Tappen

    2003-02-16

    The purpose of this revision of ''Evaluation of the Applicability of Biosphere-Related Features, Events, and Processes (FEPs)'' (BSC 2001) is to document the screening analysis of biosphere-related primary FEPs, as identified in ''The Development of Information Catalogued in REV00 of the YMP FEP Database'' (Freeze et al. 2001), in accordance with the requirements of the final U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR Part 63. This database is referred to as the Yucca Mountain Project (YMP) FEP Database throughout this document. Those biosphere-related primary FEPs that are screened as applicable will be used to develop the conceptual model portion of the biosphere model, which will in turn be used to develop the mathematical model portion of the biosphere model. As part of this revision, any reference to the screening guidance or criteria provided either by Dyer (1999) or by the proposed NRC regulations at 64 FR 8640 has been removed. The title of this revision has been changed to more accurately reflect the purpose of the analyses. In addition, this revision will address Item Numbers 19, 20, 21, 25, and 26 from Attachment 2 of ''U.S. Nuclear Regulatory Commission/U.S. Department of Energy Technical Exchange and Management Meeting on Total System Performance Assessment and Integration (August 6 through 10, 2001)'' (Reamer 2001). This Scientific Analysis Report (SAR) does not support the current revision to the YMP FEP Database (Freeze et al. 2001). Subsequent to the release of the YMP FEP Database (Freeze et al. 2001), a series of reviews was conducted on both the FEP processes used to support Total System Performance Assessment for Site Recommendation and to develop the YMP FEP Database. In response to observations and comments from these reviews, particularly the NRC/DOE TSPA Technical Exchange in August 2001 (Reamer 2001), several Key Technical Issue (KTI) Agreements were developed

  1. Unravelling Some of the Key Transformations in the Hydrothermal Liquefaction of Lignin.

    Science.gov (United States)

    Lui, Matthew Y; Chan, Bun; Yuen, Alexander K L; Masters, Anthony F; Montoya, Alejandro; Maschmeyer, Thomas

    2017-05-22

    Using both experimental and computational methods, focusing on intermediates and model compounds, some of the main features of the reaction mechanisms that operate during the hydrothermal processing of lignin were elucidated. Key reaction pathways and their connection to different structural features of lignin were proposed. Under neutral conditions, subcritical water was demonstrated to act as a bifunctional acid/base catalyst for the dissection of lignin structures. In a complex web of mutually dependent interactions, guaiacyl units within lignin were shown to significantly affect overall lignin reactivity. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Correlation between clinical and histological features in a pig model of choroidal neovascularization

    DEFF Research Database (Denmark)

    Lassota, Nathan; Kiilgaard, Jens Folke; Prause, Jan Ulrik

    2006-01-01

    To analyse the histological changes in the retina and the choroid in a pig model of choroidal neovascularization (CNV) and to correlate these findings with fundus photographic and fluorescein angiographic features.......To analyse the histological changes in the retina and the choroid in a pig model of choroidal neovascularization (CNV) and to correlate these findings with fundus photographic and fluorescein angiographic features....

  3. Improving permafrost distribution modelling using feature selection algorithms

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2016-04-01

    The availability of an increasing number of spatial data on the occurrence of mountain permafrost allows the employment of machine learning (ML) classification algorithms for modelling the distribution of the phenomenon. One of the major problems when dealing with high-dimensional dataset is the number of input features (variables) involved. Application of ML classification algorithms to this large number of variables leads to the risk of overfitting, with the consequence of a poor generalization/prediction. For this reason, applying feature selection (FS) techniques helps simplifying the amount of factors required and improves the knowledge on adopted features and their relation with the studied phenomenon. Moreover, taking away irrelevant or redundant variables from the dataset effectively improves the quality of the ML prediction. This research deals with a comparative analysis of permafrost distribution models supported by FS variable importance assessment. The input dataset (dimension = 20-25, 10 m spatial resolution) was constructed using landcover maps, climate data and DEM derived variables (altitude, aspect, slope, terrain curvature, solar radiation, etc.). It was completed with permafrost evidences (geophysical and thermal data and rock glacier inventories) that serve as training permafrost data. Used FS algorithms informed about variables that appeared less statistically important for permafrost presence/absence. Three different algorithms were compared: Information Gain (IG), Correlation-based Feature Selection (CFS) and Random Forest (RF). IG is a filter technique that evaluates the worth of a predictor by measuring the information gain with respect to the permafrost presence/absence. Conversely, CFS is a wrapper technique that evaluates the worth of a subset of predictors by considering the individual predictive ability of each variable along with the degree of redundancy between them. Finally, RF is a ML algorithm that performs FS as part of its

  4. Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen, 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  5. Observations and models of star formation in the tidal features of interacting galaxies

    International Nuclear Information System (INIS)

    Wallin, J.F.; Schombert, J.M.; Struck-Marcell, C.

    1990-01-01

    Multi-color surface photometry (BVri) is presented for the tidal features in a sample of interacting galaxies. Large color variations are found between the morphological components and within the individual components. The blue colors in the primary and the tidal features are most dramatic in B-V, and not in V-i, indicating that star formation instead of metallicity or age dominates the colors. Color variations between components is larger in systems shortly after interaction begins and diminishes to a very low level in systems which are merged. Photometric models for interacting systems are presented which suggest that a weak burst of star formation in the tidal features could cause the observed color distributions. Dynamical models indicate that compression occurs during the development of tidal features causing an increase in the local density by a factor of between 1.5 and 5. Assuming this density increase can be related to the star formation rate by a Schmidt law, the density increases observed in the dynamical models may be responsible for the variations in color seen in some of the interacting systems. Limitations of the dynamical models are also discussed

  6. Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method.

    Science.gov (United States)

    Yeh, James S; Van Hoof, Thomas J; Fischer, Michael A

    2016-02-01

    Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets

  7. Feature Set Evaluation for Offline Handwriting Recognition Systems: Application to the Recurrent Neural Network Model.

    Science.gov (United States)

    Chherawala, Youssouf; Roy, Partha Pratim; Cheriet, Mohamed

    2016-12-01

    The performance of handwriting recognition systems is dependent on the features extracted from the word image. A large body of features exists in the literature, but no method has yet been proposed to identify the most promising of these, other than a straightforward comparison based on the recognition rate. In this paper, we propose a framework for feature set evaluation based on a collaborative setting. We use a weighted vote combination of recurrent neural network (RNN) classifiers, each trained with a particular feature set. This combination is modeled in a probabilistic framework as a mixture model and two methods for weight estimation are described. The main contribution of this paper is to quantify the importance of feature sets through the combination weights, which reflect their strength and complementarity. We chose the RNN classifier because of its state-of-the-art performance. Also, we provide the first feature set benchmark for this classifier. We evaluated several feature sets on the IFN/ENIT and RIMES databases of Arabic and Latin script, respectively. The resulting combination model is competitive with state-of-the-art systems.

  8. Review of research in feature based design

    NARCIS (Netherlands)

    Salomons, O.W.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1993-01-01

    Research in feature-based design is reviewed. Feature-based design is regarded as a key factor towards CAD/CAPP integration from a process planning point of view. From a design point of view, feature-based design offers possibilities for supporting the design process better than current CAD systems

  9. Discriminative topological features reveal biological network mechanisms

    Directory of Open Access Journals (Sweden)

    Levovitz Chaya

    2004-11-01

    Full Text Available Abstract Background Recent genomic and bioinformatic advances have motivated the development of numerous network models intending to describe graphs of biological, technological, and sociological origin. In most cases the success of a model has been evaluated by how well it reproduces a few key features of the real-world data, such as degree distributions, mean geodesic lengths, and clustering coefficients. Often pairs of models can reproduce these features with indistinguishable fidelity despite being generated by vastly different mechanisms. In such cases, these few target features are insufficient to distinguish which of the different models best describes real world networks of interest; moreover, it is not clear a priori that any of the presently-existing algorithms for network generation offers a predictive description of the networks inspiring them. Results We present a method to assess systematically which of a set of proposed network generation algorithms gives the most accurate description of a given biological network. To derive discriminative classifiers, we construct a mapping from the set of all graphs to a high-dimensional (in principle infinite-dimensional "word space". This map defines an input space for classification schemes which allow us to state unambiguously which models are most descriptive of a given network of interest. Our training sets include networks generated from 17 models either drawn from the literature or introduced in this work. We show that different duplication-mutation schemes best describe the E. coli genetic network, the S. cerevisiae protein interaction network, and the C. elegans neuronal network, out of a set of network models including a linear preferential attachment model and a small-world model. Conclusions Our method is a first step towards systematizing network models and assessing their predictability, and we anticipate its usefulness for a number of communities.

  10. Hierarchical Feature Extraction With Local Neural Response for Image Recognition.

    Science.gov (United States)

    Li, Hong; Wei, Yantao; Li, Luoqing; Chen, C L P

    2013-04-01

    In this paper, a hierarchical feature extraction method is proposed for image recognition. The key idea of the proposed method is to extract an effective feature, called local neural response (LNR), of the input image with nontrivial discrimination and invariance properties by alternating between local coding and maximum pooling operation. The local coding, which is carried out on the locally linear manifold, can extract the salient feature of image patches and leads to a sparse measure matrix on which maximum pooling is carried out. The maximum pooling operation builds the translation invariance into the model. We also show that other invariant properties, such as rotation and scaling, can be induced by the proposed model. In addition, a template selection algorithm is presented to reduce computational complexity and to improve the discrimination ability of the LNR. Experimental results show that our method is robust to local distortion and clutter compared with state-of-the-art algorithms.

  11. Backbending feature of rotational spectra in the generalized variable-moment-of-inertia model and its equivalence with the Harris model

    International Nuclear Information System (INIS)

    Mantri, A.N.

    1975-01-01

    The equivalence of Harris model equations with those of the generalized variable-moment-of-inertia (GVMI) model given by Das et al. is examined in the light of backbending feature of the rotational states. It is shown that this feature is absent in the Harris model taken to any order. The GVMI model equations are found to be consistent and in one-to-one correspondence with an expansion of the square of the angular velocity in terms of a polynomial in the moment of inertia rather than with the Harris expansion and may give a backbending feature in some cases depending on the relative values of the parameters appearing in the potential energy term

  12. IBM model M keyboard

    CERN Multimedia

    1985-01-01

    In 1985, the IBM Model M keyboard was created. This timeless classic was a hit. IBM came out with several varients of the model M. They had the space saver 104 key which is the one most seen today and many international versions of that as well. The second type, and rarest is the 122 key model M which has 24 extra keys at the very top, dubbed the “programmers keyboard”. IBM manufactured these keyboards until 1991. The model M features “caps” over the actual keys that can be taken off separately one at a time for cleaning or to replace them with colored keys or keys of another language, that was a very cost effective way of shipping out internationally the keyboards.

  13. The Impact Of Website Design Features On Behavioral Intentions

    Directory of Open Access Journals (Sweden)

    Chun-Chin Chiu

    2015-08-01

    Full Text Available The design of a website interface plays an important role in online purchasing and customers are more likely to visit and buy from better-designed websites. However previous studies have not provided consistent information about the features a website should provide. Based on Hausman and Siekpes 2009 comprehensive model this study aims to empirically verify whether the model can be applied in e-service markets to predict and explain website users behavioral intentions trade intentions and revisit intentions. Based on the data from a survey of 303 Internet users the results indicate that computer factors and human factors the key website design features are significantly related to website users experiences perceived usefulness perceived entertainment value and perceived informativeness in turn significantly affect the intermediary outcomes of attitude toward the site and ultimately influence users behavioral intentions.

  14. Constructing and validating readability models: the method of integrating multilevel linguistic features with machine learning.

    Science.gov (United States)

    Sung, Yao-Ting; Chen, Ju-Ling; Cha, Ji-Her; Tseng, Hou-Chiang; Chang, Tao-Hsing; Chang, Kuo-En

    2015-06-01

    Multilevel linguistic features have been proposed for discourse analysis, but there have been few applications of multilevel linguistic features to readability models and also few validations of such models. Most traditional readability formulae are based on generalized linear models (GLMs; e.g., discriminant analysis and multiple regression), but these models have to comply with certain statistical assumptions about data properties and include all of the data in formulae construction without pruning the outliers in advance. The use of such readability formulae tends to produce a low text classification accuracy, while using a support vector machine (SVM) in machine learning can enhance the classification outcome. The present study constructed readability models by integrating multilevel linguistic features with SVM, which is more appropriate for text classification. Taking the Chinese language as an example, this study developed 31 linguistic features as the predicting variables at the word, semantic, syntax, and cohesion levels, with grade levels of texts as the criterion variable. The study compared four types of readability models by integrating unilevel and multilevel linguistic features with GLMs and an SVM. The results indicate that adopting a multilevel approach in readability analysis provides a better representation of the complexities of both texts and the reading comprehension process.

  15. RELAP5-3D Code Includes ATHENA Features and Models

    International Nuclear Information System (INIS)

    Riemke, Richard A.; Davis, Cliff B.; Schultz, Richard R.

    2006-01-01

    Version 2.3 of the RELAP5-3D computer program includes all features and models previously available only in the ATHENA version of the code. These include the addition of new working fluids (i.e., ammonia, blood, carbon dioxide, glycerol, helium, hydrogen, lead-bismuth, lithium, lithium-lead, nitrogen, potassium, sodium, and sodium-potassium) and a magnetohydrodynamic model that expands the capability of the code to model many more thermal-hydraulic systems. In addition to the new working fluids along with the standard working fluid water, one or more noncondensable gases (e.g., air, argon, carbon dioxide, carbon monoxide, helium, hydrogen, krypton, nitrogen, oxygen, SF 6 , xenon) can be specified as part of the vapor/gas phase of the working fluid. These noncondensable gases were in previous versions of RELAP5-3D. Recently four molten salts have been added as working fluids to RELAP5-3D Version 2.4, which has had limited release. These molten salts will be in RELAP5-3D Version 2.5, which will have a general release like RELAP5-3D Version 2.3. Applications that use these new features and models are discussed in this paper. (authors)

  16. Keys to the House: Unlocking Residential Savings With Program Models for Home Energy Upgrades

    Energy Technology Data Exchange (ETDEWEB)

    Grevatt, Jim [Energy Futures Group (United States); Hoffman, Ian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hoffmeyer, Dale [US Department of Energy, Washington, DC (United States)

    2017-07-05

    After more than 40 years of effort, energy efficiency program administrators and associated contractors still find it challenging to penetrate the home retrofit market, especially at levels commensurate with state and federal goals for energy savings and emissions reductions. Residential retrofit programs further have not coalesced around a reliably successful model. They still vary in design, implementation and performance, and they remain among the more difficult and costly options for acquiring savings in the residential sector. If programs are to contribute fully to meeting resource and policy objectives, administrators need to understand what program elements are key to acquiring residential savings as cost effectively as possible. To that end, the U.S. Department of Energy (DOE) sponsored a comprehensive review and analysis of home energy upgrade programs with proven track records, focusing on those with robustly verified savings and constituting good examples for replication. The study team reviewed evaluations for the period 2010 to 2014 for 134 programs that are funded by customers of investor-owned utilities. All are programs that promote multi-measure retrofits or major system upgrades. We paid particular attention to useful design and implementation features, costs, and savings for nearly 30 programs with rigorous evaluations of performance. This meta-analysis describes program models and implementation strategies for (1) direct install retrofits; (2) heating, ventilating and air-conditioning (HVAC) replacement and early retirement; and (3) comprehensive, whole-home retrofits. We analyze costs and impacts of these program models, in terms of both energy savings and emissions avoided. These program models can be useful guides as states consider expanding their strategies for acquiring energy savings as a resource and for emissions reductions. We also discuss the challenges of using evaluations to create program models that can be confidently applied in

  17. Hidden discriminative features extraction for supervised high-order time series modeling.

    Science.gov (United States)

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Benefit salience and consumers' selective attention to product features

    OpenAIRE

    Ratneshwar, S; Warlop, Luk; Mick, DG; Seeger, G

    1997-01-01

    Although attention is a key construct in models of marketing communication and consumer choice, its selective nature has rarely been examined in common time-pressured conditions. We focus on the role of benefit salience, that is, the readiness with which particular benefits are brought to mind by consumers in relation to a given product category. Study I demonstrated that when product feature information was presented rapidly, individuals for whom the benefit of personalised customer service ...

  19. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  20. Heuristic algorithms for feature selection under Bayesian models with block-diagonal covariance structure.

    Science.gov (United States)

    Foroughi Pour, Ali; Dalton, Lori A

    2018-03-21

    Many bioinformatics studies aim to identify markers, or features, that can be used to discriminate between distinct groups. In problems where strong individual markers are not available, or where interactions between gene products are of primary interest, it may be necessary to consider combinations of features as a marker family. To this end, recent work proposes a hierarchical Bayesian framework for feature selection that places a prior on the set of features we wish to select and on the label-conditioned feature distribution. While an analytical posterior under Gaussian models with block covariance structures is available, the optimal feature selection algorithm for this model remains intractable since it requires evaluating the posterior over the space of all possible covariance block structures and feature-block assignments. To address this computational barrier, in prior work we proposed a simple suboptimal algorithm, 2MNC-Robust, with robust performance across the space of block structures. Here, we present three new heuristic feature selection algorithms. The proposed algorithms outperform 2MNC-Robust and many other popular feature selection algorithms on synthetic data. In addition, enrichment analysis on real breast cancer, colon cancer, and Leukemia data indicates they also output many of the genes and pathways linked to the cancers under study. Bayesian feature selection is a promising framework for small-sample high-dimensional data, in particular biomarker discovery applications. When applied to cancer data these algorithms outputted many genes already shown to be involved in cancer as well as potentially new biomarkers. Furthermore, one of the proposed algorithms, SPM, outputs blocks of heavily correlated genes, particularly useful for studying gene interactions and gene networks.

  1. Model-independent phenotyping of C. elegans locomotion using scale-invariant feature transform.

    Directory of Open Access Journals (Sweden)

    Yelena Koren

    Full Text Available To uncover the genetic basis of behavioral traits in the model organism C. elegans, a common strategy is to study locomotion defects in mutants. Despite efforts to introduce (semi-automated phenotyping strategies, current methods overwhelmingly depend on worm-specific features that must be hand-crafted and as such are not generalizable for phenotyping motility in other animal models. Hence, there is an ongoing need for robust algorithms that can automatically analyze and classify motility phenotypes quantitatively. To this end, we have developed a fully-automated approach to characterize C. elegans' phenotypes that does not require the definition of nematode-specific features. Rather, we make use of the popular computer vision Scale-Invariant Feature Transform (SIFT from which we construct histograms of commonly-observed SIFT features to represent nematode motility. We first evaluated our method on a synthetic dataset simulating a range of nematode crawling gaits. Next, we evaluated our algorithm on two distinct datasets of crawling C. elegans with mutants affecting neuromuscular structure and function. Not only is our algorithm able to detect differences between strains, results capture similarities in locomotory phenotypes that lead to clustering that is consistent with expectations based on genetic relationships. Our proposed approach generalizes directly and should be applicable to other animal models. Such applicability holds promise for computational ethology as more groups collect high-resolution image data of animal behavior.

  2. Overall feature of EAST operation space by using simple Core-SOL-Divertor model

    International Nuclear Information System (INIS)

    Hiwatari, R.; Hatayama, A.; Zhu, S.; Takizuka, T.; Tomita, Y.

    2005-01-01

    We have developed a simple Core-SOL-Divertor (C-S-D) model to investigate qualitatively the overall features of the operational space for the integrated core and edge plasma. To construct the simple C-S-D model, a simple core plasma model of ITER physics guidelines and a two-point SOL-divertor model are used. The simple C-S-D model is applied to the study of the EAST operational space with lower hybrid current drive experiments under various kinds of trade-off for the basic plasma parameters. Effective methods for extending the operation space are also presented. As shown by this study for the EAST operation space, it is evident that the C-S-D model is a useful tool to understand qualitatively the overall features of the plasma operation space. (author)

  3. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation.

    Science.gov (United States)

    Mourad, Raphaël; Cuvier, Olivier

    2016-05-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1.

  4. Prediction models for solitary pulmonary nodules based on curvelet textural features and clinical parameters.

    Science.gov (United States)

    Wang, Jing-Jing; Wu, Hai-Feng; Sun, Tao; Li, Xia; Wang, Wei; Tao, Li-Xin; Huo, Da; Lv, Ping-Xin; He, Wen; Guo, Xiu-Hua

    2013-01-01

    Lung cancer, one of the leading causes of cancer-related deaths, usually appears as solitary pulmonary nodules (SPNs) which are hard to diagnose using the naked eye. In this paper, curvelet-based textural features and clinical parameters are used with three prediction models [a multilevel model, a least absolute shrinkage and selection operator (LASSO) regression method, and a support vector machine (SVM)] to improve the diagnosis of benign and malignant SPNs. Dimensionality reduction of the original curvelet-based textural features was achieved using principal component analysis. In addition, non-conditional logistical regression was used to find clinical predictors among demographic parameters and morphological features. The results showed that, combined with 11 clinical predictors, the accuracy rates using 12 principal components were higher than those using the original curvelet-based textural features. To evaluate the models, 10-fold cross validation and back substitution were applied. The results obtained, respectively, were 0.8549 and 0.9221 for the LASSO method, 0.9443 and 0.9831 for SVM, and 0.8722 and 0.9722 for the multilevel model. All in all, it was found that using curvelet-based textural features after dimensionality reduction and using clinical predictors, the highest accuracy rate was achieved with SVM. The method may be used as an auxiliary tool to differentiate between benign and malignant SPNs in CT images.

  5. Application of a three-feature dispersed-barrier hardening model to neutron-irradiated Fe-Cr model alloys

    Science.gov (United States)

    Bergner, F.; Pareige, C.; Hernández-Mayoral, M.; Malerba, L.; Heintze, C.

    2014-05-01

    An attempt is made to quantify the contributions of different types of defect-solute clusters to the total irradiation-induced yield stress increase in neutron-irradiated (300 °C, 0.6 dpa), industrial-purity Fe-Cr model alloys (target Cr contents of 2.5, 5, 9 and 12 at.% Cr). Former work based on the application of transmission electron microscopy, atom probe tomography, and small-angle neutron scattering revealed the formation of dislocation loops, NiSiPCr-enriched clusters and α‧-phase particles, which act as obstacles to dislocation glide. The values of the dimensionless obstacle strength are estimated in the framework of a three-feature dispersed-barrier hardening model. Special attention is paid to the effect of measuring errors, experimental details and model details on the estimates. The three families of obstacles and the hardening model are well capable of reproducing the observed yield stress increase as a function of Cr content, suggesting that the nanostructural features identified experimentally are the main, if not the only, causes of irradiation hardening in these model alloys.

  6. Improving model predictions for RNA interference activities that use support vector machine regression by combining and filtering features

    Directory of Open Access Journals (Sweden)

    Peek Andrew S

    2007-06-01

    Full Text Available Abstract Background RNA interference (RNAi is a naturally occurring phenomenon that results in the suppression of a target RNA sequence utilizing a variety of possible methods and pathways. To dissect the factors that result in effective siRNA sequences a regression kernel Support Vector Machine (SVM approach was used to quantitatively model RNA interference activities. Results Eight overall feature mapping methods were compared in their abilities to build SVM regression models that predict published siRNA activities. The primary factors in predictive SVM models are position specific nucleotide compositions. The secondary factors are position independent sequence motifs (N-grams and guide strand to passenger strand sequence thermodynamics. Finally, the factors that are least contributory but are still predictive of efficacy are measures of intramolecular guide strand secondary structure and target strand secondary structure. Of these, the site of the 5' most base of the guide strand is the most informative. Conclusion The capacity of specific feature mapping methods and their ability to build predictive models of RNAi activity suggests a relative biological importance of these features. Some feature mapping methods are more informative in building predictive models and overall t-test filtering provides a method to remove some noisy features or make comparisons among datasets. Together, these features can yield predictive SVM regression models with increased predictive accuracy between predicted and observed activities both within datasets by cross validation, and between independently collected RNAi activity datasets. Feature filtering to remove features should be approached carefully in that it is possible to reduce feature set size without substantially reducing predictive models, but the features retained in the candidate models become increasingly distinct. Software to perform feature prediction and SVM training and testing on nucleic acid

  7. Plutonium in the environment: key factors related to impact assessment in case of an accidental atmospheric release

    Energy Technology Data Exchange (ETDEWEB)

    Guetat, P. [CEA Valduc, 21 - Is-sur-Tille (France); Moulin, V.; Reiller, P. [CEA Saclay, 91 (FR)] (and others)

    2009-07-01

    This paper deals with plutonium and key factors related to impact assessment. It is based on recent work performed by CEA which summarize the main features of plutonium behaviour from sources inside installations to the environment and man, and to report current knowledge on the different parameters used in models for environmental and radiological impact assessment. These key factors are illustrated through a case study based on an accidental atmospheric release of Pu in a nuclear facility. (orig.)

  8. Feature Fusion Based Audio-Visual Speaker Identification Using Hidden Markov Model under Different Lighting Variations

    Directory of Open Access Journals (Sweden)

    Md. Rabiul Islam

    2014-01-01

    Full Text Available The aim of the paper is to propose a feature fusion based Audio-Visual Speaker Identification (AVSI system with varied conditions of illumination environments. Among the different fusion strategies, feature level fusion has been used for the proposed AVSI system where Hidden Markov Model (HMM is used for learning and classification. Since the feature set contains richer information about the raw biometric data than any other levels, integration at feature level is expected to provide better authentication results. In this paper, both Mel Frequency Cepstral Coefficients (MFCCs and Linear Prediction Cepstral Coefficients (LPCCs are combined to get the audio feature vectors and Active Shape Model (ASM based appearance and shape facial features are concatenated to take the visual feature vectors. These combined audio and visual features are used for the feature-fusion. To reduce the dimension of the audio and visual feature vectors, Principal Component Analysis (PCA method is used. The VALID audio-visual database is used to measure the performance of the proposed system where four different illumination levels of lighting conditions are considered. Experimental results focus on the significance of the proposed audio-visual speaker identification system with various combinations of audio and visual features.

  9. Key determinants of managing the marketing asset of global companies

    Directory of Open Access Journals (Sweden)

    Tatyana Tsygankova

    2016-12-01

    Full Text Available As a result of organization and summarization of key concepts of evolution of the marketing tools of global companies, the authors determined the role of the marketing assets in the system of modern marketing management (as a dialectically higher stage of development of the analyzed tools, which will allow overcoming the antagonistic contradiction of “P- and C-vectors” of their development. The article identified the optimal set of key elements of the system of marketing assets, which are the brand, customer loyalty, reputation, network cooperation, marketing strategy, internal marketing, marketing information system and marketing innovation. Due to correlation and regression analysis of the impact of each system elements on performance of global companies, the model of the "marketing asset octagon" was built as an integrative management tool. Also, as a result of construction of the said model, the authors identified the most profitable marketing assets, return on investment and development of competencies in the field of efficient management will bring the highest profit to the company. On the basis of summarizing the regional and branch features of managing the disparate elements of the marketing assets of global companies, the key regional and sectoral priorities of formation, development and improvement of existing concepts of the international marketing management were identified, particularly in terms of building an author’s integrative octagon model.

  10. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  11. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  12. Energy Demand Modeling Methodology of Key State Transitions of Turning Processes

    Directory of Open Access Journals (Sweden)

    Shun Jia

    2017-04-01

    Full Text Available Energy demand modeling of machining processes is the foundation of energy optimization. Energy demand of machining state transition is integral to the energy requirements of the machining process. However, research focus on energy modeling of state transition is scarce. To fill this gap, an energy demand modeling methodology of key state transitions of the turning process is proposed. The establishment of an energy demand model of state transition could improve the accuracy of the energy model of the machining process, which also provides an accurate model and reliable data for energy optimization of the machining process. Finally, case studies were conducted on a CK6153i CNC lathe, the results demonstrating that predictive accuracy with the proposed method is generally above 90% for the state transition cases.

  13. Identifying key radiogenomic associations between DCE-MRI and micro-RNA expressions for breast cancer

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir; Helvie, Mark A.; Kim, Renaid

    2017-03-01

    Understanding the key radiogenomic associations for breast cancer between DCE-MRI and micro-RNA expressions is the foundation for the discovery of radiomic features as biomarkers for assessing tumor progression and prognosis. We conducted a study to analyze the radiogenomic associations for breast cancer using the TCGA-TCIA data set. The core idea that tumor etiology is a function of the behavior of miRNAs is used to build the regression models. The associations based on regression are analyzed for three study outcomes: diagnosis, prognosis, and treatment. The diagnosis group consists of miRNAs associated with clinicopathologic features of breast cancer and significant aberration of expression in breast cancer patients. The prognosis group consists of miRNAs which are closely associated with tumor suppression and regulation of cell proliferation and differentiation. The treatment group consists of miRNAs that contribute significantly to the regulation of metastasis thereby having the potential to be part of therapeutic mechanisms. As a first step, important miRNA expressions were identified and their ability to classify the clinical phenotypes based on the study outcomes was evaluated using the area under the ROC curve (AUC) as a figure-of-merit. The key mapping between the selected miRNAs and radiomic features were determined using least absolute shrinkage and selection operator (LASSO) regression analysis within a two-loop leave-one-out cross-validation strategy. These key associations indicated a number of radiomic features from DCE-MRI to be potential biomarkers for the three study outcomes.

  14. Description of the Lewin Natural Gas Model

    International Nuclear Information System (INIS)

    Kuuskraa, V.; Godec, M.

    1989-01-01

    This paper provides a brief description of the Lewin Natural Gas Model, shows how this model differs in key features from the other models participating in EMF-9, and describes how the different modeling scenarios analyzed in EMF-9 were implemented in the Lewin model. This background helps explain the key results that have been gained from applying the Lewin model to the EMF scenarios

  15. Grotoco@SLAM: Second Language Acquisition Modeling with Simple Features, Learners and Task-wise Models

    DEFF Research Database (Denmark)

    Klerke, Sigrid; Martínez Alonso, Héctor; Plank, Barbara

    2018-01-01

    We present our submission to the 2018 Duolingo Shared Task on Second Language Acquisition Modeling (SLAM). We focus on evaluating a range of features for the task, including user-derived measures, while examining how far we can get with a simple linear classifier. Our analysis reveals that errors...

  16. Key features for more successful place-based sustainability research on social-ecological systems: a Programme on Ecosystem Change and Society (PECS perspective

    Directory of Open Access Journals (Sweden)

    Patricia Balvanera

    2017-03-01

    Full Text Available The emerging discipline of sustainability science is focused explicitly on the dynamic interactions between nature and society and is committed to research that spans multiple scales and can support transitions toward greater sustainability. Because a growing body of place-based social-ecological sustainability research (PBSESR has emerged in recent decades, there is a growing need to understand better how to maximize the effectiveness of this work. The Programme on Ecosystem Change and Society (PECS provides a unique opportunity for synthesizing insights gained from this research community on key features that may contribute to the relative success of PBSESR. We surveyed the leaders of PECS-affiliated projects using a combination of open, closed, and semistructured questions to identify which features of a research project are perceived to contribute to successful research design and implementation. We assessed six types of research features: problem orientation, research team, and contextual, conceptual, methodological, and evaluative features. We examined the desirable and undesirable aspects of each feature, the enabling factors and obstacles associated with project implementation, and asked respondents to assess the performance of their own projects in relation to these features. Responses were obtained from 25 projects working in 42 social-ecological study cases within 25 countries. Factors that contribute to the overall success of PBSESR included: explicitly addressing integrated social-ecological systems; a focus on solution- and transformation-oriented research; adaptation of studies to their local context; trusted, long-term, and frequent engagement with stakeholders and partners; and an early definition of the purpose and scope of research. Factors that hindered the success of PBSESR included: the complexities inherent to social-ecological systems, the imposition of particular epistemologies and methods on the wider research group

  17. Key features and progress of the KSTAR tokamak engineering

    International Nuclear Information System (INIS)

    Bak, J.S.; Choi, C.H.; Oh, Y.K.

    2003-01-01

    Substantial progress of the KSTAR tokamak engineering has been made on major tokamak structures, superconducting magnets, in-vessel components, diagnostic system, heating system, and power supplies. The engineering design has been elaborated to the extent necessary to allow a realistic assessment of its feasibility, performance, and cost. The prototype fabrication has been carried out to establish the reliable fabrication technologies and to confirm the validation of analyses employed for the KSTAR design. The completion of experimental building with beneficial occupancy for machine assembly was accomplished in Sep. 2002. The construction of special utility such as cryo-plant, de-ionized water-cooling system, and main power station will begin upon completion of building construction. The commissioning, construction, fabrication, and assembly of the whole facility will be going on by the end of 2005. This paper describes the main design features and engineering progress of the KSTAR tokamak, and elaborates the work currently underway. (author)

  18. Local and regional energy companies offering energy services: Key activities and implications for the business model

    International Nuclear Information System (INIS)

    Kindström, Daniel; Ottosson, Mikael

    2016-01-01

    Highlights: • Many companies providing energy services are experiencing difficulties. • This research identifies key activities for the provision of energy services. • Findings are aggregated to the business-model level providing managerial insights. • This research identifies two different business model innovation paths. • Energy companies may need to renew parts of, or the entire, business model. - Abstract: Energy services play a key role in increasing energy efficiency in the industry. The key actors in these services are the local and regional energy companies that are increasingly implementing energy services as part of their market offering and developing service portfolios. Although expectations for energy services have been high, progress has so far been limited, and many companies offering energy services, including energy companies, are experiencing difficulties in implementing energy services and providing them to the market. Overall, this research examines what is needed for local and regional energy companies to successfully implement energy services (and consequently provide them to the market). In doing this, a two-stage process is used: first, we identify key activities for the successful implementation of energy services, and second, we aggregate the findings to the business model level. This research demonstrates that to succeed in implementing energy services, an energy company may need to renew parts or all of its existing product-based business model, formulate a new business model, or develop coexisting multiple business models. By discussing two distinct business model innovation processes, this research demonstrates that there can be different paths to success.

  19. Advanced social features in a recommendation system for process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.; Abramowicz, W.

    2009-01-01

    Social software is known to stimulate the exchange and sharing of information among peers. This paper describes how an existing system that supports process builders in completing a business process can be enhanced with various social features. In that way, it is easier for process modeler to become

  20. Prediction of interface residue based on the features of residue interaction network.

    Science.gov (United States)

    Jiao, Xiong; Ranganathan, Shoba

    2017-11-07

    Protein-protein interaction plays a crucial role in the cellular biological processes. Interface prediction can improve our understanding of the molecular mechanisms of the related processes and functions. In this work, we propose a classification method to recognize the interface residue based on the features of a weighted residue interaction network. The random forest algorithm is used for the prediction and 16 network parameters and the B-factor are acting as the element of the input feature vector. Compared with other similar work, the method is feasible and effective. The relative importance of these features also be analyzed to identify the key feature for the prediction. Some biological meaning of the important feature is explained. The results of this work can be used for the related work about the structure-function relationship analysis via a residue interaction network model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A 3D Printing Model Watermarking Algorithm Based on 3D Slicing and Feature Points

    Directory of Open Access Journals (Sweden)

    Giao N. Pham

    2018-02-01

    Full Text Available With the increase of three-dimensional (3D printing applications in many areas of life, a large amount of 3D printing data is copied, shared, and used several times without any permission from the original providers. Therefore, copyright protection and ownership identification for 3D printing data in communications or commercial transactions are practical issues. This paper presents a novel watermarking algorithm for 3D printing models based on embedding watermark data into the feature points of a 3D printing model. Feature points are determined and computed by the 3D slicing process along the Z axis of a 3D printing model. The watermark data is embedded into a feature point of a 3D printing model by changing the vector length of the feature point in OXY space based on the reference length. The x and y coordinates of the feature point will be then changed according to the changed vector length that has been embedded with a watermark. Experimental results verified that the proposed algorithm is invisible and robust to geometric attacks, such as rotation, scaling, and translation. The proposed algorithm provides a better method than the conventional works, and the accuracy of the proposed algorithm is much higher than previous methods.

  2. ClinicalKey: a point-of-care search engine.

    Science.gov (United States)

    Vardell, Emily

    2013-01-01

    ClinicalKey is a new point-of-care resource for health care professionals. Through controlled vocabulary, ClinicalKey offers a cross section of resources on diseases and procedures, from journals to e-books and practice guidelines to patient education. A sample search was conducted to demonstrate the features of the database, and a comparison with similar tools is presented.

  3. Phospholipase A₂: the key to reversing long-term memory impairment in a gastropod model of aging.

    Science.gov (United States)

    Watson, Shawn N; Wright, Natasha; Hermann, Petra M; Wildering, Willem C

    2013-02-01

    Memory failure associated with changes in neuronal circuit functions rather than cell death is a common feature of normal aging in diverse animal species. The (neuro)biological foundations of this phenomenon are not well understood although oxidative stress, particularly in the guise of lipid peroxidation, is suspected to play a key role. Using an invertebrate model system of age-associated memory impairment that supports direct correlation between behavioral deficits and changes in the underlying neural substrate, we show that inhibition of phospholipase A(2) (PLA(2)) abolishes both long-term memory (LTM) and neural defects observed in senescent subjects and subjects exposed to experimental oxidative stress. Using a combination of behavioral assessments and electrophysiological techniques, we provide evidence for a close link between lipid peroxidation, provocation of phospholipase A(2)-dependent free fatty acid release, decline of neuronal excitability, and age-related long-term memory impairments. This supports the view that these processes suspend rather than irreversibly extinguish the aging nervous system's intrinsic capacity for plasticity. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Patch layout generation by detecting feature networks

    KAUST Repository

    Cao, Yuanhao

    2015-02-01

    The patch layout of 3D surfaces reveals the high-level geometric and topological structures. In this paper, we study the patch layout computation by detecting and enclosing feature loops on surfaces. We present a hybrid framework which combines several key ingredients, including feature detection, feature filtering, feature curve extension, patch subdivision and boundary smoothing. Our framework is able to compute patch layouts through concave features as previous approaches, but also able to generate nice layouts through smoothing regions. We demonstrate the effectiveness of our framework by comparing with the state-of-the-art methods.

  5. Feature coding for image representation and recognition

    CERN Document Server

    Huang, Yongzhen

    2015-01-01

    This brief presents a comprehensive introduction to feature coding, which serves as a key module for the typical object recognition pipeline. The text offers a rich blend of theory and practice while reflects the recent developments on feature coding, covering the following five aspects: (1) Review the state-of-the-art, analyzing the motivations and mathematical representations of various feature coding methods; (2) Explore how various feature coding algorithms evolve along years; (3) Summarize the main characteristics of typical feature coding algorithms and categorize them accordingly; (4) D

  6. KeyPathwayMiner 4.0

    DEFF Research Database (Denmark)

    Alcaraz, Nicolas; Pauling, Josch; Batra, Richa

    2014-01-01

    release of KeyPathwayMiner (version 4.0) that is not limited to analyses of single omics data sets, e.g. gene expression, but is able to directly combine several different omics data types. Version 4.0 can further integrate existing knowledge by adding a search bias towards sub-networks that contain...... (avoid) genes provided in a positive (negative) list. Finally the new release now also provides a set of novel visualization features and has been implemented as an app for the standard bioinformatics network analysis tool: Cytoscape. CONCLUSION: With KeyPathwayMiner 4.0, we publish a Cytoscape app...

  7. Neuroticism in Young Women with Fibromyalgia Links to Key Clinical Features

    Directory of Open Access Journals (Sweden)

    Katrina Malin

    2012-01-01

    Full Text Available Objective. We examined personality traits in young women with FM, in order to seek associations with key psychological processes and clinical symptoms. Methods. Twenty-seven women with FM and 29 age-matched female healthy controls [HC] completed a series of questionnaires examining FM symptoms, personality and psychological variables. Results. Significant differences between characteristic FM symptoms (sleep, pain, fatigue, and confusion as well as for the psychological variables of depression, anxiety, and stress were found between FM and HC (P<0.001. Neuroticism was the only subscale of the Big Five Inventory that showed a significant difference between the FM group and HC group [P<0.05]. Within the FM group, there was a significant association between the level of the neuroticism and each of pain, sleep, fatigue, and confusion, depression, anxiety, and stress (P<0.05–0.01. The association between the level of neuroticism and the level of stress was the strongest of all variables tested (P<0.001. Conclusion. The personality trait of neuroticism significantly associates with the key FM characteristics of pain, sleep, fatigue and confusion as well as the common co-morbidities of depression, anxiety and stress. Personality appears to be an important modulator of FM clinical symptoms.

  8. Parametric Human Body Reconstruction Based on Sparse Key Points.

    Science.gov (United States)

    Cheng, Ke-Li; Tong, Ruo-Feng; Tang, Min; Qian, Jing-Ye; Sarkis, Michel

    2016-11-01

    We propose an automatic parametric human body reconstruction algorithm which can efficiently construct a model using a single Kinect sensor. A user needs to stand still in front of the sensor for a couple of seconds to measure the range data. The user's body shape and pose will then be automatically constructed in several seconds. Traditional methods optimize dense correspondences between range data and meshes. In contrast, our proposed scheme relies on sparse key points for the reconstruction. It employs regression to find the corresponding key points between the scanned range data and some annotated training data. We design two kinds of feature descriptors as well as corresponding regression stages to make the regression robust and accurate. Our scheme follows with dense refinement where a pre-factorization method is applied to improve the computational efficiency. Compared with other methods, our scheme achieves similar reconstruction accuracy but significantly reduces runtime.

  9. E-referral Solutions: Successful Experiences, Key Features and Challenges- a Systematic Review.

    Science.gov (United States)

    Naseriasl, Mansour; Adham, Davoud; Janati, Ali

    2015-06-01

    around the world health systems constantly face increasing pressures which arise from many factors, such as an ageing population, patients and providers demands for equipment's and services. In order to respond these challenges and reduction of health system's transactional costs, referral solutions are considered as a key factor. This study was carried out to identify referral solutions that have had successes. relevant studies identified using keywords of referrals, consultation, referral system, referral model, referral project, electronic referral, electronic booking, health system, healthcare, health service and medical care. These searches were conducted using PubMed, ProQuest, Google Scholar, Scopus, Emerald, Web of Knowledge, Springer, Science direct, Mosby's index, SID, Medlib and Iran Doc data bases. 4306 initial articles were obtained and refined step by step. Finally, 27 articles met the inclusion criteria. we identified seventeen e-referral systems developed in UK, Norway, Finland, Netherlands, Denmark, Scotland, New Zealand, Canada, Australia, and U.S. Implemented solutions had variant degrees of successes such as improved access to specialist care, reduced wait times, timeliness and quality of referral communication, accurate health information transfer and integration of health centers and services. each one of referral solutions has both positive and changeable aspects that should be addressed according to sociotechnical conditions. These solutions are mainly formed in a small and localized manner.

  10. Tetranectin Knockout Mice Develop Features of Parkinson Disease

    Directory of Open Access Journals (Sweden)

    Er-song Wang

    2014-07-01

    Full Text Available Background/Aims: Aggregation of insoluble α-synuclein to form Lewy bodies (LBs may contribute to the selective loss of midbrain dopaminergic neurons in Parkinson disease (PD. Lack of robust animal models has impeded elucidation of the molecular mechanisms of LB formation and other critical aspects of PD pathogenesis. Methods: We established a mouse model with targeted deletion of the plasminogen-binding protein tetranectin (TN gene (TN-/- and measured the behavioral and histopathological features of PD. Results: Aged (15-to 20-month-old TN-/- mice displayed motor deficits resembling PD symptoms, including limb rigidity and both slower ambulation (bradykinesia and reduced rearing activity in the open field. In addition, these mice exhibited more numerous α-synuclein-positive LB-like inclusions within the substantia nigra pars compacta (SNc and reduced numbers of SNc dopaminergic neurons than age-matched wild type (WT mice. These pathological changes were also accompanied by loss of dopamine terminals in the dorsal striatum. Conclusion: The TN-/- mouse exhibits several key features of PD and so may be a valuable model for studying LB formation and testing candidate neuroprotective therapies for PD and other synucleinopathies.

  11. Features and development of Coot

    International Nuclear Information System (INIS)

    Emsley, P.; Lohkamp, B.; Scott, W. G.; Cowtan, K.

    2010-01-01

    Coot is a molecular-graphics program designed to assist in the building of protein and other macromolecular models. The current state of development and available features are presented. Coot is a molecular-graphics application for model building and validation of biological macromolecules. The program displays electron-density maps and atomic models and allows model manipulations such as idealization, real-space refinement, manual rotation/translation, rigid-body fitting, ligand search, solvation, mutations, rotamers and Ramachandran idealization. Furthermore, tools are provided for model validation as well as interfaces to external programs for refinement, validation and graphics. The software is designed to be easy to learn for novice users, which is achieved by ensuring that tools for common tasks are ‘discoverable’ through familiar user-interface elements (menus and toolbars) or by intuitive behaviour (mouse controls). Recent developments have focused on providing tools for expert users, with customisable key bindings, extensions and an extensive scripting interface. The software is under rapid development, but has already achieved very widespread use within the crystallographic community. The current state of the software is presented, with a description of the facilities available and of some of the underlying methods employed

  12. Research on Radar Micro-Doppler Feature Parameter Estimation of Propeller Aircraft

    Science.gov (United States)

    He, Zhihua; Tao, Feixiang; Duan, Jia; Luo, Jingsheng

    2018-01-01

    The micro-motion modulation effect of the rotated propellers to radar echo can be a steady feature for aircraft target recognition. Thus, micro-Doppler feature parameter estimation is a key to accurate target recognition. In this paper, the radar echo of rotated propellers is modelled and simulated. Based on which, the distribution characteristics of the micro-motion modulation energy in time, frequency and time-frequency domain are analyzed. The micro-motion modulation energy produced by the scattering points of rotating propellers is accumulated using the Inverse-Radon (I-Radon) transform, which can be used to accomplish the estimation of micro-modulation parameter. Finally, it is proved that the proposed parameter estimation method is effective with measured data. The micro-motion parameters of aircraft can be used as the features of radar target recognition.

  13. Feature-Based and String-Based Models for Predicting RNA-Protein Interaction

    Directory of Open Access Journals (Sweden)

    Donald Adjeroh

    2018-03-01

    Full Text Available In this work, we study two approaches for the problem of RNA-Protein Interaction (RPI. In the first approach, we use a feature-based technique by combining extracted features from both sequences and secondary structures. The feature-based approach enhanced the prediction accuracy as it included much more available information about the RNA-protein pairs. In the second approach, we apply search algorithms and data structures to extract effective string patterns for prediction of RPI, using both sequence information (protein and RNA sequences, and structure information (protein and RNA secondary structures. This led to different string-based models for predicting interacting RNA-protein pairs. We show results that demonstrate the effectiveness of the proposed approaches, including comparative results against leading state-of-the-art methods.

  14. Pattern Classification Using an Olfactory Model with PCA Feature Selection in Electronic Noses: Study and Application

    Directory of Open Access Journals (Sweden)

    Junbao Zheng

    2012-03-01

    Full Text Available Biologically-inspired models and algorithms are considered as promising sensor array signal processing methods for electronic noses. Feature selection is one of the most important issues for developing robust pattern recognition models in machine learning. This paper describes an investigation into the classification performance of a bionic olfactory model with the increase of the dimensions of input feature vector (outer factor as well as its parallel channels (inner factor. The principal component analysis technique was applied for feature selection and dimension reduction. Two data sets of three classes of wine derived from different cultivars and five classes of green tea derived from five different provinces of China were used for experiments. In the former case the results showed that the average correct classification rate increased as more principal components were put in to feature vector. In the latter case the results showed that sufficient parallel channels should be reserved in the model to avoid pattern space crowding. We concluded that 6~8 channels of the model with principal component feature vector values of at least 90% cumulative variance is adequate for a classification task of 3~5 pattern classes considering the trade-off between time consumption and classification rate.

  15. Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor

    Science.gov (United States)

    Ponce Wong, Ruben D.; Hellman, Randall B.; Santos, Veronica J.

    2014-06-01

    Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by "haptic intelligence" that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic "exploratory procedures" on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.

  16. Mining key elements for severe convection prediction based on CNN

    Science.gov (United States)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with

  17. Fast Localization in Large-Scale Environments Using Supervised Indexing of Binary Features.

    Science.gov (United States)

    Youji Feng; Lixin Fan; Yihong Wu

    2016-01-01

    The essence of image-based localization lies in matching 2D key points in the query image and 3D points in the database. State-of-the-art methods mostly employ sophisticated key point detectors and feature descriptors, e.g., Difference of Gaussian (DoG) and Scale Invariant Feature Transform (SIFT), to ensure robust matching. While a high registration rate is attained, the registration speed is impeded by the expensive key point detection and the descriptor extraction. In this paper, we propose to use efficient key point detectors along with binary feature descriptors, since the extraction of such binary features is extremely fast. The naive usage of binary features, however, does not lend itself to significant speedup of localization, since existing indexing approaches, such as hierarchical clustering trees and locality sensitive hashing, are not efficient enough in indexing binary features and matching binary features turns out to be much slower than matching SIFT features. To overcome this, we propose a much more efficient indexing approach for approximate nearest neighbor search of binary features. This approach resorts to randomized trees that are constructed in a supervised training process by exploiting the label information derived from that multiple features correspond to a common 3D point. In the tree construction process, node tests are selected in a way such that trees have uniform leaf sizes and low error rates, which are two desired properties for efficient approximate nearest neighbor search. To further improve the search efficiency, a probabilistic priority search strategy is adopted. Apart from the label information, this strategy also uses non-binary pixel intensity differences available in descriptor extraction. By using the proposed indexing approach, matching binary features is no longer much slower but slightly faster than matching SIFT features. Consequently, the overall localization speed is significantly improved due to the much faster key

  18. A neural network model of semantic memory linking feature-based object representation and words.

    Science.gov (United States)

    Cuppini, C; Magosso, E; Ursino, M

    2009-06-01

    Recent theories in cognitive neuroscience suggest that semantic memory is a distributed process, which involves many cortical areas and is based on a multimodal representation of objects. The aim of this work is to extend a previous model of object representation to realize a semantic memory, in which sensory-motor representations of objects are linked with words. The model assumes that each object is described as a collection of features, coded in different cortical areas via a topological organization. Features in different objects are segmented via gamma-band synchronization of neural oscillators. The feature areas are further connected with a lexical area, devoted to the representation of words. Synapses among the feature areas, and among the lexical area and the feature areas are trained via a time-dependent Hebbian rule, during a period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from acoustic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits).

  19. Impact of SLA assimilation in the Sicily Channel Regional Model: model skills and mesoscale features

    Directory of Open Access Journals (Sweden)

    A. Olita

    2012-07-01

    Full Text Available The impact of the assimilation of MyOcean sea level anomalies along-track data on the analyses of the Sicily Channel Regional Model was studied. The numerical model has a resolution of 1/32° degrees and is capable to reproduce mesoscale and sub-mesoscale features. The impact of the SLA assimilation is studied by comparing a simulation (SIM, which does not assimilate data with an analysis (AN assimilating SLA along-track multi-mission data produced in the framework of MyOcean project. The quality of the analysis was evaluated by computing RMSE of the misfits between analysis background and observations (sea level before assimilation. A qualitative evaluation of the ability of the analyses to reproduce mesoscale structures is accomplished by comparing model results with ocean colour and SST satellite data, able to detect such features on the ocean surface. CTD profiles allowed to evaluate the impact of the SLA assimilation along the water column. We found a significant improvement for AN solution in terms of SLA RMSE with respect to SIM (the averaged RMSE of AN SLA misfits over 2 years is about 0.5 cm smaller than SIM. Comparison with CTD data shows a questionable improvement produced by the assimilation process in terms of vertical features: AN is better in temperature while for salinity it gets worse than SIM at the surface. This suggests that a better a-priori description of the vertical error covariances would be desirable. The qualitative comparison of simulation and analyses with synoptic satellite independent data proves that SLA assimilation allows to correctly reproduce some dynamical features (above all the circulation in the Ionian portion of the domain and mesoscale structures otherwise misplaced or neglected by SIM. Such mesoscale changes also infer that the eddy momentum fluxes (i.e. Reynolds stresses show major changes in the Ionian area. Changes in Reynolds stresses reflect a different pumping of eastward momentum from the eddy to

  20. Integrated water flow model and modflow-farm process: A comparison of theory, approaches, and features of two integrated hydrologic models

    Science.gov (United States)

    Dogrul, Emin C.; Schmid, Wolfgang; Hanson, Randall T.; Kadir, Tariq; Chung, Francis

    2016-01-01

    Effective modeling of conjunctive use of surface and subsurface water resources requires simulation of land use-based root zone and surface flow processes as well as groundwater flows, streamflows, and their interactions. Recently, two computer models developed for this purpose, the Integrated Water Flow Model (IWFM) from the California Department of Water Resources and the MODFLOW with Farm Process (MF-FMP) from the US Geological Survey, have been applied to complex basins such as the Central Valley of California. As both IWFM and MFFMP are publicly available for download and can be applied to other basins, there is a need to objectively compare the main approaches and features used in both models. This paper compares the concepts, as well as the method and simulation features of each hydrologic model pertaining to groundwater, surface water, and landscape processes. The comparison is focused on the integrated simulation of water demand and supply, water use, and the flow between coupled hydrologic processes. The differences in the capabilities and features of these two models could affect the outcome and types of water resource problems that can be simulated.

  1. A preclinical orthotopic model for glioblastoma recapitulates key features of human tumors and demonstrates sensitivity to a combination of MEK and PI3K pathway inhibitors.

    Science.gov (United States)

    El Meskini, Rajaa; Iacovelli, Anthony J; Kulaga, Alan; Gumprecht, Michelle; Martin, Philip L; Baran, Maureen; Householder, Deborah B; Van Dyke, Terry; Weaver Ohler, Zoë

    2015-01-01

    Current therapies for glioblastoma multiforme (GBM), the highest grade malignant brain tumor, are mostly ineffective, and better preclinical model systems are needed to increase the successful translation of drug discovery efforts into the clinic. Previous work describes a genetically engineered mouse (GEM) model that contains perturbations in the most frequently dysregulated networks in GBM (driven by RB, KRAS and/or PI3K signaling and PTEN) that induce development of Grade IV astrocytoma with properties of the human disease. Here, we developed and characterized an orthotopic mouse model derived from the GEM that retains the features of the GEM model in an immunocompetent background; however, this model is also tractable and efficient for preclinical evaluation of candidate therapeutic regimens. Orthotopic brain tumors are highly proliferative, invasive and vascular, and express histology markers characteristic of human GBM. Primary tumor cells were examined for sensitivity to chemotherapeutics and targeted drugs. PI3K and MAPK pathway inhibitors, when used as single agents, inhibited cell proliferation but did not result in significant apoptosis. However, in combination, these inhibitors resulted in a substantial increase in cell death. Moreover, these findings translated into the in vivo orthotopic model: PI3K or MAPK inhibitor treatment regimens resulted in incomplete pathway suppression and feedback loops, whereas dual treatment delayed tumor growth through increased apoptosis and decreased tumor cell proliferation. Analysis of downstream pathway components revealed a cooperative effect on target downregulation. These concordant results, together with the morphologic similarities to the human GBM disease characteristics of the model, validate it as a new platform for the evaluation of GBM treatment. © 2015. Published by The Company of Biologists Ltd.

  2. A preclinical orthotopic model for glioblastoma recapitulates key features of human tumors and demonstrates sensitivity to a combination of MEK and PI3K pathway inhibitors

    Directory of Open Access Journals (Sweden)

    Rajaa El Meskini

    2015-01-01

    Full Text Available Current therapies for glioblastoma multiforme (GBM, the highest grade malignant brain tumor, are mostly ineffective, and better preclinical model systems are needed to increase the successful translation of drug discovery efforts into the clinic. Previous work describes a genetically engineered mouse (GEM model that contains perturbations in the most frequently dysregulated networks in GBM (driven by RB, KRAS and/or PI3K signaling and PTEN that induce development of Grade IV astrocytoma with properties of the human disease. Here, we developed and characterized an orthotopic mouse model derived from the GEM that retains the features of the GEM model in an immunocompetent background; however, this model is also tractable and efficient for preclinical evaluation of candidate therapeutic regimens. Orthotopic brain tumors are highly proliferative, invasive and vascular, and express histology markers characteristic of human GBM. Primary tumor cells were examined for sensitivity to chemotherapeutics and targeted drugs. PI3K and MAPK pathway inhibitors, when used as single agents, inhibited cell proliferation but did not result in significant apoptosis. However, in combination, these inhibitors resulted in a substantial increase in cell death. Moreover, these findings translated into the in vivo orthotopic model: PI3K or MAPK inhibitor treatment regimens resulted in incomplete pathway suppression and feedback loops, whereas dual treatment delayed tumor growth through increased apoptosis and decreased tumor cell proliferation. Analysis of downstream pathway components revealed a cooperative effect on target downregulation. These concordant results, together with the morphologic similarities to the human GBM disease characteristics of the model, validate it as a new platform for the evaluation of GBM treatment.

  3. Augmented distinctive features with color and scale invariance

    Science.gov (United States)

    Liu, Yan; Lu, Xiaoqing; Qin, Yeyang; Tang, Zhi; Xu, Jianbo

    2013-03-01

    For objects with the same texture but different colors, it is difficult to discriminate them with the traditional scale invariant feature transform descriptor (SIFT), because it is designed for grayscale images only. Thus it is important to keep a high probability to make sure that the used key points are couples of correct pairs. In addition, mean distributed key points are much more expected than over dense and clustered key points for image match and other applications. In this paper, we analyze these two problems. First, we propose a color and scale invariant method to extract a more mean distributed key points relying on illumination intensity invariance but object reflectance sensitivity variance variable. Second, we modify the key point's canonical direction accumulated error by dispersing each pixel's gradient direction on a relative direction around the current key point. At last, we build the descriptors on a Gaussian pyramid and match the key points with our enhanced two-way matching regulations. Experiments are performed on the Amsterdam Library of Object Images dataset and some synthetic images manually. The results show that the extracted key points have better distribution character and larger number than SIFT. The feature descriptors can well discriminate images with different color but with the same content and texture.

  4. Deep Convolutional Neural Networks Outperform Feature-Based But Not Categorical Models in Explaining Object Similarity Judgments

    Science.gov (United States)

    Jozwik, Kamila M.; Kriegeskorte, Nikolaus; Storrs, Katherine R.; Mur, Marieke

    2017-01-01

    Recent advances in Deep convolutional Neural Networks (DNNs) have enabled unprecedentedly accurate computational models of brain representations, and present an exciting opportunity to model diverse cognitive functions. State-of-the-art DNNs achieve human-level performance on object categorisation, but it is unclear how well they capture human behavior on complex cognitive tasks. Recent reports suggest that DNNs can explain significant variance in one such task, judging object similarity. Here, we extend these findings by replicating them for a rich set of object images, comparing performance across layers within two DNNs of different depths, and examining how the DNNs’ performance compares to that of non-computational “conceptual” models. Human observers performed similarity judgments for a set of 92 images of real-world objects. Representations of the same images were obtained in each of the layers of two DNNs of different depths (8-layer AlexNet and 16-layer VGG-16). To create conceptual models, other human observers generated visual-feature labels (e.g., “eye”) and category labels (e.g., “animal”) for the same image set. Feature labels were divided into parts, colors, textures and contours, while category labels were divided into subordinate, basic, and superordinate categories. We fitted models derived from the features, categories, and from each layer of each DNN to the similarity judgments, using representational similarity analysis to evaluate model performance. In both DNNs, similarity within the last layer explains most of the explainable variance in human similarity judgments. The last layer outperforms almost all feature-based models. Late and mid-level layers outperform some but not all feature-based models. Importantly, categorical models predict similarity judgments significantly better than any DNN layer. Our results provide further evidence for commonalities between DNNs and brain representations. Models derived from visual features

  5. Nonmarket economic user values of the Florida Keys/Key West

    Science.gov (United States)

    Vernon R. Leeworthy; J. Michael Bowker

    1997-01-01

    This report provides estimates of the nonmarket economic user values for recreating visitors to the Florida Keys/Key West that participated in natural resource-based activities. Results from estimated travel cost models are presented, including visitor’s responses to prices and estimated per person-trip user values. Annual user values are also calculated and presented...

  6. Research and Application of Hybrid Forecasting Model Based on an Optimal Feature Selection System—A Case Study on Electrical Load Forecasting

    Directory of Open Access Journals (Sweden)

    Yunxuan Dong

    2017-04-01

    Full Text Available The process of modernizing smart grid prominently increases the complexity and uncertainty in scheduling and operation of power systems, and, in order to develop a more reliable, flexible, efficient and resilient grid, electrical load forecasting is not only an important key but is still a difficult and challenging task as well. In this paper, a short-term electrical load forecasting model, with a unit for feature learning named Pyramid System and recurrent neural networks, has been developed and it can effectively promote the stability and security of the power grid. Nine types of methods for feature learning are compared in this work to select the best one for learning target, and two criteria have been employed to evaluate the accuracy of the prediction intervals. Furthermore, an electrical load forecasting method based on recurrent neural networks has been formed to achieve the relational diagram of historical data, and, to be specific, the proposed techniques are applied to electrical load forecasting using the data collected from New South Wales, Australia. The simulation results show that the proposed hybrid models can not only satisfactorily approximate the actual value but they are also able to be effective tools in the planning of smart grids.

  7. Cascaded ensemble of convolutional neural networks and handcrafted features for mitosis detection

    Science.gov (United States)

    Wang, Haibo; Cruz-Roa, Angel; Basavanhally, Ajay; Gilmore, Hannah; Shih, Natalie; Feldman, Mike; Tomaszewski, John; Gonzalez, Fabio; Madabhushi, Anant

    2014-03-01

    Breast cancer (BCa) grading plays an important role in predicting disease aggressiveness and patient outcome. A key component of BCa grade is mitotic count, which involves quantifying the number of cells in the process of dividing (i.e. undergoing mitosis) at a specific point in time. Currently mitosis counting is done manually by a pathologist looking at multiple high power fields on a glass slide under a microscope, an extremely laborious and time consuming process. The development of computerized systems for automated detection of mitotic nuclei, while highly desirable, is confounded by the highly variable shape and appearance of mitoses. Existing methods use either handcrafted features that capture certain morphological, statistical or textural attributes of mitoses or features learned with convolutional neural networks (CNN). While handcrafted features are inspired by the domain and the particular application, the data-driven CNN models tend to be domain agnostic and attempt to learn additional feature bases that cannot be represented through any of the handcrafted features. On the other hand, CNN is computationally more complex and needs a large number of labeled training instances. Since handcrafted features attempt to model domain pertinent attributes and CNN approaches are largely unsupervised feature generation methods, there is an appeal to attempting to combine these two distinct classes of feature generation strategies to create an integrated set of attributes that can potentially outperform either class of feature extraction strategies individually. In this paper, we present a cascaded approach for mitosis detection that intelligently combines a CNN model and handcrafted features (morphology, color and texture features). By employing a light CNN model, the proposed approach is far less demanding computationally, and the cascaded strategy of combining handcrafted features and CNN-derived features enables the possibility of maximizing performance by

  8. What are the key drivers of MAC curves? A partial-equilibrium modelling approach for the UK

    International Nuclear Information System (INIS)

    Kesicki, Fabian

    2013-01-01

    Marginal abatement cost (MAC) curves are widely used for the assessment of costs related to CO 2 emissions reduction in environmental economics, as well as domestic and international climate policy. Several meta-analyses and model comparisons have previously been performed that aim to identify the causes for the wide range of MAC curves. Most of these concentrate on general equilibrium models with a focus on aspects such as specific model type and technology learning, while other important aspects remain almost unconsidered, including the availability of abatement technologies and level of discount rates. This paper addresses the influence of several key parameters on MAC curves for the United Kingdom and the year 2030. A technology-rich energy system model, UK MARKAL, is used to derive the MAC curves. The results of this study show that MAC curves are robust even to extreme fossil fuel price changes, while uncertainty around the choice of the discount rate, the availability of key abatement technologies and the demand level were singled out as the most important influencing factors. By using a different model type and studying a wider range of influencing factors, this paper contributes to the debate on the sensitivity of MAC curves. - Highlights: ► A partial-equilibrium model is employed to test key sensitivities of MAC curves. ► MAC curves are found to be robust to wide-ranging changes in fossil fuel prices. ► Most influencing factors are the discount rate, availability of key technologies. ► Further important uncertainty in MAC curves is related to demand changes

  9. Some Key Features and Possible Origin of the Metamorphic Rock-Hosted Gold Mineralization in Buru Island, Indonesia

    Directory of Open Access Journals (Sweden)

    Arifudin Idrus

    2014-07-01

    Full Text Available DOI: 10.17014/ijog.v1i1.172This paper discusses characteristics of some key features of the primary Buru gold deposit as a tool for a better understanding of the deposit genesis. Currently, about 105,000 artisanal and small-scale gold miners (ASGM are operating in two main localities, i.e. Gogorea and Gunung Botak by digging pits/shafts following gold-bearing quartz vein orientation. The gold extraction uses mercury (amalgamation and cyanide processing. The field study identifies two types/generations of quartz veins namely (1 Early quartz veins which are segmented, sigmoidal, dis­continous, and parallel to the foliation of host rock. The quartz vein is lack of sulfides, weak mineralized, crystalline, relatively clear, and maybe poor in gold, and (2 Quartz veins occurred within a ‘mineralized zone’ of about 100 m in width and ~1,000 m in length. The gold mineralization is strongly overprinted by an argillic alteration zone. The mineralization-alteration zone is probably parallel to the mica schist foliation and strongly controlled by N-S or NE-SW-trending structures. The gold-bearing quartz veins are characterized by banded texture particularly colloform following host rock foliation and sulphide banding, brecciated, and rare bladed-like texture. The alteration types consist of propylitic (chlorite, calcite, sericite, argillic, and carbonation represented by graphite banding and carbon flakes. The ore mineralization is characterized by pyrite, native gold, pyrrhotite, and arsenopyrite. Cinnabar, stibnite, chalcopyrite, galena, and sphalerite are rare or maybe absent. In general, sulphide minerals are rare (<3%. Fifteen rock samples were collected in Wamsaid area for geochemical assaying for Au, Ag, As, Sb, Hg, Cu, Pb, and Zn. Eleven of fifteen samples yielded more than 1.00 g/t Au, in which six of them are in excess of 3.00 g/t Au. It can be noted that all high-grade samples are originally or containing limonitic materials, that suggest

  10. Feature extraction for face recognition via Active Shape Model (ASM) and Active Appearance Model (AAM)

    Science.gov (United States)

    Iqtait, M.; Mohamad, F. S.; Mamat, M.

    2018-03-01

    Biometric is a pattern recognition system which is used for automatic recognition of persons based on characteristics and features of an individual. Face recognition with high recognition rate is still a challenging task and usually accomplished in three phases consisting of face detection, feature extraction, and expression classification. Precise and strong location of trait point is a complicated and difficult issue in face recognition. Cootes proposed a Multi Resolution Active Shape Models (ASM) algorithm, which could extract specified shape accurately and efficiently. Furthermore, as the improvement of ASM, Active Appearance Models algorithm (AAM) is proposed to extracts both shape and texture of specified object simultaneously. In this paper we give more details about the two algorithms and give the results of experiments, testing their performance on one dataset of faces. We found that the ASM is faster and gains more accurate trait point location than the AAM, but the AAM gains a better match to the texture.

  11. Information verification cryptosystem using one-time keys based on double random phase encoding and public-key cryptography

    Science.gov (United States)

    Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing

    2016-08-01

    A novel image encryption system based on double random phase encoding (DRPE) and RSA public-key algorithm is proposed. The main characteristic of the system is that each encryption process produces a new decryption key (even for the same plaintext), thus the encryption system conforms to the feature of the one-time pad (OTP) cryptography. The other characteristic of the system is the use of fingerprint key. Only with the rightful authorization will the true decryption be obtained, otherwise the decryption will result in noisy images. So the proposed system can be used to determine whether the ciphertext is falsified by attackers. In addition, the system conforms to the basic agreement of asymmetric cryptosystem (ACS) due to the combination with the RSA public-key algorithm. The simulation results show that the encryption scheme has high robustness against the existing attacks.

  12. A model of biological neuron with terminal chaos and quantum-like features

    International Nuclear Information System (INIS)

    Conte, Elio; Pierri, GianPaolo; Federici, Antonio; Mendolicchio, Leonardo; Zbilut, Joseph P.

    2006-01-01

    A model of biological neuron is proposed combining terminal dynamics with quantum-like mechanical features, assuming the spin to be an important entity in neurodynamics, and, in particular, in synaptic transmission

  13. Feature selection model based on clustering and ranking in pipeline for microarray data

    Directory of Open Access Journals (Sweden)

    Barnali Sahu

    2017-01-01

    Full Text Available Most of the available feature selection techniques in the literature are classifier bound. It means a group of features tied to the performance of a specific classifier as applied in wrapper and hybrid approach. Our objective in this study is to select a set of generic features not tied to any classifier based on the proposed framework. This framework uses attribute clustering and feature ranking techniques in pipeline in order to remove redundant features. On each uncovered cluster, signal-to-noise ratio, t-statistics and significance analysis of microarray are independently applied to select the top ranked features. Both filter and evolutionary wrapper approaches have been considered for feature selection and the data set with selected features are given to ensemble of predefined statistically different classifiers. The class labels of the test data are determined using majority voting technique. Moreover, with the aforesaid objectives, this paper focuses on obtaining a stable result out of various classification models. Further, a comparative analysis has been performed to study the classification accuracy and computational time of the current approach and evolutionary wrapper techniques. It gives a better insight into the features and further enhancing the classification accuracy with less computational time.

  14. Decontaminate feature for tracking: adaptive tracking via evolutionary feature subset

    Science.gov (United States)

    Liu, Qiaoyuan; Wang, Yuru; Yin, Minghao; Ren, Jinchang; Li, Ruizhi

    2017-11-01

    Although various visual tracking algorithms have been proposed in the last 2-3 decades, it remains a challenging problem for effective tracking with fast motion, deformation, occlusion, etc. Under complex tracking conditions, most tracking models are not discriminative and adaptive enough. When the combined feature vectors are inputted to the visual models, this may lead to redundancy causing low efficiency and ambiguity causing poor performance. An effective tracking algorithm is proposed to decontaminate features for each video sequence adaptively, where the visual modeling is treated as an optimization problem from the perspective of evolution. Every feature vector is compared to a biological individual and then decontaminated via classical evolutionary algorithms. With the optimized subsets of features, the "curse of dimensionality" has been avoided while the accuracy of the visual model has been improved. The proposed algorithm has been tested on several publicly available datasets with various tracking challenges and benchmarked with a number of state-of-the-art approaches. The comprehensive experiments have demonstrated the efficacy of the proposed methodology.

  15. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  16. Key thrusts in next generation CANDU. Annex 10

    International Nuclear Information System (INIS)

    Shalaby, B.A.; Torgerson, D.F.; Duffey, R.B.

    2002-01-01

    Current electricity markets and the competitiveness of other generation options such as CCGT have influenced the directions of future nuclear generation. The next generation CANDU has used its key characteristics as the basis to leap frog into a new design featuring improved economics, enhanced passive safety, enhanced operability and demonstrated fuel cycle flexibility. Many enabling technologies spinning of current CANDU design features are used in the next generation design. Some of these technologies have been developed in support of existing plants and near term designs while others will need to be developed and tested. This paper will discuss the key principles driving the next generation CANDU design and the fuel cycle flexibility of the CANDU system which provide synergism with the PWR fuel cycle. (author)

  17. Simultaneous Channel and Feature Selection of Fused EEG Features Based on Sparse Group Lasso

    Directory of Open Access Journals (Sweden)

    Jin-Jia Wang

    2015-01-01

    Full Text Available Feature extraction and classification of EEG signals are core parts of brain computer interfaces (BCIs. Due to the high dimension of the EEG feature vector, an effective feature selection algorithm has become an integral part of research studies. In this paper, we present a new method based on a wrapped Sparse Group Lasso for channel and feature selection of fused EEG signals. The high-dimensional fused features are firstly obtained, which include the power spectrum, time-domain statistics, AR model, and the wavelet coefficient features extracted from the preprocessed EEG signals. The wrapped channel and feature selection method is then applied, which uses the logistical regression model with Sparse Group Lasso penalized function. The model is fitted on the training data, and parameter estimation is obtained by modified blockwise coordinate descent and coordinate gradient descent method. The best parameters and feature subset are selected by using a 10-fold cross-validation. Finally, the test data is classified using the trained model. Compared with existing channel and feature selection methods, results show that the proposed method is more suitable, more stable, and faster for high-dimensional feature fusion. It can simultaneously achieve channel and feature selection with a lower error rate. The test accuracy on the data used from international BCI Competition IV reached 84.72%.

  18. Jointly Feature Learning and Selection for Robust Tracking via a Gating Mechanism.

    Directory of Open Access Journals (Sweden)

    Bineng Zhong

    Full Text Available To achieve effective visual tracking, a robust feature representation composed of two separate components (i.e., feature learning and selection for an object is one of the key issues. Typically, a common assumption used in visual tracking is that the raw video sequences are clear, while real-world data is with significant noise and irrelevant patterns. Consequently, the learned features may be not all relevant and noisy. To address this problem, we propose a novel visual tracking method via a point-wise gated convolutional deep network (CPGDN that jointly performs the feature learning and feature selection in a unified framework. The proposed method performs dynamic feature selection on raw features through a gating mechanism. Therefore, the proposed method can adaptively focus on the task-relevant patterns (i.e., a target object, while ignoring the task-irrelevant patterns (i.e., the surrounding background of a target object. Specifically, inspired by transfer learning, we firstly pre-train an object appearance model offline to learn generic image features and then transfer rich feature hierarchies from an offline pre-trained CPGDN into online tracking. In online tracking, the pre-trained CPGDN model is fine-tuned to adapt to the tracking specific objects. Finally, to alleviate the tracker drifting problem, inspired by an observation that a visual target should be an object rather than not, we combine an edge box-based object proposal method to further improve the tracking accuracy. Extensive evaluation on the widely used CVPR2013 tracking benchmark validates the robustness and effectiveness of the proposed method.

  19. Multi-scale salient feature extraction on mesh models

    KAUST Repository

    Yang, Yongliang; Shen, ChaoHui

    2012-01-01

    We present a new method of extracting multi-scale salient features on meshes. It is based on robust estimation of curvature on multiple scales. The coincidence between salient feature and the scale of interest can be established straightforwardly, where detailed feature appears on small scale and feature with more global shape information shows up on large scale. We demonstrate this multi-scale description of features accords with human perception and can be further used for several applications as feature classification and viewpoint selection. Experiments exhibit that our method as a multi-scale analysis tool is very helpful for studying 3D shapes. © 2012 Springer-Verlag.

  20. Multiscale Feature Model for Terrain Data Based on Adaptive Spatial Neighborhood

    Directory of Open Access Journals (Sweden)

    Huijie Zhang

    2013-01-01

    Full Text Available Multiresolution hierarchy based on features (FMRH has been applied in the field of terrain modeling and obtained significant results in real engineering. However, it is difficult to schedule multiresolution data in FMRH from external memory. This paper proposed new multiscale feature model and related strategies to cluster spatial data blocks and solve the scheduling problems of FMRH using spatial neighborhood. In the model, the nodes with similar error in the different layers should be in one cluster. On this basis, a space index algorithm for each cluster guided by Hilbert curve is proposed. It ensures that multi-resolution terrain data can be loaded without traversing the whole FMRH; therefore, the efficiency of data scheduling is improved. Moreover, a spatial closeness theorem of cluster is put forward and is also proved. It guarantees that the union of data blocks composites a whole terrain without any data loss. Finally, experiments have been carried out on many different large scale data sets, and the results demonstrate that the schedule time is shortened and the efficiency of I/O operation is apparently improved, which is important in real engineering.

  1. Identification of key residues for protein conformational transition using elastic network model.

    Science.gov (United States)

    Su, Ji Guo; Xu, Xian Jin; Li, Chun Hua; Chen, Wei Zu; Wang, Cun Xin

    2011-11-07

    Proteins usually undergo conformational transitions between structurally disparate states to fulfill their functions. The large-scale allosteric conformational transitions are believed to involve some key residues that mediate the conformational movements between different regions of the protein. In the present work, a thermodynamic method based on the elastic network model is proposed to predict the key residues involved in protein conformational transitions. In our method, the key functional sites are identified as the residues whose perturbations largely influence the free energy difference between the protein states before and after transition. Two proteins, nucleotide binding domain of the heat shock protein 70 and human/rat DNA polymerase β, are used as case studies to identify the critical residues responsible for their open-closed conformational transitions. The results show that the functionally important residues mainly locate at the following regions for these two proteins: (1) the bridging point at the interface between the subdomains that control the opening and closure of the binding cleft; (2) the hinge region between different subdomains, which mediates the cooperative motions between the corresponding subdomains; and (3) the substrate binding sites. The similarity in the positions of the key residues for these two proteins may indicate a common mechanism in their conformational transitions.

  2. Modeling HAZ hardness and weld features with BPN technology

    International Nuclear Information System (INIS)

    Morinishi, S.; Bibby, M.J.; Chan, B.

    2000-01-01

    A BPN (back propagation network) system for predicting HAZ (heat-affected zone) hardnesses and GMAW (gas metal arc) weld features (size and shape) is described in this presentation. Among other things, issues of network structure, training and testing data selection, software efficiency and user interface are discussed. The system is evaluated by comparing network output with experimentally measured test data in the first instance, and with regression methods available for this purpose, thereafter. The potential of the web for exchanging weld process data and for accessing models generated with this system is addressed. In this regard the software has been made available on the Cambridge University 'steel' and 'neural' websites. In addition Java coded software has recently been generated to provide web flexibility and accessibility. Over and above this, the possibility of offering an on-line 'server' training service, arranged to capture user data (user identification, measured welding parameters and features) and trained models for the use of the entire welding community is described. While the possibility of such an exchange is attractive, there are several difficulties in designing such a system. Server software design, computing resources, data base and communications considerations are some of the issues that must be addressed with regard to a server centered training and database system before it becomes reality. (author)

  3. Nine key principles to guide youth mental health: development of service models in New South Wales.

    Science.gov (United States)

    Howe, Deborah; Batchelor, Samantha; Coates, Dominiek; Cashman, Emma

    2014-05-01

    Historically, the Australian health system has failed to meet the needs of young people with mental health problems and mental illness. In 2006, New South Wales (NSW) Health allocated considerable funds to the reform agenda of mental health services in NSW to address this inadequacy. Children and Young People's Mental Health (CYPMH), a service that provides mental health care for young people aged 12-24 years, with moderate to severe mental health problems, was chosen to establish a prototype Youth Mental Health (YMH) Service Model for NSW. This paper describes nine key principles developed by CYPMH to guide the development of YMH Service Models in NSW. A literature review, numerous stakeholder consultations and consideration of clinical best practice were utilized to inform the development of the key principles. Subsequent to their development, the nine key principles were formally endorsed by the Mental Health Program Council to ensure consistency and monitor the progress of YMH services across NSW. As a result, between 2008 and 2012 YMH Services across NSW regularly reported on their activities against each of the nine key principles demonstrating how each principle was addressed within their service. The nine key principles provide mental health services a framework for how to reorient services to accommodate YMH and provide a high-quality model of care. [Corrections added on 29 November 2013, after first online publication: The last two sentences of the Results section have been replaced with "As a result, between 2008 and 2012 YMH Services across NSW regularly reported on their activities against each of the nine key principles demonstrating how each principle was addressed within their service."]. © 2013 Wiley Publishing Asia Pty Ltd.

  4. Feature selection for splice site prediction: A new method using EDA-based feature ranking

    Directory of Open Access Journals (Sweden)

    Rouzé Pierre

    2004-05-01

    Full Text Available Abstract Background The identification of relevant biological features in large and complex datasets is an important step towards gaining insight in the processes underlying the data. Other advantages of feature selection include the ability of the classification system to attain good or even better solutions using a restricted subset of features, and a faster classification. Thus, robust methods for fast feature selection are of key importance in extracting knowledge from complex biological data. Results In this paper we present a novel method for feature subset selection applied to splice site prediction, based on estimation of distribution algorithms, a more general framework of genetic algorithms. From the estimated distribution of the algorithm, a feature ranking is derived. Afterwards this ranking is used to iteratively discard features. We apply this technique to the problem of splice site prediction, and show how it can be used to gain insight into the underlying biological process of splicing. Conclusion We show that this technique proves to be more robust than the traditional use of estimation of distribution algorithms for feature selection: instead of returning a single best subset of features (as they normally do this method provides a dynamical view of the feature selection process, like the traditional sequential wrapper methods. However, the method is faster than the traditional techniques, and scales better to datasets described by a large number of features.

  5. The consensus in the two-feature two-state one-dimensional Axelrod model revisited

    International Nuclear Information System (INIS)

    Biral, Elias J P; Tilles, Paulo F C; Fontanari, José F

    2015-01-01

    The Axelrod model for the dissemination of culture exhibits a rich spatial distribution of cultural domains, which depends on the values of the two model parameters: F, the number of cultural features and q, the common number of states each feature can assume. In the one-dimensional model with F = q = 2, which is closely related to the constrained voter model, Monte Carlo simulations indicate the existence of multicultural absorbing configurations in which at least one macroscopic domain coexist with a multitude of microscopic ones in the thermodynamic limit. However, rigorous analytical results for the infinite system starting from the configuration where all cultures are equally likely show convergence to only monocultural or consensus configurations. Here we show that this disagreement is due simply to the order that the time-asymptotic limit and the thermodynamic limit are taken in the simulations. In addition, we show how the consensus-only result can be derived using Monte Carlo simulations of finite chains. (paper)

  6. The consensus in the two-feature two-state one-dimensional Axelrod model revisited

    Science.gov (United States)

    Biral, Elias J. P.; Tilles, Paulo F. C.; Fontanari, José F.

    2015-04-01

    The Axelrod model for the dissemination of culture exhibits a rich spatial distribution of cultural domains, which depends on the values of the two model parameters: F, the number of cultural features and q, the common number of states each feature can assume. In the one-dimensional model with F = q = 2, which is closely related to the constrained voter model, Monte Carlo simulations indicate the existence of multicultural absorbing configurations in which at least one macroscopic domain coexist with a multitude of microscopic ones in the thermodynamic limit. However, rigorous analytical results for the infinite system starting from the configuration where all cultures are equally likely show convergence to only monocultural or consensus configurations. Here we show that this disagreement is due simply to the order that the time-asymptotic limit and the thermodynamic limit are taken in the simulations. In addition, we show how the consensus-only result can be derived using Monte Carlo simulations of finite chains.

  7. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    Science.gov (United States)

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  8. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    Directory of Open Access Journals (Sweden)

    Ran Zheng

    Full Text Available Surveillance video service (SVS is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  9. Short-Term Solar Irradiance Forecasting Model Based on Artificial Neural Network Using Statistical Feature Parameters

    Directory of Open Access Journals (Sweden)

    Hongshan Zhao

    2012-05-01

    Full Text Available Short-term solar irradiance forecasting (STSIF is of great significance for the optimal operation and power predication of grid-connected photovoltaic (PV plants. However, STSIF is very complex to handle due to the random and nonlinear characteristics of solar irradiance under changeable weather conditions. Artificial Neural Network (ANN is suitable for STSIF modeling and many research works on this topic are presented, but the conciseness and robustness of the existing models still need to be improved. After discussing the relation between weather variations and irradiance, the characteristics of the statistical feature parameters of irradiance under different weather conditions are figured out. A novel ANN model using statistical feature parameters (ANN-SFP for STSIF is proposed in this paper. The input vector is reconstructed with several statistical feature parameters of irradiance and ambient temperature. Thus sufficient information can be effectively extracted from relatively few inputs and the model complexity is reduced. The model structure is determined by cross-validation (CV, and the Levenberg-Marquardt algorithm (LMA is used for the network training. Simulations are carried out to validate and compare the proposed model with the conventional ANN model using historical data series (ANN-HDS, and the results indicated that the forecast accuracy is obviously improved under variable weather conditions.

  10. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    International Nuclear Information System (INIS)

    Wasiolek, M. A.

    2003-01-01

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10 -8 ). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  11. Research on Degeneration Model of Neural Network for Deep Groove Ball Bearing Based on Feature Fusion

    Directory of Open Access Journals (Sweden)

    Lijun Zhang

    2018-02-01

    Full Text Available Aiming at the pitting fault of deep groove ball bearing during service, this paper uses the vibration signal of five different states of deep groove ball bearing and extracts the relevant features, then uses a neural network to model the degradation for identifying and classifying the fault type. By comparing the effects of training samples with different capacities through performance indexes such as the accuracy and convergence speed, it is proven that an increase in the sample size can improve the performance of the model. Based on the polynomial fitting principle and Pearson correlation coefficient, fusion features based on the skewness index are proposed, and the performance improvement of the model after incorporating the fusion features is also validated. A comparison of the performance of the support vector machine (SVM model and the neural network model on this dataset is given. The research shows that neural networks have more potential for complex and high-volume datasets.

  12. Scaling up spike-and-slab models for unsupervised feature learning.

    Science.gov (United States)

    Goodfellow, Ian J; Courville, Aaron; Bengio, Yoshua

    2013-08-01

    We describe the use of two spike-and-slab models for modeling real-valued data, with an emphasis on their applications to object recognition. The first model, which we call spike-and-slab sparse coding (S3C), is a preexisting model for which we introduce a faster approximate inference algorithm. We introduce a deep variant of S3C, which we call the partially directed deep Boltzmann machine (PD-DBM) and extend our S3C inference algorithm for use on this model. We describe learning procedures for each. We demonstrate that our inference procedure for S3C enables scaling the model to unprecedented large problem sizes, and demonstrate that using S3C as a feature extractor results in very good object recognition performance, particularly when the number of labeled examples is low. We show that the PD-DBM generates better samples than its shallow counterpart, and that unlike DBMs or DBNs, the PD-DBM may be trained successfully without greedy layerwise training.

  13. Intelligent Fault Diagnosis of HVCB with Feature Space Optimization-Based Random Forest.

    Science.gov (United States)

    Ma, Suliang; Chen, Mingxuan; Wu, Jianwen; Wang, Yuhao; Jia, Bowen; Jiang, Yuan

    2018-04-16

    Mechanical faults of high-voltage circuit breakers (HVCBs) always happen over long-term operation, so extracting the fault features and identifying the fault type have become a key issue for ensuring the security and reliability of power supply. Based on wavelet packet decomposition technology and random forest algorithm, an effective identification system was developed in this paper. First, compared with the incomplete description of Shannon entropy, the wavelet packet time-frequency energy rate (WTFER) was adopted as the input vector for the classifier model in the feature selection procedure. Then, a random forest classifier was used to diagnose the HVCB fault, assess the importance of the feature variable and optimize the feature space. Finally, the approach was verified based on actual HVCB vibration signals by considering six typical fault classes. The comparative experiment results show that the classification accuracy of the proposed method with the origin feature space reached 93.33% and reached up to 95.56% with optimized input feature vector of classifier. This indicates that feature optimization procedure is successful, and the proposed diagnosis algorithm has higher efficiency and robustness than traditional methods.

  14. Multilevel binomial logistic prediction model for malignant pulmonary nodules based on texture features of CT image

    International Nuclear Information System (INIS)

    Wang Huan; Guo Xiuhua; Jia Zhongwei; Li Hongkai; Liang Zhigang; Li Kuncheng; He Qian

    2010-01-01

    Purpose: To introduce multilevel binomial logistic prediction model-based computer-aided diagnostic (CAD) method of small solitary pulmonary nodules (SPNs) diagnosis by combining patient and image characteristics by textural features of CT image. Materials and methods: Describe fourteen gray level co-occurrence matrix textural features obtained from 2171 benign and malignant small solitary pulmonary nodules, which belongs to 185 patients. Multilevel binomial logistic model is applied to gain these initial insights. Results: Five texture features, including Inertia, Entropy, Correlation, Difference-mean, Sum-Entropy, and age of patients own aggregating character on patient-level, which are statistically different (P < 0.05) between benign and malignant small solitary pulmonary nodules. Conclusion: Some gray level co-occurrence matrix textural features are efficiently descriptive features of CT image of small solitary pulmonary nodules, which can profit diagnosis of earlier period lung cancer if combined patient-level characteristics to some extent.

  15. Detection and quantification of flow consistency in business process models.

    Science.gov (United States)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  16. Securing information using optically generated biometric keys

    Science.gov (United States)

    Verma, Gaurav; Sinha, Aloka

    2016-11-01

    In this paper, we present a new technique to obtain biometric keys by using the fingerprint of a person for an optical image encryption system. The key generation scheme uses the fingerprint biometric information in terms of the amplitude mask (AM) and the phase mask (PM) of the reconstructed fingerprint image that is implemented using the digital holographic technique. Statistical tests have been conducted to check the randomness of the fingerprint PM key that enables its usage as an image encryption key. To explore the utility of the generated biometric keys, an optical image encryption system has been further demonstrated based on the phase retrieval algorithm and the double random phase encoding scheme in which keys for the encryption are used as the AM and the PM key. The advantage associated with the proposed scheme is that the biometric keys’ retrieval requires the simultaneous presence of the fingerprint hologram and the correct knowledge of the reconstruction parameters at the decryption stage, which not only verifies the authenticity of the person but also protects the valuable fingerprint biometric features of the keys. Numerical results are carried out to prove the feasibility and the effectiveness of the proposed encryption system.

  17. A Novel Medical Freehand Sketch 3D Model Retrieval Method by Dimensionality Reduction and Feature Vector Transformation

    Directory of Open Access Journals (Sweden)

    Zhang Jing

    2016-01-01

    Full Text Available To assist physicians to quickly find the required 3D model from the mass medical model, we propose a novel retrieval method, called DRFVT, which combines the characteristics of dimensionality reduction (DR and feature vector transformation (FVT method. The DR method reduces the dimensionality of feature vector; only the top M low frequency Discrete Fourier Transform coefficients are retained. The FVT method does the transformation of the original feature vector and generates a new feature vector to solve the problem of noise sensitivity. The experiment results demonstrate that the DRFVT method achieves more effective and efficient retrieval results than other proposed methods.

  18. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Science.gov (United States)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  19. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Directory of Open Access Journals (Sweden)

    Ralf Steinmetz

    2004-04-01

    Full Text Available In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation, the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  20. Safety assessment on the key equipment of coal gasification based on SDG-HAZOP method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, B.; Xu, X.; Ma, X.; Wu, C. [Beijing University of Chemical Technology, Beijing (China)

    2008-07-15

    An example of the coal gasification process was introduced after the explanation of the graphical representation method known as the sign directed graph (SDG). The systematic modeling procedure was also introduced. Firstly, the key variables of the whole system were selected. Then the relationship equations were listed. Finally, the SDG-HAZOP (hazard and operability) model was derived after attaching the abnormal causes and adverse consequences. In order to get a credible SDG model, the model was checked by technicians in the factory. Based on computer-aided analysis, this model can express almost all the dangerous features of the gasification process. It can also reveal the mechanisms of danger propagation, which may effectively help safety engineers to identify potential hazards. 15 refs., 4 figs., 1 tab.

  1. Choosing preclinical study models of diabetic retinopathy: key problems for consideration

    Science.gov (United States)

    Mi, Xue-Song; Yuan, Ti-Fei; Ding, Yong; Zhong, Jing-Xiang; So, Kwok-Fai

    2014-01-01

    Diabetic retinopathy (DR) is the most common complication of diabetes mellitus in the eye. Although the clinical treatment for DR has already developed to a relative high level, there are still many urgent problems that need to be investigated in clinical and basic science. Currently, many in vivo animal models and in vitro culture systems have been applied to solve these problems. Many approaches have also been used to establish different DR models. However, till now, there has not been a single study model that can clearly and exactly mimic the developmental process of the human DR. Choosing the suitable model is important, not only for achieving our research goals smoothly, but also, to better match with different experimental proposals in the study. In this review, key problems for consideration in choosing study models of DR are discussed. These problems relate to clinical relevance, different approaches for establishing models, and choice of different species of animals as well as of the specific in vitro culture systems. Attending to these considerations will deepen the understanding on current study models and optimize the experimental design for the final goal of preventing DR. PMID:25429204

  2. Simulation on scattering features of biological tissue based on generated refractive-index model

    International Nuclear Information System (INIS)

    Wang Baoyong; Ding Zhihua

    2011-01-01

    Important information on morphology of biological tissue can be deduced from elastic scattering spectra, and their analyses are based on the known refractive-index model of tissue. In this paper, a new numerical refractive-index model is put forward, and its scattering properties are intensively studied. Spectral decomposition [1] is a widely used method to generate random medium in geology, but it is never used in biology. Biological tissue is different from geology in the sense of random medium. Autocorrelation function describe almost all of features in geology, but biological tissue is not as random as geology, its structure is regular in the sense of fractal geometry [2] , and fractal dimension can be used to describe its regularity under random. Firstly scattering theories of this fractal media are reviewed. Secondly the detailed generation process of refractive-index is presented. Finally the scattering features are simulated in FDTD (Finite Difference Time Domain) Solutions software. From the simulation results, we find that autocorrelation length and fractal dimension controls scattering feature of biological tissue.

  3. Key Recovery Attacks on Recent Authenticated Ciphers

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Dobraunig, Christoph; Eichlseder, Maria

    2014-01-01

    In this paper, we cryptanalyze three authenticated ciphers: AVALANCHE, Calico, and RBS. While the former two are contestants in the ongoing international CAESAR competition for authenticated encryption schemes, the latter has recently been proposed for lightweight applications such as RFID systems...... and wireless networks. All these schemes use well-established and secure components such as the AES, Grain-like NFSRs, ChaCha and SipHash as their building blocks. However, we discover key recovery attacks for all three designs, featuring square-root complexities. Using a key collision technique, we can...

  4. Experimental demonstration of subcarrier multiplexed quantum key distribution system.

    Science.gov (United States)

    Mora, José; Ruiz-Alba, Antonio; Amaya, Waldimar; Martínez, Alfonso; García-Muñoz, Víctor; Calvo, David; Capmany, José

    2012-06-01

    We provide, to our knowledge, the first experimental demonstration of the feasibility of sending several parallel keys by exploiting the technique of subcarrier multiplexing (SCM) widely employed in microwave photonics. This approach brings several advantages such as high spectral efficiency compatible with the actual secure key rates, the sharing of the optical fainted pulse by all the quantum multiplexed channels reducing the system complexity, and the possibility of upgrading with wavelength division multiplexing in a two-tier scheme, to increase the number of parallel keys. Two independent quantum SCM channels featuring a sifted key rate of 10 Kb/s/channel over a link with quantum bit error rate <2% is reported.

  5. A Featured-Based Strategy for Stereovision Matching in Sensors with Fish-Eye Lenses for Forest Environments

    Science.gov (United States)

    Herrera, Pedro Javier; Pajares, Gonzalo; Guijarro, Maria; Ruz, José J.; Cruz, Jesús M.; Montes, Fernando

    2009-01-01

    This paper describes a novel feature-based stereovision matching process based on a pair of omnidirectional images in forest stands acquired with a stereovision sensor equipped with fish-eye lenses. The stereo analysis problem consists of the following steps: image acquisition, camera modelling, feature extraction, image matching and depth determination. Once the depths of significant points on the trees are obtained, the growing stock volume can be estimated by considering the geometrical camera modelling, which is the final goal. The key steps are feature extraction and image matching. This paper is devoted solely to these two steps. At a first stage a segmentation process extracts the trunks, which are the regions used as features, where each feature is identified through a set of attributes of properties useful for matching. In the second step the features are matched based on the application of the following four well known matching constraints, epipolar, similarity, ordering and uniqueness. The combination of the segmentation and matching processes for this specific kind of sensors make the main contribution of the paper. The method is tested with satisfactory results and compared against the human expert criterion. PMID:22303134

  6. Backup key generation model for one-time password security protocol

    Science.gov (United States)

    Jeyanthi, N.; Kundu, Sourav

    2017-11-01

    The use of one-time password (OTP) has ushered new life into the existing authentication protocols used by the software industry. It introduced a second layer of security to the traditional username-password authentication, thus coining the term, two-factor authentication. One of the drawbacks of this protocol is the unreliability of the hardware token at the time of authentication. This paper proposes a simple backup key model that can be associated with the real world applications’user database, which would allow a user to circumvent the second authentication stage, in the event of unavailability of the hardware token.

  7. A Web-Based Data Collection Platform for Multisite Randomized Behavioral Intervention Trials: Development, Key Software Features, and Results of a User Survey.

    Science.gov (United States)

    Modi, Riddhi A; Mugavero, Michael J; Amico, Rivet K; Keruly, Jeanne; Quinlivan, Evelyn Byrd; Crane, Heidi M; Guzman, Alfredo; Zinski, Anne; Montue, Solange; Roytburd, Katya; Church, Anna; Willig, James H

    2017-06-16

    Meticulous tracking of study data must begin early in the study recruitment phase and must account for regulatory compliance, minimize missing data, and provide high information integrity and/or reduction of errors. In behavioral intervention trials, participants typically complete several study procedures at different time points. Among HIV-infected patients, behavioral interventions can favorably affect health outcomes. In order to empower newly diagnosed HIV positive individuals to learn skills to enhance retention in HIV care, we developed the behavioral health intervention Integrating ENGagement and Adherence Goals upon Entry (iENGAGE) funded by the National Institute of Allergy and Infectious Diseases (NIAID), where we deployed an in-clinic behavioral health intervention in 4 urban HIV outpatient clinics in the United States. To scale our intervention strategy homogenously across sites, we developed software that would function as a behavioral sciences research platform. This manuscript aimed to: (1) describe the design and implementation of a Web-based software application to facilitate deployment of a multisite behavioral science intervention; and (2) report on results of a survey to capture end-user perspectives of the impact of this platform on the conduct of a behavioral intervention trial. In order to support the implementation of the NIAID-funded trial iENGAGE, we developed software to deploy a 4-site behavioral intervention for new clinic patients with HIV/AIDS. We integrated the study coordinator into the informatics team to participate in the software development process. Here, we report the key software features and the results of the 25-item survey to evaluate user perspectives on research and intervention activities specific to the iENGAGE trial (N=13). The key features addressed are study enrollment, participant randomization, real-time data collection, facilitation of longitudinal workflow, reporting, and reusability. We found 100% user

  8. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Science.gov (United States)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  9. Fault feature extraction method based on local mean decomposition Shannon entropy and improved kernel principal component analysis model

    Directory of Open Access Journals (Sweden)

    Jinlu Sheng

    2016-07-01

    Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.

  10. Habitat features and predictive habitat modeling for the Colorado chipmunk in southern New Mexico

    Science.gov (United States)

    Rivieccio, M.; Thompson, B.C.; Gould, W.R.; Boykin, K.G.

    2003-01-01

    Two subspecies of Colorado chipmunk (state threatened and federal species of concern) occur in southern New Mexico: Tamias quadrivittatus australis in the Organ Mountains and T. q. oscuraensis in the Oscura Mountains. We developed a GIS model of potentially suitable habitat based on vegetation and elevation features, evaluated site classifications of the GIS model, and determined vegetation and terrain features associated with chipmunk occurrence. We compared GIS model classifications with actual vegetation and elevation features measured at 37 sites. At 60 sites we measured 18 habitat variables regarding slope, aspect, tree species, shrub species, and ground cover. We used logistic regression to analyze habitat variables associated with chipmunk presence/absence. All (100%) 37 sample sites (28 predicted suitable, 9 predicted unsuitable) were classified correctly by the GIS model regarding elevation and vegetation. For 28 sites predicted suitable by the GIS model, 18 sites (64%) appeared visually suitable based on habitat variables selected from logistic regression analyses, of which 10 sites (36%) were specifically predicted as suitable habitat via logistic regression. We detected chipmunks at 70% of sites deemed suitable via the logistic regression models. Shrub cover, tree density, plant proximity, presence of logs, and presence of rock outcrop were retained in the logistic model for the Oscura Mountains; litter, shrub cover, and grass cover were retained in the logistic model for the Organ Mountains. Evaluation of predictive models illustrates the need for multi-stage analyses to best judge performance. Microhabitat analyses indicate prospective needs for different management strategies between the subspecies. Sensitivities of each population of the Colorado chipmunk to natural and prescribed fire suggest that partial burnings of areas inhabited by Colorado chipmunks in southern New Mexico may be beneficial. These partial burnings may later help avoid a fire

  11. Analyzing surface features on icy satellites using a new two-layer analogue model

    Science.gov (United States)

    Morales, K. M.; Leonard, E. J.; Pappalardo, R. T.; Yin, A.

    2017-12-01

    The appearance of similar surface morphologies across many icy satellites suggests potentially unified formation mechanisms. Constraining the processes that shape the surfaces of these icy worlds is fundamental to understanding their rheology and thermal evolution—factors that have implications for potential habitability. Analogue models have proven useful for investigating and quantifying surface structure formation on Earth, but have only been sparsely applied to icy bodies. In this study, we employ an innovative two-layer analogue model that simulates a warm, ductile ice layer overlain by brittle surface ice on satellites such as Europa and Enceladus. The top, brittle layer is composed of fine-grained sand while the ductile, lower viscosity layer is made of putty. These materials were chosen because they scale up reasonably to the conditions on Europa and Enceladus. Using this analogue model, we investigate the role of the ductile layer in forming contractional structures (e.g. folds) that would compensate for the over-abundance of extensional features observed on icy satellites. We do this by simulating different compressional scenarios in the analogue model and analyzing whether the resulting features resemble those on icy bodies. If the resulting structures are similar, then the model can be used to quantify the deformation by calculating strain. These values can then be scaled up to Europa or Enceladus and used to quantity the observed surface morphologies and the amount of extensional strain accommodated by certain features. This presentation will focus on the resulting surface morphologies and the calculated strain values from several analogue experiments. The methods and findings from this work can then be expanded and used to study other icy bodies, such as Triton, Miranda, Ariel, and Pluto.

  12. Key factors regulating the mass delivery of macromolecules to model cell membranes

    DEFF Research Database (Denmark)

    Campbell, Richard A.; Watkins, Erik B.; Jagalski, Vivien

    2014-01-01

    We show that both gravity and electrostatics are key factors regulating interactions between model cell membranes and self-assembled liquid crystalline aggregates of dendrimers and phospholipids. The system is a proxy for the trafficking of reservoirs of therapeutic drugs to cell membranes for slow...... of the aggregates to activate endocytosis pathways on specific cell types is discussed in the context of targeted drug delivery applications....

  13. A Modified Feature Selection and Artificial Neural Network-Based Day-Ahead Load Forecasting Model for a Smart Grid

    Directory of Open Access Journals (Sweden)

    Ashfaq Ahmad

    2015-12-01

    Full Text Available In the operation of a smart grid (SG, day-ahead load forecasting (DLF is an important task. The SG can enhance the management of its conventional and renewable resources with a more accurate DLF model. However, DLF model development is highly challenging due to the non-linear characteristics of load time series in SGs. In the literature, DLF models do exist; however, these models trade off between execution time and forecast accuracy. The newly-proposed DLF model will be able to accurately predict the load of the next day with a fair enough execution time. Our proposed model consists of three modules; the data preparation module, feature selection and the forecast module. The first module makes the historical load curve compatible with the feature selection module. The second module removes redundant and irrelevant features from the input data. The third module, which consists of an artificial neural network (ANN, predicts future load on the basis of selected features. Moreover, the forecast module uses a sigmoid function for activation and a multi-variate auto-regressive model for weight updating during the training process. Simulations are conducted in MATLAB to validate the performance of our newly-proposed DLF model in terms of accuracy and execution time. Results show that our proposed modified feature selection and modified ANN (m(FS + ANN-based model for SGs is able to capture the non-linearity(ies in the history load curve with 97 . 11 % accuracy. Moreover, this accuracy is achieved at the cost of a fair enough execution time, i.e., we have decreased the average execution time of the existing FS + ANN-based model by 38 . 50 % .

  14. Independent screening for single-index hazard rate models with ultrahigh dimensional features

    DEFF Research Database (Denmark)

    Gorst-Rasmussen, Anders; Scheike, Thomas

    2013-01-01

    can be viewed as the natural survival equivalent of correlation screening. We state conditions under which the method admits the sure screening property within a class of single-index hazard rate models with ultrahigh dimensional features and describe the generally detrimental effect of censoring...

  15. Evaluation of Features, Events, and Processes (FEP) for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-10-09

    The purpose of this report is to document the evaluation of biosphere features, events, and processes (FEPs) that relate to the license application (LA) process as required by the U.S. Nuclear Regulatory Commission (NRC) regulations at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. The evaluation determines whether specific biosphere-related FEPs should be included or excluded from consideration in the Total System Performance Assessment (TSPA). This analysis documents the technical basis for screening decisions as required at 10 CFR 63.114 (d, e, and f) [DIRS 156605]. For FEPs that are included in the TSPA, this analysis provides a TSPA disposition, which summarizes how the FEP has been included and addressed in the TSPA model, and cites the analysis reports and model reports that provide the technical basis and description of its disposition. For FEPs that are excluded from the TSPA, this analysis report provides a screening argument, which identifies the basis for the screening decision (i.e., low probability, low consequence, or by regulation) and discusses the technical basis that supports that decision. In cases, where a FEP covers multiple technical areas and is shared with other FEP analysis reports, this analysis may provide only a partial technical basis for the screening of the FEP. The full technical basis for these shared FEPs is addressed collectively by all FEP analysis reports that cover technical disciplines sharing a FEP. FEPs must be included in the TSPA unless they can be excluded by low probability, low consequence, or regulation. A FEP can be excluded from the TSPA by low probability per 10 CFR 63.114(d) [DIRS 156605], by showing that it has less than one chance in 10,000 of occurring over 10,000 years (or an approximately equivalent annualized probability of 10{sup -8}). A FEP can be excluded from the TSPA by low consequence per 10 CFR 63.114 (e or f) [DIRS 156605], by showing that omitting the FEP would not significantly change the magnitude and

  16. Modeling Pathologic Response of Esophageal Cancer to Chemoradiation Therapy Using Spatial-Temporal 18F-FDG PET Features, Clinical Parameters, and Demographics

    International Nuclear Information System (INIS)

    Zhang, Hao; Tan, Shan; Chen, Wengen; Kligerman, Seth; Kim, Grace; D'Souza, Warren D.; Suntharalingam, Mohan; Lu, Wei

    2014-01-01

    Purpose: To construct predictive models using comprehensive tumor features for the evaluation of tumor response to neoadjuvant chemoradiation therapy (CRT) in patients with esophageal cancer. Methods and Materials: This study included 20 patients who underwent trimodality therapy (CRT + surgery) and underwent 18 F-fluorodeoxyglucose (FDG) positron emission tomography/computed tomography (PET/CT) both before and after CRT. Four groups of tumor features were examined: (1) conventional PET/CT response measures (eg, standardized uptake value [SUV] max , tumor diameter); (2) clinical parameters (eg, TNM stage, histology) and demographics; (3) spatial-temporal PET features, which characterize tumor SUV intensity distribution, spatial patterns, geometry, and associated changes resulting from CRT; and (4) all features combined. An optimal feature set was identified with recursive feature selection and cross-validations. Support vector machine (SVM) and logistic regression (LR) models were constructed for prediction of pathologic tumor response to CRT, cross-validations being used to avoid model overfitting. Prediction accuracy was assessed by area under the receiver operating characteristic curve (AUC), and precision was evaluated by confidence intervals (CIs) of AUC. Results: When applied to the 4 groups of tumor features, the LR model achieved AUCs (95% CI) of 0.57 (0.10), 0.73 (0.07), 0.90 (0.06), and 0.90 (0.06). The SVM model achieved AUCs (95% CI) of 0.56 (0.07), 0.60 (0.06), 0.94 (0.02), and 1.00 (no misclassifications). With the use of spatial-temporal PET features combined with conventional PET/CT measures and clinical parameters, the SVM model achieved very high accuracy (AUC 1.00) and precision (no misclassifications)—results that were significantly better than when conventional PET/CT measures or clinical parameters and demographics alone were used. For groups with many tumor features (groups 3 and 4), the SVM model achieved significantly higher accuracy than

  17. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    International Nuclear Information System (INIS)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-01-01

    Highlights: • An ensemble model is developed to produce both deterministic and probabilistic wind forecasts. • A deep feature selection framework is developed to optimally determine the inputs to the forecasting methodology. • The developed ensemble methodology has improved the forecasting accuracy by up to 30%. - Abstract: With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  18. Modeling key processes causing climate change and variability

    Energy Technology Data Exchange (ETDEWEB)

    Henriksson, S.

    2013-09-01

    Greenhouse gas warming, internal climate variability and aerosol climate effects are studied and the importance to understand these key processes and being able to separate their influence on the climate is discussed. Aerosol-climate model ECHAM5-HAM and the COSMOS millennium model consisting of atmospheric, ocean and carbon cycle and land-use models are applied and results compared to measurements. Topics at focus are climate sensitivity, quasiperiodic variability with a period of 50-80 years and variability at other timescales, climate effects due to aerosols over India and climate effects of northern hemisphere mid- and high-latitude volcanic eruptions. The main findings of this work are (1) pointing out the remaining challenges in reducing climate sensitivity uncertainty from observational evidence, (2) estimates for the amplitude of a 50-80 year quasiperiodic oscillation in global mean temperature ranging from 0.03 K to 0.17 K and for its phase progression as well as the synchronising effect of external forcing, (3) identifying a power law shape S(f) {proportional_to} f-{alpha} for the spectrum of global mean temperature with {alpha} {approx} 0.8 between multidecadal and El Nino timescales with a smaller exponent in modelled climate without external forcing, (4) separating aerosol properties and climate effects in India by season and location (5) the more efficient dispersion of secondary sulfate aerosols than primary carbonaceous aerosols in the simulations, (6) an increase in monsoon rainfall in northern India due to aerosol light absorption and a probably larger decrease due to aerosol dimming effects and (7) an estimate of mean maximum cooling of 0.19 K due to larger northern hemisphere mid- and high-latitude volcanic eruptions. The results could be applied or useful in better isolating the human-caused climate change signal, in studying the processes further and in more detail, in decadal climate prediction, in model evaluation and in emission policy

  19. SIMPL Systems, or: Can We Design Cryptographic Hardware without Secret Key Information?

    Science.gov (United States)

    Rührmair, Ulrich

    This paper discusses a new cryptographic primitive termed SIMPL system. Roughly speaking, a SIMPL system is a special type of Physical Unclonable Function (PUF) which possesses a binary description that allows its (slow) public simulation and prediction. Besides this public key like functionality, SIMPL systems have another advantage: No secret information is, or needs to be, contained in SIMPL systems in order to enable cryptographic protocols - neither in the form of a standard binary key, nor as secret information hidden in random, analog features, as it is the case for PUFs. The cryptographic security of SIMPLs instead rests on (i) a physical assumption on their unclonability, and (ii) a computational assumption regarding the complexity of simulating their output. This novel property makes SIMPL systems potentially immune against many known hardware and software attacks, including malware, side channel, invasive, or modeling attacks.

  20. Finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Features step-by-step examples based on actual data and connects fundamental mathematical modeling skills and decision making concepts to everyday applicability Featuring key linear programming, matrix, and probability concepts, Finite Mathematics: Models and Applications emphasizes cross-disciplinary applications that relate mathematics to everyday life. The book provides a unique combination of practical mathematical applications to illustrate the wide use of mathematics in fields ranging from business, economics, finance, management, operations research, and the life and social sciences.

  1. Abelian 2-form gauge theory: special features

    International Nuclear Information System (INIS)

    Malik, R P

    2003-01-01

    It is shown that the four (3 + 1)-dimensional (4D) free Abelian 2-form gauge theory provides an example of (i) a class of field theoretical models for the Hodge theory, and (ii) a possible candidate for the quasi-topological field theory (q-TFT). Despite many striking similarities with some of the key topological features of the two (1 + 1)-dimensional (2D) free Abelian (and self-interacting non-Abelian) gauge theories, it turns out that the 4D free Abelian 2-form gauge theory is not an exact TFT. To corroborate this conclusion, some of the key issues are discussed. In particular, it is shown that the (anti-)BRST and (anti-)co-BRST invariant quantities of the 4D 2-form Abelian gauge theory obey recursion relations that are reminiscent of the exact TFTs but the Lagrangian density of this theory is not found to be able to be expressed as the sum of (anti-)BRST and (anti-)co-BRST exact quantities as is the case with the topological 2D free Abelian (and self-interacting non-Abelian) gauge theories

  2. The 'OMITRON' and 'MODEL OMITRON' proposed experiments

    International Nuclear Information System (INIS)

    Sestero, A.

    1997-12-01

    In the present paper the main features of the OMITRON and MODEL OMITRON proposed high field tokamaks are illustrated. Of the two, OMITRON is an ambitious experiment, aimed at attaining plasma burning conditions. its key physics issues are discussed, and a comparison is carried out with corresponding physics features in ignition experiments such as IGNITOR and ITER. Chief asset and chief challenge - in both OMITRON and MODEL OMITRON is the conspicuous 20 Tesla toroidal field value on the plasma axis. The advanced features of engineering which consent such a reward in terms of toroidal magnet performance are discussed in convenient depth and detail. As for the small, propaedeutic device MODEL OMITRON among its goals one must rank the purpose of testing key engineering issues in vivo, which are vital for the larger and more expensive parent device. Besides that, however - as indicated by ad hoc performed scoping studies - the smaller machine is found capable also of a number of quite interesting physics investigations in its own right

  3. Bit-Oriented Quantum Public-Key Cryptosystem Based on Bell States

    Science.gov (United States)

    Wu, WanQing; Cai, QingYu; Zhang, HuanGuo; Liang, XiaoYan

    2018-06-01

    Quantum public key encryption system provides information confidentiality using quantum mechanics. This paper presents a quantum public key cryptosystem ( Q P K C) based on the Bell states. By H o l e v o' s theorem, the presented scheme provides the security of the secret key using one-wayness during the QPKC. While the QPKC scheme is information theoretic security under chosen plaintext attack ( C P A). Finally some important features of presented QPKC scheme can be compared with other QPKC scheme.

  4. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach.

    Science.gov (United States)

    Jordanous, Anna; Keller, Bill

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research.

  5. Cadmium-induced immune abnormality is a key pathogenic event in human and rat models of preeclampsia.

    Science.gov (United States)

    Zhang, Qiong; Huang, Yinping; Zhang, Keke; Huang, Yanjun; Yan, Yan; Wang, Fan; Wu, Jie; Wang, Xiao; Xu, Zhangye; Chen, Yongtao; Cheng, Xue; Li, Yong; Jiao, Jinyu; Ye, Duyun

    2016-11-01

    With increased industrial development, cadmium is an increasingly important environmental pollutant. Studies have identified various adverse effects of cadmium on human beings. However, the relationships between cadmium pollution and the pathogenesis of preeclampsia remain elusive. The objective of this study is to explore the effects of cadmium on immune system among preeclamptic patients and rats. The results showed that the cadmium levels in the peripheral blood of preeclamptic patients were significantly higher than those observed in normal pregnancy. Based on it, a novel rat model of preeclampsia was established by the intraperitoneal administration of cadmium chloride (CdCl2) (0.125 mg of Cd/kg body weight) on gestational days 9-14. Key features of preeclampsia, including hypertension, proteinuria, placental abnormalities and small foetal size, appeared in pregnant rats after the administration of low-dose of CdCl2. Cadmium increased immunoglobulin production, mainly angiotensin II type 1-receptor-agonistic autoantibodies (AT1-AA), by increasing the expression of activation-induced cytosine deaminase (AID) in B cells. AID is critical for the maturation of antibody and autoantibody responses. In addition, angiotensin II type 1-receptor-agonistic autoantibody, which emerged recently as a potential pathogenic contributor to PE, was responsible for the deposition of complement component 5 (C5) in kidneys of pregnant rats via angiotensin II type 1 receptor (AT1R) activation. C5a is a fragment of C5 that is released during C5 activation. Selectively interfering with C5a signalling by a complement C5a receptor-specific antagonist significantly attenuated hypertension and proteinuria in Cd-injected pregnant rats. Our results suggest that cadmium induces immune abnormalities that may be a key pathogenic contributor to preeclampsia and provide new insights into treatment strategies of preeclampsia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Key performance indicators in hospital based on balanced scorecard model

    Directory of Open Access Journals (Sweden)

    Hamed Rahimi

    2017-01-01

    Full Text Available Introduction: Performance measurement is receiving increasing verification all over the world. Nowadays in a lot of organizations, irrespective of their type or size, performance evaluation is the main concern and a key issue for top administrators. The purpose of this study is to organize suitable key performance indicators (KPIs for hospitals’ performance evaluation based on the balanced scorecard (BSC. Method: This is a mixed method study. In order to identify the hospital’s performance indicators (HPI, first related literature was reviewed and then the experts’ panel and Delphi method were used. In this study, two rounds were needed for the desired level of consensus. The experts rated the importance of the indicators, on a five-point Likert scale. In the consensus calculation, the consensus percentage was calculated by classifying the values 1-3 as not important (0 and 4-5 to (1 as important. Simple additive weighting technique was used to rank the indicators and select hospital’s KPIs. The data were analyzed by Excel 2010 software. Results: About 218 indicators were obtained from a review of selected literature. Through internal expert panel, 77 indicators were selected. Finally, 22 were selected for KPIs of hospitals. Ten indicators were selected in internal process perspective and 5, 4, and 3 indicators in finance, learning and growth, and customer, respectively. Conclusion: This model can be a useful tool for evaluating and comparing the performance of hospitals. However, this model is flexible and can be adjusted according to differences in the target hospitals. This study can be beneficial for hospital administrators and it can help them to change their perspective about performance evaluation.

  7. Confirming the key role of Ar+ ion bombardment in the growth feature of nanostructured carbon materials by PECVD

    Science.gov (United States)

    Liu, Yulin; Lin, Jinghuang; Jia, Henan; Chen, Shulin; Qi, Junlei; Qu, Chaoqun; Cao, Jian; Feng, Jicai; Fei, Weidong

    2017-11-01

    In order to confirm the key role of Ar+ ion bombardment in the growth feature of nanostructured carbon materials (NCMs), here we report a novel strategy to create different Ar+ ion states in situ in plasma enhanced chemical vapor deposition (PECVD) by separating catalyst film from the substrate. Different bombardment environments on either side of the catalyst film were created simultaneously to achieve multi-layered structural NCMs. Results showed that Ar+ ion bombardment is crucial and complex for the growth of NCMs. Firstly, Ar+ ion bombardment has both positive and negative effects on carbon nanotubes (CNTs). On one hand, Ar+ ions can break up the graphic structure of CNTs and suppress thin CNT nucleation and growth. On the other hand, Ar+ ion bombardment can remove redundant carbon layers on the surface of large catalyst particles which is essential for thick CNTs. As a result, the diameter of the CNTs depends on the Ar+ ion state. As for vertically oriented few-layer graphene (VFG), Ar+ ions are essential and can even convert the CNTs into VFG. Therefore, by combining with the catalyst separation method, specific or multi-layered structural NCMs can be obtained by PECVD only by changing the intensity of Ar+ ion bombardment, and these special NCMs are promising in many fields.

  8. Confirming the key role of Ar+ ion bombardment in the growth feature of nanostructured carbon materials by PECVD.

    Science.gov (United States)

    Liu, Yulin; Lin, Jinghuang; Jia, Henan; Chen, Shulin; Qi, Junlei; Qu, Chaoqun; Cao, Jian; Feng, Jicai; Fei, Weidong

    2017-11-24

    In order to confirm the key role of Ar + ion bombardment in the growth feature of nanostructured carbon materials (NCMs), here we report a novel strategy to create different Ar + ion states in situ in plasma enhanced chemical vapor deposition (PECVD) by separating catalyst film from the substrate. Different bombardment environments on either side of the catalyst film were created simultaneously to achieve multi-layered structural NCMs. Results showed that Ar + ion bombardment is crucial and complex for the growth of NCMs. Firstly, Ar + ion bombardment has both positive and negative effects on carbon nanotubes (CNTs). On one hand, Ar + ions can break up the graphic structure of CNTs and suppress thin CNT nucleation and growth. On the other hand, Ar + ion bombardment can remove redundant carbon layers on the surface of large catalyst particles which is essential for thick CNTs. As a result, the diameter of the CNTs depends on the Ar + ion state. As for vertically oriented few-layer graphene (VFG), Ar + ions are essential and can even convert the CNTs into VFG. Therefore, by combining with the catalyst separation method, specific or multi-layered structural NCMs can be obtained by PECVD only by changing the intensity of Ar + ion bombardment, and these special NCMs are promising in many fields.

  9. Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems

    Science.gov (United States)

    Igaki, Hiroshi; Nakamura, Masahide

    This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.

  10. Remote health monitoring: predicting outcome success based on contextual features for cardiovascular disease.

    Science.gov (United States)

    Alshurafa, Nabil; Eastwood, Jo-Ann; Pourhomayoun, Mohammad; Liu, Jason J; Sarrafzadeh, Majid

    2014-01-01

    Current studies have produced a plethora of remote health monitoring (RHM) systems designed to enhance the care of patients with chronic diseases. Many RHM systems are designed to improve patient risk factors for cardiovascular disease, including physiological parameters such as body mass index (BMI) and waist circumference, and lipid profiles such as low density lipoprotein (LDL) and high density lipoprotein (HDL). There are several patient characteristics that could be determining factors for a patient's RHM outcome success, but these characteristics have been largely unidentified. In this paper, we analyze results from an RHM system deployed in a six month Women's Heart Health study of 90 patients, and apply advanced feature selection and machine learning algorithms to identify patients' key baseline contextual features and build effective prediction models that help determine RHM outcome success. We introduce Wanda-CVD, a smartphone-based RHM system designed to help participants with cardiovascular disease risk factors by motivating participants through wireless coaching using feedback and prompts as social support. We analyze key contextual features that secure positive patient outcomes in both physiological parameters and lipid profiles. Results from the Women's Heart Health study show that health threat of heart disease, quality of life, family history, stress factors, social support, and anxiety at baseline all help predict patient RHM outcome success.

  11. Unsupervised Feature Subset Selection

    DEFF Research Database (Denmark)

    Søndberg-Madsen, Nicolaj; Thomsen, C.; Pena, Jose

    2003-01-01

    This paper studies filter and hybrid filter-wrapper feature subset selection for unsupervised learning (data clustering). We constrain the search for the best feature subset by scoring the dependence of every feature on the rest of the features, conjecturing that these scores discriminate some ir...... irrelevant features. We report experimental results on artificial and real data for unsupervised learning of naive Bayes models. Both the filter and hybrid approaches perform satisfactorily....

  12. Brain Transcriptome Profiles in Mouse Model Simulating Features of Post-traumatic Stress Disorder

    Science.gov (United States)

    2015-02-28

    analyses of DEGs suggested pos- sible roles in anxiety-related behavioral responses, synaptic plasticity, neurogenesis, inflammation, obesity...Behavioral evaluation of mouse model We established [29] a rodent model manifesting PTSD- like behavioral features. We believe that, because the stres - sor...hippo- campus (HC), medial prefrontal cortex (MPFC) play primary roles in fear learning and memory, and thus, may contribute to the behavioral

  13. Learning to Automatically Detect Features for Mobile Robots Using Second-Order Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Olivier Aycard

    2004-12-01

    Full Text Available In this paper, we propose a new method based on Hidden Markov Models to interpret temporal sequences of sensor data from mobile robots to automatically detect features. Hidden Markov Models have been used for a long time in pattern recognition, especially in speech recognition. Their main advantages over other methods (such as neural networks are their ability to model noisy temporal signals of variable length. We show in this paper that this approach is well suited for interpretation of temporal sequences of mobile-robot sensor data. We present two distinct experiments and results: the first one in an indoor environment where a mobile robot learns to detect features like open doors or T-intersections, the second one in an outdoor environment where a different mobile robot has to identify situations like climbing a hill or crossing a rock.

  14. Integrated Phoneme Subspace Method for Speech Feature Extraction

    Directory of Open Access Journals (Sweden)

    Park Hyunsin

    2009-01-01

    Full Text Available Speech feature extraction has been a key focus in robust speech recognition research. In this work, we discuss data-driven linear feature transformations applied to feature vectors in the logarithmic mel-frequency filter bank domain. Transformations are based on principal component analysis (PCA, independent component analysis (ICA, and linear discriminant analysis (LDA. Furthermore, this paper introduces a new feature extraction technique that collects the correlation information among phoneme subspaces and reconstructs feature space for representing phonemic information efficiently. The proposed speech feature vector is generated by projecting an observed vector onto an integrated phoneme subspace (IPS based on PCA or ICA. The performance of the new feature was evaluated for isolated word speech recognition. The proposed method provided higher recognition accuracy than conventional methods in clean and reverberant environments.

  15. Key on demand (KoD) for software-defined optical networks secured by quantum key distribution (QKD).

    Science.gov (United States)

    Cao, Yuan; Zhao, Yongli; Colman-Meixner, Carlos; Yu, Xiaosong; Zhang, Jie

    2017-10-30

    Software-defined optical networking (SDON) will become the next generation optical network architecture. However, the optical layer and control layer of SDON are vulnerable to cyberattacks. While, data encryption is an effective method to minimize the negative effects of cyberattacks, secure key interchange is its major challenge which can be addressed by the quantum key distribution (QKD) technique. Hence, in this paper we discuss the integration of QKD with WDM optical networks to secure the SDON architecture by introducing a novel key on demand (KoD) scheme which is enabled by a novel routing, wavelength and key assignment (RWKA) algorithm. The QKD over SDON with KoD model follows two steps to provide security: i) quantum key pools (QKPs) construction for securing the control channels (CChs) and data channels (DChs); ii) the KoD scheme uses RWKA algorithm to allocate and update secret keys for different security requirements. To test our model, we define a security probability index which measures the security gain in CChs and DChs. Simulation results indicate that the security performance of CChs and DChs can be enhanced by provisioning sufficient secret keys in QKPs and performing key-updating considering potential cyberattacks. Also, KoD is beneficial to achieve a positive balance between security requirements and key resource usage.

  16. Predictive features of persistent activity emergence in regular spiking and intrinsic bursting model neurons.

    Directory of Open Access Journals (Sweden)

    Kyriaki Sidiropoulou

    Full Text Available Proper functioning of working memory involves the expression of stimulus-selective persistent activity in pyramidal neurons of the prefrontal cortex (PFC, which refers to neural activity that persists for seconds beyond the end of the stimulus. The mechanisms which PFC pyramidal neurons use to discriminate between preferred vs. neutral inputs at the cellular level are largely unknown. Moreover, the presence of pyramidal cell subtypes with different firing patterns, such as regular spiking and intrinsic bursting, raises the question as to what their distinct role might be in persistent firing in the PFC. Here, we use a compartmental modeling approach to search for discriminatory features in the properties of incoming stimuli to a PFC pyramidal neuron and/or its response that signal which of these stimuli will result in persistent activity emergence. Furthermore, we use our modeling approach to study cell-type specific differences in persistent activity properties, via implementing a regular spiking (RS and an intrinsic bursting (IB model neuron. We identify synaptic location within the basal dendrites as a feature of stimulus selectivity. Specifically, persistent activity-inducing stimuli consist of activated synapses that are located more distally from the soma compared to non-inducing stimuli, in both model cells. In addition, the action potential (AP latency and the first few inter-spike-intervals of the neuronal response can be used to reliably detect inducing vs. non-inducing inputs, suggesting a potential mechanism by which downstream neurons can rapidly decode the upcoming emergence of persistent activity. While the two model neurons did not differ in the coding features of persistent activity emergence, the properties of persistent activity, such as the firing pattern and the duration of temporally-restricted persistent activity were distinct. Collectively, our results pinpoint to specific features of the neuronal response to a given

  17. Competences and knowledge: Key-factors in the smart city of the future

    Directory of Open Access Journals (Sweden)

    Saverio Salerno

    2014-12-01

    Full Text Available The effective and modern management of competence development, which represents a distinguishing key-factor in future Smart Cities, cannot be limited to the Learning Management exclusively, but rather be inclusive of aspects pertaining to Human Capital and Performance Management in a holistic vision that encompasses not only the sphere of operations but also the tactical and strategic levels. In particular, organizations need solutions that especially integrate Learning Management, Performance Management, and Human Resource Management (HRM. We propose an approach considering the competences as key-factors in the management and valorization of Human Capital and making use of a socio-constructivist learning model, based on the explicit (ontological modeling of domain competences as well as a learner and didactic oriented approach. Unlike most of the current solutions, far from the proposed vision and concentrated on specific functionalities and not on the processes as a whole, the solution offered by MOMA, spin-off of the Research Group of the University of Salerno led by Prof. Salerno, is here presented as a demonstrative case of the proposed methodology and approach. A distinctive feature of our proposal, supported by the MOMA solution is the adoption of semantic technologies that for instance allows for the discovery of unpredictable paths linking them in the Knowledge Graph. Finally, we discuss how this framework can be applied in the context of the Smart Cities of the future, taking advantage of the features, enabled especially by semantics, of researching, creating, combining, delivering and using in a creative manner the resources of superior quality offered by Smart Cities.

  18. Valuing snorkeling visits to the Florida Keys with stated and revealed preference models

    Science.gov (United States)

    Timothy Park; J. Michael Bowker; Vernon R. Leeworthy

    2002-01-01

    Coastal coral reefs, especially in the Florida Keys, are declining at a disturbing rate. Marine ecologists and reef scientists have emphasized the importance of establishing nonmarket values of coral reefs to assess the cost effectiveness of coral reef management and remediation programs. The purpose of this paper is to develop a travel cost--contingent valuation model...

  19. Multimodal Feature Learning for Video Captioning

    Directory of Open Access Journals (Sweden)

    Sujin Lee

    2018-01-01

    Full Text Available Video captioning refers to the task of generating a natural language sentence that explains the content of the input video clips. This study proposes a deep neural network model for effective video captioning. Apart from visual features, the proposed model learns additionally semantic features that describe the video content effectively. In our model, visual features of the input video are extracted using convolutional neural networks such as C3D and ResNet, while semantic features are obtained using recurrent neural networks such as LSTM. In addition, our model includes an attention-based caption generation network to generate the correct natural language captions based on the multimodal video feature sequences. Various experiments, conducted with the two large benchmark datasets, Microsoft Video Description (MSVD and Microsoft Research Video-to-Text (MSR-VTT, demonstrate the performance of the proposed model.

  20. Genomic Feature Models

    DEFF Research Database (Denmark)

    Sørensen, Peter; Edwards, Stefan McKinnon; Rohde, Palle Duun

    -additive genetic mechanisms. These modeling approaches have proven to be highly useful to determine population genetic parameters as well as prediction of genetic risk or value. We present a series of statistical modelling approaches that use prior biological information for evaluating the collective action......Whole-genome sequences and multiple trait phenotypes from large numbers of individuals will soon be available in many populations. Well established statistical modeling approaches enable the genetic analyses of complex trait phenotypes while accounting for a variety of additive and non...... regions and gene ontologies) that provide better model fit and increase predictive ability of the statistical model for this trait....

  1. Quantum key distribution without alternative measurements

    CERN Document Server

    Cabello, A

    2000-01-01

    Entanglement swapping between Einstein-Podolsky-Rosen (EPR) pairs can be used to generate the same sequence of random bits in two remote places. A quantum key distribution protocol based on this idea is described. The scheme exhibits the following features. (a) It does not require that Alice and Bob choose between alternative measurements, therefore improving the rate of generated bits by transmitted qubit. (b) It allows Alice and Bob to generate a key of arbitrary length using a single quantum system (three EPR pairs), instead of a long sequence of them. (c) Detecting Eve requires the comparison of fewer bits. (d) Entanglement is an essential ingredient. The scheme assumes reliable measurements of the Bell operator. (20 refs).

  2. A touch-probe path generation method through similarity analysis between the feature vectors in new and old models

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Hye Sung; Lee, Jin Won; Yang, Jeong Sam [Dept. of Industrial Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    The On-machine measurement (OMM), which measures a work piece during or after the machining process in the machining center, has the advantage of measuring the work piece directly within the work space without moving it. However, the path generation procedure used to determine the measuring sequence and variables for the complex features of a target work piece has the limitation of requiring time-consuming tasks to generate the measuring points and mostly relies on the proficiency of the on-site engineer. In this study, we propose a touch-probe path generation method using similarity analysis between the feature vectors of three-dimensional (3-D) shapes for the OMM. For the similarity analysis between a new 3-D model and existing 3-D models, we extracted the feature vectors from models that can describe the characteristics of a geometric shape model; then, we applied those feature vectors to a geometric histogram that displays a probability distribution obtained by the similarity analysis algorithm. In addition, we developed a computer-aided inspection planning system that corrects non-applied measuring points that are caused by minute geometry differences between the two models and generates the final touch-probe path.

  3. Formal Analysis of Key Integrity in PKCS#11

    Science.gov (United States)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  4. Coding of visual object features and feature conjunctions in the human brain.

    Science.gov (United States)

    Martinovic, Jasna; Gruber, Thomas; Müller, Matthias M

    2008-01-01

    Object recognition is achieved through neural mechanisms reliant on the activity of distributed coordinated neural assemblies. In the initial steps of this process, an object's features are thought to be coded very rapidly in distinct neural assemblies. These features play different functional roles in the recognition process--while colour facilitates recognition, additional contours and edges delay it. Here, we selectively varied the amount and role of object features in an entry-level categorization paradigm and related them to the electrical activity of the human brain. We found that early synchronizations (approx. 100 ms) increased quantitatively when more image features had to be coded, without reflecting their qualitative contribution to the recognition process. Later activity (approx. 200-400 ms) was modulated by the representational role of object features. These findings demonstrate that although early synchronizations may be sufficient for relatively crude discrimination of objects in visual scenes, they cannot support entry-level categorization. This was subserved by later processes of object model selection, which utilized the representational value of object features such as colour or edges to select the appropriate model and achieve identification.

  5. The mechanisms of feature inheritance as predicted by a systems-level model of visual attention and decision making.

    Science.gov (United States)

    Hamker, Fred H

    2008-07-15

    Feature inheritance provides evidence that properties of an invisible target stimulus can be attached to a following mask. We apply a systemslevel model of attention and decision making to explore the influence of memory and feedback connections in feature inheritance. We find that the presence of feedback loops alone is sufficient to account for feature inheritance. Although our simulations do not cover all experimental variations and focus only on the general principle, our result appears of specific interest since the model was designed for a completely different purpose than to explain feature inheritance. We suggest that feedback is an important property in visual perception and provide a description of its mechanism and its role in perception.

  6. FISHRENT; Bio-economic simulation and optimisation model

    NARCIS (Netherlands)

    Salz, P.; Buisman, F.C.; Soma, K.; Frost, H.; Accadia, P.; Prellezo, R.

    2011-01-01

    Key findings: The FISHRENT model is a major step forward in bio-economic model-ling, combining features that have not been fully integrated in earlier models: 1- Incorporation of any number of species (or stock) and/or fleets 2- Integration of simulation and optimisation over a period of 25 years 3-

  7. Primordial power spectrum features and consequences

    Science.gov (United States)

    Goswami, G.

    2014-03-01

    The present Cosmic Microwave Background (CMB) temperature and polarization anisotropy data is consistent with not only a power law scalar primordial power spectrum (PPS) with a small running but also with the scalar PPS having very sharp features. This has motivated inflationary models with such sharp features. Recently, even the possibility of having nulls in the power spectrum (at certain scales) has been considered. The existence of these nulls has been shown in linear perturbation theory. What shall be the effect of higher order corrections on such nulls? Inspired by this question, we have attempted to calculate quantum radiative corrections to the Fourier transform of the 2-point function in a toy field theory and address the issue of how these corrections to the power spectrum behave in models in which the tree-level power spectrum has a sharp dip (but not a null). In particular, we have considered the possibility of the relative enhancement of radiative corrections in a model in which the tree-level spectrum goes through a dip in power at a certain scale. The mode functions of the field (whose power spectrum is to be evaluated) are chosen such that they undergo the kind of dynamics that leads to a sharp dip in the tree level power spectrum. Next, we have considered the situation in which this field has quartic self interactions, and found one loop correction in a suitably chosen renormalization scheme. Thus, we have attempted to answer the following key question in the context of this toy model (which is as important in the realistic case): In the chosen renormalization scheme, can quantum radiative corrections be enhanced relative to tree-level power spectrum at scales, at which sharp dips appear in the tree-level spectrum?

  8. Advancing Affect Modeling via Preference Learning and Unsupervised Feature Extraction

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez

    strategies (error functions and training algorithms) for artificial neural networks are examined across synthetic and psycho-physiological datasets, and compared against support vector machines and Cohen’s method. Results reveal the best training strategies for neural networks and suggest their superiority...... difficulties, ordinal reports such as rankings and ratings can yield more reliable affect annotations than alternative tools. This thesis explores preference learning methods to automatically learn computational models from ordinal annotations of affect. In particular, an extensive collection of training...... over the other examined methods. The second challenge addressed in this thesis refers to the extraction of relevant information from physiological modalities. Deep learning is proposed as an automatic approach to extract input features for models of affect from physiological signals. Experiments...

  9. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  10. Selecting a climate model subset to optimise key ensemble properties

    Directory of Open Access Journals (Sweden)

    N. Herger

    2018-02-01

    Full Text Available End users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present-day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product, and pre-processing steps used.

  11. Selecting a climate model subset to optimise key ensemble properties

    Science.gov (United States)

    Herger, Nadja; Abramowitz, Gab; Knutti, Reto; Angélil, Oliver; Lehmann, Karsten; Sanderson, Benjamin M.

    2018-02-01

    End users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present-day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product, and pre-processing steps used.

  12. Fixed versus mixed RSA: Explaining visual representations by fixed and mixed feature sets from shallow and deep computational models.

    Science.gov (United States)

    Khaligh-Razavi, Seyed-Mahdi; Henriksson, Linda; Kay, Kendrick; Kriegeskorte, Nikolaus

    2017-02-01

    Studies of the primate visual system have begun to test a wide range of complex computational object-vision models. Realistic models have many parameters, which in practice cannot be fitted using the limited amounts of brain-activity data typically available. Task performance optimization (e.g. using backpropagation to train neural networks) provides major constraints for fitting parameters and discovering nonlinear representational features appropriate for the task (e.g. object classification). Model representations can be compared to brain representations in terms of the representational dissimilarities they predict for an image set. This method, called representational similarity analysis (RSA), enables us to test the representational feature space as is (fixed RSA) or to fit a linear transformation that mixes the nonlinear model features so as to best explain a cortical area's representational space (mixed RSA). Like voxel/population-receptive-field modelling, mixed RSA uses a training set (different stimuli) to fit one weight per model feature and response channel (voxels here), so as to best predict the response profile across images for each response channel. We analysed response patterns elicited by natural images, which were measured with functional magnetic resonance imaging (fMRI). We found that early visual areas were best accounted for by shallow models, such as a Gabor wavelet pyramid (GWP). The GWP model performed similarly with and without mixing, suggesting that the original features already approximated the representational space, obviating the need for mixing. However, a higher ventral-stream visual representation (lateral occipital region) was best explained by the higher layers of a deep convolutional network and mixing of its feature set was essential for this model to explain the representation. We suspect that mixing was essential because the convolutional network had been trained to discriminate a set of 1000 categories, whose frequencies

  13. Different developmental trajectories across feature types support a dynamic field model of visual working memory development.

    Science.gov (United States)

    Simmering, Vanessa R; Miller, Hilary E; Bohache, Kevin

    2015-05-01

    Research on visual working memory has focused on characterizing the nature of capacity limits as "slots" or "resources" based almost exclusively on adults' performance with little consideration for developmental change. Here we argue that understanding how visual working memory develops can shed new light onto the nature of representations. We present an alternative model, the Dynamic Field Theory (DFT), which can capture effects that have been previously attributed either to "slot" or "resource" explanations. The DFT includes a specific developmental mechanism to account for improvements in both resolution and capacity of visual working memory throughout childhood. Here we show how development in the DFT can account for different capacity estimates across feature types (i.e., color and shape). The current paper tests this account by comparing children's (3, 5, and 7 years of age) performance across different feature types. Results showed that capacity for colors increased faster over development than capacity for shapes. A second experiment confirmed this difference across feature types within subjects, but also showed that the difference can be attenuated by testing memory for less familiar colors. Model simulations demonstrate how developmental changes in connectivity within the model-purportedly arising through experience-can capture differences across feature types.

  14. A Key Factor of the DCF Model Coherency

    Directory of Open Access Journals (Sweden)

    Piotr Adamczyk

    2017-04-01

    Full Text Available Aim/purpose - The aim of this paper is to provide economically justified evidence that the business value calculated by income valuation methods is the same, regardless of the type of cash flow used in the valuation algorithm. Design/methodology/approach - The evidence was arrived at using free cash flow to equity (FCFE, debt (FCFD and firm (FCFF. The article draws attention to the FCFF method's particular popularity in income valuation, based on analysts' practice. It shows an overview of various approaches to determine the capital structure in the formula for WACC, both in practice and theory. Finally, it examines an empirical example with the authors' own derivations and postulates. Findings - The conclusion drawn from the conducted analysis is that the key to the reconciliation process, and thus DCF model coherency, is to apply the appropriate method of capital structure estimation during the calculation of the weighted average cost of capital (WACC. This capital structure will henceforth be referred to as 'income weights'. Research implications/limitations - It should be noted that the obtained compliance of valuation results does not imply that the income valuation becomes an objective way of determining business value. It still remains subjective. Originality/value/contribution - According to the presented approach, the DCF model's subjectivism is limited to the forecasts. The rest is the algorithm which, based on the principles of mathematics, should be used in the same way in every situation.

  15. Business models of sharing economy companies : exploring features responsible for sharing economy companies’ internationalization

    OpenAIRE

    Kosintceva, Aleksandra

    2016-01-01

    This paper is dedicated to the sharing economy business models and their features responsible for internationalization. The study proposes derived definitions for the concepts of “sharing economy” and “business model” and first generic sharing economy business models typology. The typology was created through the qualitative analysis of secondary data on twenty sharing economy companies from nine different industries. The outlined categories of sharing economy business models a...

  16. Research on oral test modeling based on multi-feature fusion

    Science.gov (United States)

    Shi, Yuliang; Tao, Yiyue; Lei, Jun

    2018-04-01

    In this paper, the spectrum of speech signal is taken as an input of feature extraction. The advantage of PCNN in image segmentation and other processing is used to process the speech spectrum and extract features. And a new method combining speech signal processing and image processing is explored. At the same time of using the features of the speech map, adding the MFCC to establish the spectral features and integrating them with the features of the spectrogram to further improve the accuracy of the spoken language recognition. Considering that the input features are more complicated and distinguishable, we use Support Vector Machine (SVM) to construct the classifier, and then compare the extracted test voice features with the standard voice features to achieve the spoken standard detection. Experiments show that the method of extracting features from spectrograms using PCNN is feasible, and the fusion of image features and spectral features can improve the detection accuracy.

  17. Key aspects of stratospheric tracer modeling using assimilated winds

    Directory of Open Access Journals (Sweden)

    B. Bregman

    2006-01-01

    Full Text Available This study describes key aspects of global chemistry-transport models and their impact on stratospheric tracer transport. We concentrate on global models that use assimilated winds from numerical weather predictions, but the results also apply to tracer transport in general circulation models. We examined grid resolution, numerical diffusion, air parcel dispersion, the wind or mass flux update frequency, and time interpolation. The evaluation is performed with assimilated meteorology from the "operational analyses or operational data" (OD from the European Centre for Medium-Range Weather Forecasts (ECMWF. We also show the effect of the mass flux update frequency using the ECMWF 40-year re-analyses (ERA40. We applied the three-dimensional chemistry-transport Tracer Model version 5 (TM5 and a trajectory model and performed several diagnoses focusing on different transport regimes. Covering different time and spatial scales, we examined (1 polar vortex dynamics during the Arctic winter, (2 the large-scale stratospheric meridional circulation, and (3 air parcel dispersion in the tropical lower stratosphere. Tracer distributions inside the Arctic polar vortex show considerably worse agreement with observations when the model grid resolution in the polar region is reduced to avoid numerical instability. The results are sensitive to the diffusivity of the advection. Nevertheless, the use of a computational cheaper but diffusive advection scheme is feasible for tracer transport when the horizontal grid resolution is equal or smaller than 1 degree. The use of time interpolated winds improves the tracer distributions, particularly in the middle and upper stratosphere. Considerable improvement is found both in the large-scale tracer distribution and in the polar regions when the update frequency of the assimilated winds is increased from 6 to 3 h. It considerably reduces the vertical dispersion of air parcels in the tropical lower stratosphere. Strong

  18. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  19. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  20. A key for the identification of the tintinnoinea of the mediterranean sea

    International Nuclear Information System (INIS)

    Rampi, L.; Zattera, A.

    1982-01-01

    A key for the identification of Tintinnoinea is presented. The key's main aplication will be in the fields of production studies, species succession and alteration of the species composition caused by pollution of various origin. Each species described is accompanied by an appropriate drawing either an original or from published source, arranged in 45 plates. The key consists of two parts. One is a key to genera and the other a key to the species of mediterranean pelagic Tintinnoinea. The key is preceeded by an introduction into the general morphology of Tintinnoinea and of their principle morphological feature. This introduction is accompanied by 1 plate

  1. Computational intelligence models to predict porosity of tablets using minimum features

    Directory of Open Access Journals (Sweden)

    Khalid MH

    2017-01-01

    behavior when presented with a challenging external validation data set (best achieved symbolic regression: NRMSE =3%. Symbolic regression demonstrates the transition from the black box modeling paradigm to more transparent predictive models. Predictive performance and feature selection behavior of CI models hints at the most important variables within this factor space. Keywords: computational intelligence, artificial neural network, symbolic regression, feature selection, die compaction, porosity

  2. Obscene Video Recognition Using Fuzzy SVM and New Sets of Features

    Directory of Open Access Journals (Sweden)

    Alireza Behrad

    2013-02-01

    Full Text Available In this paper, a novel approach for identifying normal and obscene videos is proposed. In order to classify different episodes of a video independently and discard the need to process all frames, first, key frames are extracted and skin regions are detected for groups of video frames starting with key frames. In the second step, three different features including 1- structural features based on single frame information, 2- features based on spatiotemporal volume and 3-motion-based features, are extracted for each episode of video. The PCA-LDA method is then applied to reduce the size of structural features and select more distinctive features. For the final step, we use fuzzy or a Weighted Support Vector Machine (WSVM classifier to identify video episodes. We also employ a multilayer Kohonen network as an initial clustering algorithm to increase the ability to discriminate between the extracted features into two classes of videos. Features based on motion and periodicity characteristics increase the efficiency of the proposed algorithm in videos with bad illumination and skin colour variation. The proposed method is evaluated using 1100 videos in different environmental and illumination conditions. The experimental results show a correct recognition rate of 94.2% for the proposed algorithm.

  3. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    Science.gov (United States)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  4. A sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image

    Science.gov (United States)

    Li, Jing; Xie, Weixin; Pei, Jihong

    2018-03-01

    Sea-land segmentation is one of the key technologies of sea target detection in remote sensing images. At present, the existing algorithms have the problems of low accuracy, low universality and poor automatic performance. This paper puts forward a sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image removing island. Firstly, the coastline data is extracted and all of land area is labeled by using the geographic information in large-field remote sensing image. Secondly, three features (local entropy, local texture and local gradient mean) is extracted in the sea-land border area, and the three features combine a 3D feature vector. And then the MultiGaussian model is adopted to describe 3D feature vectors of sea background in the edge of the coastline. Based on this multi-gaussian sea background model, the sea pixels and land pixels near coastline are classified more precise. Finally, the coarse segmentation result and the fine segmentation result are fused to obtain the accurate sea-land segmentation. Comparing and analyzing the experimental results by subjective vision, it shows that the proposed method has high segmentation accuracy, wide applicability and strong anti-disturbance ability.

  5. Soil fauna: key to new carbon models

    NARCIS (Netherlands)

    Filser, Juliane; Faber, J.H.; Tiunov, Alexei V.; Brussaard, L.; Frouz, J.; Deyn, de G.B.; Uvarov, Alexei V.; Berg, Matty P.; Lavelle, Patrick; Loreau, M.; Wall, D.H.; Querner, Pascal; Eijsackers, Herman; Jimenez, Juan Jose

    2016-01-01

    Soil organic matter (SOM) is key to maintaining soil fertility, mitigating climate change, combatting land degradation, and conserving above- and below-ground biodiversity and associated soil processes and ecosystem services. In order to derive management options for maintaining these essential

  6. Security for Key Management Interfaces

    OpenAIRE

    Kremer , Steve; Steel , Graham; Warinschi , Bogdan

    2011-01-01

    International audience; We propose a much-needed formal definition of security for cryptographic key management APIs. The advantages of our definition are that it is general, intuitive, and applicable to security proofs in both symbolic and computational models of cryptography. Our definition relies on an idealized API which allows only the most essential functions for generating, exporting and importing keys, and takes into account dynamic corruption of keys. Based on this we can define the ...

  7. Ceramic coatings: A phenomenological modeling for damping behavior related to microstructural features

    International Nuclear Information System (INIS)

    Tassini, N.; Patsias, S.; Lambrinou, K.

    2006-01-01

    Recent research has shown that both stiffness and damping of ceramic coatings exhibit different non-linearities. These properties strongly depend on the microstructure, which is characterized by heterogeneous sets of elastic elements with mesoscopic sizes and shapes, as in non-linear mesoscopic elastic materials. To predict the damping properties of this class of materials, we have implemented a phenomenological model that characterizes their elastic properties. The model is capable of reproducing the basic features of the observed damping behavior for zirconia coatings prepared by air plasma spraying and electron-beam physical-vapor-deposition

  8. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng; Masseglia, Florent; Zhang, Xiangliang

    2012-01-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  9. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng

    2012-11-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  10. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  11. Key features of intertidal food webs that support migratory shorebirds.

    Directory of Open Access Journals (Sweden)

    Blanche Saint-Béat

    Full Text Available The migratory shorebirds of the East Atlantic flyway land in huge numbers during a migratory stopover or wintering on the French Atlantic coast. The Brouage bare mudflat (Marennes-Oléron Bay, NE Atlantic is one of the major stopover sites in France. The particular structure and function of a food web affects the efficiency of carbon transfer. The structure and functioning of the Brouage food web is crucial for the conservation of species landing within this area because it provides sufficient food, which allows shorebirds to reach the north of Europe where they nest. The aim of this study was to describe and understand which food web characteristics support nutritional needs of birds. Two food-web models were constructed, based on in situ measurements that were made in February 2008 (the presence of birds and July 2008 (absence of birds. To complete the models, allometric relationships and additional data from the literature were used. The missing flow values of the food web models were estimated by Monte Carlo Markov Chain--Linear Inverse Modelling. The flow solutions obtained were used to calculate the ecological network analysis indices, which estimate the emergent properties of the functioning of a food-web. The total activities of the Brouage ecosystem in February and July are significantly different. The specialisation of the trophic links within the ecosystem does not appear to differ between the two models. In spite of a large export of carbon from the primary producer and detritus in winter, the higher recycling leads to a similar retention of carbon for the two seasons. It can be concluded that in February, the higher activity of the ecosystem coupled with a higher cycling and a mean internal organization, ensure the sufficient feeding of the migratory shorebirds.

  12. Predictive model identifies key network regulators of cardiomyocyte mechano-signaling.

    Directory of Open Access Journals (Sweden)

    Philip M Tan

    2017-11-01

    Full Text Available Mechanical strain is a potent stimulus for growth and remodeling in cells. Although many pathways have been implicated in stretch-induced remodeling, the control structures by which signals from distinct mechano-sensors are integrated to modulate hypertrophy and gene expression in cardiomyocytes remain unclear. Here, we constructed and validated a predictive computational model of the cardiac mechano-signaling network in order to elucidate the mechanisms underlying signal integration. The model identifies calcium, actin, Ras, Raf1, PI3K, and JAK as key regulators of cardiac mechano-signaling and characterizes crosstalk logic imparting differential control of transcription by AT1R, integrins, and calcium channels. We find that while these regulators maintain mostly independent control over distinct groups of transcription factors, synergy between multiple pathways is necessary to activate all the transcription factors necessary for gene transcription and hypertrophy. We also identify a PKG-dependent mechanism by which valsartan/sacubitril, a combination drug recently approved for treating heart failure, inhibits stretch-induced hypertrophy, and predict further efficacious pairs of drug targets in the network through a network-wide combinatorial search.

  13. A model of how features of construction projects influence accident occurrence

    OpenAIRE

    Manu, P.

    2017-01-01

    This book chapter in "Valuing People in Construction" (edited by Emuze, F. and Smallwood, J.) presents a study which sought empirical verification of a model of how construction project features (CPFs) influence accident occurrence. A qualitative strategy, in particular phenomenology, involving a range of in-depth interviews with practitioners was used and the findings were subsequently validated using a credibility check involving a survey. Altogether, the findings of the interviews and cred...

  14. Recurrence predictive models for patients with hepatocellular carcinoma after radiofrequency ablation using support vector machines with feature selection methods.

    Science.gov (United States)

    Liang, Ja-Der; Ping, Xiao-Ou; Tseng, Yi-Ju; Huang, Guan-Tarn; Lai, Feipei; Yang, Pei-Ming

    2014-12-01

    Recurrence of hepatocellular carcinoma (HCC) is an important issue despite effective treatments with tumor eradication. Identification of patients who are at high risk for recurrence may provide more efficacious screening and detection of tumor recurrence. The aim of this study was to develop recurrence predictive models for HCC patients who received radiofrequency ablation (RFA) treatment. From January 2007 to December 2009, 83 newly diagnosed HCC patients receiving RFA as their first treatment were enrolled. Five feature selection methods including genetic algorithm (GA), simulated annealing (SA) algorithm, random forests (RF) and hybrid methods (GA+RF and SA+RF) were utilized for selecting an important subset of features from a total of 16 clinical features. These feature selection methods were combined with support vector machine (SVM) for developing predictive models with better performance. Five-fold cross-validation was used to train and test SVM models. The developed SVM-based predictive models with hybrid feature selection methods and 5-fold cross-validation had averages of the sensitivity, specificity, accuracy, positive predictive value, negative predictive value, and area under the ROC curve as 67%, 86%, 82%, 69%, 90%, and 0.69, respectively. The SVM derived predictive model can provide suggestive high-risk recurrent patients, who should be closely followed up after complete RFA treatment. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Modeling the diffusion of scientific publications

    NARCIS (Netherlands)

    D. Fok (Dennis); Ph.H.B.F. Franses (Philip Hans)

    2005-01-01

    textabstractThis paper illustrates that salient features of a panel of time series of annual citations can be captured by a Bass type diffusion model. We put forward an extended version of this diffusion model, where we consider the relation between key characteristics of the diffusion process and

  16. Hypertension Is a Key Feature of the Metabolic Syndrome in Subjects Aging with HIV

    DEFF Research Database (Denmark)

    Martin-Iguacel, Raquel; Negredo, Eugènia; Peck, Robert

    2016-01-01

    to predispose to these metabolic complications and to the excess risk of CVD observed in the HIV population. The metabolic syndrome (MS) represents a clustering of RF for CVD that includes abdominal obesity, hypertension, dyslipidemia and insulin resistance. Hypertension is a prevalent feature of the MS in HIV...

  17. Identifying tier one key suppliers.

    Science.gov (United States)

    Wicks, Steve

    2013-01-01

    In today's global marketplace, businesses are becoming increasingly reliant on suppliers for the provision of key processes, activities, products and services in support of their strategic business goals. The result is that now, more than ever, the failure of a key supplier has potential to damage reputation, productivity, compliance and financial performance seriously. Yet despite this, there is no recognised standard or guidance for identifying a tier one key supplier base and, up to now, there has been little or no research on how to do so effectively. This paper outlines the key findings of a BCI-sponsored research project to investigate good practice in identifying tier one key suppliers, and suggests a scalable framework process model and risk matrix tool to help businesses effectively identify their tier one key supplier base.

  18. Prediction of hot spots in protein interfaces using a random forest model with hybrid features.

    Science.gov (United States)

    Wang, Lin; Liu, Zhi-Ping; Zhang, Xiang-Sun; Chen, Luonan

    2012-03-01

    Prediction of hot spots in protein interfaces provides crucial information for the research on protein-protein interaction and drug design. Existing machine learning methods generally judge whether a given residue is likely to be a hot spot by extracting features only from the target residue. However, hot spots usually form a small cluster of residues which are tightly packed together at the center of protein interface. With this in mind, we present a novel method to extract hybrid features which incorporate a wide range of information of the target residue and its spatially neighboring residues, i.e. the nearest contact residue in the other face (mirror-contact residue) and the nearest contact residue in the same face (intra-contact residue). We provide a novel random forest (RF) model to effectively integrate these hybrid features for predicting hot spots in protein interfaces. Our method can achieve accuracy (ACC) of 82.4% and Matthew's correlation coefficient (MCC) of 0.482 in Alanine Scanning Energetics Database, and ACC of 77.6% and MCC of 0.429 in Binding Interface Database. In a comparison study, performance of our RF model exceeds other existing methods, such as Robetta, FOLDEF, KFC, KFC2, MINERVA and HotPoint. Of our hybrid features, three physicochemical features of target residues (mass, polarizability and isoelectric point), the relative side-chain accessible surface area and the average depth index of mirror-contact residues are found to be the main discriminative features in hot spots prediction. We also confirm that hot spots tend to form large contact surface areas between two interacting proteins. Source data and code are available at: http://www.aporc.org/doc/wiki/HotSpot.

  19. Temporal Feature Integration for Music Organisation

    DEFF Research Database (Denmark)

    Meng, Anders

    2006-01-01

    This Ph.D. thesis focuses on temporal feature integration for music organisation. Temporal feature integration is the process of combining all the feature vectors of a given time-frame into a single new feature vector in order to capture relevant information in the frame. Several existing methods...... for handling sequences of features are formulated in the temporal feature integration framework. Two datasets for music genre classification have been considered as valid test-beds for music organisation. Human evaluations of these, have been obtained to access the subjectivity on the datasets. Temporal...... ranking' approach is proposed for ranking the short-time features at larger time-scales according to their discriminative power in a music genre classification task. The multivariate AR (MAR) model has been proposed for temporal feature integration. It effectively models local dynamical structure...

  20. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    Energy Technology Data Exchange (ETDEWEB)

    Grimm, Lars J., E-mail: Lars.grimm@duke.edu; Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie [Department of Radiology, Duke University Medical Center, Box 3808, Durham, North Carolina 27710 (United States); Kuzmiak, Cherie M. [Department of Radiology, University of North Carolina School of Medicine, 2006 Old Clinic, CB No. 7510, Chapel Hill, North Carolina 27599 (United States); Mazurowski, Maciej A. [Duke University Medical Center, Box 2731 Medical Center, Durham, North Carolina 27710 (United States)

    2014-03-15

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  1. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features.

    Science.gov (United States)

    Grimm, Lars J; Ghate, Sujata V; Yoon, Sora C; Kuzmiak, Cherie M; Kim, Connie; Mazurowski, Maciej A

    2014-03-01

    The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502-0.739, 95% Confidence Interval: 0.543-0.680,p errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  2. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    International Nuclear Information System (INIS)

    Grimm, Lars J.; Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-01-01

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees

  3. Key Update Assistant for Resource-Constrained Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    developed a push-button solution - powered by stochastic model checking - that network designers can easily benefit from, and it paves the way for consumers to set up key update related security parameters. Key Update Assistant, as we named it, runs necessary model checking operations and determines...

  4. Image feature extraction based on the camouflage effectiveness evaluation

    Science.gov (United States)

    Yuan, Xin; Lv, Xuliang; Li, Ling; Wang, Xinzhu; Zhang, Zhi

    2018-04-01

    The key step of camouflage effectiveness evaluation is how to combine the human visual physiological features, psychological features to select effectively evaluation indexes. Based on the predecessors' camo comprehensive evaluation method, this paper chooses the suitable indexes combining with the image quality awareness, and optimizes those indexes combining with human subjective perception. Thus, it perfects the theory of index extraction.

  5. An Analysis of Audio Features to Develop a Human Activity Recognition Model Using Genetic Algorithms, Random Forests, and Neural Networks

    Directory of Open Access Journals (Sweden)

    Carlos E. Galván-Tejada

    2016-01-01

    Full Text Available This work presents a human activity recognition (HAR model based on audio features. The use of sound as an information source for HAR models represents a challenge because sound wave analyses generate very large amounts of data. However, feature selection techniques may reduce the amount of data required to represent an audio signal sample. Some of the audio features that were analyzed include Mel-frequency cepstral coefficients (MFCC. Although MFCC are commonly used in voice and instrument recognition, their utility within HAR models is yet to be confirmed, and this work validates their usefulness. Additionally, statistical features were extracted from the audio samples to generate the proposed HAR model. The size of the information is necessary to conform a HAR model impact directly on the accuracy of the model. This problem also was tackled in the present work; our results indicate that we are capable of recognizing a human activity with an accuracy of 85% using the HAR model proposed. This means that minimum computational costs are needed, thus allowing portable devices to identify human activities using audio as an information source.

  6. Phase information of time-frequency transforms as a key feature for classification of atrial fibrillation episodes

    International Nuclear Information System (INIS)

    Ortigosa, Nuria; Fernández, Carmen; Galbis, Antonio; Cano, Óscar

    2015-01-01

    Patients suffering from atrial fibrillation can be classified into different subtypes, according to the temporal pattern of the arrhythmia and its recurrence. Nowadays, clinicians cannot differentiate a priori between the different subtypes, and patient classification is done afterwards, when its clinical course is available. In this paper we present a comparison of classification performances when differentiating paroxysmal and persistent atrial fibrillation episodes by means of support vector machines. We analyze short surface electrocardiogram recordings by extracting modulus and phase features from several time-frequency transforms: short-time Fourier transform, Wigner–Ville, Choi–Williams, Stockwell transform, and general Fourier-family transform. Overall, accuracy higher than 81% is obtained when classifying phase information features of real test ECGs from a heterogeneous cohort of patients (in terms of progression of the arrhythmia and antiarrhythmic treatment) recorded in a tertiary center. Therefore, phase features can facilitate the clinicians’ choice of the most appropriate treatment for each patient by means of a non-invasive technique (the surface ECG). (paper)

  7. Biased ART: a neural architecture that shifts attention toward previously disregarded features following an incorrect prediction.

    Science.gov (United States)

    Carpenter, Gail A; Gaddam, Sai Chaitanya

    2010-04-01

    Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Two-dimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/. Copyright 2009 Elsevier Ltd. All rights reserved.

  8. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part I—Model Development

    Science.gov (United States)

    Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario

    2016-01-01

    The development of an error compensation model for coordinate measuring machines (CMMs) and its integration into feature measurement is presented. CMMs are widespread and dependable instruments in industry and laboratories for dimensional measurement. From the tip probe sensor to the machine display, there is a complex transformation of probed point coordinates through the geometrical feature model that makes the assessment of accuracy and uncertainty measurement results difficult. Therefore, error compensation is not standardized, conversely to other simpler instruments. Detailed coordinate error compensation models are generally based on CMM as a rigid-body and it requires a detailed mapping of the CMM’s behavior. In this paper a new model type of error compensation is proposed. It evaluates the error from the vectorial composition of length error by axis and its integration into the geometrical measurement model. The non-explained variability by the model is incorporated into the uncertainty budget. Model parameters are analyzed and linked to the geometrical errors and uncertainty of CMM response. Next, the outstanding measurement models of flatness, angle, and roundness are developed. The proposed models are useful for measurement improvement with easy integration into CMM signal processing, in particular in industrial environments where built-in solutions are sought. A battery of implementation tests are presented in Part II, where the experimental endorsement of the model is included. PMID:27690052

  9. Integration of Error Compensation of Coordinate Measuring Machines into Feature Measurement: Part I—Model Development

    Directory of Open Access Journals (Sweden)

    Roque Calvo

    2016-09-01

    Full Text Available The development of an error compensation model for coordinate measuring machines (CMMs and its integration into feature measurement is presented. CMMs are widespread and dependable instruments in industry and laboratories for dimensional measurement. From the tip probe sensor to the machine display, there is a complex transformation of probed point coordinates through the geometrical feature model that makes the assessment of accuracy and uncertainty measurement results difficult. Therefore, error compensation is not standardized, conversely to other simpler instruments. Detailed coordinate error compensation models are generally based on CMM as a rigid-body and it requires a detailed mapping of the CMM’s behavior. In this paper a new model type of error compensation is proposed. It evaluates the error from the vectorial composition of length error by axis and its integration into the geometrical measurement model. The non-explained variability by the model is incorporated into the uncertainty budget. Model parameters are analyzed and linked to the geometrical errors and uncertainty of CMM response. Next, the outstanding measurement models of flatness, angle, and roundness are developed. The proposed models are useful for measurement improvement with easy integration into CMM signal processing, in particular in industrial environments where built-in solutions are sought. A battery of implementation tests are presented in Part II, where the experimental endorsement of the model is included.

  10. Temporal feature integration for music genre classification

    DEFF Research Database (Denmark)

    Meng, Anders; Ahrendt, Peter; Larsen, Jan

    2007-01-01

    , but they capture neither the temporal dynamics nor dependencies among the individual feature dimensions. Here, a multivariate autoregressive feature model is proposed to solve this problem for music genre classification. This model gives two different feature sets, the diagonal autoregressive (DAR......) and multivariate autoregressive (MAR) features which are compared against the baseline mean-variance as well as two other temporal feature integration techniques. Reproducibility in performance ranking of temporal feature integration methods were demonstrated using two data sets with five and eleven music genres...

  11. Including product features in process redesign

    DEFF Research Database (Denmark)

    Hvam, Lars; Hauksdóttir, Dagný; Mortensen, Niels Henrik

    2017-01-01

    do not take into account how the product features are applied throughout the process, which makes it difficult to obtain a comprehensive understanding of the activities in the processes and to generate significant improvements. The suggested approach models the product family using the so......This article suggests a visual modelling method for integrating models of product features with business process models for redesigning the business processes involving specifications of customer-tailored products and services. The current methods for redesigning these types of business processes......-called product variant master and the business process modelling notation for modelling the process flow. The product model is combined with the process map by identifying features used in each step of the process flow. Additionally, based on the information absorbed from the integrated model, the value stream...

  12. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Toroid field coil shear key installation study, DOE task No. 22

    International Nuclear Information System (INIS)

    Jones, C.E.; Meier, R.W.; Yuen, J.L.

    1995-01-01

    Concepts for fitting and installation of the scissor keys, triangular keys, and truss keys in the ITER Toroidal Field (TF) Coil Assembly were developed and evaluated. In addition, the process of remote removal and replacement of a failed TF coil was considered. Two concepts were addressed: central solenoid installed last (Naka Option 1) and central solenoid installed first (Naka Option 2). In addition, a third concept was developed which utilized the favorable features of both concepts. A time line for installation was estimated for the Naka Option 1 concept

  14. Feature-level domain adaptation

    DEFF Research Database (Denmark)

    Kouw, Wouter M.; Van Der Maaten, Laurens J P; Krijthe, Jesse H.

    2016-01-01

    -level domain adaptation (flda), that models the dependence between the two domains by means of a feature-level transfer model that is trained to describe the transfer from source to target domain. Subsequently, we train a domain-adapted classifier by minimizing the expected loss under the resulting transfer...... modeled via a dropout distribution, which allows the classiffier to adapt to differences in the marginal probability of features in the source and the target domain. Our experiments on several real-world problems show that flda performs on par with state-of-the-art domainadaptation techniques.......Domain adaptation is the supervised learning setting in which the training and test data are sampled from different distributions: training data is sampled from a source domain, whilst test data is sampled from a target domain. This paper proposes and studies an approach, called feature...

  15. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    Science.gov (United States)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  16. A Public-key based Information Management Model for Mobile Agents

    OpenAIRE

    Rodriguez, Diego; Sobrado, Igor

    2000-01-01

    Mobile code based computing requires development of protection schemes that allow digital signature and encryption of data collected by the agents in untrusted hosts. These algorithms could not rely on carrying encryption keys if these keys could be stolen or used to counterfeit data by hostile hosts and agents. As a consequence, both information and keys must be protected in a way that only authorized hosts, that is the host that provides information and the server that has sent the mobile a...

  17. At-line monitoring of key parameters of nisin fermentation by near infrared spectroscopy, chemometric modeling and model improvement.

    Science.gov (United States)

    Guo, Wei-Liang; Du, Yi-Ping; Zhou, Yong-Can; Yang, Shuang; Lu, Jia-Hui; Zhao, Hong-Yu; Wang, Yao; Teng, Li-Rong

    2012-03-01

    An analytical procedure has been developed for at-line (fast off-line) monitoring of 4 key parameters including nisin titer (NT), the concentration of reducing sugars, cell concentration and pH during a nisin fermentation process. This procedure is based on near infrared (NIR) spectroscopy and Partial Least Squares (PLS). Samples without any preprocessing were collected at intervals of 1 h during fifteen batch of fermentations. These fermentation processes were implemented in 3 different 5 l fermentors at various conditions. NIR spectra of the samples were collected in 10 min. And then, PLS was used for modeling the relationship between NIR spectra and the key parameters which were determined by reference methods. Monte Carlo Partial Least Squares (MCPLS) was applied to identify the outliers and select the most efficacious methods for preprocessing spectra, wavelengths and the suitable number of latent variables (n (LV)). Then, the optimum models for determining NT, concentration of reducing sugars, cell concentration and pH were established. The correlation coefficients of calibration set (R (c)) were 0.8255, 0.9000, 0.9883 and 0.9581, respectively. These results demonstrated that this method can be successfully applied to at-line monitor of NT, concentration of reducing sugars, cell concentration and pH during nisin fermentation processes.

  18. Key Issues in Modeling of Complex 3D Structures from Video Sequences

    Directory of Open Access Journals (Sweden)

    Shengyong Chen

    2012-01-01

    Full Text Available Construction of three-dimensional structures from video sequences has wide applications for intelligent video analysis. This paper summarizes the key issues of the theory and surveys the recent advances in the state of the art. Reconstruction of a scene object from video sequences often takes the basic principle of structure from motion with an uncalibrated camera. This paper lists the typical strategies and summarizes the typical solutions or algorithms for modeling of complex three-dimensional structures. Open difficult problems are also suggested for further study.

  19. Toward Designing a Quantum Key Distribution Network Simulation Model

    OpenAIRE

    Miralem Mehic; Peppino Fazio; Miroslav Voznak; Erik Chromy

    2016-01-01

    As research in quantum key distribution network technologies grows larger and more complex, the need for highly accurate and scalable simulation technologies becomes important to assess the practical feasibility and foresee difficulties in the practical implementation of theoretical achievements. In this paper, we described the design of simplified simulation environment of the quantum key distribution network with multiple links and nodes. In such simulation environment, we analyzed several ...

  20. Feature-based Alignment of Volumetric Multi-modal Images

    Science.gov (United States)

    Toews, Matthew; Zöllei, Lilla; Wells, William M.

    2014-01-01

    This paper proposes a method for aligning image volumes acquired from different imaging modalities (e.g. MR, CT) based on 3D scale-invariant image features. A novel method for encoding invariant feature geometry and appearance is developed, based on the assumption of locally linear intensity relationships, providing a solution to poor repeatability of feature detection in different image modalities. The encoding method is incorporated into a probabilistic feature-based model for multi-modal image alignment. The model parameters are estimated via a group-wise alignment algorithm, that iteratively alternates between estimating a feature-based model from feature data, then realigning feature data to the model, converging to a stable alignment solution with few pre-processing or pre-alignment requirements. The resulting model can be used to align multi-modal image data with the benefits of invariant feature correspondence: globally optimal solutions, high efficiency and low memory usage. The method is tested on the difficult RIRE data set of CT, T1, T2, PD and MP-RAGE brain images of subjects exhibiting significant inter-subject variability due to pathology. PMID:24683955

  1. Modeling sports highlights using a time-series clustering framework and model interpretation

    Science.gov (United States)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  2. Memory for surface features of unfamiliar melodies: independent effects of changes in pitch and tempo.

    Science.gov (United States)

    Schellenberg, E Glenn; Stalinski, Stephanie M; Marks, Bradley M

    2014-01-01

    A melody's identity is determined by relations between consecutive tones in terms of pitch and duration, whereas surface features (i.e., pitch level or key, tempo, and timbre) are irrelevant. Although surface features of highly familiar recordings are encoded into memory, little is known about listeners' mental representations of melodies heard once or twice. It is also unknown whether musical pitch is represented additively or interactively with temporal information. In two experiments, listeners heard unfamiliar melodies twice in an initial exposure phase. In a subsequent test phase, they heard the same (old) melodies interspersed with new melodies. Some of the old melodies were shifted in key, tempo, or key and tempo. Listeners' task was to rate how well they recognized each melody from the exposure phase while ignoring changes in key and tempo. Recognition ratings were higher for old melodies that stayed the same compared to those that were shifted in key or tempo, and detrimental effects of key and tempo changes were additive in between-subjects (Experiment 1) and within-subjects (Experiment 2) designs. The results confirm that surface features are remembered for melodies heard only twice. They also imply that key and tempo are processed and stored independently.

  3. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    Science.gov (United States)

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  4. Simplicity: the key to improved safety, performance and economics

    International Nuclear Information System (INIS)

    McCandless, R.J.; Redding, J.R.

    1989-01-01

    In General Electric's Simplified Boiling Water Reactor (SBWR) design every feature, every system, every piece of equipment must justify its existence - or it must go. Each must perform a needed function in the simplest way because simplification is the key to high performance and competitive economics. The SBWR has the potential to become a safe, economical and environmentally sound energy source for the 1990s, GE believes. The distinctive features of the reactor are described. It is illustrated on a wall chart which also gives its main specifications

  5. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  6. Attentional Selection of Feature Conjunctions Is Accomplished by Parallel and Independent Selection of Single Features.

    Science.gov (United States)

    Andersen, Søren K; Müller, Matthias M; Hillyard, Steven A

    2015-07-08

    features separately. This result is key to understanding attentional selection in complex (natural) scenes, where relevant stimuli are likely to be defined by a combination of stimulus features. Copyright © 2015 the authors 0270-6474/15/359912-08$15.00/0.

  7. Dataset of coded handwriting features for use in statistical modelling

    Directory of Open Access Journals (Sweden)

    Anna Agius

    2018-02-01

    Full Text Available The data presented here is related to the article titled, “Using handwriting to infer a writer's country of origin for forensic intelligence purposes” (Agius et al., 2017 [1]. This article reports original writer, spatial and construction characteristic data for thirty-seven English Australian11 In this study, English writers were Australians whom had learnt to write in New South Wales (NSW. writers and thirty-seven Vietnamese writers. All of these characteristics were coded and recorded in Microsoft Excel 2013 (version 15.31. The construction characteristics coded were only extracted from seven characters, which were: ‘g’, ‘h’, ‘th’, ‘M’, ‘0’, ‘7’ and ‘9’. The coded format of the writer, spatial and construction characteristics is made available in this Data in Brief in order to allow others to perform statistical analyses and modelling to investigate whether there is a relationship between the handwriting features and the nationality of the writer, and whether the two nationalities can be differentiated. Furthermore, to employ mathematical techniques that are capable of characterising the extracted features from each participant.

  8. FEATURE DESCRIPTOR BY CONVOLUTION AND POOLING AUTOENCODERS

    Directory of Open Access Journals (Sweden)

    L. Chen

    2015-03-01

    Full Text Available In this paper we present several descriptors for feature-based matching based on autoencoders, and we evaluate the performance of these descriptors. In a training phase, we learn autoencoders from image patches extracted in local windows surrounding key points determined by the Difference of Gaussian extractor. In the matching phase, we construct key point descriptors based on the learned autoencoders, and we use these descriptors as the basis for local keypoint descriptor matching. Three types of descriptors based on autoencoders are presented. To evaluate the performance of these descriptors, recall and 1-precision curves are generated for different kinds of transformations, e.g. zoom and rotation, viewpoint change, using a standard benchmark data set. We compare the performance of these descriptors with the one achieved for SIFT. Early results presented in this paper show that, whereas SIFT in general performs better than the new descriptors, the descriptors based on autoencoders show some potential for feature based matching.

  9. Electricity market price spike analysis by a hybrid data model and feature selection technique

    International Nuclear Information System (INIS)

    Amjady, Nima; Keynia, Farshid

    2010-01-01

    In a competitive electricity market, energy price forecasting is an important activity for both suppliers and consumers. For this reason, many techniques have been proposed to predict electricity market prices in the recent years. However, electricity price is a complex volatile signal owning many spikes. Most of electricity price forecast techniques focus on the normal price prediction, while price spike forecast is a different and more complex prediction process. Price spike forecasting has two main aspects: prediction of price spike occurrence and value. In this paper, a novel technique for price spike occurrence prediction is presented composed of a new hybrid data model, a novel feature selection technique and an efficient forecast engine. The hybrid data model includes both wavelet and time domain variables as well as calendar indicators, comprising a large candidate input set. The set is refined by the proposed feature selection technique evaluating both relevancy and redundancy of the candidate inputs. The forecast engine is a probabilistic neural network, which are fed by the selected candidate inputs of the feature selection technique and predict price spike occurrence. The efficiency of the whole proposed method for price spike occurrence forecasting is evaluated by means of real data from the Queensland and PJM electricity markets. (author)

  10. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    Science.gov (United States)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  11. A Novel Real-Time Reference Key Frame Scan Matching Method

    Directory of Open Access Journals (Sweden)

    Haytham Mohamed

    2017-05-01

    Full Text Available Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions’ environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF. RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.

  12. Urban Area Extent Extraction in Spaceborne HR and VHR Data Using Multi-Resolution Features

    Directory of Open Access Journals (Sweden)

    Gianni Cristian Iannelli

    2014-09-01

    Full Text Available Detection of urban area extents by means of remotely sensed data is a difficult task, especially because of the multiple, diverse definitions of what an “urban area” is. The models of urban areas listed in technical literature are based on the combination of spectral information with spatial patterns, possibly at different spatial resolutions. Starting from the same data set, “urban area” extraction may thus lead to multiple outputs. If this is done in a well-structured framework, however, this may be considered as an advantage rather than an issue. This paper proposes a novel framework for urban area extent extraction from multispectral Earth Observation (EO data. The key is to compute and combine spectral and multi-scale spatial features. By selecting the most adequate features, and combining them with proper logical rules, the approach allows matching multiple urban area models. Experimental results for different locations in Brazil and Kenya using High-Resolution (HR data prove the usefulness and flexibility of the framework.

  13. Ecological Understanding 2: Transformation--A Key to Ecological Understanding.

    Science.gov (United States)

    Carlsson, Britta

    2002-01-01

    Describes the structure and general features of the phenomenon of ecological understanding. Presents qualitatively different ways of experiencing cycling of matter and the flow of energy in the context of ecosystems. The idea of transformation is key to the development of ecological understanding. (Contains 17 references.) (Author/YDS)

  14. SITE-94. Discrete-feature modelling of the Aespoe site: 2. Development of the integrated site-scale model

    International Nuclear Information System (INIS)

    Geier, J.E.

    1996-12-01

    A 3-dimensional, discrete-feature hydrological model is developed. The model integrates structural and hydrologic data for the Aespoe site, on scales ranging from semi regional fracture zones to individual fractures in the vicinity of the nuclear waste canisters. Hydrologic properties of the large-scale structures are initially estimated from cross-hole hydrologic test data, and automatically calibrated by numerical simulation of network flow, and comparison with undisturbed heads and observed drawdown in selected cross-hole tests. The calibrated model is combined with a separately derived fracture network model, to yield the integrated model. This model is partly validated by simulation of transient responses to a long-term pumping test and a convergent tracer test, based on the LPT2 experiment at Aespoe. The integrated model predicts that discharge from the SITE-94 repository is predominantly via fracture zones along the eastern shore of Aespoe. Similar discharge loci are produced by numerous model variants that explore uncertainty with regard to effective semi regional boundary conditions, hydrologic properties of the site-scale structures, and alternative structural/hydrological interpretations. 32 refs

  15. SITE-94. Discrete-feature modelling of the Aespoe site: 2. Development of the integrated site-scale model

    Energy Technology Data Exchange (ETDEWEB)

    Geier, J.E. [Golder Associates AB, Uppsala (Sweden)

    1996-12-01

    A 3-dimensional, discrete-feature hydrological model is developed. The model integrates structural and hydrologic data for the Aespoe site, on scales ranging from semi regional fracture zones to individual fractures in the vicinity of the nuclear waste canisters. Hydrologic properties of the large-scale structures are initially estimated from cross-hole hydrologic test data, and automatically calibrated by numerical simulation of network flow, and comparison with undisturbed heads and observed drawdown in selected cross-hole tests. The calibrated model is combined with a separately derived fracture network model, to yield the integrated model. This model is partly validated by simulation of transient responses to a long-term pumping test and a convergent tracer test, based on the LPT2 experiment at Aespoe. The integrated model predicts that discharge from the SITE-94 repository is predominantly via fracture zones along the eastern shore of Aespoe. Similar discharge loci are produced by numerous model variants that explore uncertainty with regard to effective semi regional boundary conditions, hydrologic properties of the site-scale structures, and alternative structural/hydrological interpretations. 32 refs.

  16. Markerless client-server augmented reality system with natural features

    Science.gov (United States)

    Ning, Shuangning; Sang, Xinzhu; Chen, Duo

    2017-10-01

    A markerless client-server augmented reality system is presented. In this research, the more extensive and mature virtual reality head-mounted display is adopted to assist the implementation of augmented reality. The viewer is provided an image in front of their eyes with the head-mounted display. The front-facing camera is used to capture video signals into the workstation. The generated virtual scene is merged with the outside world information received from the camera. The integrated video is sent to the helmet display system. The distinguishing feature and novelty is to realize the augmented reality with natural features instead of marker, which address the limitations of the marker, such as only black and white, the inapplicability of different environment conditions, and particularly cannot work when the marker is partially blocked. Further, 3D stereoscopic perception of virtual animation model is achieved. The high-speed and stable socket native communication method is adopted for transmission of the key video stream data, which can reduce the calculation burden of the system.

  17. Detection of Vandalism in Wikipedia using Metadata Features – Implementation in Simple English and Albanian sections

    Directory of Open Access Journals (Sweden)

    Arsim Susuri

    2017-03-01

    Full Text Available In this paper, we evaluate a list of classifiers in order to use them in the detection of vandalism by focusing on metadata features. Our work is focused on two low resource data sets (Simple English and Albanian from Wikipedia. The aim of this research is to prove that this form of vandalism detection applied in one data set (language can be extended into another data set (language. Article views data sets in Wikipedia have been used rarely for the purpose of detecting vandalism. We will show the benefits of using article views data set with features from the article revisions data set with the aim of improving the detection of vandalism. The key advantage of using metadata features is that these metadata features are language independent and simple to extract because they require minimal processing. This paper shows that application of vandalism models across low resource languages is possible, and vandalism can be detected through view patterns of articles.

  18. Bayesian quantile regression-based partially linear mixed-effects joint models for longitudinal data with multiple features.

    Science.gov (United States)

    Zhang, Hanze; Huang, Yangxin; Wang, Wei; Chen, Henian; Langland-Orban, Barbara

    2017-01-01

    In longitudinal AIDS studies, it is of interest to investigate the relationship between HIV viral load and CD4 cell counts, as well as the complicated time effect. Most of common models to analyze such complex longitudinal data are based on mean-regression, which fails to provide efficient estimates due to outliers and/or heavy tails. Quantile regression-based partially linear mixed-effects models, a special case of semiparametric models enjoying benefits of both parametric and nonparametric models, have the flexibility to monitor the viral dynamics nonparametrically and detect the varying CD4 effects parametrically at different quantiles of viral load. Meanwhile, it is critical to consider various data features of repeated measurements, including left-censoring due to a limit of detection, covariate measurement error, and asymmetric distribution. In this research, we first establish a Bayesian joint models that accounts for all these data features simultaneously in the framework of quantile regression-based partially linear mixed-effects models. The proposed models are applied to analyze the Multicenter AIDS Cohort Study (MACS) data. Simulation studies are also conducted to assess the performance of the proposed methods under different scenarios.

  19. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  20. An object-oriented feature-based design system face-based detection of feature interactions

    International Nuclear Information System (INIS)

    Ariffin Abdul Razak

    1999-01-01

    This paper presents an object-oriented, feature-based design system which supports the integration of design and manufacture by ensuring that part descriptions fully account for any feature interactions. Manufacturing information is extracted from the feature descriptions in the form of volumes and Tool Access Directions, TADs. When features interact, both volumes and TADs are updated. This methodology has been demonstrated by developing a prototype system in which ACIS attributes are used to record feature information within the data structure of the solid model. The system implemented in the C++ programming language and embedded in a menu-driven X-windows user interface to the ACIS 3D Toolkit. (author)

  1. Protein single-model quality assessment by feature-based probability density functions.

    Science.gov (United States)

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  2. A review of features in Internet consumer health decision-support tools.

    Science.gov (United States)

    Schwitzer, Gary

    2002-01-01

    Over the past decade, health care consumers have begun to benefit from new Web-based communications tools to guide decision making on treatments and tests. Using today's online tools, consumers who have Internet connections can: watch and listen to videos of physicians; watch and hear the stories of other consumers who have faced the same decisions; join an online social support network; receive estimates of their own chances of experiencing various outcomes; and do it all at home. To review currently-available Internet consumer health decision-support tools. Five Web sites offering consumer health decision-support tools are analyzed for their use of 4 key Web-enabled features: the presentation of outcomes probability data tailored to the individual user; the use of videotaped patient interviews in the final product to convey the experiences of people who have faced similar diagnoses in the past; the ability to interact with others in a social support network; and the accessibility of the tool to any health care consumers with an Internet connection. None of the 5 Web sites delivers all 4 target features to all Web users. The reasons for these variations in the use of key Web functionality--features that make the Web distinctive--are not immediately clear. Consumers trying to make health care decisions may benefit from current Web-based decision-support tools. But, variations in Web developers' use of 4 key Web-enabled features leaves the online decision-support experience less than what it could be. Key research questions are identified that could help in the development of new hybrid patient decision-support tools.

  3. Ontology patterns for complex topographic feature yypes

    Science.gov (United States)

    Varanka, Dalia E.

    2011-01-01

    Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.

  4. Feature-based tolerancing for intelligent inspection process definition

    International Nuclear Information System (INIS)

    Brown, C.W.

    1993-07-01

    This paper describes a feature-based tolerancing capability that complements a geometric solid model with an explicit representation of conventional and geometric tolerances. This capability is focused on supporting an intelligent inspection process definition system. The feature-based tolerance model's benefits include advancing complete product definition initiatives (e.g., STEP -- Standard for Exchange of Product model dam), suppling computer-integrated manufacturing applications (e.g., generative process planning and automated part programming) with product definition information, and assisting in the solution of measurement performance issues. A feature-based tolerance information model was developed based upon the notion of a feature's toleranceable aspects and describes an object-oriented scheme for representing and relating tolerance features, tolerances, and datum reference frames. For easy incorporation, the tolerance feature entities are interconnected with STEP solid model entities. This schema will explicitly represent the tolerance specification for mechanical products, support advanced dimensional measurement applications, and assist in tolerance-related methods divergence issues

  5. A Quantitative Feasibility Study on Potential Safety Improvement Effects of Advanced Safety Features in APR-1400 when Applied to OPR-1000

    Energy Technology Data Exchange (ETDEWEB)

    Ualikhan Zhiyenbayev [KAIST, Daejeon (Korea, Republic of); Chung, Dae Wook [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    This study aims to test the feasibility of the applications using Probabilistic Safety Assessment (PSA). Particularly, three of those advanced safety features are selected as follows: 1. Providing an additional Emergency Diesel Generator (EDG); 2. Increasing the capacity of Class 1E batteries; 3. Placing a Refueling Water Storage Tank (RWST) inside containment, i.e., change from RWST to IRWST. The Advanced Power Reactor 1400 (APR-1400) adopts several advanced safety features compared to its predecessor, the Optimized Power Reactor 1000 (OPR-1000), which includes an additional Emergency Diesel Generator, increase in battery capacity, in-containment refueling water storage tank (IRWST), and so on. Considering the remarkable advantages of these safety features in safety improvement and the design similarities between APR-1400 and OPR-1000, it is feasible to apply key advanced safety features of APR-1400 to OPR-1000 to enhance the safety. The selected safety features are incorporated into OPR-1000 PSA model using the Advanced Information Management System (AIMS) for PSA and CDFs are re-evaluated for each application and combination of three applications. Based on current results, it is concluded that three of key advanced safety features of APR-1400 can be effectively applied to OPR-1000, resulting in considerable safety improvement. In aggregate, three advanced safety features, which are an additional EDG, increased battery capacity and IRWST, can reduce the CDF of OPR-1000 by more than 15% when applied altogether.

  6. A Quantitative Feasibility Study on Potential Safety Improvement Effects of Advanced Safety Features in APR-1400 when Applied to OPR-1000

    International Nuclear Information System (INIS)

    Ualikhan Zhiyenbayev; Chung, Dae Wook

    2015-01-01

    This study aims to test the feasibility of the applications using Probabilistic Safety Assessment (PSA). Particularly, three of those advanced safety features are selected as follows: 1. Providing an additional Emergency Diesel Generator (EDG); 2. Increasing the capacity of Class 1E batteries; 3. Placing a Refueling Water Storage Tank (RWST) inside containment, i.e., change from RWST to IRWST. The Advanced Power Reactor 1400 (APR-1400) adopts several advanced safety features compared to its predecessor, the Optimized Power Reactor 1000 (OPR-1000), which includes an additional Emergency Diesel Generator, increase in battery capacity, in-containment refueling water storage tank (IRWST), and so on. Considering the remarkable advantages of these safety features in safety improvement and the design similarities between APR-1400 and OPR-1000, it is feasible to apply key advanced safety features of APR-1400 to OPR-1000 to enhance the safety. The selected safety features are incorporated into OPR-1000 PSA model using the Advanced Information Management System (AIMS) for PSA and CDFs are re-evaluated for each application and combination of three applications. Based on current results, it is concluded that three of key advanced safety features of APR-1400 can be effectively applied to OPR-1000, resulting in considerable safety improvement. In aggregate, three advanced safety features, which are an additional EDG, increased battery capacity and IRWST, can reduce the CDF of OPR-1000 by more than 15% when applied altogether

  7. Is the fluid mosaic (and the accompanying raft hypothesis a suitable model to describe fundamental features of biological membranes? What may be missing?

    Directory of Open Access Journals (Sweden)

    Luis Alberto Bagatolli

    2013-11-01

    Full Text Available The structure, dynamics, and stability of lipid bilayers are controlled by thermodynamic forces, leading to overall tensionless membranes with a distinct lateral organization and a conspicuous lateral pressure profile. Bilayers are also subject to built-in curvature-stress instabilities that may be released locally or globally in terms of morphological changes leading to the formation of non-lamellar and curved structures. A key controller of the bilayer’s propensity to form curved structures is the average molecular shape of the different lipid molecules. Via the curvature stress, molecular shape mediates a coupling to membrane-protein function and provides a set of physical mechanisms for formation of lipid domains and laterally differentiated regions in the plane of the membrane. Unfortunately, these relevant physical features of membranes are often ignored in the most popular models for biological membranes. Results from a number of experimental and theoretical studies emphasize the significance of these fundamental physical properties and call for a refinement of the fluid mosaic model (and the accompanying raft hypothesis.

  8. Models for solid oxide fuel cell systems exploitation of models hierarchy for industrial design of control and diagnosis strategies

    CERN Document Server

    Marra, Dario; Polverino, Pierpaolo; Sorrentino, Marco

    2016-01-01

    This book presents methodologies for optimal design of control and diagnosis strategies for Solid Oxide Fuel Cell systems. A key feature of the methodologies presented is the exploitation of modelling tools that balance accuracy and computational burden.

  9. Key Feature of the Catalytic Cycle of TNF-α Converting Enzyme Involves Communication Between Distal Protein Sites and the Enzyme Catalytic Core

    International Nuclear Information System (INIS)

    Solomon, A.; Akabayov, B.; Frenkel, A.; Millas, M.; Sagi, I.

    2007-01-01

    Despite their key roles in many normal and pathological processes, the molecular details by which zinc-dependent proteases hydrolyze their physiological substrates remain elusive. Advanced theoretical analyses have suggested reaction models for which there is limited and controversial experimental evidence. Here we report the structure, chemistry and lifetime of transient metal-protein reaction intermediates evolving during the substrate turnover reaction of a metalloproteinase, the tumor necrosis factor-α converting enzyme (TACE). TACE controls multiple signal transduction pathways through the proteolytic release of the extracellular domain of a host of membrane-bound factors and receptors. Using stopped-flow x-ray spectroscopy methods together with transient kinetic analyses, we demonstrate that TACE's catalytic zinc ion undergoes dynamic charge transitions before substrate binding to the metal ion. This indicates previously undescribed communication pathways taking place between distal protein sites and the enzyme catalytic core. The observed charge transitions are synchronized with distinct phases in the reaction kinetics and changes in metal coordination chemistry mediated by the binding of the peptide substrate to the catalytic metal ion and product release. Here we report key local charge transitions critical for proteolysis as well as long sought evidence for the proposed reaction model of peptide hydrolysis. This study provides a general approach for gaining critical insights into the molecular basis of substrate recognition and turnover by zinc metalloproteinases that may be used for drug design

  10. Featuring Multiple Local Optima to Assist the User in the Interpretation of Induced Bayesian Network Models

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Pena, Jose; Kocka, Tomas

    2004-01-01

    We propose a method to assist the user in the interpretation of the best Bayesian network model indu- ced from data. The method consists in extracting relevant features from the model (e.g. edges, directed paths and Markov blankets) and, then, assessing the con¯dence in them by studying multiple...

  11. Features that contribute to the usefulness of low-fidelity models for surgical skills training

    DEFF Research Database (Denmark)

    Langebæk, Rikke; Berendt, Mette; Pedersen, Lene Tanggaard

    2012-01-01

    of models were developed to be used in a basic surgical skills course for veterinary students. The models were low fidelity, having limited resemblance to real animals. The aim of the present study was to describe the students' learning experience with the models and to report their perception...... of the usefulness of the models in applying the trained skills to live animal surgery. One hundred and forty-six veterinary fourth-year students evaluated the models on a four-point Likert scale. Of these, 26 additionally participated in individual semistructured interviews. The survey results showed that 75 per...... educational tools in preparation for live animal surgery. However, there are specific features to take into account when developing models in order for students to perceive them as useful....

  12. Antimicrobial Nanoplexes meet Model Bacterial Membranes: the key role of Cardiolipin

    Science.gov (United States)

    Marín-Menéndez, Alejandro; Montis, Costanza; Díaz-Calvo, Teresa; Carta, Davide; Hatzixanthis, Kostas; Morris, Christopher J.; McArthur, Michael; Berti, Debora

    2017-01-01

    Antimicrobial resistance to traditional antibiotics is a crucial challenge of medical research. Oligonucleotide therapeutics, such as antisense or Transcription Factor Decoys (TFDs), have the potential to circumvent current resistance mechanisms by acting on novel targets. However, their full translation into clinical application requires efficient delivery strategies and fundamental comprehension of their interaction with target bacterial cells. To address these points, we employed a novel cationic bolaamphiphile that binds TFDs with high affinity to form self-assembled complexes (nanoplexes). Confocal microscopy revealed that nanoplexes efficiently transfect bacterial cells, consistently with biological efficacy on animal models. To understand the factors affecting the delivery process, liposomes with varying compositions, taken as model synthetic bilayers, were challenged with nanoplexes and investigated with Scattering and Fluorescence techniques. Thanks to the combination of results on bacteria and synthetic membrane models we demonstrate for the first time that the prokaryotic-enriched anionic lipid Cardiolipin (CL) plays a key-role in the TFDs delivery to bacteria. Moreover, we can hypothesize an overall TFD delivery mechanism, where bacterial membrane reorganization with permeability increase and release of the TFD from the nanoplexes are the main factors. These results will be of great benefit to boost the development of oligonucleotides-based antimicrobials of superior efficacy.

  13. Predicting Spatial Distribution of Key Honeybee Pests in Kenya Using Remotely Sensed and Bioclimatic Variables: Key Honeybee Pests Distribution Models

    Directory of Open Access Journals (Sweden)

    David M. Makori

    2017-02-01

    Full Text Available Bee keeping is indispensable to global food production. It is an alternate income source, especially in rural underdeveloped African settlements, and an important forest conservation incentive. However, dwindling honeybee colonies around the world are attributed to pests and diseases whose spatial distribution and influences are not well established. In this study, we used remotely sensed data to improve the reliability of pest ecological niche (EN models to attain reliable pest distribution maps. Occurrence data on four pests (Aethina tumida, Galleria mellonella, Oplostomus haroldi and Varroa destructor were collected from apiaries within four main agro-ecological regions responsible for over 80% of Kenya’s bee keeping. Africlim bioclimatic and derived normalized difference vegetation index (NDVI variables were used to model their ecological niches using Maximum Entropy (MaxEnt. Combined precipitation variables had a high positive logit influence on all remotely sensed and biotic models’ performance. Remotely sensed vegetation variables had a substantial effect on the model, contributing up to 40.8% for G. mellonella and regions with high rainfall seasonality were predicted to be high-risk areas. Projections (to 2055 indicated that, with the current climate change trend, these regions will experience increased honeybee pest risk. We conclude that honeybee pests could be modelled using bioclimatic data and remotely sensed variables in MaxEnt. Although the bioclimatic data were most relevant in all model results, incorporating vegetation seasonality variables to improve mapping the ‘actual’ habitat of key honeybee pests and to identify risk and containment zones needs to be further investigated.

  14. Crowding with conjunctions of simple features.

    Science.gov (United States)

    Põder, Endel; Wagemans, Johan

    2007-11-20

    Several recent studies have related crowding with the feature integration stage in visual processing. In order to understand the mechanisms involved in this stage, it is important to use stimuli that have several features to integrate, and these features should be clearly defined and measurable. In this study, Gabor patches were used as target and distractor stimuli. The stimuli differed in three dimensions: spatial frequency, orientation, and color. A group of 3, 5, or 7 objects was presented briefly at 4 deg eccentricity of the visual field. The observers' task was to identify the object located in the center of the group. A strong effect of the number of distractors was observed, consistent with various spatial pooling models. The analysis of incorrect responses revealed that these were a mix of feature errors and mislocalizations of the target object. Feature errors were not purely random, but biased by the features of distractors. We propose a simple feature integration model that predicts most of the observed regularities.

  15. [Elucidation of key genes in sex determination in genetics teaching].

    Science.gov (United States)

    Li, Meng; He, Zhumei

    2014-06-01

    Sex is an important and complex feature of organisms, which is controlled by the genetic and environmental factors. The genetic factors, i.e., genes, are vital in sex determination. However, not all the related genes play the same roles, and some key genes play a vital role in the sex determination and differentiation. With the development of the modern genetics, a great progress on the key genes has been made in sex determination. In this review, we summarize the mechanism of sex determination and the strategy of how to study the key genes in sex determination. It will help us to understand the mechanism of sex determination better in the teaching of genetics.

  16. The relationship between the key elements of Donabedian's conceptual model within the field of assistive technology

    DEFF Research Database (Denmark)

    Sund, Terje; Iwarsson, Susanne; Brandt, Åse

    2015-01-01

    Previous research has suggested that there is a relationship between the three key components of Donabedian's conceptual model for quality assessments: structure of care, process, and outcome of care. That is, structure predicted both process and outcome of care, and better processes predict better...

  17. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    Science.gov (United States)

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  18. Bilinear modeling of EMG signals to extract user-independent features for multiuser myoelectric interface.

    Science.gov (United States)

    Matsubara, Takamitsu; Morimoto, Jun

    2013-08-01

    In this study, we propose a multiuser myoelectric interface that can easily adapt to novel users. When a user performs different motions (e.g., grasping and pinching), different electromyography (EMG) signals are measured. When different users perform the same motion (e.g., grasping), different EMG signals are also measured. Therefore, designing a myoelectric interface that can be used by multiple users to perform multiple motions is difficult. To cope with this problem, we propose for EMG signals a bilinear model that is composed of two linear factors: 1) user dependent and 2) motion dependent. By decomposing the EMG signals into these two factors, the extracted motion-dependent factors can be used as user-independent features. We can construct a motion classifier on the extracted feature space to develop the multiuser interface. For novel users, the proposed adaptation method estimates the user-dependent factor through only a few interactions. The bilinear EMG model with the estimated user-dependent factor can extract the user-independent features from the novel user data. We applied our proposed method to a recognition task of five hand gestures for robotic hand control using four-channel EMG signals measured from subject forearms. Our method resulted in 73% accuracy, which was statistically significantly different from the accuracy of standard nonmultiuser interfaces, as the result of a two-sample t -test at a significance level of 1%.

  19. Shielding voices: The modulation of binding processes between voice features and response features by task representations.

    Science.gov (United States)

    Bogon, Johanna; Eisenbarth, Hedwig; Landgraf, Steffen; Dreisbach, Gesine

    2017-09-01

    Vocal events offer not only semantic-linguistic content but also information about the identity and the emotional-motivational state of the speaker. Furthermore, most vocal events have implications for our actions and therefore include action-related features. But the relevance and irrelevance of vocal features varies from task to task. The present study investigates binding processes for perceptual and action-related features of spoken words and their modulation by the task representation of the listener. Participants reacted with two response keys to eight different words spoken by a male or a female voice (Experiment 1) or spoken by an angry or neutral male voice (Experiment 2). There were two instruction conditions: half of participants learned eight stimulus-response mappings by rote (SR), and half of participants applied a binary task rule (TR). In both experiments, SR instructed participants showed clear evidence for binding processes between voice and response features indicated by an interaction between the irrelevant voice feature and the response. By contrast, as indicated by a three-way interaction with instruction, no such binding was found in the TR instructed group. These results are suggestive of binding and shielding as two adaptive mechanisms that ensure successful communication and action in a dynamic social environment.

  20. Toward Designing a Quantum Key Distribution Network Simulation Model

    Directory of Open Access Journals (Sweden)

    Miralem Mehic

    2016-01-01

    Full Text Available As research in quantum key distribution network technologies grows larger and more complex, the need for highly accurate and scalable simulation technologies becomes important to assess the practical feasibility and foresee difficulties in the practical implementation of theoretical achievements. In this paper, we described the design of simplified simulation environment of the quantum key distribution network with multiple links and nodes. In such simulation environment, we analyzed several routing protocols in terms of the number of sent routing packets, goodput and Packet Delivery Ratio of data traffic flow using NS-3 simulator.

  1. Key features of wave energy.

    Science.gov (United States)

    Rainey, R C T

    2012-01-28

    For a weak point source or dipole, or a small body operating as either, we show that the power from a wave energy converter (WEC) is the product of the particle velocity in the waves, and the wave force (suitably defined). There is a thus a strong analogy with a wind or tidal turbine, where the power is the product of the fluid velocity through the turbine, and the force on it. As a first approximation, the cost of a structure is controlled by the force it has to carry, which governs its strength, and the distance it has to be carried, which governs its size. Thus, WECs are at a disadvantage compared with wind and tidal turbines because the fluid velocities are lower, and hence the forces are higher. On the other hand, the distances involved are lower. As with turbines, the implication is also that a WEC must make the most of its force-carrying ability-ideally, to carry its maximum force all the time, the '100% sweating WEC'. It must be able to limit the wave force on it in larger waves, ultimately becoming near-transparent to them in the survival condition-just like a turbine in extreme conditions, which can stop and feather its blades. A turbine of any force rating can achieve its maximum force in low wind speeds, if its diameter is sufficiently large. This is not possible with a simple monopole or dipole WEC, however, because of the 'nλ/2π' capture width limits. To achieve reasonable 'sweating' in typical wave climates, the force is limited to about 1 MN for a monopole device, or 2 MN for a dipole. The conclusion is that the future of wave energy is in devices that are not simple monopoles or dipoles, but multi-body devices or other shapes equivalent to arrays.

  2. Thermodynamic model of social influence on two-dimensional square lattice: Case for two features

    Science.gov (United States)

    Genzor, Jozef; Bužek, Vladimír; Gendiar, Andrej

    2015-02-01

    We propose a thermodynamic multi-state spin model in order to describe equilibrial behavior of a society. Our model is inspired by the Axelrod model used in social network studies. In the framework of the statistical mechanics language, we analyze phase transitions of our model, in which the spin interaction J is interpreted as a mutual communication among individuals forming a society. The thermal fluctuations introduce a noise T into the communication, which suppresses long-range correlations. Below a certain phase transition point Tt, large-scale clusters of the individuals, who share a specific dominant property, are formed. The measure of the cluster sizes is an order parameter after spontaneous symmetry breaking. By means of the Corner transfer matrix renormalization group algorithm, we treat our model in the thermodynamic limit and classify the phase transitions with respect to inherent degrees of freedom. Each individual is chosen to possess two independent features f = 2 and each feature can assume one of q traits (e.g. interests). Hence, each individual is described by q2 degrees of freedom. A single first-order phase transition is detected in our model if q > 2, whereas two distinct continuous phase transitions are found if q = 2 only. Evaluating the free energy, order parameters, specific heat, and the entanglement von Neumann entropy, we classify the phase transitions Tt(q) in detail. The permanent existence of the ordered phase (the large-scale cluster formation with a non-zero order parameter) is conjectured below a non-zero transition point Tt(q) ≈ 0.5 in the asymptotic regime q → ∞.

  3. ANALYSIS OF THE KEY ACTIVITIES OF THE LIFE CYCLE OF KNOWLEDGE MANAGEMENT IN THE UNIVERSITY AND DEVELOPMENT OF THE CONCEPTUAL ARCHITECTURE OF THE KNOWLEDGE MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Eugene N. Tcheremsina

    2013-01-01

    Full Text Available This article gives an analysis of the key activities of the life cycle of knowledge management in terms of the features of knowledge management in higher education. Based on the analysis we propose the model of the conceptual architecture of virtual knowledge-space of a university. The proposed model is the basis for the development of kernel intercollegiate virtual knowledge-space, based on cloud technology. 

  4. Solutions manual to accompany finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    A solutions manual to accompany Finite Mathematics: Models and Applications In order to emphasize the main concepts of each chapter, Finite Mathematics: Models and Applications features plentiful pedagogical elements throughout such as special exercises, end notes, hints, select solutions, biographies of key mathematicians, boxed key principles, a glossary of important terms and topics, and an overview of use of technology. The book encourages the modeling of linear programs and their solutions and uses common computer software programs such as LINDO. In addition to extensive chapters on pr

  5. CONSTRUCTION OF MECHANICAL MODEL OF THE DIESEL-TRAIN DTKR-2 CAR AND ITS FEATURES

    Directory of Open Access Journals (Sweden)

    A. Y. Kuzyshyn

    2017-12-01

    Full Text Available Purpose.The article is aimed to construct the mechanical model of the diesel train DTKr-2 of the Kryukivsk Railway Car Building Works based on the analysis of undercarriage construction. This model will be used in the study of dynamic properties of the vehicle. When constructing the model the design features and its loading methods should be displayed as much as possible. Methodology. When constructing the mechanical model of the diesel train DTKr-2 car, the pneumatic spring, which is the main element of the central spring suspension, was modeled using Kelvin-Voigt node. This node includes elastic and viscous element. Hydraulic shock absorbers that are used both in the central and axle-box spring suspension were modeled as a viscous element. During research, the rigidity of the pneumatic spring, which is associated with the change in its effective area under deformation, was assumed to be zero. Findings. This article analyzed the design of car undercarriage of the diesel train DTKr-2. The mathematical models of its main units were presented, namely, in the central spring suspension – the model of pneumatic spring. Taking into account the peculiarities of design of the diesel train DTKr-2 undercarriage it was developed its mechanical model, which will be used in the future when studying dynamic properties. Originality.For the first time for the diesel train DTKr-2 car it was developed its mechanical model taking into account the features of the interaction of individual elements of its design. It has been proposed as a pneumatic spring to use the Kelvin-Voigt node, which includes parallel arranged elastic and viscous elements. Practical value. On the basis of the proposed mechanical model, a system of ordinary differential equations of car undercarriage movement of the diesel train DTKr-2 (mathematical model will be compiled. This model is further planned to be used when studying dynamic interaction of the diesel train car undercarriage wheel

  6. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  7. Adaptive Correlation Model for Visual Tracking Using Keypoints Matching and Deep Convolutional Feature

    Directory of Open Access Journals (Sweden)

    Yuankun Li

    2018-02-01

    Full Text Available Although correlation filter (CF-based visual tracking algorithms have achieved appealing results, there are still some problems to be solved. When the target object goes through long-term occlusions or scale variation, the correlation model used in existing CF-based algorithms will inevitably learn some non-target information or partial-target information. In order to avoid model contamination and enhance the adaptability of model updating, we introduce the keypoints matching strategy and adjust the model learning rate dynamically according to the matching score. Moreover, the proposed approach extracts convolutional features from a deep convolutional neural network (DCNN to accurately estimate the position and scale of the target. Experimental results demonstrate that the proposed tracker has achieved satisfactory performance in a wide range of challenging tracking scenarios.

  8. Key Process Uncertainties in Soil Carbon Dynamics: Comparing Multiple Model Structures and Observational Meta-analysis

    Science.gov (United States)

    Sulman, B. N.; Moore, J.; Averill, C.; Abramoff, R. Z.; Bradford, M.; Classen, A. T.; Hartman, M. D.; Kivlin, S. N.; Luo, Y.; Mayes, M. A.; Morrison, E. W.; Riley, W. J.; Salazar, A.; Schimel, J.; Sridhar, B.; Tang, J.; Wang, G.; Wieder, W. R.

    2016-12-01

    Soil carbon (C) dynamics are crucial to understanding and predicting C cycle responses to global change and soil C modeling is a key tool for understanding these dynamics. While first order model structures have historically dominated this area, a recent proliferation of alternative model structures representing different assumptions about microbial activity and mineral protection is providing new opportunities to explore process uncertainties related to soil C dynamics. We conducted idealized simulations of soil C responses to warming and litter addition using models from five research groups that incorporated different sets of assumptions about processes governing soil C decomposition and stabilization. We conducted a meta-analysis of published warming and C addition experiments for comparison with simulations. Assumptions related to mineral protection and microbial dynamics drove strong differences among models. In response to C additions, some models predicted long-term C accumulation while others predicted transient increases that were counteracted by accelerating decomposition. In experimental manipulations, doubling litter addition did not change soil C stocks in studies spanning as long as two decades. This result agreed with simulations from models with strong microbial growth responses and limited mineral sorption capacity. In observations, warming initially drove soil C loss via increased CO2 production, but in some studies soil C rebounded and increased over decadal time scales. In contrast, all models predicted sustained C losses under warming. The disagreement with experimental results could be explained by physiological or community-level acclimation, or by warming-related changes in plant growth. In addition to the role of microbial activity, assumptions related to mineral sorption and protected C played a key role in driving long-term model responses. In general, simulations were similar in their initial responses to perturbations but diverged over

  9. A Novel Re-keying Function Protocol (NRFP For Wireless Sensor Network Security

    Directory of Open Access Journals (Sweden)

    Naif Alsharabi

    2008-12-01

    Full Text Available This paper describes a novel re-keying function protocol (NRFP for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs, covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding in-network processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks.

  10. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security

    Science.gov (United States)

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-01-01

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding innetwork processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks. PMID:27873963

  11. A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security.

    Science.gov (United States)

    Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif

    2008-12-04

    This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding in-network processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks.

  12. A framework for treating DSM-5 alternative model for personality disorder features.

    Science.gov (United States)

    Hopwood, Christopher J

    2018-04-15

    Despite its demonstrated empirical superiority over the DSM-5 Section 2 categorical model of personality disorders for organizing the features of personality pathology, limitations remain with regard to the translation of the DSM-5 Section 3 alternative model of personality disorders (AMPD) to clinical practice. The goal of this paper is to outline a general and preliminary framework for approaching treatment from the perspective of the AMPD. Specific techniques are discussed for the assessment and treatment of both Criterion A personality dysfunction and Criterion B maladaptive traits. A concise and step-by-step model is presented for clinical decision making with the AMPD, in the hopes of offering clinicians a framework for treating personality pathology and promoting further research on the clinical utility of the AMPD. Copyright © 2018 John Wiley & Sons, Ltd. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Dependency Parsing with Transformed Feature

    Directory of Open Access Journals (Sweden)

    Fuxiang Wu

    2017-01-01

    Full Text Available Dependency parsing is an important subtask of natural language processing. In this paper, we propose an embedding feature transforming method for graph-based parsing, transform-based parsing, which directly utilizes the inner similarity of the features to extract information from all feature strings including the un-indexed strings and alleviate the feature sparse problem. The model transforms the extracted features to transformed features via applying a feature weight matrix, which consists of similarities between the feature strings. Since the matrix is usually rank-deficient because of similar feature strings, it would influence the strength of constraints. However, it is proven that the duplicate transformed features do not degrade the optimization algorithm: the margin infused relaxed algorithm. Moreover, this problem can be alleviated by reducing the number of the nearest transformed features of a feature. In addition, to further improve the parsing accuracy, a fusion parser is introduced to integrate transformed and original features. Our experiments verify that both transform-based and fusion parser improve the parsing accuracy compared to the corresponding feature-based parser.

  14. Novel personalized pathway-based metabolomics models reveal key metabolic pathways for breast cancer diagnosis

    DEFF Research Database (Denmark)

    Huang, Sijia; Chong, Nicole; Lewis, Nathan

    2016-01-01

    diagnosis. We applied this method to predict breast cancer occurrence, in combination with correlation feature selection (CFS) and classification methods. Results: The resulting all-stage and early-stage diagnosis models are highly accurate in two sets of testing blood samples, with average AUCs (Area Under.......993. Moreover, important metabolic pathways, such as taurine and hypotaurine metabolism and the alanine, aspartate, and glutamate pathway, are revealed as critical biological pathways for early diagnosis of breast cancer. Conclusions: We have successfully developed a new type of pathway-based model to study...... metabolomics data for disease diagnosis. Applying this method to blood-based breast cancer metabolomics data, we have discovered crucial metabolic pathway signatures for breast cancer diagnosis, especially early diagnosis. Further, this modeling approach may be generalized to other omics data types for disease...

  15. How Task Features Impact Evidence from Assessments Embedded in Simulations and Games

    Science.gov (United States)

    Almond, Russell G.; Kim, Yoon Jeon; Velasquez, Gertrudes; Shute, Valerie J.

    2014-01-01

    One of the key ideas of evidence-centered assessment design (ECD) is that task features can be deliberately manipulated to change the psychometric properties of items. ECD identifies a number of roles that task-feature variables can play, including determining the focus of evidence, guiding form creation, determining item difficulty and…

  16. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  17. Investigation of attenuation correction in SPECT using textural features, Monte Carlo simulations, and computational anthropomorphic models.

    Science.gov (United States)

    Spirou, Spiridon V; Papadimitroulas, Panagiotis; Liakou, Paraskevi; Georgoulias, Panagiotis; Loudos, George

    2015-09-01

    To present and evaluate a new methodology to investigate the effect of attenuation correction (AC) in single-photon emission computed tomography (SPECT) using textural features analysis, Monte Carlo techniques, and a computational anthropomorphic model. The GATE Monte Carlo toolkit was used to simulate SPECT experiments using the XCAT computational anthropomorphic model, filled with a realistic biodistribution of (99m)Tc-N-DBODC. The simulated gamma camera was the Siemens ECAM Dual-Head, equipped with a parallel hole lead collimator, with an image resolution of 3.54 × 3.54 mm(2). Thirty-six equispaced camera positions, spanning a full 360° arc, were simulated. Projections were calculated after applying a ± 20% energy window or after eliminating all scattered photons. The activity of the radioisotope was reconstructed using the MLEM algorithm. Photon attenuation was accounted for by calculating the radiological pathlength in a perpendicular line from the center of each voxel to the gamma camera. Twenty-two textural features were calculated on each slice, with and without AC, using 16 and 64 gray levels. A mask was used to identify only those pixels that belonged to each organ. Twelve of the 22 features showed almost no dependence on AC, irrespective of the organ involved. In both the heart and the liver, the mean and SD were the features most affected by AC. In the liver, six features were affected by AC only on some slices. Depending on the slice, skewness decreased by 22-34% with AC, kurtosis by 35-50%, long-run emphasis mean by 71-91%, and long-run emphasis range by 62-95%. In contrast, gray-level non-uniformity mean increased by 78-218% compared with the value without AC and run percentage mean by 51-159%. These results were not affected by the number of gray levels (16 vs. 64) or the data used for reconstruction: with the energy window or without scattered photons. The mean and SD were the main features affected by AC. In the heart, no other feature was

  18. Discriminative phenomenological features of scale invariant models for electroweak symmetry breaking

    Directory of Open Access Journals (Sweden)

    Katsuya Hashino

    2016-01-01

    Full Text Available Classical scale invariance (CSI may be one of the solutions for the hierarchy problem. Realistic models for electroweak symmetry breaking based on CSI require extended scalar sectors without mass terms, and the electroweak symmetry is broken dynamically at the quantum level by the Coleman–Weinberg mechanism. We discuss discriminative features of these models. First, using the experimental value of the mass of the discovered Higgs boson h(125, we obtain an upper bound on the mass of the lightest additional scalar boson (≃543 GeV, which does not depend on its isospin and hypercharge. Second, a discriminative prediction on the Higgs-photon–photon coupling is given as a function of the number of charged scalar bosons, by which we can narrow down possible models using current and future data for the di-photon decay of h(125. Finally, for the triple Higgs boson coupling a large deviation (∼+70% from the SM prediction is universally predicted, which is independent of masses, quantum numbers and even the number of additional scalars. These models based on CSI can be well tested at LHC Run II and at future lepton colliders.

  19. Histological image classification using biologically interpretable shape-based features

    International Nuclear Information System (INIS)

    Kothari, Sonal; Phan, John H; Young, Andrew N; Wang, May D

    2013-01-01

    Automatic cancer diagnostic systems based on histological image classification are important for improving therapeutic decisions. Previous studies propose textural and morphological features for such systems. These features capture patterns in histological images that are useful for both cancer grading and subtyping. However, because many of these features lack a clear biological interpretation, pathologists may be reluctant to adopt these features for clinical diagnosis. We examine the utility of biologically interpretable shape-based features for classification of histological renal tumor images. Using Fourier shape descriptors, we extract shape-based features that capture the distribution of stain-enhanced cellular and tissue structures in each image and evaluate these features using a multi-class prediction model. We compare the predictive performance of the shape-based diagnostic model to that of traditional models, i.e., using textural, morphological and topological features. The shape-based model, with an average accuracy of 77%, outperforms or complements traditional models. We identify the most informative shapes for each renal tumor subtype from the top-selected features. Results suggest that these shapes are not only accurate diagnostic features, but also correlate with known biological characteristics of renal tumors. Shape-based analysis of histological renal tumor images accurately classifies disease subtypes and reveals biologically insightful discriminatory features. This method for shape-based analysis can be extended to other histological datasets to aid pathologists in diagnostic and therapeutic decisions

  20. A general procedure to generate models for urban environmental-noise pollution using feature selection and machine learning methods.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P

    2015-02-01

    The prediction of environmental noise in urban environments requires the solution of a complex and non-linear problem, since there are complex relationships among the multitude of variables involved in the characterization and modelling of environmental noise and environmental-noise magnitudes. Moreover, the inclusion of the great spatial heterogeneity characteristic of urban environments seems to be essential in order to achieve an accurate environmental-noise prediction in cities. This problem is addressed in this paper, where a procedure based on feature-selection techniques and machine-learning regression methods is proposed and applied to this environmental problem. Three machine-learning regression methods, which are considered very robust in solving non-linear problems, are used to estimate the energy-equivalent sound-pressure level descriptor (LAeq). These three methods are: (i) multilayer perceptron (MLP), (ii) sequential minimal optimisation (SMO), and (iii) Gaussian processes for regression (GPR). In addition, because of the high number of input variables involved in environmental-noise modelling and estimation in urban environments, which make LAeq prediction models quite complex and costly in terms of time and resources for application to real situations, three different techniques are used to approach feature selection or data reduction. The feature-selection techniques used are: (i) correlation-based feature-subset selection (CFS), (ii) wrapper for feature-subset selection (WFS), and the data reduction technique is principal-component analysis (PCA). The subsequent analysis leads to a proposal of different schemes, depending on the needs regarding data collection and accuracy. The use of WFS as the feature-selection technique with the implementation of SMO or GPR as regression algorithm provides the best LAeq estimation (R(2)=0.94 and mean absolute error (MAE)=1.14-1.16 dB(A)). Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Clinical features and management of hereditary spastic paraplegia

    Directory of Open Access Journals (Sweden)

    Ingrid Faber

    2014-03-01

    Full Text Available Hereditary spastic paraplegia (HSP is a group of genetically-determined disorders characterized by progressive spasticity and weakness of lower limbs. An apparently sporadic case of adult-onset spastic paraplegia is a frequent clinical problem and a significant proportion of cases are likely to be of genetic origin. HSP is clinically divided into pure and complicated forms. The later present with a wide range of additional neurological and systemic features. To date, there are up to 60 genetic subtypes described. All modes of monogenic inheritance have been described: autosomal dominant, autosomal recessive, X-linked and mitochondrial traits. Recent advances point to abnormal axonal transport as a key mechanism leading to the degeneration of the long motor neuron axons in the central nervous system in HSP. In this review we aim to address recent advances in the field, placing emphasis on key diagnostic features that will help practicing neurologists to identify and manage these conditions.

  2. Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.

    Science.gov (United States)

    Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S

    2018-01-01

    In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.

  3. Multiple Paths to Mathematics Practice in Al-Kashi's Key to Arithmetic

    Science.gov (United States)

    Taani, Osama

    2014-01-01

    In this paper, I discuss one of the most distinguishing features of Jamshid al-Kashi's pedagogy from his Key to Arithmetic, a well-known Arabic mathematics textbook from the fifteenth century. This feature is the multiple paths that he includes to find a desired result. In the first section light is shed on al-Kashi's life and his contributions to mathematics and astronomy. Section 2 starts with a brief discussion of the contents and pedagogy of the Key to Arithmetic. Al-Kashi's multiple approaches are discussed through four different examples of his versatility in presenting a topic from multiple perspectives. These examples are multiple definitions, multiple algorithms, multiple formulas, and multiple methods for solving word problems. Section 3 is devoted to some benefits that can be gained by implementing al-Kashi's multiple paths approach in modern curricula. For this discussion, examples from two teaching modules taken from the Key to Arithmetic and implemented in Pre-Calculus and mathematics courses for preservice teachers are discussed. Also, the conclusions are supported by some aspects of these modules. This paper is an attempt to help mathematics educators explore more benefits from reading from original sources.

  4. Distinguishing obsessive features and worries: the role of thought-action fusion.

    Science.gov (United States)

    Coles, M E; Mennin, D S; Heimberg, R G

    2001-08-01

    Obsessions are a key feature of obsessive-compulsive disorder (OCD), and chronic worry is the cardinal feature of generalized anxiety disorder (GAD). However, these two cognitive processes are conceptually very similar, and there is a need to determine how they differ. Recent studies have attempted to identify cognitive processes that may be differentially related to obsessive features and worry. In the current study we proposed that (1) obsessive features and worry could be differentiated and that (2) a measure of the cognitive process thought-action fusion would distinguish between obsessive features and worry, being strongly related to obsessive features after controlling for the effects of worry. These hypotheses were supported in a sample of 173 undergraduate students. Thought-action fusion may be a valuable construct in differentiating between obsessive features and worry.

  5. Elysium region, mars: Tests of lithospheric loading models for the formation of tectonic features

    International Nuclear Information System (INIS)

    Hall, J.L.; Solomon, S.C.; Head, J.W.

    1986-01-01

    The second largest volcanic province on Mars lies in the Elysium region. Like the larger Tharsis province, Elysium is marked by a topographic rise and a broad free air gravity anomaly and also exhibits a complex assortment of tectonic and volcanic features. We test the hypothesis that the tectonic features in the Elysium region are the product of stresses produced by loading of the Martian lithosphere. We consider loading at three different scales: local loading by individual volcanoes, regional loading of the lithosphere from above or below, and quasi-global loading by Tharsis. A comparison of flexural stresses with lithospheric strength and with the inferred maximum depth of faulting confirms that concentric graben around Elysium Mons can be explained as resulting from local flexure of an elastic lithosphere about 50 km thick in response to the volcano load. Volcanic loading on a regional scale, however, leads to predicted stresses inconsistent with all observed tectonic features, suggesting that loading by widespread emplacement of thick plains deposits was not an important factor in the tectonic evolution of the Elysium region. A number of linear extensional features oriented generally NW-SE may have been the result of flexural uplift of the lithosphere on the scale of the Elysium rise. The global stress field associated with the support of the Tharsis rise appears to have influenced the development of many of the tectonic features in the Elysium region, including Cerberus Rupes and the systems of ridges in eastern and western Elysium. The comparisons of stress models for Elysium with the preserved tectonic features support a succession of stress fields operating at different times in the region

  6. Choosing preclinical study models of diabetic retinopathy: key problems for consideration

    Directory of Open Access Journals (Sweden)

    Mi XS

    2014-11-01

    Full Text Available Xue-Song Mi,1,2 Ti-Fei Yuan,3,4 Yong Ding,1 Jing-Xiang Zhong,1 Kwok-Fai So4,5 1Department of Ophthalmology, First Affiliated Hospital of Jinan University, Guangzhou, Guangdong, People’s Republic of China; 2Department of Anatomy, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Hong Kong, People’s Republic of China; 3School of Psychology, Nanjing Normal University, Nanjing, People’s Republic of China; 4Department of Ophthalmology, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Hong Kong; 5Guangdong-Hongkong-Macau Institute of Central Nervous System, Jinan University, Guangzhou, People’s Republic of China Abstract: Diabetic retinopathy (DR is the most common complication of diabetes mellitus in the eye. Although the clinical treatment for DR has already developed to a relative high level, there are still many urgent problems that need to be investigated in clinical and basic science. Currently, many in vivo animal models and in vitro culture systems have been applied to solve these problems. Many approaches have also been used to establish different DR models. However, till now, there has not been a single study model that can clearly and exactly mimic the developmental process of the human DR. Choosing the suitable model is important, not only for achieving our research goals smoothly, but also, to better match with different experimental proposals in the study. In this review, key problems for consideration in choosing study models of DR are discussed. These problems relate to clinical relevance, different approaches for establishing models, and choice of different species of animals as well as of the specific in vitro culture systems. Attending to these considerations will deepen the understanding on current study models and optimize the experimental design for the final goal of preventing DR. Keywords: animal model, in vitro culture, ex vivo culture, neurovascular dysfunction

  7. Research on Methods for Discovering and Selecting Cloud Infrastructure Services Based on Feature Modeling

    Directory of Open Access Journals (Sweden)

    Huamin Zhu

    2016-01-01

    Full Text Available Nowadays more and more cloud infrastructure service providers are providing large numbers of service instances which are a combination of diversified resources, such as computing, storage, and network. However, for cloud infrastructure services, the lack of a description standard and the inadequate research of systematic discovery and selection methods have exposed difficulties in discovering and choosing services for users. First, considering the highly configurable properties of a cloud infrastructure service, the feature model method is used to describe such a service. Second, based on the description of the cloud infrastructure service, a systematic discovery and selection method for cloud infrastructure services are proposed. The automatic analysis techniques of the feature model are introduced to verify the model’s validity and to perform the matching of the service and demand models. Finally, we determine the critical decision metrics and their corresponding measurement methods for cloud infrastructure services, where the subjective and objective weighting results are combined to determine the weights of the decision metrics. The best matching instances from various providers are then ranked by their comprehensive evaluations. Experimental results show that the proposed methods can effectively improve the accuracy and efficiency of cloud infrastructure service discovery and selection.

  8. Using probabilistic model as feature descriptor on a smartphone device for autonomous navigation of unmanned ground vehicles

    Science.gov (United States)

    Desai, Alok; Lee, Dah-Jye

    2013-12-01

    There has been significant research on the development of feature descriptors in the past few years. Most of them do not emphasize real-time applications. This paper presents the development of an affine invariant feature descriptor for low resource applications such as UAV and UGV that are equipped with an embedded system with a small microprocessor, a field programmable gate array (FPGA), or a smart phone device. UAV and UGV have proven suitable for many promising applications such as unknown environment exploration, search and rescue operations. These applications required on board image processing for obstacle detection, avoidance and navigation. All these real-time vision applications require a camera to grab images and match features using a feature descriptor. A good feature descriptor will uniquely describe a feature point thus allowing it to be correctly identified and matched with its corresponding feature point in another image. A few feature description algorithms are available for a resource limited system. They either require too much of the device's resource or too much simplification on the algorithm, which results in reduction in performance. This research is aimed at meeting the needs of these systems without sacrificing accuracy. This paper introduces a new feature descriptor called PRObabilistic model (PRO) for UGV navigation applications. It is a compact and efficient binary descriptor that is hardware-friendly and easy for implementation.

  9. ClinicalKey 2.0: Upgrades in a Point-of-Care Search Engine.

    Science.gov (United States)

    Huslig, Mary Ann; Vardell, Emily

    2015-01-01

    ClinicalKey 2.0, launched September 23, 2014, offers a mobile-friendly design with a search history feature for targeting point-of-care resources for health care professionals. Browsing is improved with searchable, filterable listings of sources highlighting new resources. ClinicalKey 2.0 improvements include more than 1,400 new Topic Pages for quick access to point-of-care content. A sample search details some of the upgrades and content options.

  10. Feature scale modeling for etching and deposition processes in semiconductor manufacturing

    International Nuclear Information System (INIS)

    Pyka, W.

    2000-04-01

    modeling of ballistic transport determined low-pressure processes, the equations for the calculation of local etching and deposition rates have been revised. New extensions like the full relation between angular and radial target emission characteristics and particle distributions resulting at different positions on the wafer have been added, and results from reactor scale simulations have been linked to the feature scale profile evolution. Moreover, a fitting model has been implemented, which reduces the number of parameters for particle distributions, scattering mechanisms, and angular dependent surface interactions. Concerning diffusion determined high-pressure CVD processes, a continuum transport and reaction model for the first time has been implemented in three dimensions. It comprises a flexible interface for the formulation of the involved process chemistry and derives the local deposition rate from a finite element diffusion calculation carried out on the three-dimensional mesh of the gas domain above the feature. For each time-step of the deposition simulation the mesh is automatically generated as counterpart to the surface of the three-dimensional structure evolving with time. The CVD model has also been coupled with equipment simulations. (author)

  11. Feature displacement interpolation

    DEFF Research Database (Denmark)

    Nielsen, Mads; Andresen, Per Rønsholt

    1998-01-01

    Given a sparse set of feature matches, we want to compute an interpolated dense displacement map. The application may be stereo disparity computation, flow computation, or non-rigid medical registration. Also estimation of missing image data, may be phrased in this framework. Since the features...... often are very sparse, the interpolation model becomes crucial. We show that a maximum likelihood estimation based on the covariance properties (Kriging) show properties more expedient than methods such as Gaussian interpolation or Tikhonov regularizations, also including scale......-selection. The computational complexities are identical. We apply the maximum likelihood interpolation to growth analysis of the mandibular bone. Here, the features used are the crest-lines of the object surface....

  12. Predicting establishment of non-native fishes in Greece: identifying key features

    Directory of Open Access Journals (Sweden)

    Christos Gkenas

    2015-11-01

    Full Text Available Non-native fishes are known to cause economic damage to human society and are considered a major threat to biodiversity loss in freshwater ecosystems. The growing concern about these impacts has driven to an investigation of the biological traits that facilitate the establishment of non-native fish. However, invalid assessment in choosing the appropriate statistical model can lead researchers to ambiguous conclusions. Here, we present a comprehensive comparison of traditional and alternative statistical methods for predicting fish invasions using logistic regression, classification trees, multicorrespondence analysis and random forest analysis to determine characteristics of successful and failed non-native fishes in Hellenic Peninsula through establishment. We defined fifteen categorical predictor variables with biological relevance and measures of human interest. Our study showed that accuracy differed according to the model and the number of factors considered. Among all the models tested, random forest and logistic regression performed best, although all approaches predicted non-native fish establishment with moderate to excellent results. Detailed evaluation among the models corresponded with differences in variables importance, with three biological variables (parental care, distance from nearest native source and maximum size and two variables of human interest (prior invasion success and propagule pressure being important in predicting establishment. The analyzed statistical methods presented have a high predictive power and can be used as a risk assessment tool to prevent future freshwater fish invasions in this region with an imperiled fish fauna.

  13. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    Directory of Open Access Journals (Sweden)

    Li Yao

    2016-01-01

    Full Text Available Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm’s projective function. We test our work on the several datasets and obtain very promising results.

  14. Feature: Post Traumatic Stres Disorder PTSD: NIH Research to Results

    Science.gov (United States)

    ... Navigation Bar Home Current Issue Past Issues Feature PTSD NIH Research to Results Past Issues / Winter 2009 ... be a key to a better understanding of PTSD and early identification of those at risk. Early ...

  15. Features of microscopic pedestrian movement in a panic situation based on cellular automata model

    Science.gov (United States)

    Ibrahim, Najihah; Hassan, Fadratul Hafinaz

    2017-10-01

    Pedestrian movement is the one of the subset for the crowd management under simulation objective. During panic situation, pedestrian usually will create a microscopic movement that lead towards the self-organization. During self-organizing, the behavioral and physical factors had caused the mass effect on the pedestrian movement. The basic CA model will create a movement path for each pedestrian over a time step. However, due to the factors immerge, the CA model needs some enhancement that will establish a real simulation state. Hence, this concept paper will discuss on the enhanced features of CA model for microscopic pedestrian movement during panic situation for a better pedestrian simulation.

  16. An Empirical Study of Wrappers for Feature Subset Selection based on a Parallel Genetic Algorithm: The Multi-Wrapper Model

    KAUST Repository

    Soufan, Othman

    2012-09-01

    Feature selection is the first task of any learning approach that is applied in major fields of biomedical, bioinformatics, robotics, natural language processing and social networking. In feature subset selection problem, a search methodology with a proper criterion seeks to find the best subset of features describing data (relevance) and achieving better performance (optimality). Wrapper approaches are feature selection methods which are wrapped around a classification algorithm and use a performance measure to select the best subset of features. We analyze the proper design of the objective function for the wrapper approach and highlight an objective based on several classification algorithms. We compare the wrapper approaches to different feature selection methods based on distance and information based criteria. Significant improvement in performance, computational time, and selection of minimally sized feature subsets is achieved by combining different objectives for the wrapper model. In addition, considering various classification methods in the feature selection process could lead to a global solution of desirable characteristics.

  17. Architectural Building A Public Key Infrastructure Integrated Information Space

    Directory of Open Access Journals (Sweden)

    Vadim Ivanovich Korolev

    2015-10-01

    Full Text Available The article keeps under consideration the mattersto apply the cryptographic system having a public key to provide information security and to implya digital signature. It performs the analysis of trust models at the formation of certificates and their use. The article describes the relationships between the trust model and the architecture public key infrastructure. It contains conclusions in respect of the options for building the public key infrastructure for integrated informationspace.

  18. Does Your Terrestrial Model Capture Key Arctic-Boreal Relationships?: Functional Benchmarks in the ABoVE Model Benchmarking System

    Science.gov (United States)

    Stofferahn, E.; Fisher, J. B.; Hayes, D. J.; Schwalm, C. R.; Huntzinger, D. N.; Hantson, W.

    2017-12-01

    The Arctic-Boreal Region (ABR) is a major source of uncertainties for terrestrial biosphere model (TBM) simulations. These uncertainties are precipitated by a lack of observational data from the region, affecting the parameterizations of cold environment processes in the models. Addressing these uncertainties requires a coordinated effort of data collection and integration of the following key indicators of the ABR ecosystem: disturbance, vegetation / ecosystem structure and function, carbon pools and biogeochemistry, permafrost, and hydrology. We are continuing to develop the model-data integration framework for NASA's Arctic Boreal Vulnerability Experiment (ABoVE), wherein data collection is driven by matching observations and model outputs to the ABoVE indicators via the ABoVE Grid and Projection. The data are used as reference datasets for a benchmarking system which evaluates TBM performance with respect to ABR processes. The benchmarking system utilizes two types of performance metrics to identify model strengths and weaknesses: standard metrics, based on the International Land Model Benchmarking (ILaMB) system, which relate a single observed variable to a single model output variable, and functional benchmarks, wherein the relationship of one variable to one or more variables (e.g. the dependence of vegetation structure on snow cover, the dependence of active layer thickness (ALT) on air temperature and snow cover) is ascertained in both observations and model outputs. This in turn provides guidance to model development teams for reducing uncertainties in TBM simulations of the ABR.

  19. Finite element modeling of small-scale tapered wood-laminated composite poles with biomimicry features

    Science.gov (United States)

    Cheng Piao; Todd F. Shupe; R.C. Tang; Chung Y. Hse

    2008-01-01

    Tapered composite poles with biomimicry features as in bamboo are a new generation of wood laminated composite poles that may some day be considered as an alternative to solid wood poles that are widely used in the transmission and telecommunication fields. Five finite element models were developed with ANSYS to predict and assess the performance of five types of...

  20. Comparison of the Features of EPUB E-Book and SCORM E-Learning Content Model

    Science.gov (United States)

    Chang, Hsuan-Pu; Hung, Jason C.

    2018-01-01

    E-books nowadays have greatly evolved in its presentation and functions, however its features for education need to be investigated and inspired because people who are accustomed to using printed books may consider and approach it in the same way as they do printed ones. Therefore, the authors compared the EPUB e-book content model with the SCORM…

  1. Key structural features of nonsteroidal ligands for binding and activation of the androgen receptor.

    Science.gov (United States)

    Yin, Donghua; He, Yali; Perera, Minoli A; Hong, Seoung Soo; Marhefka, Craig; Stourman, Nina; Kirkovsky, Leonid; Miller, Duane D; Dalton, James T

    2003-01-01

    The purposes of the present studies were to examine the androgen receptor (AR) binding ability and in vitro functional activity of multiple series of nonsteroidal compounds derived from known antiandrogen pharmacophores and to investigate the structure-activity relationships (SARs) of these nonsteroidal compounds. The AR binding properties of sixty-five nonsteroidal compounds were assessed by a radioligand competitive binding assay with the use of cytosolic AR prepared from rat prostates. The AR agonist and antagonist activities of high-affinity ligands were determined by the ability of the ligand to regulate AR-mediated transcriptional activation in cultured CV-1 cells, using a cotransfection assay. Nonsteroidal compounds with diverse structural features demonstrated a wide range of binding affinity for the AR. Ten compounds, mainly from the bicalutamide-related series, showed a binding affinity superior to the structural pharmacophore from which they were derived. Several SARs regarding nonsteroidal AR binding were revealed from the binding data, including stereoisomeric conformation, steric effect, and electronic effect. The functional activity of high-affinity ligands ranged from antagonist to full agonist for the AR. Several structural features were found to be determinative of agonist and antagonist activities. The nonsteroidal AR agonists identified from the present studies provided a pool of candidates for further development of selective androgen receptor modulators (SARMs) for androgen therapy. Also, these studies uncovered or confirmed numerous important SARs governing AR binding and functional properties by nonsteroidal molecules, which would be valuable in the future structural optimization of SARMs.

  2. Investigation of the blockchain systems’ scalability features using the agent based modelling

    OpenAIRE

    Šulnius, Aleksas

    2017-01-01

    Investigation of the BlockChain Systems’ Scalability Features using the Agent Based Modelling. BlockChain currently is in the spotlight of all the FinTech industry. This technology is being called revolutionary, ground breaking, disruptive and even the WEB 3.0. On the other hand it is widely agreed that the BlockChain is in its early stages of development. In its current state BlockChain is in similar position that the Internet was in the early nineties. In order for this technology to gain m...

  3. Finding the Key Periods for Assimilating HJ-1A/B CCD Data and the WOFOST Model to Evaluate Heavy Metal Stress in Rice.

    Science.gov (United States)

    Zhao, Shuang; Qian, Xu; Liu, Xiangnan; Xu, Zhao

    2018-04-17

    Accurately monitoring heavy metal stress in crops is vital for food security and agricultural production. The assimilation of remote sensing images into the World Food Studies (WOFOST) model provides an efficient way to solve this problem. In this study, we aimed at investigating the key periods of the assimilation framework for continuous monitoring of heavy metal stress in rice. The Harris algorithm was used for the leaf area index (LAI) curves to select the key period for an optimized assimilation. To obtain accurate LAI values, the measured dry weight of rice roots (WRT), which have been proven to be the most stress-sensitive indicator of heavy metal stress, were incorporated into the improved WOFOST model. Finally, the key periods, which contain four dominant time points, were used to select remote sensing images for the RS-WOFOST model for continuous monitoring of heavy metal stress. Compared with the key period which contains all the available remote sensing images, the results showed that the optimal key period can significantly improve the time efficiency of the assimilation framework by shortening the model operation time by more than 50%, while maintaining its accuracy. This result is highly significant when monitoring heavy metals in rice on a large-scale. Furthermore, it can also offer a reference for the timing of field measurements in monitoring heavy metal stress in rice.

  4. Tissue Feature-Based and Segmented Deformable Image Registration for Improved Modeling of Shear Movement of Lungs

    International Nuclear Information System (INIS)

    Xie Yaoqin; Chao Ming; Xing Lei

    2009-01-01

    Purpose: To report a tissue feature-based image registration strategy with explicit inclusion of the differential motions of thoracic structures. Methods and Materials: The proposed technique started with auto-identification of a number of corresponding points with distinct tissue features. The tissue feature points were found by using the scale-invariant feature transform method. The control point pairs were then sorted into different 'colors' according to the organs in which they resided and used to model the involved organs individually. A thin-plate spline method was used to register a structure characterized by the control points with a given 'color.' The proposed technique was applied to study a digital phantom case and 3 lung and 3 liver cancer patients. Results: For the phantom case, a comparison with the conventional thin-plate spline method showed that the registration accuracy was markedly improved when the differential motions of the lung and chest wall were taken into account. On average, the registration error and standard deviation of the 15 points against the known ground truth were reduced from 3.0 to 0.5 mm and from 1.5 to 0.2 mm, respectively, when the new method was used. A similar level of improvement was achieved for the clinical cases. Conclusion: The results of our study have shown that the segmented deformable approach provides a natural and logical solution to model the discontinuous organ motions and greatly improves the accuracy and robustness of deformable registration.

  5. Deep Restricted Kernel Machines Using Conjugate Feature Duality.

    Science.gov (United States)

    Suykens, Johan A K

    2017-08-01

    The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.

  6. Exploring Secondary Students' Epistemological Features Depending on the Evaluation Levels of the Group Model on Blood Circulation

    Science.gov (United States)

    Lee, Shinyoung; Kim, Heui-Baik

    2014-01-01

    The purpose of this study is to identify the epistemological features and model qualities depending on model evaluation levels and to explore the reasoning process behind high-level evaluation through small group interaction about blood circulation. Nine groups of three to four students in the eighth grade participated in the modeling practice.…

  7. Day-ahead deregulated electricity market price forecasting using neural network input featured by DCT

    International Nuclear Information System (INIS)

    Anbazhagan, S.; Kumarappan, N.

    2014-01-01

    Highlights: • We presented DCT input featured FFNN model for forecasting in Spain market. • The key factors impacting electricity price forecasting are historical prices. • Past 42 days were trained and the next 7 days were forecasted. • The proposed approach has a simple and better NN structure. • The DCT-FFNN mode is effective and less computation time than the recent models. - Abstract: In a deregulated market, a number of factors determined the outcome of electricity price and displays a perplexed and maverick fluctuation. Both power producers and consumers needs single compact and robust price forecasting tool in order to maximize their profits and utilities. In order to achieve the helter–skelter kind of electricity price, one dimensional discrete cosine transforms (DCT) input featured feed-forward neural network (FFNN) is modeled (DCT-FFNN). The proposed FFNN is a single compact and robust architecture (without hybridizing the various hard and soft computing models). It has been predicted that the DCT-FFNN model is close to the state of the art can be achieved with less computation time. The proposed DCT-FFNN approach is compared with 17 other recent approaches to estimate the market clearing prices of mainland Spain. Finally, the accuracy of the price forecasting is also applied to the electricity market of New York in year 2010 that shows the effectiveness of the proposed DCT-FFNN approach

  8. Secret-key expansion from covert communication

    Science.gov (United States)

    Arrazola, Juan Miguel; Amiri, Ryan

    2018-02-01

    Covert communication allows the transmission of messages in such a way that it is not possible for adversaries to detect that the communication is occurring. This provides protection in situations where knowledge that two parties are talking to each other may be incriminating to them. In this work, we study how covert communication can be used for a different purpose: secret key expansion. First, we show that any message transmitted in a secure covert protocol is also secret and therefore unknown to an adversary. We then propose a covert communication protocol where the amount of key consumed in the protocol is smaller than the transmitted key, thus leading to secure secret key expansion. We derive precise conditions for secret key expansion to occur, showing that it is possible when there are sufficiently low levels of noise for a given security level. We conclude by examining how secret key expansion from covert communication can be performed in a computational security model.

  9. Stargardt disease: clinical features, molecular genetics, animal models and therapeutic options

    Science.gov (United States)

    Tanna, Preena; Strauss, Rupert W; Fujinami, Kaoru; Michaelides, Michel

    2017-01-01

    Stargardt disease (STGD1; MIM 248200) is the most prevalent inherited macular dystrophy and is associated with disease-causing sequence variants in the gene ABCA4. Significant advances have been made over the last 10 years in our understanding of both the clinical and molecular features of STGD1, and also the underlying pathophysiology, which has culminated in ongoing and planned human clinical trials of novel therapies. The aims of this review are to describe the detailed phenotypic and genotypic characteristics of the disease, conventional and novel imaging findings, current knowledge of animal models and pathogenesis, and the multiple avenues of intervention being explored. PMID:27491360

  10. Grouted Connections with Shear Keys

    DEFF Research Database (Denmark)

    Pedersen, Ronnie; Jørgensen, M. B.; Damkilde, Lars

    2012-01-01

    This paper presents a finite element model in the software package ABAQUS in which a reliable analysis of grouted pile-to-sleeve connections with shear keys is the particular purpose. The model is calibrated to experimental results and a consistent set of input parameters is estimated so that dif...... that different structural problems can be reproduced successfully....

  11. TU-C-12A-09: Modeling Pathologic Response of Locally Advanced Esophageal Cancer to Chemo-Radiotherapy Using Quantitative PET/CT Features, Clinical Parameters and Demographics

    International Nuclear Information System (INIS)

    Zhang, H; Chen, W; Kligerman, S; D’Souza, W; Suntharalingam, M; Lu, W; Tan, S; Kim, G

    2014-01-01

    Purpose: To develop predictive models using quantitative PET/CT features for the evaluation of tumor response to neoadjuvant chemo-radiotherapy (CRT) in patients with locally advanced esophageal cancer. Methods: This study included 20 patients who underwent tri-modality therapy (CRT + surgery) and had 18 F-FDG PET/CT scans before initiation of CRT and 4-6 weeks after completion of CRT but prior to surgery. Four groups of tumor features were examined: (1) conventional PET/CT response measures (SUVmax, tumor diameter, etc.); (2) clinical parameters (TNM stage, histology, etc.) and demographics; (3) spatial-temporal PET features, which characterize tumor SUV intensity distribution, spatial patterns, geometry, and associated changes resulting from CRT; and (4) all features combined. An optimal feature set was identified with recursive feature selection and cross-validations. Support vector machine (SVM) and logistic regression (LR) models were constructed for prediction of pathologic tumor response to CRT, using cross-validations to avoid model over-fitting. Prediction accuracy was assessed via area under the receiver operating characteristic curve (AUC), and precision was evaluated via confidence intervals (CIs) of AUC. Results: When applied to the 4 groups of tumor features, the LR model achieved AUCs (95% CI) of 0.57 (0.10), 0.73 (0.07), 0.90 (0.06), and 0.90 (0.06). The SVM model achieved AUCs (95% CI) of 0.56 (0.07), 0.60 (0.06), 0.94 (0.02), and 1.00 (no misclassifications). Using spatial-temporal PET features combined with conventional PET/CT measures and clinical parameters, the SVM model achieved very high accuracy (AUC 1.00) and precision (no misclassifications), significantly better than using conventional PET/CT measures or clinical parameters and demographics alone. For groups with a large number of tumor features (groups 3 and 4), the SVM model achieved significantly higher accuracy than the LR model. Conclusion: The SVM model using all features including

  12. Using different classification models in wheat grading utilizing visual features

    Science.gov (United States)

    Basati, Zahra; Rasekh, Mansour; Abbaspour-Gilandeh, Yousef

    2018-04-01

    Wheat is one of the most important strategic crops in Iran and in the world. The major component that distinguishes wheat from other grains is the gluten section. In Iran, sunn pest is one of the most important factors influencing the characteristics of wheat gluten and in removing it from a balanced state. The existence of bug-damaged grains in wheat will reduce the quality and price of the product. In addition, damaged grains reduce the enrichment of wheat and the quality of bread products. In this study, after preprocessing and segmentation of images, 25 features including 9 colour features, 10 morphological features, and 6 textual statistical features were extracted so as to classify healthy and bug-damaged wheat grains of Azar cultivar of four levels of moisture content (9, 11.5, 14 and 16.5% w.b.) and two lighting colours (yellow light, the composition of yellow and white lights). Using feature selection methods in the WEKA software and the CfsSubsetEval evaluator, 11 features were chosen as inputs of artificial neural network, decision tree and discriment analysis classifiers. The results showed that the decision tree with the J.48 algorithm had the highest classification accuracy of 90.20%. This was followed by artificial neural network classifier with the topology of 11-19-2 and discrimient analysis classifier at 87.46 and 81.81%, respectively

  13. A comparative study of sequence- and structure-based features of small RNAs and other RNAs of bacteria.

    Science.gov (United States)

    Barik, Amita; Das, Santasabuj

    2018-01-02

    Small RNAs (sRNAs) in bacteria have emerged as key players in transcriptional and post-transcriptional regulation of gene expression. Here, we present a statistical analysis of different sequence- and structure-related features of bacterial sRNAs to identify the descriptors that could discriminate sRNAs from other bacterial RNAs. We investigated a comprehensive and heterogeneous collection of 816 sRNAs, identified by northern blotting across 33 bacterial species and compared their various features with other classes of bacterial RNAs, such as tRNAs, rRNAs and mRNAs. We observed that sRNAs differed significantly from the rest with respect to G+C composition, normalized minimum free energy of folding, motif frequency and several RNA-folding parameters like base-pairing propensity, Shannon entropy and base-pair distance. Based on the selected features, we developed a predictive model using Random Forests (RF) method to classify the above four classes of RNAs. Our model displayed an overall predictive accuracy of 89.5%. These findings would help to differentiate bacterial sRNAs from other RNAs and further promote prediction of novel sRNAs in different bacterial species.

  14. Noiseless Steganography The Key to Covert Communications

    CERN Document Server

    Desoky, Abdelrahman

    2012-01-01

    Among the features that make Noiseless Steganography: The Key to Covert Communications a first of its kind: The first to comprehensively cover Linguistic Steganography The first to comprehensively cover Graph Steganography The first to comprehensively cover Game Steganography Although the goal of steganography is to prevent adversaries from suspecting the existence of covert communications, most books on the subject present outdated steganography approaches that are detectable by human and/or machine examinations. These approaches often fail because they camouflage data as a detectable noise b

  15. Interplay of multiple synaptic plasticity features in filamentary memristive devices for neuromorphic computing

    Science.gov (United States)

    La Barbera, Selina; Vincent, Adrien F.; Vuillaume, Dominique; Querlioz, Damien; Alibart, Fabien

    2016-12-01

    Bio-inspired computing represents today a major challenge at different levels ranging from material science for the design of innovative devices and circuits to computer science for the understanding of the key features required for processing of natural data. In this paper, we propose a detail analysis of resistive switching dynamics in electrochemical metallization cells for synaptic plasticity implementation. We show how filament stability associated to joule effect during switching can be used to emulate key synaptic features such as short term to long term plasticity transition and spike timing dependent plasticity. Furthermore, an interplay between these different synaptic features is demonstrated for object motion detection in a spike-based neuromorphic circuit. System level simulation presents robust learning and promising synaptic operation paving the way to complex bio-inspired computing systems composed of innovative memory devices.

  16. A food recognition system for diabetic patients based on an optimized bag-of-features model.

    Science.gov (United States)

    Anthimopoulos, Marios M; Gianola, Lauro; Scarnato, Luca; Diem, Peter; Mougiakakou, Stavroula G

    2014-07-01

    Computer vision-based food recognition could be used to estimate a meal's carbohydrate content for diabetic patients. This study proposes a methodology for automatic food recognition, based on the bag-of-features (BoF) model. An extensive technical investigation was conducted for the identification and optimization of the best performing components involved in the BoF architecture, as well as the estimation of the corresponding parameters. For the design and evaluation of the prototype system, a visual dataset with nearly 5000 food images was created and organized into 11 classes. The optimized system computes dense local features, using the scale-invariant feature transform on the HSV color space, builds a visual dictionary of 10000 visual words by using the hierarchical k-means clustering and finally classifies the food images with a linear support vector machine classifier. The system achieved classification accuracy of the order of 78%, thus proving the feasibility of the proposed approach in a very challenging image dataset.

  17. Benchmarking Organisational Capability using The 20 Keys

    Directory of Open Access Journals (Sweden)

    Dino Petrarolo

    2012-01-01

    Full Text Available Organisations have over the years implemented many improvement initiatives, many of which were applied individually with no real, lasting improvement. Approaches such as quality control, team activities, setup reduction and many more seldom changed the fundamental constitution or capability of an organisation. Leading companies in the world have come to realise that an integrated approach is required which focuses on improving more than one factor at the same time - by recognising the importance of synergy between different improvement efforts and the need for commitment at all levels of the company to achieve total system-wide improvement.

    The 20 Keys approach offers a way to look at the strenqth of organisations and to systemically improve it, one step at a time by focusing on 20 different but interrelated aspects. One feature of the approach is the benchmarking system which forms the main focus of this paper. The benchmarking system is introduced as an important part of the 20 Keys philosophy in measuring organisational strength. Benchmarking results from selected South African companies are provided, as well as one company's results achieved through the adoption of the 20 Keys philosophy.

  18. Functional validation of candidate genes detected by genomic feature models

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Østergaard, Solveig; Kristensen, Torsten Nygaard

    2018-01-01

    to investigate locomotor activity, and applied genomic feature prediction models to identify gene ontology (GO) cate- gories predictive of this phenotype. Next, we applied the covariance association test to partition the genomic variance of the predictive GO terms to the genes within these terms. We...... then functionally assessed whether the identified candidate genes affected locomotor activity by reducing gene expression using RNA interference. In five of the seven candidate genes tested, reduced gene expression altered the phenotype. The ranking of genes within the predictive GO term was highly correlated......Understanding the genetic underpinnings of complex traits requires knowledge of the genetic variants that contribute to phenotypic variability. Reliable statistical approaches are needed to obtain such knowledge. In genome-wide association studies, variants are tested for association with trait...

  19. Representing Microbial Dormancy in Soil Decomposition Models Improves Model Performance and Reveals Key Ecosystem Controls on Microbial Activity

    Science.gov (United States)

    He, Y.; Yang, J.; Zhuang, Q.; Wang, G.; Liu, Y.

    2014-12-01

    Climate feedbacks from soils can result from environmental change and subsequent responses of plant and microbial communities and nutrient cycling. Explicit consideration of microbial life history traits and strategy may be necessary to predict climate feedbacks due to microbial physiology and community changes and their associated effect on carbon cycling. In this study, we developed an explicit microbial-enzyme decomposition model and examined model performance with and without representation of dormancy at six temperate forest sites with observed soil efflux ranged from 4 to 10 years across different forest types. We then extrapolated the model to all temperate forests in the Northern Hemisphere (25-50°N) to investigate spatial controls on microbial and soil C dynamics. Both models captured the observed soil heterotrophic respiration (RH), yet no-dormancy model consistently exhibited large seasonal amplitude and overestimation in microbial biomass. Spatially, the total RH from temperate forests based on dormancy model amounts to 6.88PgC/yr, and 7.99PgC/yr based on no-dormancy model. However, no-dormancy model notably overestimated the ratio of microbial biomass to SOC. Spatial correlation analysis revealed key controls of soil C:N ratio on the active proportion of microbial biomass, whereas local dormancy is primarily controlled by soil moisture and temperature, indicating scale-dependent environmental and biotic controls on microbial and SOC dynamics. These developments should provide essential support to modeling future soil carbon dynamics and enhance the avenue for collaboration between empirical soil experiment and modeling in the sense that more microbial physiological measurements are needed to better constrain and evaluate the models.

  20. A Hybrid Feature Model and Deep-Learning-Based Bearing Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Muhammad Sohaib

    2017-12-01

    Full Text Available Bearing fault diagnosis is imperative for the maintenance, reliability, and durability of rotary machines. It can reduce economical losses by eliminating unexpected downtime in industry due to failure of rotary machines. Though widely investigated in the past couple of decades, continued advancement is still desirable to improve upon existing fault diagnosis techniques. Vibration acceleration signals collected from machine bearings exhibit nonstationary behavior due to variable working conditions and multiple fault severities. In the current work, a two-layered bearing fault diagnosis scheme is proposed for the identification of fault pattern and crack size for a given fault type. A hybrid feature pool is used in combination with sparse stacked autoencoder (SAE-based deep neural networks (DNNs to perform effective diagnosis of bearing faults of multiple severities. The hybrid feature pool can extract more discriminating information from the raw vibration signals, to overcome the nonstationary behavior of the signals caused by multiple crack sizes. More discriminating information helps the subsequent classifier to effectively classify data into the respective classes. The results indicate that the proposed scheme provides satisfactory performance in diagnosing bearing defects of multiple severities. Moreover, the results also demonstrate that the proposed model outperforms other state-of-the-art algorithms, i.e., support vector machines (SVMs and backpropagation neural networks (BPNNs.

  1. A Hybrid Feature Model and Deep-Learning-Based Bearing Fault Diagnosis.

    Science.gov (United States)

    Sohaib, Muhammad; Kim, Cheol-Hong; Kim, Jong-Myon

    2017-12-11

    Bearing fault diagnosis is imperative for the maintenance, reliability, and durability of rotary machines. It can reduce economical losses by eliminating unexpected downtime in industry due to failure of rotary machines. Though widely investigated in the past couple of decades, continued advancement is still desirable to improve upon existing fault diagnosis techniques. Vibration acceleration signals collected from machine bearings exhibit nonstationary behavior due to variable working conditions and multiple fault severities. In the current work, a two-layered bearing fault diagnosis scheme is proposed for the identification of fault pattern and crack size for a given fault type. A hybrid feature pool is used in combination with sparse stacked autoencoder (SAE)-based deep neural networks (DNNs) to perform effective diagnosis of bearing faults of multiple severities. The hybrid feature pool can extract more discriminating information from the raw vibration signals, to overcome the nonstationary behavior of the signals caused by multiple crack sizes. More discriminating information helps the subsequent classifier to effectively classify data into the respective classes. The results indicate that the proposed scheme provides satisfactory performance in diagnosing bearing defects of multiple severities. Moreover, the results also demonstrate that the proposed model outperforms other state-of-the-art algorithms, i.e., support vector machines (SVMs) and backpropagation neural networks (BPNNs).

  2. Plasma and process characterization of high power magnetron physical vapor deposition with integrated plasma equipment--feature profile model

    International Nuclear Information System (INIS)

    Zhang Da; Stout, Phillip J.; Ventzek, Peter L.G.

    2003-01-01

    High power magnetron physical vapor deposition (HPM-PVD) has recently emerged for metal deposition into deep submicron features in state of the art integrated circuit fabrication. However, the plasma characteristics and process mechanism are not well known. An integrated plasma equipment-feature profile modeling infrastructure has therefore been developed for HPM-PVD deposition, and it has been applied to simulating copper seed deposition with an Ar background gas for damascene metalization. The equipment scale model is based on the hybrid plasma equipment model [M. Grapperhaus et al., J. Appl. Phys. 83, 35 (1998); J. Lu and M. J. Kushner, ibid., 89, 878 (2001)], which couples a three-dimensional Monte Carlo sputtering module within a two-dimensional fluid model. The plasma kinetics of thermalized, athermal, and ionized metals and the contributions of these species in feature deposition are resolved. A Monte Carlo technique is used to derive the angular distribution of athermal metals. Simulations show that in typical HPM-PVD processing, Ar + is the dominant ionized species driving sputtering. Athermal metal neutrals are the dominant deposition precursors due to the operation at high target power and low pressure. The angular distribution of athermals is off axis and more focused than thermal neutrals. The athermal characteristics favor sufficient and uniform deposition on the sidewall of the feature, which is the critical area in small feature filling. In addition, athermals lead to a thick bottom coverage. An appreciable fraction (∼10%) of the metals incident to the wafer are ionized. The ionized metals also contribute to bottom deposition in the absence of sputtering. We have studied the impact of process and equipment parameters on HPM-PVD. Simulations show that target power impacts both plasma ionization and target sputtering. The Ar + ion density increases nearly linearly with target power, different from the behavior of typical ionized PVD processing. The

  3. Feature-opinion pair identification of product reviews in Chinese: a domain ontology modeling method

    Science.gov (United States)

    Yin, Pei; Wang, Hongwei; Guo, Kaiqiang

    2013-03-01

    With the emergence of the new economy based on social media, a great amount of consumer feedback on particular products are conveyed through wide-spreading product online reviews, making opinion mining a growing interest for both academia and industry. According to the characteristic mode of expression in Chinese, this research proposes an ontology-based linguistic model to identify the basic appraisal expression in Chinese product reviews-"feature-opinion pair (FOP)." The product-oriented domain ontology is constructed automatically at first, then algorithms to identify FOP are designed by mapping product features and opinions to the conceptual space of the domain ontology, and finally comparative experiments are conducted to evaluate the model. Experimental results indicate that the performance of the proposed approach in this paper is efficient in obtaining a more accurate result compared to the state-of-art algorithms. Furthermore, through identifying and analyzing FOPs, the unstructured product reviews are converted into structured and machine-sensible expression, which provides valuable information for business application. This paper contributes to the related research in opinion mining by developing a solid foundation for further sentiment analysis at a fine-grained level and proposing a general way for automatic ontology construction.

  4. Assessing impact of changes in human resources features on enterprise activities: simulation model

    Directory of Open Access Journals (Sweden)

    Kalmykova Svetlana

    2017-01-01

    Full Text Available The need for creating programs of human resources development is shown; the impact of these programs on organizational effectiveness is taken into account. The stages of development tools and HRD programs on the basis of cognitive modelling are disclosed; these stages will help assess the impact of HR-practices on the key indicators of organization activity at the design stage. The method of HR-practices’ pre-selection in professional development of the employees is represented.

  5. Key-value store with internal key-value storage interface

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Ting, Dennis P. J.; Tzelnic, Percy; Gupta, Uday; Grider, Gary; Bonnie, David J.

    2018-01-16

    A key-value store is provided having one or more key-value storage interfaces. A key-value store on at least one compute node comprises a memory for storing a plurality of key-value pairs; and an abstract storage interface comprising a software interface module that communicates with at least one persistent storage device providing a key-value interface for persistent storage of one or more of the plurality of key-value pairs, wherein the software interface module provides the one or more key-value pairs to the at least one persistent storage device in a key-value format. The abstract storage interface optionally processes one or more batch operations on the plurality of key-value pairs. A distributed embodiment for a partitioned key-value store is also provided.

  6. Online feature selection with streaming features.

    Science.gov (United States)

    Wu, Xindong; Yu, Kui; Ding, Wei; Wang, Hao; Zhu, Xingquan

    2013-05-01

    We propose a new online feature selection framework for applications with streaming features where the knowledge of the full feature space is unknown in advance. We define streaming features as features that flow in one by one over time whereas the number of training examples remains fixed. This is in contrast with traditional online learning methods that only deal with sequentially added observations, with little attention being paid to streaming features. The critical challenges for Online Streaming Feature Selection (OSFS) include 1) the continuous growth of feature volumes over time, 2) a large feature space, possibly of unknown or infinite size, and 3) the unavailability of the entire feature set before learning starts. In the paper, we present a novel Online Streaming Feature Selection method to select strongly relevant and nonredundant features on the fly. An efficient Fast-OSFS algorithm is proposed to improve feature selection performance. The proposed algorithms are evaluated extensively on high-dimensional datasets and also with a real-world case study on impact crater detection. Experimental results demonstrate that the algorithms achieve better compactness and higher prediction accuracy than existing streaming feature selection algorithms.

  7. Machinery running state identification based on discriminant semi-supervised local tangent space alignment for feature fusion and extraction

    International Nuclear Information System (INIS)

    Su, Zuqiang; Xiao, Hong; Zhang, Yi; Tang, Baoping; Jiang, Yonghua

    2017-01-01

    Extraction of sensitive features is a challenging but key task in data-driven machinery running state identification. Aimed at solving this problem, a method for machinery running state identification that applies discriminant semi-supervised local tangent space alignment (DSS-LTSA) for feature fusion and extraction is proposed. Firstly, in order to extract more distinct features, the vibration signals are decomposed by wavelet packet decomposition WPD, and a mixed-domain feature set consisted of statistical features, autoregressive (AR) model coefficients, instantaneous amplitude Shannon entropy and WPD energy spectrum is extracted to comprehensively characterize the properties of machinery running state(s). Then, the mixed-dimension feature set is inputted into DSS-LTSA for feature fusion and extraction to eliminate redundant information and interference noise. The proposed DSS-LTSA can extract intrinsic structure information of both labeled and unlabeled state samples, and as a result the over-fitting problem of supervised manifold learning and blindness problem of unsupervised manifold learning are overcome. Simultaneously, class discrimination information is integrated within the dimension reduction process in a semi-supervised manner to improve sensitivity of the extracted fusion features. Lastly, the extracted fusion features are inputted into a pattern recognition algorithm to achieve the running state identification. The effectiveness of the proposed method is verified by a running state identification case in a gearbox, and the results confirm the improved accuracy of the running state identification. (paper)

  8. Complex Topographic Feature Ontology Patterns

    Science.gov (United States)

    Varanka, Dalia E.; Jerris, Thomas J.

    2015-01-01

    Semantic ontologies are examined as effective data models for the representation of complex topographic feature types. Complex feature types are viewed as integrated relations between basic features for a basic purpose. In the context of topographic science, such component assemblages are supported by resource systems and found on the local landscape. Ontologies are organized within six thematic modules of a domain ontology called Topography that includes within its sphere basic feature types, resource systems, and landscape types. Context is constructed not only as a spatial and temporal setting, but a setting also based on environmental processes. Types of spatial relations that exist between components include location, generative processes, and description. An example is offered in a complex feature type ‘mine.’ The identification and extraction of complex feature types are an area for future research.

  9. Understanding Legacy Features with Featureous

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Java programs called Featureous that addresses this issue. Featureous allows a programmer to easily establish feature-code traceability links and to analyze their characteristics using a number of visualizations. Featureous is an extension to the NetBeans IDE, and can itself be extended by third...

  10. Habitat Restoration as a Key Conservation Lever for Woodland Caribou: A review of restoration programs and key learnings from Alberta

    Directory of Open Access Journals (Sweden)

    Paula Bentham

    2015-12-01

    Full Text Available The Recovery Strategy for the Woodland Caribou (Rangifer tarandus caribou, Boreal Population in Canada (EC, 2012, identifies coordinated actions to reclaim woodland caribou habitat as a key step to meeting current and future caribou population objectives. Actions include restoring industrial landscape features such as roads, seismic lines, pipelines, cut-lines, and cleared areas in an effort to reduce landscape fragmentation and the changes in caribou population dynamics associated with changing predator-prey dynamics in highly fragmented landscapes. Reliance on habitat restoration as a recovery action within the federal recovery strategy is high, considering all Alberta populations have less than 65% undisturbed habitat, which is identified in the recovery strategy as a threshold providing a 60% chance that a local population will be self-sustaining. Alberta’s Provincial Woodland Caribou Policy also identifies habitat restoration as a critical component of long-term caribou habitat management. We review and discuss the history of caribou habitat restoration programs in Alberta and present outcomes and highlights of a caribou habitat restoration workshop attended by over 80 representatives from oil and gas, forestry, provincial and federal regulators, academia and consulting who have worked on restoration programs. Restoration initiatives in Alberta began in 2001 and have generally focused on construction methods, revegetation treatments, access control programs, and limiting plant species favourable to alternate prey. Specific treatments include tree planting initiatives, coarse woody debris management along linear features, and efforts for multi-company and multi-stakeholder coordinated habitat restoration on caribou range. Lessons learned from these programs have been incorporated into large scale habitat restoration projects near Grande Prairie, Cold Lake, and Fort McMurray. A key outcome of our review is the opportunity to provide a

  11. The role of emotion in musical improvisation: an analysis of structural features.

    Science.gov (United States)

    McPherson, Malinda J; Lopez-Gonzalez, Monica; Rankin, Summer K; Limb, Charles J

    2014-01-01

    One of the primary functions of music is to convey emotion, yet how music accomplishes this task remains unclear. For example, simple correlations between mode (major vs. minor) and emotion (happy vs. sad) do not adequately explain the enormous range, subtlety or complexity of musically induced emotions. In this study, we examined the structural features of unconstrained musical improvisations generated by jazz pianists in response to emotional cues. We hypothesized that musicians would not utilize any universal rules to convey emotions, but would instead combine heterogeneous musical elements together in order to depict positive and negative emotions. Our findings demonstrate a lack of simple correspondence between emotions and musical features of spontaneous musical improvisation. While improvisations in response to positive emotional cues were more likely to be in major keys, have faster tempos, faster key press velocities and more staccato notes when compared to negative improvisations, there was a wide distribution for each emotion with components that directly violated these primary associations. The finding that musicians often combine disparate features together in order to convey emotion during improvisation suggests that structural diversity may be an essential feature of the ability of music to express a wide range of emotion.

  12. Specific features of goal setting in road traffic safety

    Science.gov (United States)

    Kolesov, V. I.; Danilov, O. F.; Petrov, A. I.

    2017-10-01

    Road traffic safety (RTS) management is inherently a branch of cybernetics and therefore requires clear formalization of the task. The paper aims at identification of the specific features of goal setting in RTS management under the system approach. The paper presents the results of cybernetic modeling of the cause-to-effect mechanism of a road traffic accident (RTA); in here, the mechanism itself is viewed as a complex system. A designed management goal function is focused on minimizing the difficulty in achieving the target goal. Optimization of the target goal has been performed using the Lagrange principle. The created working algorithms have passed the soft testing. The key role of the obtained solution in the tactical and strategic RTS management is considered. The dynamics of the management effectiveness indicator has been analyzed based on the ten-year statistics for Russia.

  13. Climatic features of the Red Sea from a regional assimilative model

    KAUST Repository

    Viswanadhapalli, Yesubabu

    2016-08-16

    The Advanced Research version of Weather Research and Forecasting (WRF-ARW) model was used to generate a downscaled, 10-km resolution regional climate dataset over the Red Sea and adjacent region. The model simulations are performed based on two, two-way nested domains of 30- and 10-km resolutions assimilating all conventional observations using a cyclic three-dimensional variational approach over an initial 12-h period. The improved initial conditions are then used to generate regional climate products for the following 24 h. We combined the resulting daily 24-h datasets to construct a 15-year Red Sea atmospheric downscaled product from 2000 to 2014. This 15-year downscaled dataset is evaluated via comparisons with various in situ and gridded datasets. Our analysis indicates that the assimilated model successfully reproduced the spatial and temporal variability of temperature, wind, rainfall, relative humidity and sea level pressure over the Red Sea region. The model also efficiently simulated the seasonal and monthly variability of wind patterns, the Red Sea Convergence Zone and associated rainfall. Our results suggest that dynamical downscaling and assimilation of available observations improve the representation of regional atmospheric features over the Red Sea compared to global analysis data from the National Centers for Environmental Prediction. We use the dataset to describe the atmospheric climatic conditions over the Red Sea region. © 2016 Royal Meteorological Society.

  14. A Survey of Public Key Infrastructure-Based Security for Mobile Communication Systems

    Directory of Open Access Journals (Sweden)

    Mohammed Ramadan

    2016-08-01

    Full Text Available Mobile communication security techniques are employed to guard the communication between the network entities. Mobile communication cellular systems have become one of the most important communication systems in recent times and are used by millions of people around the world. Since the 1990s, considerable efforts have been taken to improve both the communication and security features of the mobile communications systems. However, these improvements divide the mobile communications field into different generations according to the communication and security techniques such as A3, A5 and A8 algorithms for 2G-GSM cellular system, 3G-authentication and key agreement (AKA, evolved packet system-authentication and key agreement (EPS-AKA, and long term evolution-authentication and key agreement (LTE-AKA algorithms for 3rd generation partnership project (3GPP systems. Furthermore, these generations have many vulnerabilities, and huge security work is involved to solve such problems. Some of them are in the field of the public key cryptography (PKC which requires a high computational cost and more network flexibility to be achieved. As such, the public key infrastructure (PKI is more compatible with the modern generations due to the superior communications features. This paper surveys the latest proposed works on the security of GSM, CDMA, and LTE cellular systems using PKI. Firstly, we present the security issues for each generation of mobile communication systems, then we study and analyze the latest proposed schemes and give some comparisons. Finally, we introduce some new directions for the future scope. This paper classifies the mobile communication security schemes according to the techniques used for each cellular system and covers some of the PKI-based security techniques such as authentication, key agreement, and privacy preserving.

  15. Superpixel-Based Feature for Aerial Image Scene Recognition

    Directory of Open Access Journals (Sweden)

    Hongguang Li

    2018-01-01

    Full Text Available Image scene recognition is a core technology for many aerial remote sensing applications. Different landforms are inputted as different scenes in aerial imaging, and all landform information is regarded as valuable for aerial image scene recognition. However, the conventional features of the Bag-of-Words model are designed using local points or other related information and thus are unable to fully describe landform areas. This limitation cannot be ignored when the aim is to ensure accurate aerial scene recognition. A novel superpixel-based feature is proposed in this study to characterize aerial image scenes. Then, based on the proposed feature, a scene recognition method of the Bag-of-Words model for aerial imaging is designed. The proposed superpixel-based feature that utilizes landform information establishes top-task superpixel extraction of landforms to bottom-task expression of feature vectors. This characterization technique comprises the following steps: simple linear iterative clustering based superpixel segmentation, adaptive filter bank construction, Lie group-based feature quantification, and visual saliency model-based feature weighting. Experiments of image scene recognition are carried out using real image data captured by an unmanned aerial vehicle (UAV. The recognition accuracy of the proposed superpixel-based feature is 95.1%, which is higher than those of scene recognition algorithms based on other local features.

  16. Chirp investigation in EMLs towards frequency shift keying modulation

    DEFF Research Database (Denmark)

    Iglesias Olmedo, Miguel; Vegas Olmos, Juan José; Westergren, Urban

    2014-01-01

    This paper presents a chirp modeling and experimental results that support our vision of enabling frequency shift keying (FSK) exploiting the chirp effect in externally modulated lasers (EMLs).......This paper presents a chirp modeling and experimental results that support our vision of enabling frequency shift keying (FSK) exploiting the chirp effect in externally modulated lasers (EMLs)....

  17. SUPPLY CHAIN MANAGEMENT – KEY FACTORS

    Directory of Open Access Journals (Sweden)

    Magdalena Daniela DINU

    2014-06-01

    Full Text Available This paper exposes Supply Chain Management by its key factors. Briefly, where the Supply Chain Management is treated as strategic part of a company then maintaining both control and influence throughout the entire supply chain are key factors and critical to success. On the other hand, finding the right partner to manage the non-strategic Supply Chains would be another key factor too. To define the most important key factors within Supply Chain Management means a deeply understanding of both Supply Chain’ s components, procedures, workflow, processes and the importance of Supply Chain Management into maximizing company's value. SCORE model able to provide solid information about measuring performance and identifying priorities within Supply Chain Management will help us to understand the key factors by analyzing its elements: Plan, Source, Make, Deliver,Return, Enable. These elements covers all the challenging areas from first to third tier of Supply Chain Management.

  18. A comprehensive analysis of earthquake damage patterns using high dimensional model representation feature selection

    Science.gov (United States)

    Taşkin Kaya, Gülşen

    2013-10-01

    Recently, earthquake damage assessment using satellite images has been a very popular ongoing research direction. Especially with the availability of very high resolution (VHR) satellite images, a quite detailed damage map based on building scale has been produced, and various studies have also been conducted in the literature. As the spatial resolution of satellite images increases, distinguishability of damage patterns becomes more cruel especially in case of using only the spectral information during classification. In order to overcome this difficulty, textural information needs to be involved to the classification to improve the visual quality and reliability of damage map. There are many kinds of textural information which can be derived from VHR satellite images depending on the algorithm used. However, extraction of textural information and evaluation of them have been generally a time consuming process especially for the large areas affected from the earthquake due to the size of VHR image. Therefore, in order to provide a quick damage map, the most useful features describing damage patterns needs to be known in advance as well as the redundant features. In this study, a very high resolution satellite image after Iran, Bam earthquake was used to identify the earthquake damage. Not only the spectral information, textural information was also used during the classification. For textural information, second order Haralick features were extracted from the panchromatic image for the area of interest using gray level co-occurrence matrix with different size of windows and directions. In addition to using spatial features in classification, the most useful features representing the damage characteristic were selected with a novel feature selection method based on high dimensional model representation (HDMR) giving sensitivity of each feature during classification. The method called HDMR was recently proposed as an efficient tool to capture the input

  19. TU-D-207B-01: A Prediction Model for Distinguishing Radiation Necrosis From Tumor Progression After Gamma Knife Radiosurgery Based On Radiomics Features From MR Images

    International Nuclear Information System (INIS)

    Zhang, Z; Ho, A; Wang, X; Brown, P; Guha-Thakurta, N; Ferguson, S; Fave, X; Zhang, L; Mackin, D; Court, L; Li, J; Yang, J

    2016-01-01

    Purpose: To develop and validate a prediction model using radiomics features extracted from MR images to distinguish radiation necrosis from tumor progression for brain metastases treated with Gamma knife radiosurgery. Methods: The images used to develop the model were T1 post-contrast MR scans from 71 patients who had had pathologic confirmation of necrosis or progression; 1 lesion was identified per patient (17 necrosis and 54 progression). Radiomics features were extracted from 2 images at 2 time points per patient, both obtained prior to resection. Each lesion was manually contoured on each image, and 282 radiomics features were calculated for each lesion. The correlation for each radiomics feature between two time points was calculated within each group to identify a subset of features with distinct values between two groups. The delta of this subset of radiomics features, characterizing changes from the earlier time to the later one, was included as a covariate to build a prediction model using support vector machines with a cubic polynomial kernel function. The model was evaluated with a 10-fold cross-validation. Results: Forty radiomics features were selected based on consistent correlation values of approximately 0 for the necrosis group and >0.2 for the progression group. In performing the 10-fold cross-validation, we narrowed this number down to 11 delta radiomics features for the model. This 11-delta-feature model showed an overall prediction accuracy of 83.1%, with a true positive rate of 58.8% in predicting necrosis and 90.7% for predicting tumor progression. The area under the curve for the prediction model was 0.79. Conclusion: These delta radiomics features extracted from MR scans showed potential for distinguishing radiation necrosis from tumor progression. This tool may be a useful, noninvasive means of determining the status of an enlarging lesion after radiosurgery, aiding decision-making regarding surgical resection versus conservative medical

  20. TU-D-207B-01: A Prediction Model for Distinguishing Radiation Necrosis From Tumor Progression After Gamma Knife Radiosurgery Based On Radiomics Features From MR Images

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Z [Central South University Xiangya Hospital, Changsha, Hunan (China); MD Anderson Cancer Center, Houston, TX (United States); Ho, A [University of Houston, Houston, TX (United States); Wang, X; Brown, P; Guha-Thakurta, N; Ferguson, S; Fave, X; Zhang, L; Mackin, D; Court, L; Li, J; Yang, J [MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: To develop and validate a prediction model using radiomics features extracted from MR images to distinguish radiation necrosis from tumor progression for brain metastases treated with Gamma knife radiosurgery. Methods: The images used to develop the model were T1 post-contrast MR scans from 71 patients who had had pathologic confirmation of necrosis or progression; 1 lesion was identified per patient (17 necrosis and 54 progression). Radiomics features were extracted from 2 images at 2 time points per patient, both obtained prior to resection. Each lesion was manually contoured on each image, and 282 radiomics features were calculated for each lesion. The correlation for each radiomics feature between two time points was calculated within each group to identify a subset of features with distinct values between two groups. The delta of this subset of radiomics features, characterizing changes from the earlier time to the later one, was included as a covariate to build a prediction model using support vector machines with a cubic polynomial kernel function. The model was evaluated with a 10-fold cross-validation. Results: Forty radiomics features were selected based on consistent correlation values of approximately 0 for the necrosis group and >0.2 for the progression group. In performing the 10-fold cross-validation, we narrowed this number down to 11 delta radiomics features for the model. This 11-delta-feature model showed an overall prediction accuracy of 83.1%, with a true positive rate of 58.8% in predicting necrosis and 90.7% for predicting tumor progression. The area under the curve for the prediction model was 0.79. Conclusion: These delta radiomics features extracted from MR scans showed potential for distinguishing radiation necrosis from tumor progression. This tool may be a useful, noninvasive means of determining the status of an enlarging lesion after radiosurgery, aiding decision-making regarding surgical resection versus conservative medical