WorldWideScience

Sample records for robust oracle machines

  1. Oracle Exalytics: Engineered for Speed-of-Thought Analytics

    Directory of Open Access Journals (Sweden)

    Gabriela GLIGOR

    2011-12-01

    Full Text Available One of the biggest product announcements at 2011's Oracle OpenWorld user conference was Oracle Exalytics In-Memory Machine, the latest addition to the "Exa"-branded suite of Oracle-Sun engineered software-hardware systems. Analytics is all about gaining insights from the data for better decision making. However, the vision of delivering fast, interactive, insightful analytics has remained elusive for most organizations. Most enterprise IT organizations continue to struggle to deliver actionable analytics due to time-sensitive, sprawling requirements and ever tightening budgets. The issue is further exasperated by the fact that most enterprise analytics solutions require dealing with a number of hardware, software, storage and networking vendors and precious resources are wasted integrating the hardware and software components to deliver a complete analytical solution. Oracle Exalytics Business Intelligence Machine is the world’s first engineered system specifically designed to deliver high performance analysis, modeling and planning. Built using industry-standard hardware, market-leading business intelligence software and in-memory database technology, Oracle Exalytics is an optimized system that delivers answers to all your business questions with unmatched speed, intelligence, simplicity and manageability.

  2. Data warehousing with Oracle

    Science.gov (United States)

    Shahzad, Muhammad A.

    1999-02-01

    With the emergence of data warehousing, Decision support systems have evolved to its best. At the core of these warehousing systems lies a good database management system. Database server, used for data warehousing, is responsible for providing robust data management, scalability, high performance query processing and integration with other servers. Oracle being the initiator in warehousing servers, provides a wide range of features for facilitating data warehousing. This paper is designed to review the features of data warehousing - conceptualizing the concept of data warehousing and, lastly, features of Oracle servers for implementing a data warehouse.

  3. Effects of a random noisy oracle on search algorithm complexity

    International Nuclear Information System (INIS)

    Shenvi, Neil; Brown, Kenneth R.; Whaley, K. Birgitta

    2003-01-01

    Grover's algorithm provides a quadratic speed-up over classical algorithms for unstructured database or library searches. This paper examines the robustness of Grover's search algorithm to a random phase error in the oracle and analyzes the complexity of the search process as a function of the scaling of the oracle error with database or library size. Both the discrete- and continuous-time implementations of the search algorithm are investigated. It is shown that unless the oracle phase error scales as O(N -1/4 ), neither the discrete- nor the continuous-time implementation of Grover's algorithm is scalably robust to this error in the absence of error correction

  4. Oracle Forms migrācija uz Oracle ADF

    OpenAIRE

    Komule, Vanda

    2017-01-01

    Oracle Forms tehnoloģijai ir ilgstoša un veiksmīga vēsture. Oracle Forms instalācijas ir plaši pieejamas visā pasaulē. Savukārt mūsdienu darījuma prasības ir ievērojami mainījušās un Oracle Forms programmatūra tuvojas sava dzīves cikla dabiskām beigām. Šodien vairākas organizācijas vēlas transformēt Oracle Forms lietojumprogrammatūru pilnīgā tīmekļa bāzētā risinājumā. Šim nolūkam Oracle korporācija piedāvā savu risinājumu - Oracle ADF ietvaru, kas bāzēts uz Java EE tehnoloģijām. Autore ir pi...

  5. Oracle database 12c release 2 in-memory tips and techniques for maximum performance

    CERN Document Server

    Banerjee, Joyjeet

    2017-01-01

    This Oracle Press guide shows, step-by-step, how to optimize database performance and cut transaction processing time using Oracle Database 12c Release 2 In-Memory. Oracle Database 12c Release 2 In-Memory: Tips and Techniques for Maximum Performance features hands-on instructions, best practices, and expert tips from an Oracle enterprise architect. You will learn how to deploy the software, use In-Memory Advisor, build queries, and interoperate with Oracle RAC and Multitenant. A complete chapter of case studies illustrates real-world applications. • Configure Oracle Database 12c and construct In-Memory enabled databases • Edit and control In-Memory options from the graphical interface • Implement In-Memory with Oracle Real Application Clusters • Use the In-Memory Advisor to determine what objects to keep In-Memory • Optimize In-Memory queries using groups, expressions, and aggregations • Maximize performance using Oracle Exadata Database Machine and In-Memory option • Use Swingbench to create d...

  6. Oracle Corporation 1-2

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    During these sessions, Tom Kyte of Oracle Corporation will cover the following topics:# The Tools Tom uses, # The Top 5 things done wrong over and over again, # Building test cases, # Oracle 10g "cool features" Speaker Bio:Tom Kyte is the Vice President, Core Technologies for Oracle Government, Education and Healthcare. Before starting at Oracle, Tom Kyte worked as a systems integrator building large-scale, heterogeneous databases and applications, mostly for military and government customers. He spends a great deal of his time working with the Oracle database and, more specifically, working with people who are working with the Oracle database. Tom Kyte is the Tom behind the AskTom web site, answering people's questions about the Oracle database and its tools (http://asktom.oracle.com/). He is also the author of the AskTom column in http://www.oracle.com/technology/oramag/oracle/current.html Oracle Magazine, and the author of Expert One-on-One Oracle (Apress, 2003), Beginning Oracle Programming (Wrox Press, 2...

  7. Oracle Training Survey

    CERN Multimedia

    Oracle Team

    2002-01-01

    With the aim of identifying the more suitable courses for the Oracle community at CERN; we have prepared a web questionnaire and you are kindly invited to take a moment and fill it in. The questionnaire is available now and UNTIL 14 June 2002, here. In fact, this questionnaire covers two areas Courses on Oracle topics (given by a external company) Oracle Tutorials (given by the oracle team) As you may know, a series of Oracle Tutorials were held last year and we would like to know if it would be useful for you to have a second run of them. You can access the Oracle Tutorials here. Thank you very much for your cooperation. Oracle Team Contact Name: Montse Collados Polidura (IT/DB)

  8. Oracle announces increased uptake of Oracle9i Application Server

    CERN Multimedia

    2002-01-01

    Oracle Europe this week announced that increasingly, companies in the region are selecting the Oracle9i Application Server (Oracle9iAS) to develop and deploy web-based business application. CERN is one of its customers (1/2 page).

  9. DC Algorithm for Extended Robust Support Vector Machine.

    Science.gov (United States)

    Fujiwara, Shuhei; Takeda, Akiko; Kanamori, Takafumi

    2017-05-01

    Nonconvex variants of support vector machines (SVMs) have been developed for various purposes. For example, robust SVMs attain robustness to outliers by using a nonconvex loss function, while extended [Formula: see text]-SVM (E[Formula: see text]-SVM) extends the range of the hyperparameter by introducing a nonconvex constraint. Here, we consider an extended robust support vector machine (ER-SVM), a robust variant of E[Formula: see text]-SVM. ER-SVM combines two types of nonconvexity from robust SVMs and E[Formula: see text]-SVM. Because of the two nonconvexities, the existing algorithm we proposed needs to be divided into two parts depending on whether the hyperparameter value is in the extended range or not. The algorithm also heuristically solves the nonconvex problem in the extended range. In this letter, we propose a new, efficient algorithm for ER-SVM. The algorithm deals with two types of nonconvexity while never entailing more computations than either E[Formula: see text]-SVM or robust SVM, and it finds a critical point of ER-SVM. Furthermore, we show that ER-SVM includes the existing robust SVMs as special cases. Numerical experiments confirm the effectiveness of integrating the two nonconvexities.

  10. Applying and extending Oracle Spatial

    CERN Document Server

    Simon Gerard Greener, Siva Ravada

    2013-01-01

    This book is an advanced practical guide to applying and extending Oracle Spatial.This book is for existing users of Oracle and Oracle Spatial who have, at a minimum, basic operational experience of using Oracle or an equivalent database. Advanced skills are not required.

  11. Oracle 12c for dummies

    CERN Document Server

    Ruel, Chris

    2013-01-01

    Demystifying the power of the Oracle 12c database The Oracle database is the industry-leading relational database management system (RDMS) used from small companies to the world's largest enterprises alike for their most critical business and analytical processing. Oracle 12c includes industry leading enhancements to enable cloud computing and empowers users to manage both Big Data and traditional data structures faster and cheaper than ever before. Oracle 12c For Dummies is the perfect guide for a novice database administrator or an Oracle DBA who is new to Oracle 12c. The book covers what

  12. Comparison of quantum oracles

    International Nuclear Information System (INIS)

    Kashefi, Elham; Banaszek, Konrad; Kent, Adrian; Vedral, Vlatko

    2002-01-01

    A standard quantum oracle S f for a general function f: Z N →Z N is defined to act on two input states and return two outputs, with inputs vertical bar i> and vertical bar j> (i,j contains Z N ) returning outputs vertical bar i> and vertical bar j+f(i)>. However, if f is known to be a one-to-one function, a simpler oracle, M f , which returns vertical bar f(i)> given |i>, can also be defined. We consider the relative strengths of these oracles. We define a simple promise problem that minimal quantum oracles can solve exponentially faster than classical oracles, via an algorithm that cannot be naively adapted to standard quantum oracles. We show that S f can be constructed by invoking M f and (M f ) -1 once each, while Θ(√(N)) invocations of S f and/or (S f ) -1 are required to construct M f

  13. Questioning ORACLE: An Assessment of ORACLE's Analysis of Teachers' Questions and [A Comment on "Questioning ORACLE"].

    Science.gov (United States)

    Scarth, John; And Others

    1986-01-01

    Analysis of teachers' questions, part of the ORACLE (Observation Research and Classroom Learning Evaluation) project research, is examined in detail. Scarth and Hammersley argue that the rules ORACLE uses for identifying different types of questions involve levels of ambiguity and inference that threaten reliability and validity of the study's…

  14. Oracle PL/SQL Programming

    CERN Document Server

    Feuerstein, Steven

    2009-01-01

    This book is the definitive reference on PL/SQL, considered throughout the database community to be the best Oracle programming book available. Like its predecessors, this fifth edition of Oracle PL/SQL Programming covers language fundamentals, advanced coding techniques, and best practices for using Oracle's powerful procedural language. Thoroughly updated for Oracle Database 11g Release 2, this edition reveals new PL/SQL features and provides extensive code samples, ranging from simple examples to complex and complete applications, in the book and on the companion website. This indispensab

  15. Expert Oracle GoldenGate

    CERN Document Server

    Prusinski, Ben; Chung, Richard

    2011-01-01

    Expert Oracle GoldenGate is a hands-on guide to creating and managing complex data replication environments using the latest in database replication technology from Oracle. GoldenGate is the future in replication technology from Oracle, and aims to be best-of-breed. GoldenGate supports homogeneous replication between Oracle databases. It supports heterogeneous replication involving other brands such as Microsoft SQL Server and IBM DB2 Universal Server. GoldenGate is high-speed, bidirectional, highly-parallelized, and makes only a light impact on the performance of databases involved in replica

  16. Physical database design using Oracle

    CERN Document Server

    Burleson, Donald K

    2004-01-01

    INTRODUCTION TO ORACLE PHYSICAL DESIGNPrefaceRelational Databases and Physical DesignSystems Analysis and Physical Database DesignIntroduction to Logical Database DesignEntity/Relation ModelingBridging between Logical and Physical ModelsPhysical Design Requirements Validation PHYSICAL ENTITY DESIGN FOR ORACLEData Relationships and Physical DesignMassive De-Normalization: STAR Schema DesignDesigning Class HierarchiesMaterialized Views and De-NormalizationReferential IntegrityConclusionORACLE HARDWARE DESIGNPlanning the Server EnvironmentDesigning the Network Infrastructure for OracleOracle Netw

  17. Oracle SQL tuning with Oracle SQLTXPLAIN

    CERN Document Server

    Charalambides, Stelios

    2013-01-01

    Oracle SQL Tuning with SQLTXPLAIN is a practical guide to SQL tuning the way Oracle's own experts do it, using a freely downloadable tool called SQLTXPLAIN. Using this simple tool you'll learn how to tune even the most complex SQL, and you'll learn to do it quickly, without the huge learning curve usually associated with tuning as a whole.  Firmly based in real world problems, this book helps you reclaim system resources and avoid the most common bottleneck in overall performance, badly tuned SQL.  You'll learn how the optimizer works, how to take advantage of its latest features, and when it'

  18. Ground-water conditions between Oracle and Oracle Junction, Pinal County, Arizona

    Science.gov (United States)

    Heindl, L.A.

    1955-01-01

    The development of the San Manuel copper prospect has greatly increased traffic along State Highway 77. Considerable interest in commercial possibilities along that road has resulted in a request by the Arizona State Land Department for information about the ground-water conditions between Oracle and Oracle Junction. This request came too late for information to be included in a recently completed memorandum report on the occurrence of ground water in the vicinity of Oracle, released in February 1955. These data are presented as a supplement to that report to minimized duplication of statements about the general geologic and hydrologic conditions. The necessary well data and sample descriptions that were not included in the Oracle report are shown in tables 3 and 4. The area discussed in this supplement comprises parts of Tps. 9 and 10 S., Rs. 13, 14, and 15 E., and includes about 90 square miles (fig. 3). The eastern portion overlaps part of the area covered by the earlier report.

  19. Unambiguous discrimination among oracle operators

    International Nuclear Information System (INIS)

    Chefles, Anthony; Kitagawa, Akira; Takeoka, Masahiro; Sasaki, Masahide; Twamley, Jason

    2007-01-01

    We address the problem of unambiguous discrimination among oracle operators. The general theory of unambiguous discrimination among unitary operators is extended with this application in mind. We prove that entanglement with an ancilla cannot assist any discrimination strategy for commuting unitary operators. We also obtain a simple, practical test for the unambiguous distinguishability of an arbitrary set of unitary operators on a given system. Using this result, we prove that the unambiguous distinguishability criterion is the same for both standard and minimal oracle operators. We then show that, except in certain trivial cases, unambiguous discrimination among all standard oracle operators corresponding to integer functions with fixed domain and range is impossible. However, we find that it is possible to unambiguously discriminate among the Grover oracle operators corresponding to an arbitrarily large unsorted database. The unambiguous distinguishability of standard oracle operators corresponding to totally indistinguishable functions, which possess a strong form of classical indistinguishability, is analysed. We prove that these operators are not unambiguously distinguishable for any finite set of totally indistinguishable functions on a Boolean domain and with arbitrary fixed range. Sets of such functions on a larger domain can have unambiguously distinguishable standard oracle operators, and we provide a complete analysis of the simplest case, that of four functions. We also examine the possibility of unambiguous oracle operator discrimination with multiple parallel calls and investigate an intriguing unitary superoperator transformation between standard and entanglement-assisted minimal oracle operators

  20. Boltzmann Oracle for Combinatorial Systems

    OpenAIRE

    Pivoteau , Carine; Salvy , Bruno; Soria , Michèle

    2008-01-01

    International audience; Boltzmann random generation applies to well-defined systems of recursive combinatorial equations. It relies on oracles giving values of the enumeration generating series inside their disk of convergence. We show that the combinatorial systems translate into numerical iteration schemes that provide such oracles. In particular, we give a fast oracle based on Newton iteration.

  1. Graph reconstruction with a betweenness oracle

    DEFF Research Database (Denmark)

    Abrahamsen, Mikkel; Bodwin, Greg; Rotenberg, Eva

    2016-01-01

    Graph reconstruction algorithms seek to learn a hidden graph by repeatedly querying a blackbox oracle for information about the graph structure. Perhaps the most well studied and applied version of the problem uses a distance oracle, which can report the shortest path distance between any pair...... of nodes. We introduce and study the betweenness oracle, where bet(a, m, z) is true iff m lies on a shortest path between a and z. This oracle is strictly weaker than a distance oracle, in the sense that a betweenness query can be simulated by a constant number of distance queries, but not vice versa...

  2. Study guide for 1Z0-071 Oracle Database 12c SQL : Oracle Certification Prep

    CERN Document Server

    Morris, Matthew

    2016-01-01

    This Study Guide is targeted at IT professionals who are working towards becoming an Oracle Database 12c SQL Certified Associate. The book provides information covering all of the exam topics for the Oracle certification exam: "1Z0-071: Oracle Database 12c SQL". The books in the Oracle Certification Prep series are built in lockstep with the test topics provided by Oracle Education's certification program. Each book is intended to provide the information that will be tested in a clean and concise format. The guides introduce the subject you'll be tested on, follow that with the information you'll need to know for it, and then move on to the next topic. They contain no drills or unrealistic self-tests to bump the page count without adding value. The series is intended to provide a concentrated source of exam information that is compact enough to be read through multiple times. This series is ideal for experienced Oracle professionals that are familiar with the topic being tested, but want a means to rapidly re...

  3. Oracle Apex reporting tips and tricks

    CERN Document Server

    Bara, George

    2013-01-01

    Take advantage of all the exciting Reporting features of Oracle Application Express 4.2. Designed for a hands-on approach, this book contains in-depth practical guidelines from George Bara, a well-known Oracle Apex expert and blogger. From Classic to Interactive Reports, Web Services and Pdf Printing, "Oracle Apex Reporting Tips & Tricks" is a must-have for all database developers that want to make the most out of the Oracle Apex reporting engine.

  4. Oracle Data Dictionary Pocket Reference

    CERN Document Server

    Kreines, David

    2003-01-01

    If you work with Oracle, then you don't need to be told that the data dictionary is large and complex, and grows larger with each new Oracle release. It's one of the basic elements of the Oracle database you interact with regularly, but the sheer number of tables and views makes it difficult to remember which view you need, much less the name of the specific column. Want to make it simpler? The Oracle Data Dictionary Pocket Reference puts all the information you need right at your fingertips. Its handy and compact format lets you locate the table and view you need effortlessly without stoppin

  5. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  6. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  7. Oracle ADF Faces cookbook

    CERN Document Server

    Gawish, Amr

    2014-01-01

    This is a cookbook that covers more than 80 different recipes to teach you about different aspects of Oracle ADF Faces. It follows a practical approach and covers how to build your components for reuse in different applications. This book will also help you in tuning the performance of your ADF Faces application. If you are an ADF developer who wants to harness the power of Oracle ADF Faces to create exceptional user interfaces and reactive applications, this book will provide you with the recipes needed to do just that. You will not need to be familiar with Oracle ADF Faces, but you should be

  8. Oracle and PLSQL Recipes

    CERN Document Server

    Juneau, Josh

    2010-01-01

    Oracle PL/SQL Recipes is your go to book for PL/SQL programming solutions. It takes a task-oriented approach to PL/SQL programming that lets you quickly look up a specific task and see the pattern for a solution. Then it's as simple as modifying the pattern for your specific application and implementing it. And you're done and home for dinner. Oracle PL/SQL Recipes is another in Apress' ongoing series of recipe books aimed at Oracle practitioners. The recipe format is ideal for the busy professional who just needs to get the job done. * Covers the most common PL/SQL programming problems * Pres

  9. Oracle PL/SQL Language Pocket Reference

    CERN Document Server

    Feuerstein, Steven; Dawes, Chip

    2007-01-01

    The fourth edition of this popular pocket guide provides quick-reference information that will help you use Oracle's PL/SQL language, including the newest Oracle Database 11g features. A companion to Steven Feuerstein and Bill Pribyl's bestselling Oracle PL/SQL Programming, this concise guide boils down the most vital PL/SQL information into an accessible summary

  10. The CIO's guide to Oracle products and solutions

    CERN Document Server

    Keyes, Jessica

    2014-01-01

    From operating systems to the cloud, Oracle's products and services are everywhere, and it has the market share to prove it. Given the share diversity of the Oracle product line, and the level of complexity of integration, management can be quite a daunting task.The CIO''s Guide to Oracle Products and Solutions is the go-to guide for all things Oracle. It provides management-level guidance on how to successfully navigate and manage the full range of Oracle products. The book presents management best practices and user/developer lessons learned in the use of Oracle products and services.Supplyi

  11. PS3-21: Extracting Utilization Data from Clarity into VDW Using Oracle and SAS

    Science.gov (United States)

    Chimmula, Srivardhan

    2013-01-01

    Background/Aims The purpose of the presentation is to demonstrate how we use SAS and Oracle to load VDW_Utilization, VDW_DX, and VDW_PX tables from Clarity at the Kaiser Permanente Northern California (KPNC) Division of Research (DOR) site. Methods DOR uses the best of Oracle PL/ SQL and SAS capabilities in building Extract Transform and Load (ETL) processes. These processes extract patient encounter, diagnosis, and procedure data from Teradata-based Clarity. The data is then transformed to fit HMORN’s VDW definitions of the table. This data is then loaded into the Oracle-based VDW table on DOR’s research database and then finally a copy of the table is also created as a SAS dataset. Results DOR builds robust and efficient ETL processes that refresh VDW Utilization table on a monthly basis processing millions of records/observations. The ETL processes have the capability to identify daily changes in Clarity and update the VDW tables on a daily basis. Conclusions KPNC DOR combines the best of both Oracle and SAS worlds to build ETL processes that load the data into VDW Utilization tables efficiently.

  12. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  13. The PEP-II/BaBar Project-Wide Database using World Wide Web and Oracle*Case

    International Nuclear Information System (INIS)

    Chan, A.; Crane, G.; MacGregor, I.; Meyer, S.

    1995-12-01

    The PEP-II/BaBar Project Database is a tool for monitoring the technical and documentation aspects of the accelerator and detector construction. It holds the PEP-II/BaBar design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, components fabrication and calibration data, survey and alignment data, property control, CAD drawings, publications and documentation. This central Oracle database on a UNIX server is built using Oracle*Case tools. Users at the collaborating laboratories mainly access the data using World Wide Web (WWW). The Project Database is being extended to link to legacy databases required for the operations phase

  14. Integrating Oracle Human Resources with Other Modules

    Science.gov (United States)

    Sparks, Karl; Shope, Shawn

    1998-01-01

    One of the most challenging aspects of implementing an enterprise-wide business system is achieving integration of the different modules to the satisfaction of diverse customers. The Jet Propulsion Laboratory's (JPL) implementation of the Oracle application suite demonstrates the need to coordinate Oracle Human Resources Management System (HRMS) decision across the Oracle modules.

  15. Oracle internals tips, tricks, and techniques for DBAs

    CERN Document Server

    Burleson, Donald K

    2001-01-01

    If you are a typical Oracle professional, you don't have the luxury of time to keep up with new technology and read all the new manuals to understand each new feature of the latest release from Oracle. You need a comprehensive source of information and in-depth tips and techniques for using the new technology. You need Oracle Internals: Tips, Tricks, and Techniques for DBAs.Oracle has evolved from a simple relational database into one of the most complex e-commerce platforms ever devised. It's not enough for you to understand just the Oracle database. You must also understand the components of

  16. Oracle database 12c the complete reference

    CERN Document Server

    Bryla, Bob

    2014-01-01

    Maintain a scalable, highly available enterprise platform and reduce complexity by leveraging the powerful new tools and cloud enhancements of Oracle Database 12c. This authoritative Oracle Press guide offers complete coverage of installation, configuration, tuning, and administration. Find out how to build and populate Oracle databases, perform effective queries, design applications, and secure your enterprise data

  17. Oracle BAM 11gR1 Handbook

    CERN Document Server

    Wang, Pete

    2012-01-01

    "Oracle BAM 11gR1 Handbook" is a practical best practices tutorial focused entirely on Oracle Business Activity Monitoring. An intermediate-to-advanced guide, step-by-step instructions and an accompanying demo project will help SOA report developers through application development and producing dashboards and reports. If you are a developer/report developer or SOA Architect who wants to learn valuable Oracle BAM best practices for monitoring your operations in real time, then "Oracle BAM 11gR1 Handbook" is for you. Administrators will also find the book useful. You should already be comfortabl

  18. Database server Oracle8i; DB saver Oracle8i

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Oracle8i is a DBMS (DataBase Management System) with its functions augmented for use with a DWH (Data Ware House). Augmented functions which are important includes functions of hash partition and composite partition that enable high-speed retrieval/management of large data and a function of materialized view that enables high-speed data counting. Also augmented in terms of usability are functions involving automatic standby, on-line index (re-)editing, and instance recovery. Although Oracle8i names the development and operation in coordination with Java as an important augmented item, this remains to be supported at the current stage. (translated by NEDO)

  19. Oracle and storage IOs, explanations and experience at CERN

    International Nuclear Information System (INIS)

    Grancher, Eric

    2010-01-01

    The Oracle database system is used extensively in the High Energy Physics community. Critical to the efficient running of these databases is the storage subsystem, and over the years Oracle has introduced new ways to access and manage this storage, e.g. ASM (Oracle database version 10.1), Direct NFS (Oracle database version 11.1), and Exadata (Oracle database version 11.1). This paper presents CERN's experience over the past few years with the different storage access and management features, and gives a comparison of each functionality. Also compared are the different solutions used at CERN, and the Tier 1 sites for storing Oracle databases.

  20. The data archive system and preliminary applications of Hefei light source based on oracle

    International Nuclear Information System (INIS)

    Qiang Jie; Liu Gongfa

    2013-01-01

    Data archive system is the key part of HLS control system, it is used for archiving beam parameters, real-time equipment data, alarm records, etc. The Archive Engine of RDB Channel Archiver is used for collecting data from Input Output Controllers of HLS control system and storing in Oracle database. Webpages based on JSP are developed for data inquiry and statistics, with which the data processing and analyzing can be more efficient for operators and researchers. Thus the system can meet the requirements of the HLS machine operation and machine research. (authors)

  1. Oracle Goldengate 11g complete cookbook

    CERN Document Server

    Gupta, Ankur

    2013-01-01

    Oracle Goldengate 11g Complete Cookbook follows the Cookbook style. Each recipe provides step by step instructions with various examples and scripts. This book provides the necessary information to successfully complete most of the possible administration tasks.Oracle Goldengate 11g Complete Cookbook is aimed at Database Administrators, Architects, and Middleware Administrators who are keen to know more about Oracle Goldengate. Whether you are handling Goldengate environments on a day-to-day basis, or using it just for migration, this book provides the necessary information required to success

  2. Oracle SOA Governance 11g implementation

    CERN Document Server

    Weir, Luis Augusto

    2013-01-01

    This book is a practical tutorial, with lots of step-by-step instructions for achieving SOA Governance by implementing the component Oracle products.This book is written for SOA architects and project managers who want to learn how to implement Oracle SOA Governance.

  3. DATABASE OF MIGRATION AND REPLICATION WITH ORACLE GOLDEN GATE

    Directory of Open Access Journals (Sweden)

    Suharjito Suharjito

    2014-10-01

    Full Text Available The main goal of this research is to analyze and design a database configuration of migration and replication in PT Metro Batavia. Research methodologies used in this research are data collecting, analysis and design model. Data collecting method is conducted with library research and direct survey in the company. Analysis method is conducted by analyzing hangar system, migration and reflection process and the available problems. Design method is conducted by designing a prototype for migration process with the implementation of Oracle SQL Developer and replication process with implementation of Oracle Golden Gate. The result of this research is a prototype for configuration of migration and replication process by using Oracle Golden Gate, which can produce two sets of identical data for the purpose of backup and recovery, and also design a simple tool that is expected to help active-active or active-passive replication process. The conclusion of this research is migration process of MySQL database to Oracle database by using Oracle Golden Gate hasn’t been conducted, because Oracle Golden Gate still has bug related to binary log, so database of migration is conducted by using Oracle Golden Gate. However, replication of bi-directional in between database of Oracle by using Oracle SQL Developer can guarantee data availability and reduce work burden from primary database.

  4. A strategy for quantum algorithm design assisted by machine learning

    International Nuclear Information System (INIS)

    Bang, Jeongho; Lee, Jinhyoung; Ryu, Junghee; Yoo, Seokwon; Pawłowski, Marcin

    2014-01-01

    We propose a method for quantum algorithm design assisted by machine learning. The method uses a quantum–classical hybrid simulator, where a ‘quantum student’ is being taught by a ‘classical teacher’. In other words, in our method, the learning system is supposed to evolve into a quantum algorithm for a given problem, assisted by a classical main-feedback system. Our method is applicable for designing quantum oracle-based algorithms. We chose, as a case study, an oracle decision problem, called a Deutsch–Jozsa problem. We showed by using Monte Carlo simulations that our simulator can faithfully learn a quantum algorithm for solving the problem for a given oracle. Remarkably, the learning time is proportional to the square root of the total number of parameters, rather than showing the exponential dependence found in the classical machine learning-based method. (paper)

  5. A strategy for quantum algorithm design assisted by machine learning

    Science.gov (United States)

    Bang, Jeongho; Ryu, Junghee; Yoo, Seokwon; Pawłowski, Marcin; Lee, Jinhyoung

    2014-07-01

    We propose a method for quantum algorithm design assisted by machine learning. The method uses a quantum-classical hybrid simulator, where a ‘quantum student’ is being taught by a ‘classical teacher’. In other words, in our method, the learning system is supposed to evolve into a quantum algorithm for a given problem, assisted by a classical main-feedback system. Our method is applicable for designing quantum oracle-based algorithms. We chose, as a case study, an oracle decision problem, called a Deutsch-Jozsa problem. We showed by using Monte Carlo simulations that our simulator can faithfully learn a quantum algorithm for solving the problem for a given oracle. Remarkably, the learning time is proportional to the square root of the total number of parameters, rather than showing the exponential dependence found in the classical machine learning-based method.

  6. Oracle PL/SQL programming

    CERN Document Server

    Feuerstein, Steven

    2014-01-01

    Considered the best Oracle PL/SQL programming guide by the Oracle community, this definitive guide is precisely what you need to make the most of Oracle’s powerful procedural language. The sixth edition describes the features and capabilities of PL/SQL up through Oracle Database 12c Release 1. Hundreds of thousands of PL/SQL developers have benefited from this book over the last twenty years; this edition continues that tradition. With extensive code examples and a lively sense of humor, this book explains language fundamentals, explores advanced coding techniques, and offers best practices to help you solve real-world problems. * Get PL/SQL programs up and running quickly, with clear instructions for executing, tracing, testing, debugging, and managing code * Understand new 12.1 features, including the ACCESSIBLE_BY clause, WITH FUNCTION and UDF pragma, BEQUEATH CURRENT_USER for views, and new conditional compilation directives * Take advantage of extensive code samples, from easy-to-follow examples to reu...

  7. Pro Oracle database 11g RAC on Linux

    CERN Document Server

    Shaw, Steve

    2010-01-01

    Pro Oracle Database 11g RAC on Linux provides full-life-cycle guidance on implementing Oracle Real Application Clusters in a Linux environment. Real Application Clusters, commonly abbreviated as RAC, is Oracle's industry-leading architecture for scalable and fault-tolerant databases. RAC allows you to scale up and down by simply adding and subtracting inexpensive Linux servers. Redundancy provided by those multiple, inexpensive servers is the basis for the failover and other fault-tolerance features that RAC provides. Written by authors well-known for their talent with RAC, Pro Oracle Database

  8. Oracle-based online robust optimization via online learning

    NARCIS (Netherlands)

    Ben-Tal, A.; Hazan, E.; Koren, T.; Shie, M.

    2015-01-01

    Robust optimization is a common optimization framework under uncertainty when problem parameters are unknown, but it is known that they belong to some given uncertainty set. In the robust optimization framework, a min-max problem is solved wherein a solution is evaluated according to its performance

  9. Oracle Solaris 11 advanced administration cookbook

    CERN Document Server

    Borges, Alexandre

    2014-01-01

    If you are a Solaris administrator who wants to learn more about administering an Oracle Solaris system and want to go a level higher in utilizing the advanced features of Oracle Solaris, then this book is for you. A working knowledge of Solaris Administration is assumed.

  10. Oracle support provides a range of new tutorials

    CERN Multimedia

    2012-01-01

    The IT DB is pleased to announce a new series of Oracle tutorials, with the proposed schedule. Note that these tutorials will take place in the Filtration Plant (Building 222) and that no registration is required.   4 June (Monday) 09:00 Oracle Architecture, Przemyslaw Adam Radowiecki The objective is to go through Oracle database physical and logical structures, highlighting the consequences of some of Oracle's internal design choices for developers of database applications. The presentation defines Oracle-related basic terms and illustrates them based on the database architecture. The following topics will be discussed: • Database with its physical and logical structures (tablespace, segment, extent, block, database user, schema, user's quota) • Single instance (significant memory structures: buffer cache, shared pool) • Real Application Cluster (RAC) • Connecting to the database (TNS, database service) • SQL statement processing (h...

  11. Oracle PL/SQL Language Pocket Reference

    CERN Document Server

    Feuerstein, Steven; Dawes, Chip

    2004-01-01

    While it's good to have a book with all the answers--like your trusty copy of Oracle PL/SQL Programming-- how often do you need all the answers? More likely, you just need a reminder, a quick answer to a problem you're up against. For these times, nothing's handier than the new edition of the Oracle PL/SQL Language Pocket Reference by PL/SQL experts Stephen Feuerstein, Bill Pribyl, and Chip Dawes. Newly updated for Oracle10g, this little book is always at the ready for the quick problem solving you need. The 3rd edition of this popular mini-reference boils down the most vital information fr

  12. Oracle SQL Tuning pocket Reference

    CERN Document Server

    Gurry, Mark

    2002-01-01

    One of the most important challenges faced by Oracle database administrators and Oracle developers is the need to tune SQL statements so that they execute efficiently. Poorly tuned SQL statements are one of the leading causes of substandard database performance and poor response time. SQL statements that perform poorly result in frustration for users, and can even prevent a company from serving its customers in a timely manner

  13. How to Configurate Oracle Enterprise Manager on Windows 2000 Server

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Oracle Enterprise Manager is a system management tool, which provides an integrated solution for centrally managing your heterogeneous environment Servers. Enterprise Manager combines a graphical Console, Oracle Management Servers, Oracle Intelligent Agents, common services, and tools to provide an integrated, comprehensive systems management platform for managing Oracle products, and is comprised of such as Data

  14. STRONG ORACLE OPTIMALITY OF FOLDED CONCAVE PENALIZED ESTIMATION.

    Science.gov (United States)

    Fan, Jianqing; Xue, Lingzhou; Zou, Hui

    2014-06-01

    Folded concave penalization methods have been shown to enjoy the strong oracle property for high-dimensional sparse estimation. However, a folded concave penalization problem usually has multiple local solutions and the oracle property is established only for one of the unknown local solutions. A challenging fundamental issue still remains that it is not clear whether the local optimum computed by a given optimization algorithm possesses those nice theoretical properties. To close this important theoretical gap in over a decade, we provide a unified theory to show explicitly how to obtain the oracle solution via the local linear approximation algorithm. For a folded concave penalized estimation problem, we show that as long as the problem is localizable and the oracle estimator is well behaved, we can obtain the oracle estimator by using the one-step local linear approximation. In addition, once the oracle estimator is obtained, the local linear approximation algorithm converges, namely it produces the same estimator in the next iteration. The general theory is demonstrated by using four classical sparse estimation problems, i.e., sparse linear regression, sparse logistic regression, sparse precision matrix estimation and sparse quantile regression.

  15. A robust PSSs design using PSO in a multi-machine environment

    Energy Technology Data Exchange (ETDEWEB)

    Shayeghi, H., E-mail: hshayeghi@gmail.co [Technical Engineering Department, University of Mohaghegh Ardabili, Ardabil (Iran, Islamic Republic of); Shayanfar, H.A. [Center of Excellence for Power Automation and Operation, Electrical Engineering Department, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Safari, A.; Aghmasheh, R. [Technical Engineering Department, Zanjan University, Zanjan (Iran, Islamic Republic of)

    2010-04-15

    In this paper, multi-objective design of multi-machine power system stabilizers (PSSs) using particle swarm optimization (PSO) is proposed. The potential of the proposed approach for optimal setting of the widely used conventional lead-lag PSSs has been investigated. The stabilizers are tuned to simultaneously shift the lightly damped and undamped electromechanical modes of all machines to a prescribed zone in the s-plane. The PSSs parameters tuning problem is converted to an optimization problem with the eigenvalue-based multi-objective function comprising the damping factor, and the damping ratio of the lightly damped electromechanical modes, which is solved by a PSO algorithm which has a strong ability to find the most optimistic results. The robustness of the proposed PSO-based PSSs (PSOPSS) is verified on a multi-machine power system under different operating conditions and disturbances. The results of the proposed PSOPSS are compared with the genetic algorithm based tuned PSS and classical PSSs through eigenvalue analysis, nonlinear time-domain simulation and some performance indices to illustrate its robust performance for a wide range of loading conditions.

  16. ATLAS database application enhancements using Oracle 11g

    International Nuclear Information System (INIS)

    Dimitrov, G; Canali, L; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemes (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have been upgraded to the newest Oracle version at the time: Oracle 11g Release 2. Oracle 11g come with several key improvements compared to previous database engine versions. In this work we present our evaluation of the most relevant new features of Oracle 11g of interest for ATLAS applications and use cases. Notably we report on the performance and scalability enhancements obtained in production since the Oracle 11g deployment during Q1 2012 and we outline plans for future work in this area.

  17. Oracle accrual plans from requirements to implementation

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, Christine K [Los Alamos National Laboratory

    2009-01-01

    Implementing any new business software can be an intimidating prospect and this paper is intended to offer some insight in to how to approach this challenge with some fundamental rules for success. Los Alamos National Laboratory (LANL) had undergone an original ERP implementation of HRMS, Oracle Advanced Benefits, Worker Self Service, Manager Self Service, Project Accounting, Financials and PO, and recently completed a project to implement Oracle Payroll, Time and Labor and Accrual Plans. This paper will describe some of the important lessons that can be applied to any implementation as a whole, and then specifically how this knowledge was applied to the design and deployment of Oracle Accrual Plans for LANL. Finally, detail on the functionality available in Oracle Accrual Plans will be described, as well as the detailed setups.that were utilized at LANL.

  18. Oracle GoldenGate 12c implementer's guide

    CERN Document Server

    Jeffries, John P

    2015-01-01

    The book is aimed at Oracle database administrators, project managers, and solution architects who wish to extend their knowledge of GoldenGate. The reader is assumed to be familiar with Oracle databases. No knowledge of GoldenGate is required.

  19. Oracle Data Guard 11gR2 administration beginner's guide

    CERN Document Server

    Baransel, Emre

    2013-01-01

    Using real-world examples and hands-on tasks, Oracle Data Guard 11gR2 Administration Beginner's Guide will give you a solid foundation in Oracle Data Guard. It has been designed to teach you everything you need to know to successfully create and operate Data Guard environments with maximum flexibility, compatibility, and effectiveness.If you are an Oracle database administrator who wants to configure and administer Data Guard configurations, then ""Oracle Data Guard 11gR2 Administration Beginner's Guide"" is for you. With a basic understanding of Oracle database administration, you'll be able

  20. Organization of Risk Analysis Codes for Living Evaluations (ORACLE)

    International Nuclear Information System (INIS)

    Batt, D.L.; MacDonald, P.E.; Sattison, M.B.; Vesely, E.

    1987-01-01

    ORACLE (Organization of Risk Analysis Codes for Living Evaluations) is an integration concept for using risk-based information in United States Nuclear Regulatory Commission (USNRC) applications. Portions of ORACLE are being developed at the Idaho Nationale Engineering Laboratory for the USNRC. The ORACLE concept consists of related databases, software, user interfaces, processes, and quality control checks allowing a wide variety of regulatory problems and activities to be addressed using current, updated PRA information. The ORACLE concept provides for smooth transitions between one code and the next without pre- or post-processing. (orig.)

  1. ROBUSTNESS OF A FACE-RECOGNITION TECHNIQUE BASED ON SUPPORT VECTOR MACHINES

    OpenAIRE

    Prashanth Harshangi; Koshy George

    2010-01-01

    The ever-increasing requirements of security concerns have placed a greater demand for face recognition surveillance systems. However, most current face recognition techniques are not quite robust with respect to factors such as variable illumination, facial expression and detail, and noise in images. In this paper, we demonstrate that face recognition using support vector machines are sufficiently robust to different kinds of noise, does not require image pre-processing, and can be used with...

  2. Space-efficient path-reporting approximate distance oracles

    DEFF Research Database (Denmark)

    Elkin, Michael; Neiman, Ofer; Wulff-Nilsen, Christian

    2016-01-01

    We consider approximate path-reporting distance oracles, distance labeling and labeled routing with extremely low space requirements, for general undirected graphs. For distance oracles, we show how to break the nlog⁡n space bound of Thorup and Zwick if approximate paths rather than distances need...

  3. Developing web applications with Oracle ADF essentials

    CERN Document Server

    Vesterli, Sten E

    2013-01-01

    Developing Web Applications with Oracle ADF Essentials covers the basics of Oracle ADF and then works through more complex topics such as debugging and logging features and JAAS Security in JDeveloper as the reader gains more skills. This book will follow a tutorial approach, using a practical example, with the content and tasks getting harder throughout.""Developing Web Applications with Oracle ADF Essentials"" is for you if you want to build modern, user-friendly web applications for all kinds of data gathering, analysis, and presentations. You do not need to know any advanced HTML or JavaSc

  4. Oracle ADF enterprise application development made simple

    CERN Document Server

    Vesterli, Sten E

    2014-01-01

    This book is written in an easy-to-understand style, following an enterprise development process through all the phases of development and deployment. Concepts are illustrated with real-world examples and the methods used are explained step-by-step.This book is for Oracle developers looking to start using Oracle's latest development tool and J2EE developers looking for a more productive way to build modern web applications. This book will guide you through the creation of a successful enterprise application with Oracle ADF 12c, and therefore it assumes you have basic knowledge of Java, JDevelo

  5. AC machine control : robust and sensorless control by parameter independency

    Energy Technology Data Exchange (ETDEWEB)

    Samuelsen, Dag Andreas Hals

    2009-06-15

    In this thesis it is first presented how robust control can be used to give AC motor drive systems competitive dynamic performance under parameter variations. These variations are common to all AC machines, and are a result of temperature change in the machine, and imperfect machine models. This robust control is, however, dependent on sensor operation in the sense that the rotor position is needed in the control loop. Elimination of this control loop has been for many years, and still is, a main research area of AC machines control systems. An integrated PWM modulator and sampler unit has been developed and tested. The sampler unit is able to give current and voltage measurements with a reduced noise component. It is further used to give the true derivative of currents and voltages in the machine and the power converter, as an average over a PWM period, and as separate values for all states of the power converter. In this way, it can give measurements of the currents as well as the derivative of the currents, at the start and at the end of a single power inverter state. This gave a large degree of freedom in parameter and state identification during uninterrupted operation of the induction machine. The special measurement scheme of the system achieved three main goals: By avoiding the time frame where the transistors commutate and the noise in the measurement of the current is large, filtering of the current measurement is no longer needed. The true derivative of the current in the machine is can be measured with far less noise components. This was extended to give any separate derivative in all three switching states of the power converter. Using the computational resources of the FPGA, more advanced information was supplied to the control system, in order to facilitate sensor less operation, with low computational demands on the DSP. As shown in the papers, this extra information was first used to estimate some of the states of the machine, in some or all of the

  6. Study on managing EPICS database using ORACLE

    International Nuclear Information System (INIS)

    Liu Shu; Wang Chunhong; Zhao Jijiu

    2007-01-01

    EPICS is used as a development toolkit of BEPCII control system. The core of EPICS is a distributed database residing in front-end machines. The distributed database is usually created by tools such as VDCT and text editor in the host, then loaded to front-end target IOCs through the network. In BEPCII control system there are about 20,000 signals, which are distributed in more than 20 IOCs. All the databases are developed by device control engineers using VDCT or text editor. There's no uniform tools providing transparent management. The paper firstly presents the current status on EPICS database management issues in many labs. Secondly, it studies EPICS database and the interface between ORACLE and EPICS database. finally, it introduces the software development and application is BEPCII control system. (authors)

  7. Oracle Database 12c backup and recovery survival guide

    CERN Document Server

    Alvarez, Francisco Munoz

    2013-01-01

    The book follows a tutorial-based approach, covering all the best practices for backup and recovery. The book starts by introducing readers to the world of backup and recovery, then moves on to teach them the new features offered by Oracle 12c. The book is full of useful tips and best practices that are essential for any DBA to perform backup and recovery operations in an organization.This book is designed for Oracle DBAs and system administrators. The reader will have a basic working experience of administering Oracle databases. This book is designed for Oracle DBAs and system administrators.

  8. Oracle Application Express 5 for beginners a practical guide to rapidly develop data-centric web applications accessible from desktop, laptops, tablets, and smartphones

    CERN Document Server

    2015-01-01

    Oracle Application Express has taken another big leap towards becoming a true next generation RAD tool. It has entered into its fifth version to build robust web applications. One of the most significant feature in this release is a new page designer that helps developers create and edit page elements within a single page design view, which enormously maximizes developer productivity. Without involving the audience too much into the boring bits, this full colored edition adopts an inspiring approach that helps beginners practically evaluate almost every feature of Oracle Application Express, including all features new to version 5. The most convincing way to explore a technology is to apply it to a real world problem. In this book, you’ll develop a sales application that demonstrates almost every feature to practically expose the anatomy of Oracle Application Express 5. The short list below presents some main topics of Oracle APEX covered in this book: Rapid web application development for desktops, la...

  9. CERN pushes the envelope with Oracle9i database

    CERN Multimedia

    2001-01-01

    Oracle Corp. today announced that unique capabilities in Oracle9i Database are helping CERN, the European Organization for Nuclear Research in Geneva. The LHC project will generate petabytes of data - an amount well beyond the capability of any relational database technology today. CERN is developing a new route in data management and analysis using Oracle9i Real Application Cluster technology.

  10. Oracle APEX 4.2 reporting

    CERN Document Server

    Pathak, Vishal

    2013-01-01

    Oracle APEX 4.2 Reporting is a practical tutorial for intermediate to advanced use, with plenty of step-by-step instructions and business scenarios for understanding and implementing the ins and outs of making reports.""Oracle APEX 4.2 Reporting"" is for you if you design or develop advanced solutions in APEX or wish to know about the advanced features of APEX. If you wish to have a 360 degree view of reporting technologies or work in a complex heterogeneous enterprise, this is a must-have.

  11. Oracle application express 5.1 basics and beyond a practical guide to rapidly develop data-centric web applications accessible from desktop, laptops, tablets, and smartphones

    CERN Document Server

    2017-01-01

    You will find stuff about workspace, application, page, and so on in every APEX book. But this book is unique because the information it contains is not available anywhere else! Unlike other books, it adopts a stimulating approach to reveal almost every feature necessary for the beginners of Oracle APEX and also takes them beyond the basics. As a technology enthusiast I write on a variety of new technologies, but writing books on Oracle Application Express is my passion. The blood pumping comments I get from my readers on Amazon (and in my inbox) are the main forces that motivate me to write a book whenever a new version of Oracle APEX is launched. This is my fifth book on Oracle APEX (and the best so far) written after discovering the latest 5.1 version. As usual, I’m sharing my personal learning experience through this book to expose this unique rapid web application development platform. In Oracle Application Express you can build robust web applications. The new version is launched with some more prol...

  12. Oracle Data Integrator 11g cookbook

    CERN Document Server

    Dupupet, Christophe; Gray, Denis; Testut, Julien

    2013-01-01

    Written as a practical Cookbook, the recipes in this essential guide will help you make the most out of Oracle Data Integrator 11g.This book is meant for people who already possess a basic understanding of Oracle Data Integrator and want to take it to the next level by learning how to better leverage advanced ODI features and functionality as they continue to develop and manage their data integration projects.

  13. Oracle Application Express 4 Recipes

    CERN Document Server

    Zehoo, Edmund

    2011-01-01

    Oracle Application Express 4 Recipes provides an example-based approach to learning Application Express - the ground-breaking, rapid application development platform included with every Oracle Database license. The recipes format is ideal for the quick-study who just wants a good example or two to kick start their thinking and get pointed in the right direction. The recipes cover the gamut of Application Express development. Author and Application Express expert Edmund Zehoo shows how to create data entry screens, visualize data in the form of reports and charts, implement validation and back-

  14. Oracle Wiener filtering of a Gaussian signal

    NARCIS (Netherlands)

    Babenko, A.; Belitser, E.

    2011-01-01

    We study the problem of filtering a Gaussian process whose trajectories, in some sense, have an unknown smoothness ß0 from the white noise of small intensity e. If we knew the parameter ß0, we would use the Wiener filter which has the meaning of oracle. Our goal is now to mimic the oracle, i.e.,

  15. Oracle Wiener filtering of a Gaussian signal

    NARCIS (Netherlands)

    Babenko, A.; Belitser, E.N.

    2011-01-01

    We study the problem of filtering a Gaussian process whose trajectories, in some sense, have an unknown smoothness β0 from the white noise of small intensity . If we knew the parameter β0, we would use the Wiener filter which has the meaning of oracle. Our goal is now to mimic the oracle, i.e.,

  16. Improvement of the Oracle setup and database design at the Heidelberg ion therapy center

    International Nuclear Information System (INIS)

    Hoeppner, K.; Haberer, T.; Mosthaf, J.M.; Peters, A.; Thomas, M.; Welde, A.; Froehlich, G.; Juelicher, S.; Schaa, V. R.W.; Schiebel, W.; Steinmetz, S.

    2012-01-01

    The HIT (Heidelberg Ion Therapy) center is an accelerator facility for cancer therapy using both carbon ions and protons, located at the university hospital in Heidelberg. It provides three therapy treatment rooms: two with fixed beam exit (both in clinical use), and a unique gantry with a rotating beam head, currently under commissioning. The backbone of the proprietary accelerator control system consists of an Oracle database running on a Windows server, storing and delivering data of beam cycles, error logging, measured values, and the device parameters and beam settings for about 100,000 combinations of energy, beam size and particle rate used in treatment plans. Since going operational, we found some performance problems with the current database setup. Thus, we started an analysis that focused on the following topics: hardware resources of the database server, configuration of the Oracle instance, and a review of the database design that underwent several changes since its original design. The analysis revealed issues on all fields. The outdated server will be replaced by a state-of-the-art machine soon. We will present improvements of the Oracle configuration, the optimization of SQL statements, and the performance tuning of database design by adding new indexes which proved directly visible in accelerator operation, while data integrity was improved by additional foreign key constraints. (authors)

  17. Getting started with Oracle WebLogic Server 12c developer's guide

    CERN Document Server

    Nunes, Fabio Mazanatti

    2013-01-01

    Getting Started with Oracle WebLogic Server 12c is a fast-paced and feature-packed book, designed to get you working with Java EE 6, JDK 7 and Oracle WebLogic Server 12c straight away, so start developing your own applications.Getting Started with Oracle WebLogic Server 12c: Developer's Guide is written for developers who are just getting started, or who have some experience, with Java EE who want to learn how to develop for and use Oracle WebLogic Server. Getting Started with Oracle WebLogic Server 12c: Developer's Guide also provides a great overview of the updated features of the 12c releas

  18. New series of ORACLE tutorials, March-June 2006

    CERN Multimedia

    Catherine Delamare

    2006-01-01

    The IT DES Oracle Support team is pleased to announce the new series of Oracle tutorials with the proposed schedule: Thursday 20 April - SQL I - Eva Dafonte Perez Thursday 27 April - SQL II - Lucia Moreno Lopez Thursday 4 May - Architecture - Montse Collados Thursday 11 May - Tuning - Michal Kwiatek Thursday 1 June - PL/SQL I - Eva Dafonte Perez Thursday 8 June - PL/SQL II - Nilo Segura Thursday 15 June - Oracle Tools and Bindings with languages - Eric Grancher, Nilo Segura These tutorials will take place in the IT Auditorium (bldg. 31/3-004) starting at 10:00. The average duration will be 1 hour plus time for questions. There is no need to register in advance. You can access the previous 2002-2003 sessions at http://it-des.web.cern.ch/IT-DES/DIS/oracle/tutorials.html If you need more information, please contact Catherine.Delamare@cern.ch

  19. Expert Oracle RAC 12c

    CERN Document Server

    Shamsudeen, Riyaj; Yu, Kai; Farooq, Tariq

    2013-01-01

    Expert Oracle RAC 12c is a hands-on book helping you understand and implement Oracle Real Application Clusters (RAC), and to reduce the total-cost-of-ownership (TCO) of a RAC database. As a seasoned professional, you are probably aware of the importance of understanding the technical details behind the RAC stack. This book provides deep understanding of RAC concepts and implementation details that you can apply toward your day-to-day operational practices. You'll be guided in troubleshooting and avoiding trouble in your installation. Successful RAC operation hinges upon a fast-performing netwo

  20. Machine learning for inverse lithography: using stochastic gradient descent for robust photomask synthesis

    International Nuclear Information System (INIS)

    Jia, Ningning; Lam, Edmund Y

    2010-01-01

    Inverse lithography technology (ILT) synthesizes photomasks by solving an inverse imaging problem through optimization of an appropriate functional. Much effort on ILT is dedicated to deriving superior masks at a nominal process condition. However, the lower k 1 factor causes the mask to be more sensitive to process variations. Robustness to major process variations, such as focus and dose variations, is desired. In this paper, we consider the focus variation as a stochastic variable, and treat the mask design as a machine learning problem. The stochastic gradient descent approach, which is a useful tool in machine learning, is adopted to train the mask design. Compared with previous work, simulation shows that the proposed algorithm is effective in producing robust masks

  1. New series of ORACLE tutorials, March-June 2006

    CERN Multimedia

    Catherine Delamare

    2006-01-01

    The IT DES Oracle Support team is pleased to announce the new series of Oracle tutorials with the proposed schedule: Thursday 30 March - Design - Arash Khodabandeh Thursday 20 April - SQL I - Eva Dafonte Perez Thursday 27 April - SQL II - Lucia Moreno Lopez Thursday 4 May - Architecture - Montse Collados Thursday 11 May - Tuning - Michal Kwiatek Thursday 1 June - PL/SQL I - Eva Dafonte Perez Thursday 8 June - PL/SQL II - Nilo Segura Thursday 15 June - Oracle Tools and Bindings with languages - Eric Grancher, Nilo Segura These tutorials will take place in the IT Auditorium (bldg. 31/3-004) starting at 10:00. The average duration will be 1 hour plus time for questions. There is no need to register in advance. You can access the previous 2002-2003 sessions at http://it-des.web.cern.ch/IT-DES/DIS/oracle/tutorials.html If you need more information, please contact Catherine.Delamare@cern.ch

  2. On the robustness of bucket brigade quantum RAM

    Science.gov (United States)

    Arunachalam, Srinivasan; Gheorghiu, Vlad; Jochym-O'Connor, Tomas; Mosca, Michele; Varshinee Srinivasan, Priyaa

    2015-12-01

    We study the robustness of the bucket brigade quantum random access memory model introduced by Giovannetti et al (2008 Phys. Rev. Lett.100 160501). Due to a result of Regev and Schiff (ICALP ’08 733), we show that for a class of error models the error rate per gate in the bucket brigade quantum memory has to be of order o({2}-n/2) (where N={2}n is the size of the memory) whenever the memory is used as an oracle for the quantum searching problem. We conjecture that this is the case for any realistic error model that will be encountered in practice, and that for algorithms with super-polynomially many oracle queries the error rate must be super-polynomially small, which further motivates the need for quantum error correction. By contrast, for algorithms such as matrix inversion Harrow et al (2009 Phys. Rev. Lett.103 150502) or quantum machine learning Rebentrost et al (2014 Phys. Rev. Lett.113 130503) that only require a polynomial number of queries, the error rate only needs to be polynomially small and quantum error correction may not be required. We introduce a circuit model for the quantum bucket brigade architecture and argue that quantum error correction for the circuit causes the quantum bucket brigade architecture to lose its primary advantage of a small number of ‘active’ gates, since all components have to be actively error corrected.

  3. Non-Mechanism in Quantum Oracle Computing

    OpenAIRE

    Castagnoli, Giuseppe

    1999-01-01

    A typical oracle problem is finding which software program is installed on a computer, by running the computer and testing its input-output behaviour. The program is randomly chosen from a set of programs known to the problem solver. As well known, some oracle problems are solved more efficiently by using quantum algorithms; this naturally implies changing the computer to quantum, while the choice of the software program remains sharp. In order to highlight the non-mechanistic origin of this ...

  4. Machine learning meliorates computing and robustness in discrete combinatorial optimization problems.

    Directory of Open Access Journals (Sweden)

    Fushing Hsieh

    2016-11-01

    Full Text Available Discrete combinatorial optimization problems in real world are typically defined via an ensemble of potentially high dimensional measurements pertaining to all subjects of a system under study. We point out that such a data ensemble in fact embeds with system's information content that is not directly used in defining the combinatorial optimization problems. Can machine learning algorithms extract such information content and make combinatorial optimizing tasks more efficient? Would such algorithmic computations bring new perspectives into this classic topic of Applied Mathematics and Theoretical Computer Science? We show that answers to both questions are positive. One key reason is due to permutation invariance. That is, the data ensemble of subjects' measurement vectors is permutation invariant when it is represented through a subject-vs-measurement matrix. An unsupervised machine learning algorithm, called Data Mechanics (DM, is applied to find optimal permutations on row and column axes such that the permuted matrix reveals coupled deterministic and stochastic structures as the system's information content. The deterministic structures are shown to facilitate geometry-based divide-and-conquer scheme that helps optimizing task, while stochastic structures are used to generate an ensemble of mimicries retaining the deterministic structures, and then reveal the robustness pertaining to the original version of optimal solution. Two simulated systems, Assignment problem and Traveling Salesman problem, are considered. Beyond demonstrating computational advantages and intrinsic robustness in the two systems, we propose brand new robust optimal solutions. We believe such robust versions of optimal solutions are potentially more realistic and practical in real world settings.

  5. Use of ORACLE in a scientific environment

    International Nuclear Information System (INIS)

    Carey, R.W.; Auerbach, J.M.; Lerche, R.A.; Demartini, B.J.

    1983-01-01

    This paper discusses the use of ORACLE at the Fusion Experiments Analysis Facility (FEAF) for the laser program of the Lawrence Livermore National Laboratory. The mission of this VAX based computing facility is to aid laser program scientists and engineers develop their understanding of inertial confinement fusion target behavior. We have incorporated the ORACLE DBMS as a major part of an integrated data management and analysis environment for accomplishing this task. We discuss our use of ORACLE through all phases of data processing from raw digital forms to final physics summary data. Applications include: an information management tool for maintaining large amounts of one- and two-dimensional data, a configuration management tool for experiment setup information, and a data analysis tool for maintaining calibration and sensor response data

  6. Expert Oracle Application Express Plug-Ins

    CERN Document Server

    D'Souza, Martin Giffy

    2011-01-01

    Expert Oracle Application Express Plugins is your "go to" book on the groundbreaking plugin architecture introduced in Oracle Application Express 4.0. Using the new APEX functionality, you can create well-packaged, documented, reusable components and reliably leverage your coding investments across many applications. Components you create can define new item and region types, specify validation processes, and present dynamic actions to client applications. You can design innovative and colorful ways to display information, such as displaying the temperature using an image of a thermometer, or

  7. OCA Oracle Database 11g database administration I : a real-world certification guide

    CERN Document Server

    Ries, Steve

    2013-01-01

    Developed as a practical book, ""Oracle Database 11g Administration I Certification Guide"" will show you all you need to know to effectively excel at being an Oracle DBA, for both examinations and the real world. This book is for anyone who needs the essential skills to become an Oracle DBA, pass the Oracle Database Administration I exam, and use those skills in the real world to manage secure, high performance, and highly available Oracle databases.

  8. Oracle E-Business Suite Financials R12 A Functionality Guide

    CERN Document Server

    Iyer, Mohan

    2012-01-01

    This is a step-by-step functional guide to get you started easily with Oracle EBS Financials. If you are an Oracle E-Business Suite Financial consultant or an administrator looking to get a quick review on the capabilities of Oracle E-Business Suite and improve the use of the systems functionality, then this is the best guide for you. This book assumes that you have a fundamental knowledge of EBS Suite.

  9. Nonlinear decentralized robust governor control for hydroturbine-generator sets in multi-machine power systems

    Energy Technology Data Exchange (ETDEWEB)

    Qiang Lu; Yusong Sun; Yuanzhang Sun [Tsinghua University, Beijing (China). Dept. of Electrical Engineering; Felix F Wu; Yixin Ni [University of Hong Kong (China). Dept. of Electrical and Electronic Engineering; Yokoyama, Akihiko [University of Tokyo (Japan). Dept. of Electrical Engineering; Goto, Masuo; Konishi, Hiroo [Hitachi Ltd., Tokyo (Japan). Power System Div.

    2004-06-01

    A novel nonlinear decentralized robust governor control for hydroturbine-generator sets in multi-machine power systems is suggested in this paper. The nonelastic water hammer effect and disturbances are considered in the modeling. The advanced differential geometry theory, nonlinear robust control theory and the dynamic feedback method are combined to solve the problem. The nonlinear decentralized robust control law for the speed governor of hydroturbine-generators has been derived. The input signals to the proposed controller are all local measurements and independent to the system parameters. The derived control law guarantees the integrated system stability with disturbance attenuation, which is significant to the real power system application. Computer tests on an 8-machine, 36-bus power system show clearly the effectiveness of the new control strategy in transient stability enhancement and disturbance attenuation. The computer test results based on the suggested controller are compared favorably with those based on the conventional linear governor control. (author)

  10. QUERY RESPONSE TIME COMPARISON NOSQLDB MONGODB WITH SQLDB ORACLE

    Directory of Open Access Journals (Sweden)

    Humasak T. A. Simanjuntak

    2015-01-01

    Full Text Available Penyimpanan data saat ini terdapat dua jenis yakni relational database dan non-relational database. Kedua jenis DBMS (Database Managemnet System tersebut berbeda dalam berbagai aspek seperti per-formansi eksekusi query, scalability, reliability maupun struktur penyimpanan data. Kajian ini memiliki tujuan untuk mengetahui perbandingan performansi DBMS antara Oracle sebagai jenis relational data-base dan MongoDB sebagai jenis non-relational database dalam mengolah data terstruktur. Eksperimen dilakukan untuk mengetahui perbandingan performansi kedua DBMS tersebut untuk operasi insert, select, update dan delete dengan menggunakan query sederhana maupun kompleks pada database Northwind. Untuk mencapai tujuan eksperimen, 18 query yang terdiri dari 2 insert query, 10 select query, 2 update query dan 2 delete query dieksekusi. Query dieksekusi melalui sebuah aplikasi .Net yang dibangun sebagai perantara antara user dengan basis data. Eksperimen dilakukan pada tabel dengan atau tanpa relasi pada Oracle dan embedded atau bukan embedded dokumen pada MongoDB. Response time untuk setiap eksekusi query dibandingkan dengan menggunakan metode statistik. Eksperimen menunjukkan response time query untuk proses select, insert, dan update pada MongoDB lebih cepatdaripada Oracle. MongoDB lebih cepat 64.8 % untuk select query;MongoDB lebihcepat 72.8 % untuk insert query dan MongoDB lebih cepat 33.9 % untuk update query. Pada delete query, Oracle lebih cepat 96.8 % daripada MongoDB untuk table yang berelasi, tetapi MongoDB lebih cepat 83.8 % daripada Oracle untuk table yang tidak memiliki relasi.Untuk query kompleks dengan Map Reduce pada MongoDB lebih lambat 97.6% daripada kompleks query dengan aggregate function pada Oracle.

  11. Get the best out of Oracle 12.2 partitioning features

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    With the newest Oracle database version 12.2, DB administrators and developers are interested what new partitioning features are built in the DB engine, what benefits in terms of performance and maintenance they come with and what are the recommendations for their future usage. About the speaker: Thomas Teske works at Oracle Switzerland as Business Development Manager for technology. He gained his experience while working with all releases since Oracle 5. During his 20+ years with O...

  12. Oracle and storage IOs, explanations and experience at CERN

    CERN Document Server

    Grancher, E

    2010-01-01

    The Oracle database system is used extensively in the High Energy Physics community. Critical to the efficient running of these databases is the storage subsystem, and over the years Oracle has introduced new ways to access and manage this storage, e.g. ASM (10.1), Direct NFS (11.1), and Exadata (11.1). This paper presents our experience over the past few years with the different storage access and management features, and gives a comparison of each functionality. Also compared are the different solutions used at CERN, and the Tier 1 sites for storing Oracle databases.

  13. Oracle PLSQL Programming A Developer's Workbook

    CERN Document Server

    Feuerstein, Steven

    2008-01-01

    However excellent they are, most computer books are inherently passive--readers simply take in text without having any opportunity to react to it. The Oracle PL/SQL Developer's Workbook is a different kind of animal! It's designed to engage you actively, to get you solving programming problems immediately, and to help you apply what you've learned about PL/SQL--and in the process deepen your knowledge of the language. By tackling the exercises in this workbook, you'll find yourself moving more rapidly along the learning curve to join the growing ranks of PL/SQL experts. The Oracle PL/SQL

  14. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    Science.gov (United States)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  15. Lower bound for the oracle projection posterior convergence rate

    NARCIS (Netherlands)

    Babenko, A.; Belitser, E.N.

    2011-01-01

    In Babenko and Belitser (2010), a new notion for the posterior concentration rate is proposed, the so-called oracle risk rate, the best possible rate over an appropriately chosen estimators family, which is a local quantity (as compared, e.g., with global minimax rates). The program of oracle

  16. Oracle JDeveloper 11gR2 Cookbook

    CERN Document Server

    Haralabidis, Nick

    2012-01-01

    "Oracle JDeveloper 11gR2 Cookbook" is a practical cookbook which goes beyond the basics with immediately applicable recipes for building ADF applications at an intermediate-to-advanced level. If you are a JavaEE developer who wants to go beyond the basics of building ADF applications with Oracle JDeveloper 11gR2 and get hands on with practical recipes, this book is for you. You should be comfortable with general Java development principles, the JDeveloper IDE, and ADF basics

  17. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    Science.gov (United States)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  18. Oracle SOA Suite 11g performance cookbook

    CERN Document Server

    Brasier, Matthew; Wright, Nicholas

    2013-01-01

    This is a Cookbook with interesting, hands-on recipes, giving detailed descriptions and lots of practical walkthroughs for boosting the performance of your Oracle SOA Suite.This book is for Oracle SOA Suite 11g administrators, developers, and architects who want to understand how they can maximise the performance of their SOA Suite infrastructure. The recipes contain easy to follow step-by-step instructions and include many helpful and practical tips. It is suitable for anyone with basic operating system and application server administration experience.

  19. Expert Oracle database architecture Oracle database programming 9i, 10g, and 11g : Techniques and solution

    CERN Document Server

    Kyte, Thomas

    2010-01-01

    Now in its second edition, this best-selling book by Tom Kyte of Ask Tom fame continues to bring you some of the best thinking on how to apply Oracle Database to produce scalable applications that perform well and deliver correct results. Tom has a simple philosophy: you can treat Oracle as a black box and just stick data into it or you can understand how it works and exploit it as a powerful computing environment. If you choose the latter, then you'll find that there are few information management problems that you cannot solve quickly and elegantly. This fully revised second edition covers t

  20. OrChem - An open source chemistry search engine for Oracle(R).

    Science.gov (United States)

    Rijnbeek, Mark; Steinbeck, Christoph

    2009-10-22

    Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net.

  1. Oracle9i database gains traction among leading European customers

    CERN Multimedia

    2002-01-01

    The Oracle Corporation software group, today announced that more than 600 customers worldwide have purchased Oracle9i Real Application Clusters. Some of the European customers include Austrian Railways, BACS, Bavarian Police Force, CERN, e-Spatial, Navision a/s and the North Rhine-Westphalia Police Force.

  2. Robust Matching Pursuit Extreme Learning Machines

    Directory of Open Access Journals (Sweden)

    Zejian Yuan

    2018-01-01

    Full Text Available Extreme learning machine (ELM is a popular learning algorithm for single hidden layer feedforward networks (SLFNs. It was originally proposed with the inspiration from biological learning and has attracted massive attentions due to its adaptability to various tasks with a fast learning ability and efficient computation cost. As an effective sparse representation method, orthogonal matching pursuit (OMP method can be embedded into ELM to overcome the singularity problem and improve the stability. Usually OMP recovers a sparse vector by minimizing a least squares (LS loss, which is efficient for Gaussian distributed data, but may suffer performance deterioration in presence of non-Gaussian data. To address this problem, a robust matching pursuit method based on a novel kernel risk-sensitive loss (in short KRSLMP is first proposed in this paper. The KRSLMP is then applied to ELM to solve the sparse output weight vector, and the new method named the KRSLMP-ELM is developed for SLFN learning. Experimental results on synthetic and real-world data sets confirm the effectiveness and superiority of the proposed method.

  3. Oracle Application Integration Architecture (AIA) Foundation Pack 11gR1 Essentials

    CERN Document Server

    Ganesarethinam, Hariharan V

    2012-01-01

    This book is written in simple, easy to understand format with lots of screenshots and step-by-step explanations. If you are a Business Analyst, Integration Architect or Developer, working in Oracle applications integration, looking forward to understanding Oracle AIA fundamentals and development practice, then this is the best guide for you. This book assumes that you have a fundamental knowledge of Oracle SOA Suite and its components.

  4. A cognitive network for oracle bone characters related to animals

    Science.gov (United States)

    Dress, Andreas; Grünewald, Stefan; Zeng, Zhenbing

    2016-01-01

    In this paper, we present an analysis of oracle bone characters for animals from a “cognitive” point of view. After some general remarks on oracle-bone characters presented in Sec. 1 and a short outline of the paper in Sec. 2, we collect various oracle-bone characters for animals from published resources in Sec. 3. In the next section, we begin analyzing a group of 60 ancient animal characters from www.zdic.net, a highly acclaimed internet dictionary of Chinese characters that is strictly based on historical sources, and introduce five categories of specific features regarding their (graphical) structure that will be used in Sec. 5 to associate corresponding feature vectors to these characters. In Sec. 6, these feature vectors will be used to investigate their dissimilarity in terms of a family of parameterized distance measures. And in the last section, we apply the SplitsTree method as encoded in the NeighborNet algorithms to construct a corresponding family of dissimilarity-based networks with the intention of elucidating how the ancient Chinese might have perceived the “animal world” in the late bronze age and to demonstrate that these pictographs reflect an intuitive understanding of this world and its inherent structure that predates its classification in the oldest surviving Chinese encyclopedia from approximately the third century BC, the Er Ya, as well as similar classification systems in the West by one to two millennia. We also present an English dictionary of 70 oracle bone characters for animals in Appendix A. In Appendix B, we list various variants of animal characters that were published in the Jia Gu Wen Bian (cf. 甲骨文编, A Complete Collection of Oracle Bone Characters, edited by the Institute of Archaeology of the Chinese Academy of Social Sciences, published by the Zhonghua Book Company in 1965). We recall the frequencies of the 521 most frequent oracle bone characters in Appendix C as reported in [T. Chen, Yin-Shang Jiaguwen Zixing

  5. Oracle Hyperion Interactive Reporting 11 Expert Guide

    CERN Document Server

    Cody, Edward J

    2011-01-01

    This book is written in a simple, easy to understand format with screenshots, code samples, and step-by-step explanations that will guide you through the advanced techniques used by the experts. If you are an Oracle Hyperion Interactive reporting user or developer looking to become an expert in the product, then this book is for you. You will require a basic knowledge of Interactive Reporting, as this book starts with a brief overview and then dives into advanced techniques, functions, and best practices. Beginner users should consult The Business Analyst's Guide to Oracle Hyperion Interactive

  6. Approximate distance oracles for planar graphs with improved query time-space tradeoff

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    2016-01-01

    We consider approximate distance oracles for edge-weighted n-vertex undirected planar graphs. Given fixed ϵ > 0, we present a (1 + ϵ)-approximate distance oracle with O(n(log log n)2) space and O((loglogr?,)3) query time. This improves the previous best product of query time and space...... of the oracles of Thorup (FOCS 2001, J. ACM 2004) and Klein (SODA 2002) from O(nlogn) to O(n(loglogn)5)....

  7. Active learning for noisy oracle via density power divergence.

    Science.gov (United States)

    Sogawa, Yasuhiro; Ueno, Tsuyoshi; Kawahara, Yoshinobu; Washio, Takashi

    2013-10-01

    The accuracy of active learning is critically influenced by the existence of noisy labels given by a noisy oracle. In this paper, we propose a novel pool-based active learning framework through robust measures based on density power divergence. By minimizing density power divergence, such as β-divergence and γ-divergence, one can estimate the model accurately even under the existence of noisy labels within data. Accordingly, we develop query selecting measures for pool-based active learning using these divergences. In addition, we propose an evaluation scheme for these measures based on asymptotic statistical analyses, which enables us to perform active learning by evaluating an estimation error directly. Experiments with benchmark datasets and real-world image datasets show that our active learning scheme performs better than several baseline methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Developing of database on nuclear power engineering and purchase of ORACLE system

    International Nuclear Information System (INIS)

    Liu Renkang

    1996-01-01

    This paper presents a point of view according development of database on the nuclear power engineering and performance of ORACLE database manager system. ORACLE system is a practical database system for purchasing

  9. Uses of ORACLE in the Nova Laser Control System

    International Nuclear Information System (INIS)

    McGuigan, D.L.

    1983-01-01

    The Nova Laser System is a large-scale fusion experiment being constructed at the Lawrence Livermore National Laboratory. Modern control system technology is required to efficiently manage the thousands of devices needed to operate the system. In order to reduce the requirements on the operations staff, much of the system is being automated. This requires a significant knowledge base including frequently used system configurations and device parameters. We will be using ORACLE to provide this information to the control system. To insure the control-system integrity, ORACLE will be used to maintain information about the control-system software. This information will be used to document the system as well as help track down problems. ORACLE will also be used to maintain data on the system performance. This data will be analyzed to optimize the laser performance and point out when maintenance is required

  10. ROBUSTNESS AND PREDICTION ACCURACY OF MACHINE LEARNING FOR OBJECTIVE VISUAL QUALITY ASSESSMENT

    OpenAIRE

    Hines, Andrew; Kendrick, Paul; Barri, Adriaan; Narwaria, Manish; Redi, Judith A.

    2014-01-01

    Machine Learning (ML) is a powerful tool to support the development of objective visual quality assessment metrics, serving as a substitute model for the perceptual mechanisms acting in visual quality appreciation. Nevertheless, the reliability of ML-based techniques within objective quality assessment metrics is often questioned. In this study, the robustness of ML in supporting objective quality assessment is investigated, specifically when the feature set adopted for prediction is suboptim...

  11. Boat And Shore Oracle Data Tables

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Oracle Tables To Provide Boat and Shore Data which contains the object of this system is to provide an inventory of vessels that answer two fundamental questions:...

  12. On Combining Language Models: Oracle Approach

    National Research Council Canada - National Science Library

    Hacioglu, Kadri; Ward, Wayne

    2001-01-01

    In this paper, we address the of combining several language models (LMs). We find that simple interpolation methods, like log-linear and linear interpolation, improve the performance but fall short of the performance of an oracle...

  13. An evaluation of Oracle for persistent data storage and analysis of LHC physics data

    International Nuclear Information System (INIS)

    Grancher, E.; Marczukajtis, M.

    2001-01-01

    CERN's IT/DB group is currently exploring the possibility of using oracle to store LHC physics data. It presents preliminary results from this work, concentrating on two aspects: the storage of RAW and the analysis of TAG data. The RAW data part of the study discusses the throughput that one can achieve with the oracle database system, the options for storing the data and an estimation of the associated overheads. The TAG data analysis focuses on the use of new and extended indexing features of oracle to perform efficient cuts on the data. The tests were performed with Oracle 8.1.7

  14. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  15. Oracle, a novel PDZ-LIM domain protein expressed in heart and skeletal muscle.

    Science.gov (United States)

    Passier, R; Richardson, J A; Olson, E N

    2000-04-01

    In order to identify novel genes enriched in adult heart, we performed a subtractive hybridization for genes expressed in mouse heart but not in skeletal muscle. We identified two alternative splicing variants of a novel PDZ-LIM domain protein, which we named Oracle. Both variants contain a PDZ domain at the amino-terminus and three LIM domains at the carboxy-terminus. Highest homology of Oracle was found with the human and rat enigma proteins in the PDZ domain (62 and 61%, respectively) and in the LIM domains (60 and 69%, respectively). By Northern hybridization analysis, we showed that expression is highest in adult mouse heart, low in skeletal muscle and undetectable in other adult mouse tissues. In situ hybridization in mouse embryos confirmed and extended these data by showing high expression of Oracle mRNA in atrial and ventricular myocardial cells from E8.5. From E9.5 low expression of Oracle mRNA was detectable in myotomes. These data suggest a role for Oracle in the early development and function of heart and skeletal muscle.

  16. The mass remote sensing image data management based on Oracle InterMedia

    Science.gov (United States)

    Zhao, Xi'an; Shi, Shaowei

    2013-07-01

    With the development of remote sensing technology, getting the image data more and more, how to apply and manage the mass image data safely and efficiently has become an urgent problem to be solved. According to the methods and characteristics of the mass remote sensing image data management and application, this paper puts forward to a new method that takes Oracle Call Interface and Oracle InterMedia to store the image data, and then takes this component to realize the system function modules. Finally, it successfully takes the VC and Oracle InterMedia component to realize the image data storage and management.

  17. Understanding Oracle Clinical

    CERN Document Server

    Johnson, Joan

    2007-01-01

    This Short Cut is written to assist you, an Oracle Clinical Developer, with many of the tasks and decisions you may encounter on an occasional basis. These tasks involve study setup and maintenance, account maintenance, handling discrepancies, preparing data sets for analysis, batch-loading data, altering system-level settings and defining standard processes. A working knowledge of screen setup and procedure coding is assumed. Remote Data Capture (RDC), which moves data entry from the CRO or pharmaceutical company to the sites, is fast becoming the preferred way to gather and clean data for

  18. Random Oracles in a Quantum World

    NARCIS (Netherlands)

    D. Boneh; O. Dagdelen; M. Fischlin; D. Lehmann; C. Schaffner (Christian); M. Zhandry

    2012-01-01

    htmlabstractThe interest in post-quantum cryptography - classical systems that remain secure in the presence of a quantum adversary - has generated elegant proposals for new cryptosystems. Some of these systems are set in the random oracle model and are proven secure relative to adversaries that

  19. Expert Oracle Exadata

    CERN Document Server

    Johnson, Randy

    2011-01-01

    Throughout history, advances in technology have come in spurts. A single great idea can often spur rapid change as the idea takes hold and is propagated, often in totally unexpected directions. Exadata embodies such a change in how we think about and manage relational databases. The key change lies in the concept of offloading SQL processing to the storage layer. That concept is a huge win, and its implementation in the form of Exadata is truly a game changer. Expert Oracle Exadata will give you a look under the covers at how the combination of hardware and software that comprise Exadata actua

  20. Quantum functional oracles

    International Nuclear Information System (INIS)

    Kim, Jinsoo; Lee, Soojoon; Chi, Dong Pyo

    2002-01-01

    The limitation on the size of quantum computers makes it important to reuse qubits for auxiliary registers even though they are entangled with others and are occupied by other computational processes. We construct a quantum algorithm that performs the functional phase rotation, which is the generalized form of the conventional conditional phase transforms, using the functional evaluation oracle. The constructed algorithm works without any a priori knowledge of the state of an auxiliary register at the beginning and it recovers the initial state of an auxiliary register at the end. This provides ample scope to choose qubits for auxiliary registers at will. (author)

  1. Oracle support provides a range of new tutorials

    CERN Multimedia

    2013-01-01

    CERN IT-DB Group is pleased to announce a new series of Oracle tutorials, with the proposed schedule:   Tuesday 23 April Introduction to Oracle & Tools (30-7-018 - Kjell Johnsen Auditorium) Tuesday 30 April Database Design & Security (30-7-018 - Kjell Johnsen Auditorium) Wednesday 8 May SQL (40-S2-C01 - Salle Curie)   Tuesday 21 May PL/SQL (30-7-018 - Kjell Johnsen Auditorium)   Monday 27 May Troubleshooting Performance (40-S2-C01 - Salle Curie) Wednesday 5 June Troubleshooting Performance - Case Studies (40-S2-C01 - Salle Curie) There is no need to register in advance. For more information, see the Indico agenda.

  2. An Oracle(c) database for the AMS experiment

    International Nuclear Information System (INIS)

    Boschini, M.; Gervasi, M.; Grandi, D.; Rancoita, P.G.; Trombetta, L.; Usoskin, I.G.

    1999-01-01

    We present hardware and software technologies implemented for the AMS Milano Data Center. Goal of the AMS Milano Data Center is to provide data collected during the STS-91 Space Shuttle flight to users and to provide a User Interface as well to manage the data properly. Data are stored in a database that provides high level query and retrieval features, the support being a magneto-optical juke-box. We describe the use of proprietary software (Oracle(c)) as well as custom-written software to enhance access performances. In particular we underscore the use of the Oracle Call Interfaces as a powerful tool to interface the database and the operating system in a natural way

  3. Oracle BI Publisher 11g A Practical Guide to Enterprise Reporting

    CERN Document Server

    Bozdoc, Daniela

    2011-01-01

    This is a practical guide with step-by step instructions for enhancing your application of Oracle BI Publisher 11g for enterprise reporting. If you are an Oracle BI Publisher 11g end user, be it a report developer, business analyst or consultant, this book is for you. You should have good knowledge of general reporting practices and XML/XSL programming, though experience of using BI/XML publisher is not essential.

  4. Getting started with Oracle SOA B2B Integration a hands-on tutorial

    CERN Document Server

    Bhatia, Krishnaprem; Perlovsky, Alan

    2013-01-01

    This hands on tutorial gives you the best possible start you could hope for with Oracle B2B. Learn using real life scenarios and examples to give you a solid footing of B2B.This book is for B2B architects, consultants and developers who would like to design and develop B2B integrations using Oracle B2B. This book assumes no prior knowledge of Oracle B2B and explains all concepts from scratch using illustrations, real world examples and step-by-step instructions. The book covers enough depth and details to be useful for both beginner and advanced B2B users.

  5. ORACLE INEQUALITIES FOR THE LASSO IN THE COX MODEL.

    Science.gov (United States)

    Huang, Jian; Sun, Tingni; Ying, Zhiliang; Yu, Yi; Zhang, Cun-Hui

    2013-06-01

    We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Similar results based on an extension of the restricted eigenvalue can be also proved by our method. However, the presented oracle inequalities are sharper since the compatibility and cone invertibility factors are always greater than the corresponding restricted eigenvalue. In the Cox regression model, the Hessian matrix is based on time-dependent covariates in censored risk sets, so that the compatibility and cone invertibility factors, and the restricted eigenvalue as well, are random variables even when they are evaluated for the Hessian at the true regression coefficients. Under mild conditions, we prove that these quantities are bounded from below by positive constants for time-dependent covariates, including cases where the number of covariates is of greater order than the sample size. Consequently, the compatibility and cone invertibility factors can be treated as positive constants in our oracle inequalities.

  6. Oracle Inequalities for High Dimensional Vector Autoregressions

    DEFF Research Database (Denmark)

    Callot, Laurent; Kock, Anders Bredahl

    This paper establishes non-asymptotic oracle inequalities for the prediction error and estimation accuracy of the LASSO in stationary vector autoregressive models. These inequalities are used to establish consistency of the LASSO even when the number of parameters is of a much larger order...

  7. Sparse PCA with Oracle Property.

    Science.gov (United States)

    Gu, Quanquan; Wang, Zhaoran; Liu, Han

    In this paper, we study the estimation of the k -dimensional sparse principal subspace of covariance matrix Σ in the high-dimensional setting. We aim to recover the oracle principal subspace solution, i.e., the principal subspace estimator obtained assuming the true support is known a priori. To this end, we propose a family of estimators based on the semidefinite relaxation of sparse PCA with novel regularizations. In particular, under a weak assumption on the magnitude of the population projection matrix, one estimator within this family exactly recovers the true support with high probability, has exact rank- k , and attains a [Formula: see text] statistical rate of convergence with s being the subspace sparsity level and n the sample size. Compared to existing support recovery results for sparse PCA, our approach does not hinge on the spiked covariance model or the limited correlation condition. As a complement to the first estimator that enjoys the oracle property, we prove that, another estimator within the family achieves a sharper statistical rate of convergence than the standard semidefinite relaxation of sparse PCA, even when the previous assumption on the magnitude of the projection matrix is violated. We validate the theoretical results by numerical experiments on synthetic datasets.

  8. Practical tuning for Oracle

    International Nuclear Information System (INIS)

    Kwon, Sun Yong

    2005-02-01

    This book deals with tuning for oracle application, which consists of twenty two chapters. These are the contents of this book : what is tuning?, procedure of tuning, collection of performance data using stats pack, collection of performance data in real time, disk IO dispersion, architecture on Index, partition and IOT, optimization of cluster Factor, optimizer, analysis on plan of operation, selection of Index, tuning of Index, parallel processing architecture, DML, analytic function join method, join type, analysis of application, Lock architecture, SGA architecture and wait event and segment tuning.

  9. On Tichý’s Attempt to Explicate Sense in Terms of Turing Machines

    Czech Academy of Sciences Publication Activity Database

    Materna, Pavel

    2018-01-01

    Roč. 25, č. 1 (2018), s. 41-52 ISSN 1335-0668 R&D Projects: GA ČR(CZ) GA17-15645S Institutional support: RVO:67985955 Keywords : Oracle * possible worlds * procedure * sense * Turing machine Subject RIV: AA - Philosophy ; Religion OBOR OECD: Philosophy, History and Philosophy of science and technology

  10. Oracle SOA BPEL PM 11g R1 a hands-on tutorial

    CERN Document Server

    Saraswathi, Ravi

    2013-01-01

    This hands-on, example-driven guide is a practical getting started tutorial with plenty of step-by-step instructions for beginner to intermediate level readers working with BPEL PM in Oracle SOA SuiteWritten for SOA developers, administrators, architects, and engineers who want to get started with Oracle BPEL PM 11g. No previous experience with BPEL PM is required, but an understanding of SOA and web services is assumed

  11. Oracle Inequalities for Convex Loss Functions with Non-Linear Targets

    DEFF Research Database (Denmark)

    Caner, Mehmet; Kock, Anders Bredahl

    This paper consider penalized empirical loss minimization of convex loss functions with unknown non-linear target functions. Using the elastic net penalty we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target...... of the same order as that of the oracle. If the target is linear we give sufficient conditions for consistency of the estimated parameter vector. Next, we briefly discuss how a thresholded version of our estimator can be used to perform consistent variable selection. We give two examples of loss functions...

  12. Perfectly Secure Oblivious RAM without Random Oracles

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Meldgaard, Sigurd Torkel; Nielsen, Jesper Buus

    2011-01-01

    We present an algorithm for implementing a secure oblivious RAM where the access pattern is perfectly hidden in the information theoretic sense, without assuming that the CPU has access to a random oracle. In addition we prove a lower bound on the amount of randomness needed for implementing...

  13. A cognitive network for oracle-bone characters related to animals

    Science.gov (United States)

    Dress. Andreas; Grünewald, Stefan; Zeng, Zhenbing

    This paper is dedicated to HAO Bailin on the occasion of his eighties birthday, the great scholar and very good friend who never tired to introduce us to the wonderful and complex intricacies of Chinese culture and history. In this paper, we present an analysis of oracle-bone characters for animals from a `cognitive' point of view. After some general remarks on oraclebone characters presented in Section 1 and a short outline of the paper in Section 2, we collect various oracle-bone characters for animals from published resources in Section 3. In the next section, we begin analysing a group of 60 ancient animal characters from www.zdic.net, a highly acclaimed internet dictionary of Chinese characters that is strictly based on historical sources, and introduce five categories of specific features regarding their (graphical) structure that will be used in Section 5 to associate corresponding feature vectors to these characters. In Section 6, these feature vectors will be used to investigate their dissimilarity in terms of a family of parameterised distance measures. And in the last section, we apply the SplitsTree method as encoded in the NeighbourNet algorithms to construct a corresponding family of dissimilarity-based networks with the intention of elucidating how the ancient Chinese might have perceived the `animal world' in the late bronze age and to demonstrate that these pictographs reflect an intuitive understanding of this world and its inherent structure that predates its classification in the oldest surviving Chinese encyclopedia from approximately the 3rd century BC, the ErYa, as well as similar classification systems in the West by one to two millennia. We also present an English dictionary of 70 oracle-bone characters for animals in Appendix 1. In Appendix 2, we list various variants of animal characters that were published in the Jia Gu Wen Bian (cf. , A Complete Collection of Oracle Bone Characters, edited by the Institute of Archaeology of the Chinese

  14. PERANGKAT BANTU UNTUK OPTIMASI QUERY PADA ORACLE DENGAN RESTRUKTURISASI SQL

    Directory of Open Access Journals (Sweden)

    Darlis Heru Murti

    2006-07-01

    Full Text Available Query merupakan bagian dari bahasa pemrograman SQL (Structured Query Language yang berfungsi untuk mengambil data (read dalam DBMS (Database Management System, termasuk Oracle [3]. Pada Oracle, ada tiga tahap proses yang dilakukan dalam pengeksekusian query, yaitu Parsing, Execute dan Fetch. Sebelum proses execute dijalankan, Oracle terlebih dahulu membuat execution plan yang akan menjadi skenario dalam proses excute.Dalam proses pengeksekusian query, terdapat faktor-faktor yang mempengaruhi kinerja query, di antaranya access path (cara pengambilan data dari sebuah tabel dan operasi join (cara menggabungkan data dari dua tabel. Untuk mendapatkan query dengan kinerja optimal, maka diperlukan pertimbangan-pertimbangan dalam menyikapi faktor-faktor tersebut.  Optimasi query merupakan suatu cara untuk mendapatkan query dengan kinerja seoptimal mungkin, terutama dilihat dari sudut pandang waktu. Ada banyak metode untuk mengoptimasi query, tapi pada Penelitian ini, penulis membuat sebuah aplikasi untuk mengoptimasi query dengan metode restrukturisasi SQL statement. Pada metode ini, objek yang dianalisa adalah struktur klausa yang membangun sebuah query. Aplikasi ini memiliki satu input dan lima jenis output. Input dari aplikasi ini adalah sebuah query sedangkan kelima jenis output aplikasi ini adalah berupa query hasil optimasi, saran perbaikan, saran pembuatan indeks baru, execution plan dan data statistik. Cara kerja aplikasi ini dibagi menjadi empat tahap yaitu mengurai query menjadi sub query, mengurai query per-klausa, menentukan access path dan operasi join dan restrukturisasi query.Dari serangkaian ujicoba yang dilakukan penulis, aplikasi telah dapat berjalan sesuai dengan tujuan pembuatan Penelitian ini, yaitu mendapatkan query dengan kinerja optimal.Kata Kunci : Query, SQL, DBMS, Oracle, Parsing, Execute, Fetch, Execution Plan, Access Path, Operasi Join, Restrukturisasi SQL statement.

  15. Robust iterative learning contouring controller with disturbance observer for machine tool feed drives.

    Science.gov (United States)

    Simba, Kenneth Renny; Bui, Ba Dinh; Msukwa, Mathew Renny; Uchiyama, Naoki

    2018-04-01

    In feed drive systems, particularly machine tools, a contour error is more significant than the individual axial tracking errors from the view point of enhancing precision in manufacturing and production systems. The contour error must be within the permissible tolerance of given products. In machining complex or sharp-corner products, large contour errors occur mainly owing to discontinuous trajectories and the existence of nonlinear uncertainties. Therefore, it is indispensable to design robust controllers that can enhance the tracking ability of feed drive systems. In this study, an iterative learning contouring controller consisting of a classical Proportional-Derivative (PD) controller and disturbance observer is proposed. The proposed controller was evaluated experimentally by using a typical sharp-corner trajectory, and its performance was compared with that of conventional controllers. The results revealed that the maximum contour error can be reduced by about 37% on average. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Mocking in Oracle PL/SQL

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Testing is not very popular in database development, so there are none common approaches how to test software written in database. Surprisingly one of the oldest DB still lacks of appropriate testing approach for its PL/SQL programs. SQLDeveloper's built-in test "framework" is far from excellence, especially it does not cover mocking which is inherent part of testing for any bigger system being developed. This talk will briefly introduce Edition-Based Redefinition by Oracle to be used for mocking.

  17. Robust estimation for partially linear models with large-dimensional covariates.

    Science.gov (United States)

    Zhu, LiPing; Li, RunZe; Cui, HengJian

    2013-10-01

    We are concerned with robust estimation procedures to estimate the parameters in partially linear models with large-dimensional covariates. To enhance the interpretability, we suggest implementing a noncon-cave regularization method in the robust estimation procedure to select important covariates from the linear component. We establish the consistency for both the linear and the nonlinear components when the covariate dimension diverges at the rate of [Formula: see text], where n is the sample size. We show that the robust estimate of linear component performs asymptotically as well as its oracle counterpart which assumes the baseline function and the unimportant covariates were known a priori. With a consistent estimator of the linear component, we estimate the nonparametric component by a robust local linear regression. It is proved that the robust estimate of nonlinear component performs asymptotically as well as if the linear component were known in advance. Comprehensive simulation studies are carried out and an application is presented to examine the finite-sample performance of the proposed procedures.

  18. Robust Visual Knowledge Transfer via Extreme Learning Machine Based Domain Adaptation.

    Science.gov (United States)

    Zhang, Lei; Zhang, David

    2016-08-10

    We address the problem of visual knowledge adaptation by leveraging labeled patterns from source domain and a very limited number of labeled instances in target domain to learn a robust classifier for visual categorization. This paper proposes a new extreme learning machine based cross-domain network learning framework, that is called Extreme Learning Machine (ELM) based Domain Adaptation (EDA). It allows us to learn a category transformation and an ELM classifier with random projection by minimizing the -norm of the network output weights and the learning error simultaneously. The unlabeled target data, as useful knowledge, is also integrated as a fidelity term to guarantee the stability during cross domain learning. It minimizes the matching error between the learned classifier and a base classifier, such that many existing classifiers can be readily incorporated as base classifiers. The network output weights cannot only be analytically determined, but also transferrable. Additionally, a manifold regularization with Laplacian graph is incorporated, such that it is beneficial to semi-supervised learning. Extensively, we also propose a model of multiple views, referred as MvEDA. Experiments on benchmark visual datasets for video event recognition and object recognition, demonstrate that our EDA methods outperform existing cross-domain learning methods.

  19. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  20. Lecture 9: Oracle Databases at CERN

    CERN Multimedia

    CERN. Geneva; Limper, Maaike

    2013-01-01

    She participated in the analysis of the first LHC data in a variety of ways: she worked on the construction of the ATLAS silicon tracker, wrote new data reconstruction software and developed some of the databases that store information on the ATLAS data-taking conditions. As of January 2012, Maaike joined the CERN IT Databases group as a CERN openlab Fellow funded by Oracle to help investigate the possib...

  1. ATLAS database application enhancements using Oracle 11g

    CERN Document Server

    Dimitrov, G; The ATLAS collaboration; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemas (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have...

  2. Quantum random oracle model for quantum digital signature

    Science.gov (United States)

    Shang, Tao; Lei, Qi; Liu, Jianwei

    2016-10-01

    The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.

  3. 75 FR 82381 - Oracle Energy Services, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Science.gov (United States)

    2010-12-30

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-2436-000] Oracle Energy Services, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket... proceeding of Oracle Energy Services, LLC's application for market-based rate authority, with an accompanying...

  4. ORACLE E-BUSINESS SUITE” MIGRĀCIJA STARP DAŽĀDĀM VERSIJĀM

    OpenAIRE

    Dansone, Linda

    2008-01-01

    Maģistra darbā analizēta uzņēmumu resursu pārvaldības sistēmās "Oracle E-Business Suite" migrācija starp dažādām versijām. Darba gaitā izpētīti ieteikumi „Oracle E-Business Suite” migrācijai, apskatīta Oracle E-Business Suite standartfunkcionalitāte un arhitektūra, izanalizēta uzņēmuma pieredze šīs sistēmas migrēšanā, apzināti ieteikumi un rekomendācijas šī procesa uzlabošanā, apskatīta risku pārvaldība, apskatīts COBIT standarts, izanalizēti riski, veikta pieteikumu analīze pirms migrāci...

  5. Rapidly exploring structural and dynamic properties of signaling networks using PathwayOracle

    Directory of Open Access Journals (Sweden)

    Ram Prahlad T

    2008-08-01

    Full Text Available Abstract Background In systems biology the experimentalist is presented with a selection of software for analyzing dynamic properties of signaling networks. These tools either assume that the network is in steady-state or require highly parameterized models of the network of interest. For biologists interested in assessing how signal propagates through a network under specific conditions, the first class of methods does not provide sufficiently detailed results and the second class requires models which may not be easily and accurately constructed. A tool that is able to characterize the dynamics of a signaling network using an unparameterized model of the network would allow biologists to quickly obtain insights into a signaling network's behavior. Results We introduce PathwayOracle, an integrated suite of software tools for computationally inferring and analyzing structural and dynamic properties of a signaling network. The feature which differentiates PathwayOracle from other tools is a method that can predict the response of a signaling network to various experimental conditions and stimuli using only the connectivity of the signaling network. Thus signaling models are relatively easy to build. The method allows for tracking signal flow in a network and comparison of signal flows under different experimental conditions. In addition, PathwayOracle includes tools for the enumeration and visualization of coherent and incoherent signaling paths between proteins, and for experimental analysis – loading and superimposing experimental data, such as microarray intensities, on the network model. Conclusion PathwayOracle provides an integrated environment in which both structural and dynamic analysis of a signaling network can be quickly conducted and visualized along side experimental results. By using the signaling network connectivity, analyses and predictions can be performed quickly using relatively easily constructed signaling network models

  6. Continuous-Variable Quantum Computation of Oracle Decision Problems

    Science.gov (United States)

    Adcock, Mark R. A.

    Quantum information processing is appealing due its ability to solve certain problems quantitatively faster than classical information processing. Most quantum algorithms have been studied in discretely parameterized systems, but many quantum systems are continuously parameterized. The field of quantum optics in particular has sophisticated techniques for manipulating continuously parameterized quantum states of light, but the lack of a code-state formalism has hindered the study of quantum algorithms in these systems. To address this situation, a code-state formalism for the solution of oracle decision problems in continuously-parameterized quantum systems is developed. Quantum information processing is appealing due its ability to solve certain problems quantitatively faster than classical information processing. Most quantum algorithms have been studied in discretely parameterized systems, but many quantum systems are continuously parameterized. The field of quantum optics in particular has sophisticated techniques for manipulating continuously parameterized quantum states of light, but the lack of a code-state formalism has hindered the study of quantum algorithms in these systems. To address this situation, a code-state formalism for the solution of oracle decision problems in continuously-parameterized quantum systems is developed. In the infinite-dimensional case, we study continuous-variable quantum algorithms for the solution of the Deutsch--Jozsa oracle decision problem implemented within a single harmonic-oscillator. Orthogonal states are used as the computational bases, and we show that, contrary to a previous claim in the literature, this implementation of quantum information processing has limitations due to a position-momentum trade-off of the Fourier transform. We further demonstrate that orthogonal encoding bases are not unique, and using the coherent states of the harmonic oscillator as the computational bases, our formalism enables quantifying

  7. Composite Differential Evolution with Modified Oracle Penalty Method for Constrained Optimization Problems

    Directory of Open Access Journals (Sweden)

    Minggang Dong

    2014-01-01

    Full Text Available Motivated by recent advancements in differential evolution and constraints handling methods, this paper presents a novel modified oracle penalty function-based composite differential evolution (MOCoDE for constrained optimization problems (COPs. More specifically, the original oracle penalty function approach is modified so as to satisfy the optimization criterion of COPs; then the modified oracle penalty function is incorporated in composite DE. Furthermore, in order to solve more complex COPs with discrete, integer, or binary variables, a discrete variable handling technique is introduced into MOCoDE to solve complex COPs with mix variables. This method is assessed on eleven constrained optimization benchmark functions and seven well-studied engineering problems in real life. Experimental results demonstrate that MOCoDE achieves competitive performance with respect to some other state-of-the-art approaches in constrained optimization evolutionary algorithms. Moreover, the strengths of the proposed method include few parameters and its ease of implementation, rendering it applicable to real life. Therefore, MOCoDE can be an efficient alternative to solving constrained optimization problems.

  8. Witnet: A Decentralized Oracle Network Protocol

    OpenAIRE

    de Pedro, Adán Sánchez; Levi, Daniele; Cuende, Luis Iván

    2017-01-01

    Witnet is a decentralized oracle network (DON) that connects smart contracts to the outer world. Generally speaking, it allows any piece of software to retrieve the contents published at any web address at a certain point in time, with complete and verifiable proof of its integrity and without blindly trusting any third party. Witnet runs on a blockchain with a native protocol token (called Wit), which miners-called witnesses-earn by retrieving, attesting and delivering web contents for clien...

  9. C-A1-03: Considerations in the Design and Use of an Oracle-based Virtual Data Warehouse

    Science.gov (United States)

    Bredfeldt, Christine; McFarland, Lela

    2011-01-01

    Background/Aims The amount of clinical data available for research is growing exponentially. As it grows, increasing the efficiency of both data storage and data access becomes critical. Relational database management systems (rDBMS) such as Oracle are ideal solutions for managing longitudinal clinical data because they support large-scale data storage and highly efficient data retrieval. In addition, they can greatly simplify the management of large data warehouses, including security management and regular data refreshes. However, the HMORN Virtual Data Warehouse (VDW) was originally designed based on SAS datasets, and this design choice has a number of implications for both the design and use of an Oracle-based VDW. From a design standpoint, VDW tables are designed as flat SAS datasets, which do not take full advantage of Oracle indexing capabilities. From a data retrieval standpoint, standard VDW SAS scripts do not take advantage of SAS pass-through SQL capabilities to enable Oracle to perform the processing required to narrow datasets to the population of interest. Methods Beginning in 2009, the research department at Kaiser Permanente in the Mid-Atlantic States (KPMA) has developed an Oracle-based VDW according to the HMORN v3 specifications. In order to take advantage of the strengths of relational databases, KPMA introduced an interface layer to the VDW data, using views to provide access to standardized VDW variables. In addition, KPMA has developed SAS programs that provide access to SQL pass-through processing for first-pass data extraction into SAS VDW datasets for processing by standard VDW scripts. Results We discuss both the design and performance considerations specific to the KPMA Oracle-based VDW. We benchmarked performance of the Oracle-based VDW using both standard VDW scripts and an initial pre-processing layer to evaluate speed and accuracy of data return. Conclusions Adapting the VDW for deployment in an Oracle environment required minor

  10. Multi-objective optimization model of CNC machining to minimize processing time and environmental impact

    Science.gov (United States)

    Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad

    2017-11-01

    Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.

  11. An ORACLE Chronicle: A Decade of Classroom Research.

    Science.gov (United States)

    Galton, Maurice

    1987-01-01

    This article describes Project ORACLE which was research carried out at the University of Leicester begun in 1975 concerning (1) a longitudinal process-product study of teaching and learning in elementary schools; and (2) a study which concentrated on collaborative group work in the same classrooms. Results and implications are discussed.…

  12. FISH Oracle: a web server for flexible visualization of DNA copy number data in a genomic context.

    Science.gov (United States)

    Mader, Malte; Simon, Ronald; Steinbiss, Sascha; Kurtz, Stefan

    2011-07-28

    The rapidly growing amount of array CGH data requires improved visualization software supporting the process of identifying candidate cancer genes. Optimally, such software should work across multiple microarray platforms, should be able to cope with data from different sources and should be easy to operate. We have developed a web-based software FISH Oracle to visualize data from multiple array CGH experiments in a genomic context. Its fast visualization engine and advanced web and database technology supports highly interactive use. FISH Oracle comes with a convenient data import mechanism, powerful search options for genomic elements (e.g. gene names or karyobands), quick navigation and zooming into interesting regions, and mechanisms to export the visualization into different high quality formats. These features make the software especially suitable for the needs of life scientists. FISH Oracle offers a fast and easy to use visualization tool for array CGH and SNP array data. It allows for the identification of genomic regions representing minimal common changes based on data from one or more experiments. FISH Oracle will be instrumental to identify candidate onco and tumor suppressor genes based on the frequency and genomic position of DNA copy number changes. The FISH Oracle application and an installed demo web server are available at http://www.zbh.uni-hamburg.de/fishoracle.

  13. Oracle APEX 4.2 cookbook

    CERN Document Server

    Van Zoest, Michel

    2013-01-01

    As a Cookbook, this book enables you to create APEX web applications and to implement features with immediately usable recipes that unleash the powerful functionality of Oracle APEX 4.2. Each recipe is presented as a separate, standalone entity and the reading of other, prior recipes is not required.It can be seen as a reference and a practical guide to APEX development.This book is aimed both at developers new to the APEX environment and at intermediate developers. More advanced developers will also gain from the information at hand.If you are new to APEX you will find recipes to start develo

  14. Rancang Bangun Sistem Informasi Asistensi ORACLE Berbasis Web Di Prodi Sistem Informasi UNIKOM

    Directory of Open Access Journals (Sweden)

    diana effendi

    2016-12-01

    Full Text Available Learning process in the Computer Laboratory cannot be separated by quantity and computer specs as well as qualified lecturer. learning process in the Oracle Laboratory of Information System Major can not be held optimally, because many students in the laboratory and limited  time for study. Not all students can be accommodate to solve their problems in the laboratory. This problem is solved with the chosen laboratory assistant. During the process of selecting a laboratory assistant do with conventional process. The election process can not produce a qualified assistant. Besides that,  scheduling lab assistants performed manually, resulting in a scheduling conflict with class schedules the assistant and needs to create a schedule and  printing BAP (Berita Acara Perkuliahan returned.The certificate of assistance is granted at the end of the semester, done manually and can be made repeatedly. Therefore, need to be built Laboratory Information System Assistance Oracle (SIASLORA that can process data assistant candidate selection, scheduling assistance Oracle lab, printing BAP (Berita Acara Perkuliahan and printing a  certificate of assistance quickly and accurately. Development method of SIASLORA using prototype development system and structured system approach using context diagram and data flow diagram. The Programing language using PHP (Hypertext Preprocessor and data base management system (DBMS using MySQL. With SIASLORA expected to overcome the problems described earlier.Keywords — laboratory Oracle, kcomputerize, information system, SIASLORA.

  15. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    Science.gov (United States)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  16. Robust Machine Learning Variable Importance Analyses of Medical Conditions for Health Care Spending.

    Science.gov (United States)

    Rose, Sherri

    2018-03-11

    To propose nonparametric double robust machine learning in variable importance analyses of medical conditions for health spending. 2011-2012 Truven MarketScan database. I evaluate how much more, on average, commercially insured enrollees with each of 26 of the most prevalent medical conditions cost per year after controlling for demographics and other medical conditions. This is accomplished within the nonparametric targeted learning framework, which incorporates ensemble machine learning. Previous literature studying the impact of medical conditions on health care spending has almost exclusively focused on parametric risk adjustment; thus, I compare my approach to parametric regression. My results demonstrate that multiple sclerosis, congestive heart failure, severe cancers, major depression and bipolar disorders, and chronic hepatitis are the most costly medical conditions on average per individual. These findings differed from those obtained using parametric regression. The literature may be underestimating the spending contributions of several medical conditions, which is a potentially critical oversight. If current methods are not capturing the true incremental effect of medical conditions, undesirable incentives related to care may remain. Further work is needed to directly study these issues in the context of federal formulas. © Health Research and Educational Trust.

  17. Real-time PCR Machine System Modeling and a Systematic Approach for the Robust Design of a Real-time PCR-on-a-Chip System

    Directory of Open Access Journals (Sweden)

    Da-Sheng Lee

    2010-01-01

    Full Text Available Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.

  18. Single Sign-On: um estudo de caso em banco de dados Oracle

    Directory of Open Access Journals (Sweden)

    Cássio Tavares Brito

    2012-11-01

    Full Text Available É notório que as soluções disponibilizadas pela Tecnologia da Informação (TI trazem benefícios mensuráveis para áreas afins e também áreas meio. Desta forma, aceita-se que com esta evolução tecnológica os Sistemas de Gerenciamento de Banco de Dados Relacional (SGBDR obtiveram um ganho significativo dos recursos referentes a segurança, persistência, processamento e armazenamento de dados. A junção de todos estes recursos possibilita aos Administradores de Banco de Dados (DBA criar, organizar e manter as diversas bases de informações de diversas organizações da forma mais efetiva possível. A este propósito o Serviço de Diretório, nos Bancos de Dados Oracle, veio contribuir para as melhores práticas de Segurança da Informação, que integrado aos Serviços de Diretórios já existentes, tais como: Active Directory (Microsoft, eDirectory (Novell e OpenLDAP, aumentam o leque da interoperabilidade de serviços. Assim, como resultado tem-se o provisionamento automático de identidades dos usuários nas principais funcionalidades, tais como a criação, atualização, desativação e remoção destas contas nos respectivos Bancos de Dados, de forma imediata, e com total transparência. Este trabalho tem como objetivo elaborar um estudo sobre o componente Oracle Internet Directory que provisiona login e senha únicos da rede corporativa sincronizados com o Serviço de Diretório do Banco de Dados Oracle. Quando este componente é registrado nos Bancos de Dados Oracle, otimiza o custo da troca constante das senhas dos respectivos Administradores de Banco de Dados. Assim, fundamenta-se as bases para que o Single Sign-On (Único Ponto de Entrada nos Bancos de Dados Oracle possa ser implementado seguindo as melhores práticas de Gestão de Acesso e Segurança da Informação.

  19. Surface Estimation, Variable Selection, and the Nonparametric Oracle Property.

    Science.gov (United States)

    Storlie, Curtis B; Bondell, Howard D; Reich, Brian J; Zhang, Hao Helen

    2011-04-01

    Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting.

  20. Oracle NoSQL databáze

    OpenAIRE

    Chlomek, Lukáš

    2015-01-01

    This thesis deals with the theme of NoSQL databases. The theoretical part describes, the reasons for the creation of this databases trend, their basic properties and the most widely used NoSQL data models. The practical part introduces readers with one representative of NoSQL key-value data model, Oracle NoSQL. Following is a sample of work with records of this database, using created test program. End of work includes a short practical demon-stration of manipulation with tables in the reposi...

  1. Robust Least-Squares Support Vector Machine With Minimization of Mean and Variance of Modeling Error.

    Science.gov (United States)

    Lu, Xinjiang; Liu, Wenbo; Zhou, Chuang; Huang, Minghui

    2017-06-13

    The least-squares support vector machine (LS-SVM) is a popular data-driven modeling method and has been successfully applied to a wide range of applications. However, it has some disadvantages, including being ineffective at handling non-Gaussian noise as well as being sensitive to outliers. In this paper, a robust LS-SVM method is proposed and is shown to have more reliable performance when modeling a nonlinear system under conditions where Gaussian or non-Gaussian noise is present. The construction of a new objective function allows for a reduction of the mean of the modeling error as well as the minimization of its variance, and it does not constrain the mean of the modeling error to zero. This differs from the traditional LS-SVM, which uses a worst-case scenario approach in order to minimize the modeling error and constrains the mean of the modeling error to zero. In doing so, the proposed method takes the modeling error distribution information into consideration and is thus less conservative and more robust in regards to random noise. A solving method is then developed in order to determine the optimal parameters for the proposed robust LS-SVM. An additional analysis indicates that the proposed LS-SVM gives a smaller weight to a large-error training sample and a larger weight to a small-error training sample, and is thus more robust than the traditional LS-SVM. The effectiveness of the proposed robust LS-SVM is demonstrated using both artificial and real life cases.

  2. OCA/OCP Oracle database 11g all-in-one exam guide exams 1Z0-051, 1Z0-052, 1Z0-053

    CERN Document Server

    Watson, John

    2010-01-01

    A Fully Integrated Study System for OCA Exams 1Z0-051 and 1Z0-052, and OCP Exam 1Z0-053 Prepare for the Oracle Certified Associate Administration I and SQL Fundamentals I exams and the Oracle Certified Professional Administration II exam with help from this exclusive Oracle Press guide. In each chapter, you'll find challenging exercises, practice questions, and a two-minute drill to highlight what you've learned. This authoritative guide will help you pass the test and serve as your essential on-the-job reference. Get complete coverage of all objectives for exams 1Z0-051, 1Z0-052, and 1Z0-053, including: Instance management Networking and storage Security SQL Oracle Recovery Manager and Oracle Flashback Oracle Automatic Storage Management Resource manager Oracle Scheduler Automatic workload repository Performance tuning And more On the CD-ROM: Three full practice exams Detailed answers and explanations Score report performance assessment tool Complete electronic book Three bonus exams available with free onli...

  3. Single product lot-sizing on unrelated parallel machines with non-decreasing processing times

    Science.gov (United States)

    Eremeev, A.; Kovalyov, M.; Kuznetsov, P.

    2018-01-01

    We consider a problem in which at least a given quantity of a single product has to be partitioned into lots, and lots have to be assigned to unrelated parallel machines for processing. In one version of the problem, the maximum machine completion time should be minimized, in another version of the problem, the sum of machine completion times is to be minimized. Machine-dependent lower and upper bounds on the lot size are given. The product is either assumed to be continuously divisible or discrete. The processing time of each machine is defined by an increasing function of the lot volume, given as an oracle. Setup times and costs are assumed to be negligibly small, and therefore, they are not considered. We derive optimal polynomial time algorithms for several special cases of the problem. An NP-hard case is shown to admit a fully polynomial time approximation scheme. An application of the problem in energy efficient processors scheduling is considered.

  4. Aerosol-Radiation-Cloud Interactions in the South-East Atlantic: Results from the ORACLES-2016 Deployment and a First Look at ORACLES-2017 and Beyond

    Science.gov (United States)

    Redemann, Jens; Wood, R.; Zuidema, P.

    2018-01-01

    Seasonal biomass burning (BB) in Southern Africa during the Southern hemisphere spring produces almost a third of the Earth's BB aerosol particles. These particles are lofted into the mid-troposphere and transported westward over the South-East (SE) Atlantic, where they interact with one of the three semi-permanent subtropical stratocumulus (Sc) cloud decks in the world. These interactions include adjustments to aerosol-induced solar heating and microphysical effects. The representation of these interactions in climate models remains highly uncertain, because of the scarcity of observational constraints on both, the aerosol and cloud properties, and the governing physical processes. The first deployment of the NASA P-3 and ER-2 aircraft in the ORACLES (ObseRvations of Aerosols Above Clouds and Their IntEractionS) project in August/September of 2016 has started to fill this observational gap by providing an unprecedented look at the SE Atlantic cloud-aerosol system. We provide an overview of the first deployment, highlighting aerosol absorptive and cloud-nucleating properties, their vertical distribution relative to clouds, the locations and degree of aerosol mixing into clouds, cloud changes in response to such mixing, and cloud top stability relationships to the aerosol. We also expect to describe preliminary results of the second ORACLES deployment from Sao Tome and Pri­ncipe in August 2017. We will make an initial assessment of the differences and similarities of the BB plume and cloud properties as observed from a deployment site near the plume's northern edge. We will conclude with an outlook for the third ORACLES deployment in October 2018.

  5. Introducing ORACLE: Library Processing in a Multi-User Environment.

    Science.gov (United States)

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  6. C#: Connecting a Mobile Application to Oracle Server via Web Services

    Directory of Open Access Journals (Sweden)

    Daniela Ilea

    2008-01-01

    Full Text Available This article is focused on mobile development using Visual Studio 2005, web services and their connection to Oracle Server, willing to help programmers to realize simple and useful mobile applications.

  7. The Delphic oracle: a multidisciplinary defense of the gaseous vent theory.

    Science.gov (United States)

    Spiller, Henry A; Hale, John R; De Boer, Jelle Z

    2002-01-01

    Ancient historical references consistently describe an intoxicating gas, produced by a cavern in the ground, as the source of the power at the oracle of Delphi. These ancient writings are supported by a series of associated geological findings. Chemical analysis of the spring waters and travertine deposits at the site show these gases to be the light hydrocarbon gases methane, ethane, and ethylene. The effects of inhaling ethylene, a major anesthetic gas in the mid-20th century, are similar to those described in the ancient writings. We believe the probable cause of the trancelike state of the Priestess (the Pythia) at the oracle of Delphi during her mantic sessions was produced by inhaling ethylene gas or a mixture of ethylene and ethane from a naturally occurring vent of geological origin.

  8. FISH Oracle 2: a web server for integrative visualization of genomic data in cancer research.

    Science.gov (United States)

    Mader, Malte; Simon, Ronald; Kurtz, Stefan

    2014-03-31

    A comprehensive view on all relevant genomic data is instrumental for understanding the complex patterns of molecular alterations typically found in cancer cells. One of the most effective ways to rapidly obtain an overview of genomic alterations in large amounts of genomic data is the integrative visualization of genomic events. We developed FISH Oracle 2, a web server for the interactive visualization of different kinds of downstream processed genomics data typically available in cancer research. A powerful search interface and a fast visualization engine provide a highly interactive visualization for such data. High quality image export enables the life scientist to easily communicate their results. A comprehensive data administration allows to keep track of the available data sets. We applied FISH Oracle 2 to published data and found evidence that, in colorectal cancer cells, the gene TTC28 may be inactivated in two different ways, a fact that has not been published before. The interactive nature of FISH Oracle 2 and the possibility to store, select and visualize large amounts of downstream processed data support life scientists in generating hypotheses. The export of high quality images supports explanatory data visualization, simplifying the communication of new biological findings. A FISH Oracle 2 demo server and the software is available at http://www.zbh.uni-hamburg.de/fishoracle.

  9. MIGRATION OF ORACLE HR DATABASE

    CERN Multimedia

    ais.support@cern.ch

    2001-01-01

    Restricted services from 3 to 7 November 2001 Due to the migration of the Oracle HR application to the Web, some services which rely on the application's availability may be disturbed from Friday 2 November at 17:30 until Thursday 8 November at 08:30. Amongst those services: HR Division: records office, recruitment, claims and benefits. FI Division: personnel accounting, advances and claims. ST Division: registration office (access cards). SPL Division: external firm staff records. EP Division: users' office. Experiments' secretariats: PIE, Greybook. Divisional secretariats: externals, internal addresses. All information concerning this migration is available at: http://ais.cern.ch We apologize for any inconvenience and thank you in advance for your understanding.

  10. Efficient and robust pupil size and blink estimation from near-field video sequences for human-machine interaction.

    Science.gov (United States)

    Chen, Siyuan; Epps, Julien

    2014-12-01

    Monitoring pupil and blink dynamics has applications in cognitive load measurement during human-machine interaction. However, accurate, efficient, and robust pupil size and blink estimation pose significant challenges to the efficacy of real-time applications due to the variability of eye images, hence to date, require manual intervention for fine tuning of parameters. In this paper, a novel self-tuning threshold method, which is applicable to any infrared-illuminated eye images without a tuning parameter, is proposed for segmenting the pupil from the background images recorded by a low cost webcam placed near the eye. A convex hull and a dual-ellipse fitting method are also proposed to select pupil boundary points and to detect the eyelid occlusion state. Experimental results on a realistic video dataset show that the measurement accuracy using the proposed methods is higher than that of widely used manually tuned parameter methods or fixed parameter methods. Importantly, it demonstrates convenience and robustness for an accurate and fast estimate of eye activity in the presence of variations due to different users, task types, load, and environments. Cognitive load measurement in human-machine interaction can benefit from this computationally efficient implementation without requiring a threshold calibration beforehand. Thus, one can envisage a mini IR camera embedded in a lightweight glasses frame, like Google Glass, for convenient applications of real-time adaptive aiding and task management in the future.

  11. OCA Oracle Database SQL exam guide (exam 1Z0-071) complete exam preparation

    CERN Document Server

    O'Hearn, Steve

    2017-01-01

    This thoroughly revised Oracle Press guide offers 100% coverage of all objectives on the latest version of the Oracle Database SQL Exam. Ideal both as a study guide and on-the-job reference, OCA Oracle Database SQL Exam Guide (Exam 1Z0-071) features detailed explanations, examples, practice questions, and chapter summaries. “Certification Objectives,” “Exam Watch,” and “On the Job” sections reinforce salient points throughout. You will gain access to two complete practice exams that match the tone, tenor, and format of the live test. Get complete coverage every topic on Exam 1Z0-071, including: • DDL and SQL SELECT statements • Manipulating, restricting, and sorting data • Single-row and group functions • Displaying data from multiple tables • Subqueries • Schema objects • Set operators • Grouping related data • Report creation • Data dictionary views • Large data sets • Hierarchical retrieval • Regular expression support • User access control The electronic includes: • Two full practi...

  12. OCA Oracle Database SQL exam guide (exam 1Z0-071) : complete exam preparation

    CERN Document Server

    O'Hearn, Steve

    2017-01-01

    This thoroughly revised Oracle Press guide offers 100% coverage of all objectives on the latest version of the Oracle Database SQL Exam. Ideal both as a study guide and on-the-job reference, OCA Oracle Database SQL Exam Guide (Exam 1Z0-071) features detailed explanations, examples, practice questions, and chapter summaries. “Certification Objectives,” “Exam Watch,” and “On the Job” sections reinforce salient points throughout. You will gain access to two complete practice exams that match the tone, tenor, and format of the live test. Get complete coverage every topic on Exam 1Z0-071, including: • DDL and SQL SELECT statements • Manipulating, restricting, and sorting data • Single-row and group functions • Displaying data from multiple tables • Subqueries • Schema objects • Set operators • Grouping related data • Report creation • Data dictionary views • Large data sets • Hierarchical retrieval • Regular expression support • User access control The electronic includes: • Two full practi...

  13. Analysis, Design and Implementation of a Web Database With Oracle 8I

    National Research Council Canada - National Science Library

    Demiryurek, Ugur

    2001-01-01

    ....O served as the OS environment From the technical aspect, Database Management Systems, Web-Database Architectures, Server Extension Programs, Oracle8i as well as several other software and hardware...

  14. OCA Oracle Database 12c administrator certified associate study guide : exams 1Z0-061 and 1Z0-062

    CERN Document Server

    Thomas, Biju

    2014-01-01

    An all-in-one study guide prepares you for the updated Oracle Certified Associate certification It's been nearly six years since Oracle updated its cornerstone database software, making the demand for a comprehensive study guide for the OCA 12c certification a top priority. This resource answers that demand. Packed with invaluable insight, chapter review questions, bonus practice exams, hundreds of electronic flashcards, and a searchable glossary of terms, this study guide prepares you for the challenging Oracle certification exams. Provides you with a solid understanding of restricting and s

  15. Leveling a Simple Manufacturing Process Using Microsoft Project and Oracle Primavera

    Directory of Open Access Journals (Sweden)

    Ilie Margareta

    2017-01-01

    Full Text Available The present paper main objective is to put forward two software application tools: Microsoft Project 2013 and Oracle Primavera P6 Professional, used for the management process planning, especially for industrial projects. Second goals is to compare some feature of the two software and emphasize some aspects used for an easier use. The presentation and comparison considered four small processes with a total of 38 activities. The two software are used for the leveling of seven labor resources defined for the achievement of the activities. The results of the analyzes point out that even the two tools have the same purpose, Microsoft Project can be used for general simple to average processes’ complexity and size, Oracle Primavera has more specialized possibilities (libraries and specific project plans and more possibilities to model the actions needed for better leveling.

  16. ORACLS- OPTIMAL REGULATOR ALGORITHMS FOR THE CONTROL OF LINEAR SYSTEMS (CDC VERSION)

    Science.gov (United States)

    Armstrong, E. S.

    1994-01-01

    This control theory design package, called Optimal Regulator Algorithms for the Control of Linear Systems (ORACLS), was developed to aid in the design of controllers and optimal filters for systems which can be modeled by linear, time-invariant differential and difference equations. Optimal linear quadratic regulator theory, currently referred to as the Linear-Quadratic-Gaussian (LQG) problem, has become the most widely accepted method of determining optimal control policy. Within this theory, the infinite duration time-invariant problems, which lead to constant gain feedback control laws and constant Kalman-Bucy filter gains for reconstruction of the system state, exhibit high tractability and potential ease of implementation. A variety of new and efficient methods in the field of numerical linear algebra have been combined into the ORACLS program, which provides for the solution to time-invariant continuous or discrete LQG problems. The ORACLS package is particularly attractive to the control system designer because it provides a rigorous tool for dealing with multi-input and multi-output dynamic systems in both continuous and discrete form. The ORACLS programming system is a collection of subroutines which can be used to formulate, manipulate, and solve various LQG design problems. The ORACLS program is constructed in a manner which permits the user to maintain considerable flexibility at each operational state. This flexibility is accomplished by providing primary operations, analysis of linear time-invariant systems, and control synthesis based on LQG methodology. The input-output routines handle the reading and writing of numerical matrices, printing heading information, and accumulating output information. The basic vector-matrix operations include addition, subtraction, multiplication, equation, norm construction, tracing, transposition, scaling, juxtaposition, and construction of null and identity matrices. The analysis routines provide for the following

  17. IMPLEMENTASI ORACLE SPATIAL UNTUK PEMETAAN KETAHANAN DAN KERAWANAN PANGAN DI KABUPATEN BREBES

    Directory of Open Access Journals (Sweden)

    Agusta Praba Ristadi Pinem

    2016-11-01

    Full Text Available Food insecurity is a common problem in some areas and it must be taken into consideration for its involvement into human life. Food insecurity refers to the inability of people in local communities to secure adequate food. Food insecurity maps can be used to identify areas that tend to insecure. Based on Central Java Food Security and Vulnerability Atlas (FSVA in 2010, Brebes has been in a food secure status and it makes difference from all districts and cities in Central Java province. The anomaly on food security status in Brebes is the main concern on this research by implementing Oracle Spatial to mapping the food security and vulnerability. Oracle Spatial combines the spatial data and food data for displaying information about food security status within each district in Brebes.

  18. Discourse analysis: Conversational analysis of the internal conversation in Oracle Corporation Malaysia

    Directory of Open Access Journals (Sweden)

    Marwa Marwa

    2017-07-01

    Full Text Available This study highlights the internal conversation which takes place in Oracle CorporationMalaysia. Through the study, it will be shown how conversational analysis is used toanalyze the transcription of a telephone conversation between Oracle staffs. The analysisof the transcriptions will apply a few basic concepts of conversational analysis; turntakingorganization, and the adjacency pair. The objective of the study is to find out howthe internal conversations takes place by focusing on the conversation itself, that is, theconversational structures spontaneously produced by people during talk ranging fromturn-taking strategies, how topics are introduced, conversation closings and so on. Bylooking in detail at such talk, we can gain a detailed understanding of how the staffs seethemselves in relation to the company that influence their daily lives.Keywords: conversational analysis, turn-taking, adjacency pairs

  19. Incorporating Oracle on-line space management with long-term archival technology

    Science.gov (United States)

    Moran, Steven M.; Zak, Victor J.

    1996-01-01

    The storage requirements of today's organizations are exploding. As computers continue to escalate in processing power, applications grow in complexity and data files grow in size and in number. As a result, organizations are forced to procure more and more megabytes of storage space. This paper focuses on how to expand the storage capacity of a Very Large Database (VLDB) cost-effectively within a Oracle7 data warehouse system by integrating long term archival storage sub-systems with traditional magnetic media. The Oracle architecture described in this paper was based on an actual proof of concept for a customer looking to store archived data on optical disks yet still have access to this data without user intervention. The customer had a requirement to maintain 10 years worth of data on-line. Data less than a year old still had the potential to be updated thus will reside on conventional magnetic disks. Data older than a year will be considered archived and will be placed on optical disks. The ability to archive data to optical disk and still have access to that data provides the system a means to retain large amounts of data that is readily accessible yet significantly reduces the cost of total system storage. Therefore, the cost benefits of archival storage devices can be incorporated into the Oracle storage medium and I/O subsystem without loosing any of the functionality of transaction processing, yet at the same time providing an organization access to all their data.

  20. Test oracle automation for V&V of an autonomous spacecraft's planner

    Science.gov (United States)

    Feather, M. S.; Smith, B.

    2001-01-01

    We built automation to assist the software testing efforts associated with the Remote Agent experiment. In particular, our focus was upon introducing test oracles into the testing of the planning and scheduling system component. This summary is intended to provide an overview of the work.

  1. An Object-Relational Ifc Storage Model Based on Oracle Database

    Science.gov (United States)

    Li, Hang; Liu, Hua; Liu, Yong; Wang, Yuan

    2016-06-01

    With the building models are getting increasingly complicated, the levels of collaboration across professionals attract more attention in the architecture, engineering and construction (AEC) industry. In order to adapt the change, buildingSMART developed Industry Foundation Classes (IFC) to facilitate the interoperability between software platforms. However, IFC data are currently shared in the form of text file, which is defective. In this paper, considering the object-based inheritance hierarchy of IFC and the storage features of different database management systems (DBMS), we propose a novel object-relational storage model that uses Oracle database to store IFC data. Firstly, establish the mapping rules between data types in IFC specification and Oracle database. Secondly, design the IFC database according to the relationships among IFC entities. Thirdly, parse the IFC file and extract IFC data. And lastly, store IFC data into corresponding tables in IFC database. In experiment, three different building models are selected to demonstrate the effectiveness of our storage model. The comparison of experimental statistics proves that IFC data are lossless during data exchange.

  2. Huldah�s oracle: The origin of the Chronicler�s typical style?

    Directory of Open Access Journals (Sweden)

    Louis C. Jonker

    2012-02-01

    Full Text Available Scholars of Chronicles normally emphasise that the Chronicler used typical words and phrases in those parts that belong to his Sondergut. Amongst these are phrases like �to humble yourself�, �to seek Yahweh�, and �not to forsake Yahweh�. The writer�s typical changes to the burial notices of the royal narratives also belong in this category. Something which is often overlooked, however, is that many of these features already occur in the narrative about Huldah�s oracle (2 Chr 34:19�28 which was taken over with only minor changes from the Deuteronomistic version (2 Ki 22:11�20. My paper investigates whether or not the Huldah oracle could have served as theological paradigm according to which the Chronicler developed his own unique style. If so, the investigation will prompt me to revisit the issue of how continuity and discontinuity, with the older historiographical tradition, characterise the identity negotiation process that we witness in this literature.

  3. Oracle Efficient Variable Selection in Random and Fixed Effects Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    This paper generalizes the results for the Bridge estimator of Huang et al. (2008) to linear random and fixed effects panel data models which are allowed to grow in both dimensions. In particular we show that the Bridge estimator is oracle efficient. It can correctly distinguish between relevant...... and irrelevant variables and the asymptotic distribution of the estimators of the coefficients of the relevant variables is the same as if only these had been included in the model, i.e. as if an oracle had revealed the true model prior to estimation. In the case of more explanatory variables than observations......, we prove that the Marginal Bridge estimator can asymptotically correctly distinguish between relevant and irrelevant explanatory variables. We do this without restricting the dependence between covariates and without assuming sub Gaussianity of the error terms thereby generalizing the results...

  4. ORACLS- OPTIMAL REGULATOR ALGORITHMS FOR THE CONTROL OF LINEAR SYSTEMS (DEC VAX VERSION)

    Science.gov (United States)

    Frisch, H.

    1994-01-01

    This control theory design package, called Optimal Regulator Algorithms for the Control of Linear Systems (ORACLS), was developed to aid in the design of controllers and optimal filters for systems which can be modeled by linear, time-invariant differential and difference equations. Optimal linear quadratic regulator theory, currently referred to as the Linear-Quadratic-Gaussian (LQG) problem, has become the most widely accepted method of determining optimal control policy. Within this theory, the infinite duration time-invariant problems, which lead to constant gain feedback control laws and constant Kalman-Bucy filter gains for reconstruction of the system state, exhibit high tractability and potential ease of implementation. A variety of new and efficient methods in the field of numerical linear algebra have been combined into the ORACLS program, which provides for the solution to time-invariant continuous or discrete LQG problems. The ORACLS package is particularly attractive to the control system designer because it provides a rigorous tool for dealing with multi-input and multi-output dynamic systems in both continuous and discrete form. The ORACLS programming system is a collection of subroutines which can be used to formulate, manipulate, and solve various LQG design problems. The ORACLS program is constructed in a manner which permits the user to maintain considerable flexibility at each operational state. This flexibility is accomplished by providing primary operations, analysis of linear time-invariant systems, and control synthesis based on LQG methodology. The input-output routines handle the reading and writing of numerical matrices, printing heading information, and accumulating output information. The basic vector-matrix operations include addition, subtraction, multiplication, equation, norm construction, tracing, transposition, scaling, juxtaposition, and construction of null and identity matrices. The analysis routines provide for the following

  5. Variable-Speed, Robust Synchronous Reluctance Machine Drive Systems

    DEFF Research Database (Denmark)

    Wang, Dong

    The synchronous reluctance machine drive is getting more and more interests from the industrial side, since it can provide higher system energy efficiency than traditional inverter-fed induction machine drive systems with similar production cost. It is considered as a good candidate for super...... is recommended. In recent years, there is an increasing trend to replace the electrolytic capacitor in the frequency converter with film capacitor, which has a longer expected service lifetime and no explosion risk. Furthermore, it is possible to achieve a compact converter design by using film capacitor, since...

  6. Robust total energy demand estimation with a hybrid Variable Neighborhood Search – Extreme Learning Machine algorithm

    International Nuclear Information System (INIS)

    Sánchez-Oro, J.; Duarte, A.; Salcedo-Sanz, S.

    2016-01-01

    Highlights: • The total energy demand in Spain is estimated with a Variable Neighborhood algorithm. • Socio-economic variables are used, and one year ahead prediction horizon is considered. • Improvement of the prediction with an Extreme Learning Machine network is considered. • Experiments are carried out in real data for the case of Spain. - Abstract: Energy demand prediction is an important problem whose solution is evaluated by policy makers in order to take key decisions affecting the economy of a country. A number of previous approaches to improve the quality of this estimation have been proposed in the last decade, the majority of them applying different machine learning techniques. In this paper, the performance of a robust hybrid approach, composed of a Variable Neighborhood Search algorithm and a new class of neural network called Extreme Learning Machine, is discussed. The Variable Neighborhood Search algorithm is focused on obtaining the most relevant features among the set of initial ones, by including an exponential prediction model. While previous approaches consider that the number of macroeconomic variables used for prediction is a parameter of the algorithm (i.e., it is fixed a priori), the proposed Variable Neighborhood Search method optimizes both: the number of variables and the best ones. After this first step of feature selection, an Extreme Learning Machine network is applied to obtain the final energy demand prediction. Experiments in a real case of energy demand estimation in Spain show the excellent performance of the proposed approach. In particular, the whole method obtains an estimation of the energy demand with an error lower than 2%, even when considering the crisis years, which are a real challenge.

  7. Advanced Machine learning Algorithm Application for Rotating Machine Health Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Kanemoto, Shigeru; Watanabe, Masaya [The University of Aizu, Aizuwakamatsu (Japan); Yusa, Noritaka [Tohoku University, Sendai (Japan)

    2014-08-15

    The present paper tries to evaluate the applicability of conventional sound analysis techniques and modern machine learning algorithms to rotating machine health monitoring. These techniques include support vector machine, deep leaning neural network, etc. The inner ring defect and misalignment anomaly sound data measured by a rotating machine mockup test facility are used to verify the above various kinds of algorithms. Although we cannot find remarkable difference of anomaly discrimination performance, some methods give us the very interesting eigen patterns corresponding to normal and abnormal states. These results will be useful for future more sensitive and robust anomaly monitoring technology.

  8. Advanced Machine learning Algorithm Application for Rotating Machine Health Monitoring

    International Nuclear Information System (INIS)

    Kanemoto, Shigeru; Watanabe, Masaya; Yusa, Noritaka

    2014-01-01

    The present paper tries to evaluate the applicability of conventional sound analysis techniques and modern machine learning algorithms to rotating machine health monitoring. These techniques include support vector machine, deep leaning neural network, etc. The inner ring defect and misalignment anomaly sound data measured by a rotating machine mockup test facility are used to verify the above various kinds of algorithms. Although we cannot find remarkable difference of anomaly discrimination performance, some methods give us the very interesting eigen patterns corresponding to normal and abnormal states. These results will be useful for future more sensitive and robust anomaly monitoring technology

  9. Robust anti-synchronization of uncertain chaotic systems based on multiple-kernel least squares support vector machine modeling

    International Nuclear Information System (INIS)

    Chen Qiang; Ren Xuemei; Na Jing

    2011-01-01

    Highlights: Model uncertainty of the system is approximated by multiple-kernel LSSVM. Approximation errors and disturbances are compensated in the controller design. Asymptotical anti-synchronization is achieved with model uncertainty and disturbances. Abstract: In this paper, we propose a robust anti-synchronization scheme based on multiple-kernel least squares support vector machine (MK-LSSVM) modeling for two uncertain chaotic systems. The multiple-kernel regression, which is a linear combination of basic kernels, is designed to approximate system uncertainties by constructing a multiple-kernel Lagrangian function and computing the corresponding regression parameters. Then, a robust feedback control based on MK-LSSVM modeling is presented and an improved update law is employed to estimate the unknown bound of the approximation error. The proposed control scheme can guarantee the asymptotic convergence of the anti-synchronization errors in the presence of system uncertainties and external disturbances. Numerical examples are provided to show the effectiveness of the proposed method.

  10. Oracle Service Bus 11g Development Cookbook

    CERN Document Server

    Schmutz, Guido; van Zoggel, Jan

    2012-01-01

    This cookbook is full of immediately useable recipes showing you how to develop service and message-oriented (integration) applications on the Oracle Service Bus. In addition to its cookbook style, which ensures the solutions are presented in a clear step-by-step manner, the explanations go into great detail, which makes it good learning material for everyone who has experience in OSB and wants to improve. Most of the recipes are designed in such a way that each recipe is presented as a separate, standalone entity and reading of prior recipes is not required. The finished solution of each reci

  11. Robust Parallel Machine Scheduling Problem with Uncertainties and Sequence-Dependent Setup Time

    Directory of Open Access Journals (Sweden)

    Hongtao Hu

    2016-01-01

    Full Text Available A parallel machine scheduling problem in plastic production is studied in this paper. In this problem, the processing time and arrival time are uncertain but lie in their respective intervals. In addition, each job must be processed together with a mold while jobs which belong to one family can share the same mold. Therefore, time changing mold is required for two consecutive jobs that belong to different families, which is known as sequence-dependent setup time. This paper aims to identify a robust schedule by min–max regret criterion. It is proved that the scenario incurring maximal regret for each feasible solution lies in finite extreme scenarios. A mixed integer linear programming formulation and an exact algorithm are proposed to solve the problem. Moreover, a modified artificial bee colony algorithm is developed to solve large-scale problems. The performance of the presented algorithm is evaluated through extensive computational experiments and the results show that the proposed algorithm surpasses the exact method in terms of objective value and computational time.

  12. Oracle Enterprise Manager 12c command-line interface

    CERN Document Server

    Pot'Vin, Kellyn; Smith, Ray

    2014-01-01

    Oracle Enterprise Manager 12c Command-Line Interface shows how to use Enterprise Manager's powerful scripting language to automate your database administration work and save time by scripting routine tasks, and then executing those scripts across collections of databases and instances in your environment. This book is chock full of ready-made scripting examples contributed by the authors and leading members of the community. For example, you'll find scripts and examples of commands to: Remove an Enterprise Manager agent and its related targetsQuickly create administrator accounts that are ful

  13. Efficient representation of DNA data for pattern recognition using failure factor oracles

    NARCIS (Netherlands)

    Cleophas, Loek; Kourie, Derrick G.; Watson, Bruce W.

    2013-01-01

    In indexing of and pattern matching on DNA sequences, representing all factors of a sequence is important. One efficient, compact representation is the factor oracle (FO). At the same time, any classical deterministic finite automata (DFA) can be transformed to a so-called failure one (FDFA), which

  14. Characters Feature Extraction Based on Neat Oracle Bone Rubbings

    OpenAIRE

    Lei Guo

    2013-01-01

    In order to recognize characters on the neat oracle bone rubbings, a new mesh point feature extraction algorithm was put forward in this paper by researching and improving of the existing coarse mesh feature extraction algorithm and the point feature extraction algorithm. Some improvements of this algorithm were as followings: point feature was introduced into the coarse mesh feature, the absolute address was converted to relative address, and point features have been changed grid and positio...

  15. Typologically robust statistical machine translation : Understanding and exploiting differences and similarities between languages in machine translation

    NARCIS (Netherlands)

    Daiber, J.

    2018-01-01

    Machine translation systems often incorporate modeling assumptions motivated by properties of the language pairs they initially target. When such systems are applied to language families with considerably different properties, translation quality can deteriorate. Phrase-based machine translation

  16. On the Oracle Property of the Adaptive LASSO in Stationary and Nonstationary Autoregressions

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl

    We show that the Adaptive LASSO is oracle efficient in stationary and non-stationary autoregressions. This means that it estimates parameters consistently, selects the correct sparsity pattern, and estimates the coefficients belonging to the relevant variables at the same asymptotic efficiency...

  17. An Oracle-based co-training framework for writer identification in offline handwriting

    Science.gov (United States)

    Porwal, Utkarsh; Rajan, Sreeranga; Govindaraju, Venu

    2012-01-01

    State-of-the-art techniques for writer identification have been centered primarily on enhancing the performance of the system for writer identification. Machine learning algorithms have been used extensively to improve the accuracy of such system assuming sufficient amount of data is available for training. Little attention has been paid to the prospect of harnessing the information tapped in a large amount of un-annotated data. This paper focuses on co-training based framework that can be used for iterative labeling of the unlabeled data set exploiting the independence between the multiple views (features) of the data. This paradigm relaxes the assumption of sufficiency of the data available and tries to generate labeled data from unlabeled data set along with improving the accuracy of the system. However, performance of co-training based framework is dependent on the effectiveness of the algorithm used for the selection of data points to be added in the labeled set. We propose an Oracle based approach for data selection that learns the patterns in the score distribution of classes for labeled data points and then predicts the labels (writers) of the unlabeled data point. This method for selection statistically learns the class distribution and predicts the most probable class unlike traditional selection algorithms which were based on heuristic approaches. We conducted experiments on publicly available IAM dataset and illustrate the efficacy of the proposed approach.

  18. Demonstration of quantum advantage in machine learning

    Science.gov (United States)

    Ristè, Diego; da Silva, Marcus P.; Ryan, Colm A.; Cross, Andrew W.; Córcoles, Antonio D.; Smolin, John A.; Gambetta, Jay M.; Chow, Jerry M.; Johnson, Blake R.

    2017-04-01

    The main promise of quantum computing is to efficiently solve certain problems that are prohibitively expensive for a classical computer. Most problems with a proven quantum advantage involve the repeated use of a black box, or oracle, whose structure encodes the solution. One measure of the algorithmic performance is the query complexity, i.e., the scaling of the number of oracle calls needed to find the solution with a given probability. Few-qubit demonstrations of quantum algorithms, such as Deutsch-Jozsa and Grover, have been implemented across diverse physical systems such as nuclear magnetic resonance, trapped ions, optical systems, and superconducting circuits. However, at the small scale, these problems can already be solved classically with a few oracle queries, limiting the obtained advantage. Here we solve an oracle-based problem, known as learning parity with noise, on a five-qubit superconducting processor. Executing classical and quantum algorithms using the same oracle, we observe a large gap in query count in favor of quantum processing. We find that this gap grows by orders of magnitude as a function of the error rates and the problem size. This result demonstrates that, while complex fault-tolerant architectures will be required for universal quantum computing, a significant quantum advantage already emerges in existing noisy systems.

  19. On Notions of Security for Deterministic Encryption, and Efficient Constructions Without Random Oracles

    NARCIS (Netherlands)

    S. Boldyreva; S. Fehr (Serge); A. O'Neill; D. Wagner

    2008-01-01

    textabstractThe study of deterministic public-key encryption was initiated by Bellare et al. (CRYPTO ’07), who provided the “strongest possible” notion of security for this primitive (called PRIV) and constructions in the random oracle (RO) model. We focus on constructing efficient deterministic

  20. OCA Oracle Database 11g SQL Fundamentals I

    CERN Document Server

    Ries, Steve

    2011-01-01

    This book is packed with real word examples. Each major certification topic is covered in a separate chapter, which helps to make understanding of concepts easier. At the end of each chapter, you will find a variety of practice questions to strengthen and test your learning. You will get a feel for the actual SQL Fundamentals I exam by solving practice papers modeled on it. This book is for anyone who needs the essential skills to pass the Oracle Database SQL Fundamentals I exam and use those skills in daily life as an SQL developer or database administrator.

  1. OrChem - An open source chemistry search engine for Oracle®

    Science.gov (United States)

    2009-01-01

    Background Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Results Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. Availability OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net. PMID:20298521

  2. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  3. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  4. The development of ORACLe: a measure of an organisation's capacity to engage in evidence-informed health policy.

    Science.gov (United States)

    Makkar, Steve R; Turner, Tari; Williamson, Anna; Louviere, Jordan; Redman, Sally; Haynes, Abby; Green, Sally; Brennan, Sue

    2016-01-14

    Evidence-informed policymaking is more likely if organisations have cultures that promote research use and invest in resources that facilitate staff engagement with research. Measures of organisations' research use culture and capacity are needed to assess current capacity, identify opportunities for improvement, and examine the impact of capacity-building interventions. The aim of the current study was to develop a comprehensive system to measure and score organisations' capacity to engage with and use research in policymaking, which we entitled ORACLe (Organisational Research Access, Culture, and Leadership). We used a multifaceted approach to develop ORACLe. Firstly, we reviewed the available literature to identify key domains of organisational tools and systems that may facilitate research use by staff. We interviewed senior health policymakers to verify the relevance and applicability of these domains. This information was used to generate an interview schedule that focused on seven key domains of organisational capacity. The interview was pilot-tested within four Australian policy agencies. A discrete choice experiment (DCE) was then undertaken using an expert sample to establish the relative importance of these domains. This data was used to produce a scoring system for ORACLe. The ORACLe interview was developed, comprised of 23 questions addressing seven domains of organisational capacity and tools that support research use, including (1) documented processes for policymaking; (2) leadership training; (3) staff training; (4) research resources (e.g. database access); and systems to (5) generate new research, (6) undertake evaluations, and (7) strengthen relationships with researchers. From the DCE data, a conditional logit model was estimated to calculate total scores that took into account the relative importance of the seven domains. The model indicated that our expert sample placed the greatest importance on domains (2), (3) and (4). We utilised

  5. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement.

    Science.gov (United States)

    He, Yan-Lin; Geng, Zhi-Qiang; Xu, Yuan; Zhu, Qun-Xiong

    2015-09-01

    In this paper, a robust hybrid model integrating an enhanced inputs based extreme learning machine with the partial least square regression (PLSR-EIELM) was proposed. The proposed PLSR-EIELM model can overcome two main flaws in the extreme learning machine (ELM), i.e. the intractable problem in determining the optimal number of the hidden layer neurons and the over-fitting phenomenon. First, a traditional extreme learning machine (ELM) is selected. Second, a method of randomly assigning is applied to the weights between the input layer and the hidden layer, and then the nonlinear transformation for independent variables can be obtained from the output of the hidden layer neurons. Especially, the original input variables are regarded as enhanced inputs; then the enhanced inputs and the nonlinear transformed variables are tied together as the whole independent variables. In this way, the PLSR can be carried out to identify the PLS components not only from the nonlinear transformed variables but also from the original input variables, which can remove the correlation among the whole independent variables and the expected outputs. Finally, the optimal relationship model of the whole independent variables with the expected outputs can be achieved by using PLSR. Thus, the PLSR-EIELM model is developed. Then the PLSR-EIELM model served as an intelligent measurement tool for the key variables of the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. The experimental results show that the predictive accuracy of PLSR-EIELM is stable, which indicate that PLSR-EIELM has good robust character. Moreover, compared with ELM, PLSR, hierarchical ELM (HELM), and PLSR-ELM, PLSR-EIELM can achieve much smaller predicted relative errors in these two applications. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  7. Detecting epileptic seizure with different feature extracting strategies using robust machine learning classification techniques by applying advance parameter optimization approach.

    Science.gov (United States)

    Hussain, Lal

    2018-06-01

    Epilepsy is a neurological disorder produced due to abnormal excitability of neurons in the brain. The research reveals that brain activity is monitored through electroencephalogram (EEG) of patients suffered from seizure to detect the epileptic seizure. The performance of EEG detection based epilepsy require feature extracting strategies. In this research, we have extracted varying features extracting strategies based on time and frequency domain characteristics, nonlinear, wavelet based entropy and few statistical features. A deeper study was undertaken using novel machine learning classifiers by considering multiple factors. The support vector machine kernels are evaluated based on multiclass kernel and box constraint level. Likewise, for K-nearest neighbors (KNN), we computed the different distance metrics, Neighbor weights and Neighbors. Similarly, the decision trees we tuned the paramours based on maximum splits and split criteria and ensemble classifiers are evaluated based on different ensemble methods and learning rate. For training/testing tenfold Cross validation was employed and performance was evaluated in form of TPR, NPR, PPV, accuracy and AUC. In this research, a deeper analysis approach was performed using diverse features extracting strategies using robust machine learning classifiers with more advanced optimal options. Support Vector Machine linear kernel and KNN with City block distance metric give the overall highest accuracy of 99.5% which was higher than using the default parameters for these classifiers. Moreover, highest separation (AUC = 0.9991, 0.9990) were obtained at different kernel scales using SVM. Additionally, the K-nearest neighbors with inverse squared distance weight give higher performance at different Neighbors. Moreover, to distinguish the postictal heart rate oscillations from epileptic ictal subjects, and highest performance of 100% was obtained using different machine learning classifiers.

  8. Computer vision and machine learning for robust phenotyping in genome-wide studies.

    Science.gov (United States)

    Zhang, Jiaoping; Naik, Hsiang Sing; Assefa, Teshale; Sarkar, Soumik; Reddy, R V Chowda; Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh K

    2017-03-08

    Traditional evaluation of crop biotic and abiotic stresses are time-consuming and labor-intensive limiting the ability to dissect the genetic basis of quantitative traits. A machine learning (ML)-enabled image-phenotyping pipeline for the genetic studies of abiotic stress iron deficiency chlorosis (IDC) of soybean is reported. IDC classification and severity for an association panel of 461 diverse plant-introduction accessions was evaluated using an end-to-end phenotyping workflow. The workflow consisted of a multi-stage procedure including: (1) optimized protocols for consistent image capture across plant canopies, (2) canopy identification and registration from cluttered backgrounds, (3) extraction of domain expert informed features from the processed images to accurately represent IDC expression, and (4) supervised ML-based classifiers that linked the automatically extracted features with expert-rating equivalent IDC scores. ML-generated phenotypic data were subsequently utilized for the genome-wide association study and genomic prediction. The results illustrate the reliability and advantage of ML-enabled image-phenotyping pipeline by identifying previously reported locus and a novel locus harboring a gene homolog involved in iron acquisition. This study demonstrates a promising path for integrating the phenotyping pipeline into genomic prediction, and provides a systematic framework enabling robust and quicker phenotyping through ground-based systems.

  9. S/HIC: Robust Identification of Soft and Hard Sweeps Using Machine Learning.

    Directory of Open Access Journals (Sweden)

    Daniel R Schrider

    2016-03-01

    Full Text Available Detecting the targets of adaptive natural selection from whole genome sequencing data is a central problem for population genetics. However, to date most methods have shown sub-optimal performance under realistic demographic scenarios. Moreover, over the past decade there has been a renewed interest in determining the importance of selection from standing variation in adaptation of natural populations, yet very few methods for inferring this model of adaptation at the genome scale have been introduced. Here we introduce a new method, S/HIC, which uses supervised machine learning to precisely infer the location of both hard and soft selective sweeps. We show that S/HIC has unrivaled accuracy for detecting sweeps under demographic histories that are relevant to human populations, and distinguishing sweeps from linked as well as neutrally evolving regions. Moreover, we show that S/HIC is uniquely robust among its competitors to model misspecification. Thus, even if the true demographic model of a population differs catastrophically from that specified by the user, S/HIC still retains impressive discriminatory power. Finally, we apply S/HIC to the case of resequencing data from human chromosome 18 in a European population sample, and demonstrate that we can reliably recover selective sweeps that have been identified earlier using less specific and sensitive methods.

  10. Comparison of Cloud backup performance and costs in Oracle database

    OpenAIRE

    Aljaž Zrnec; Dejan Lavbič

    2011-01-01

    Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database u...

  11. Has publication of the results of the ORACLE Children Study changed practice in the UK?

    Science.gov (United States)

    Kenyon, S; Pike, K; Jones, D; Brocklehurst, P; Marlow, N; Salt, A; Taylor, D

    2010-10-01

      To investigate whether publication of the results of the ORACLE Children's Study, a 7-year follow-up of the ORACLE trial, changed practice with regard to the routine prescription of antibiotics to women with preterm rupture of membranes or spontaneous preterm labour (intact membranes).   A comparative questionnaire survey of clinical practice in November 2007 (before publication) and March 2009 (after publication).   Lead obstetricians for labour wards of all maternity units in the UK.   Self-administered questionnaires requested information about the routine prescription of antibiotics to women with either preterm rupture of membranes or spontaneous preterm labour (intact membranes).   Change in practice for prescription of antibiotics.   The response rate was 166/214 (78%) in 2007 and 158/209 (76%) in 2009. In total, 120 maternity units responded on both occasions. For women with preterm rupture of membranes, 162/214 (98%) in 2007 and 151/158 (96%) in 2009 maternity units reported that they prescribed antibiotics, with the majority using erythromycin (98%). For women with spontaneous preterm labour (intact membranes), 35/166 (21%) in 2007 and 25/158 (16%) in 2009 maternity units reported that they routinely prescribed antibiotics. The findings from units who responded on both occasions are similar.   There has been little change in the reported prescription of antibiotics to women with either preterm rupture of membranes or spontaneous preterm labour following publication of the ORACLE Children's Study. This suggests that current practice may require updated guidance.

  12. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences.

    Science.gov (United States)

    Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.

  13. The Oracle of Delphi Encore!

    CERN Multimedia

    2000-01-01

    After the great success of the performance last winter the company Mimescope and CERN stage again The Oracle of Delphi! A performance with acrobatics, mime and music played in the heart of CERN laboratory. Kept in suspense between dream and reality inside the fertile imagination of the brilliant English physicist Paul Dirac, the audience dives into the fascinating world of antimatter. The performance will be staged in the PS Soutn Hall, close to LEAR, where the first atoms of antihydrogen were produced. After each performance there will be a short tour around the hall. From 11 January 2001 to 04 February 2001 every Thursday (EXCEPT 1st February!), Friday and Saturday at 20.00 h, and Sundays at 18.00 h. Purchase of tickets from Forum Meyrin, open Monday to Saturday from 14.00h to 18.00h, tel: 022/989.34.34, or through the Billetel system tel: 0901.553.901 (1.50 CHF per minute). On the days of the event last minute tickets can still be purchased at the Reception desk of bldg 33 on CERN site. Please arrive ...

  14. The Oracle of Delphi Encore!

    CERN Multimedia

    2001-01-01

    After the great success of the performance last winter the company Mimescope and CERN stage again The Oracle of Delphi! A performance with acrobatics, mime and music played in the heart of CERN laboratory. Kept in suspense between dream and reality inside the fertile imagination of the brilliant English physicist Paul Dirac, the audience dives into the fascinating world of antimatter. The performance will be staged in the PS Soutn Hall, close to LEAR, where the first atoms of antihydrogen were produced. After each performance there will be a short tour around the hall. - From 11 January 2001 to 04 February 2001 every Thursday (EXCEPT 1st February!), Friday and Saturday at 20.00 h, and Sundays at 18.00 h. - Purchase of tickets from Forum Meyrin, open Monday to Saturday from 14.00h to 18.00h, tel: 022/989.34.34, or through the Billetel system tel: 0901.553.901 (1.50 CHF per minute). On the days of the event last minute tickets can still be purchased at the Reception desk of bldg 33 on CERN site. Please arrive ...

  15. Robust Machine Learning-Based Correction on Automatic Segmentation of the Cerebellum and Brainstem.

    Science.gov (United States)

    Wang, Jun Yi; Ngo, Michael M; Hessl, David; Hagerman, Randi J; Rivera, Susan M

    2016-01-01

    Automated segmentation is a useful method for studying large brain structures such as the cerebellum and brainstem. However, automated segmentation may lead to inaccuracy and/or undesirable boundary. The goal of the present study was to investigate whether SegAdapter, a machine learning-based method, is useful for automatically correcting large segmentation errors and disagreement in anatomical definition. We further assessed the robustness of the method in handling size of training set, differences in head coil usage, and amount of brain atrophy. High resolution T1-weighted images were acquired from 30 healthy controls scanned with either an 8-channel or 32-channel head coil. Ten patients, who suffered from brain atrophy because of fragile X-associated tremor/ataxia syndrome, were scanned using the 32-channel head coil. The initial segmentations of the cerebellum and brainstem were generated automatically using Freesurfer. Subsequently, Freesurfer's segmentations were both manually corrected to serve as the gold standard and automatically corrected by SegAdapter. Using only 5 scans in the training set, spatial overlap with manual segmentation in Dice coefficient improved significantly from 0.956 (for Freesurfer segmentation) to 0.978 (for SegAdapter-corrected segmentation) for the cerebellum and from 0.821 to 0.954 for the brainstem. Reducing the training set size to 2 scans only decreased the Dice coefficient ≤0.002 for the cerebellum and ≤ 0.005 for the brainstem compared to the use of training set size of 5 scans in corrective learning. The method was also robust in handling differences between the training set and the test set in head coil usage and the amount of brain atrophy, which reduced spatial overlap only by segmentation and corrective learning provides a valuable method for accurate and efficient segmentation of the cerebellum and brainstem, particularly in large-scale neuroimaging studies, and potentially for segmenting other neural regions as

  16. Oracle Database 11gR2 Performance Tuning Cookbook

    CERN Document Server

    Fiorillo, Ciro

    2012-01-01

    In this book you will find both examples and theoretical concepts covered. Every recipe is based on a script/procedure explained step-by-step, with screenshots, while theoretical concepts are explained in the context of the recipe, to explain why a solution performs better than another. This book is aimed at software developers, software and data architects, and DBAs who are using or are planning to use the Oracle Database, who have some experience and want to solve performance problems faster and in a rigorous way. If you are an architect who wants to design better applications, a DBA who is

  17. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    Science.gov (United States)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  18. Comparitive study of the influence of harmonic voltage distortion on the efficiency of induction machines versus line start permanent magnet machines

    OpenAIRE

    Debruyne, Colin; Derammelaere, Stijn; Desmet, Jan; Vandevelde, Lieven

    2012-01-01

    Induction machines have nearly reached their maximal efficiency. In order to further increase the efficiency the use of permanent magnets in combination with the robust design of the induction machine is being extensively researched. These so-called line start permanent magnet machines have an increased efficiency in sine wave conditions in respect to standard induction machines, however the efficiency of these machines is less researched under distorted voltage conditions. This paper compare...

  19. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  20. MRC ORACLE Children Study. Long term outcomes following prescription of antibiotics to pregnant women with either spontaneous preterm labour or preterm rupture of the membranes

    Directory of Open Access Journals (Sweden)

    Salt Alison

    2008-04-01

    Full Text Available Abstract Background The Medical Research Council (MRC ORACLE trial evaluated the use of co-amoxiclav 375 mg and/or erythromycin 250 mg in women presenting with preterm rupture of membranes (PROM ORACLE I or in spontaneous preterm labour (SPL ORACLE II using a factorial design. The results showed that for women with a singleton baby with PROM the prescription of erythromycin is associated with improvements in short term neonatal outcomes, although co-amoxiclav is associated with prolongation of pregnancy, a significantly higher rate of neonatal necrotising enterocolitis was found in these babies. Prescription of erythromycin is now established practice for women with PROM. For women with SPL antibiotics demonstrated no improvements in short term neonatal outcomes and are not recommended treatment. There is evidence that both these conditions are associated with subclinical infection so perinatal antibiotic administration may reduce the risk of later disabilities, including cerebral palsy, although the risk may be increased through exposure to inflammatory cytokines, so assessment of longer term functional and educational outcomes is appropriate. Methods The MRC ORACLE Children's Study will follow up UK children at age 7 years born to 4809 women with PROM and the 4266 women with SPL enrolled in the earlier ORACLE trials. We will use a parental questionnaire including validated tools to assess disability and behaviour. We will collect the frequency of specific medical conditions: cerebral palsy, epilepsy, respiratory illness including asthma, diabetes, admission to hospital in last year and other diseases, as reported by parents. National standard test results will be collected to assess educational attainment at Key Stage 1 for children in England. Discussion This study is designed to investigate whether or not peripartum antibiotics improve health and disability for children at 7 years of age. Trial registration The ORACLE Trial and Children

  1. MRC ORACLE Children Study. Long term outcomes following prescription of antibiotics to pregnant women with either spontaneous preterm labour or preterm rupture of the membranes.

    Science.gov (United States)

    Kenyon, Sara; Brocklehurst, Peter; Jones, David; Marlow, Neil; Salt, Alison; Taylor, David

    2008-04-24

    The Medical Research Council (MRC) ORACLE trial evaluated the use of co-amoxiclav 375 mg and/or erythromycin 250 mg in women presenting with preterm rupture of membranes (PROM) ORACLE I or in spontaneous preterm labour (SPL) ORACLE II using a factorial design. The results showed that for women with a singleton baby with PROM the prescription of erythromycin is associated with improvements in short term neonatal outcomes, although co-amoxiclav is associated with prolongation of pregnancy, a significantly higher rate of neonatal necrotising enterocolitis was found in these babies. Prescription of erythromycin is now established practice for women with PROM. For women with SPL antibiotics demonstrated no improvements in short term neonatal outcomes and are not recommended treatment. There is evidence that both these conditions are associated with subclinical infection so perinatal antibiotic administration may reduce the risk of later disabilities, including cerebral palsy, although the risk may be increased through exposure to inflammatory cytokines, so assessment of longer term functional and educational outcomes is appropriate. The MRC ORACLE Children's Study will follow up UK children at age 7 years born to 4809 women with PROM and the 4266 women with SPL enrolled in the earlier ORACLE trials. We will use a parental questionnaire including validated tools to assess disability and behaviour. We will collect the frequency of specific medical conditions: cerebral palsy, epilepsy, respiratory illness including asthma, diabetes, admission to hospital in last year and other diseases, as reported by parents. National standard test results will be collected to assess educational attainment at Key Stage 1 for children in England. This study is designed to investigate whether or not peripartum antibiotics improve health and disability for children at 7 years of age. The ORACLE Trial and Children Study is registered in the Current Controlled Trials registry. ISCRTN 52995660.

  2. ORACL program file for acquisition, storage and analysis of data in radiation measurement and nondestructive measurement of nuclear material, vol. 2

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Takeuchi, Norio; Gotoh, Hiroshi

    1976-09-01

    The file contains 79 programs for radiation measurement and nondestructive measurement of nuclear material written in conversational language ORACL associated with the GAMMA-III system of ORTEC Incorporated. It deals with data transfers between disk/core/MCA/magnetic tape, edition of data in disks, calculation of the peak area, calculation of mean and standard deviation, reference to gamma-ray data files, accounting, calendar, etc. It also has a support system for micro-computer development. Usages of the built-in functions of ORACL are presented. (auth.)

  3. Augmenting Oracle Text with the UMLS for enhanced searching of free-text medical reports.

    Science.gov (United States)

    Ding, Jing; Erdal, Selnur; Dhaval, Rakesh; Kamal, Jyoti

    2007-10-11

    The intrinsic complexity of free-text medical reports imposes great challenges for information retrieval systems. We have developed a prototype search engine for retrieving clinical reports that leverages the powerful indexing and querying capabilities of Oracle Text, and the rich biomedical domain knowledge and semantic structures that are captured in the UMLS Metathesaurus.

  4. Searches on star graphs and equivalent oracle problems

    International Nuclear Information System (INIS)

    Lee, Jaehak; Lee, Hai-Woong; Hillery, Mark

    2011-01-01

    We examine a search on a graph among a number of different kinds of objects (vertices), one of which we want to find. In a standard graph search, all of the vertices are the same, except for one, the marked vertex, and that is the one we wish to find. We examine the case in which the unmarked vertices can be of different types, so the background against which the search is done is not uniform. We find that the search can still be successful, but the probability of success is lower than in the uniform background case, and that probability decreases with the number of types of unmarked vertices. We also show how the graph searches can be rephrased as equivalent oracle problems.

  5. Remote filter handling machine for Sizewell B

    International Nuclear Information System (INIS)

    Barker, D.

    1993-01-01

    Two Filter Handling machines (FHM) have been supplied to Nuclear Electric plc for use at Sizewell B Power Station. These machines have been designed and built following ALARP principles with the functional objective being to remove radioactive filter cartridges from a filter housing and replace them with clean filter cartridges. Operation of the machine is achieved by the prompt of each distinct task via an industrial computer or the prompt of a full cycle using the automatic mode. The design of the machine features many aspects demonstrating ALARP while keeping the machine simple, robust and easy to maintain. (author)

  6. Test of Oracle JSON support in the view of CMS JSON data

    OpenAIRE

    Baveja, Sartaj Singh; Dziedziniewicz-Wojcik, Katarzyna Maria; Kuznetsov, Valentin

    2016-01-01

    Abstract Oracle has introduced native support for Javascript Object Notation (JSON) data in its 12c release with relational database features, including transactions, indexing, declarative querying and views. The requirements for the CMS WMArchive project, whose goal is to reliably store its Workflow and Data Management framework job report (FWJR) documents, include storing deep nested JSON structures, running queries over them and aggregating data in an effective way. The objective of th...

  7. Robust Machine Learning-Based Correction on Automatic Segmentation of the Cerebellum and Brainstem.

    Directory of Open Access Journals (Sweden)

    Jun Yi Wang

    Full Text Available Automated segmentation is a useful method for studying large brain structures such as the cerebellum and brainstem. However, automated segmentation may lead to inaccuracy and/or undesirable boundary. The goal of the present study was to investigate whether SegAdapter, a machine learning-based method, is useful for automatically correcting large segmentation errors and disagreement in anatomical definition. We further assessed the robustness of the method in handling size of training set, differences in head coil usage, and amount of brain atrophy. High resolution T1-weighted images were acquired from 30 healthy controls scanned with either an 8-channel or 32-channel head coil. Ten patients, who suffered from brain atrophy because of fragile X-associated tremor/ataxia syndrome, were scanned using the 32-channel head coil. The initial segmentations of the cerebellum and brainstem were generated automatically using Freesurfer. Subsequently, Freesurfer's segmentations were both manually corrected to serve as the gold standard and automatically corrected by SegAdapter. Using only 5 scans in the training set, spatial overlap with manual segmentation in Dice coefficient improved significantly from 0.956 (for Freesurfer segmentation to 0.978 (for SegAdapter-corrected segmentation for the cerebellum and from 0.821 to 0.954 for the brainstem. Reducing the training set size to 2 scans only decreased the Dice coefficient ≤0.002 for the cerebellum and ≤ 0.005 for the brainstem compared to the use of training set size of 5 scans in corrective learning. The method was also robust in handling differences between the training set and the test set in head coil usage and the amount of brain atrophy, which reduced spatial overlap only by <0.01. These results suggest that the combination of automated segmentation and corrective learning provides a valuable method for accurate and efficient segmentation of the cerebellum and brainstem, particularly in large

  8. Big data mining: In-database Oracle data mining over hadoop

    Science.gov (United States)

    Kovacheva, Zlatinka; Naydenova, Ina; Kaloyanova, Kalinka; Markov, Krasimir

    2017-07-01

    Big data challenges different aspects of storing, processing and managing data, as well as analyzing and using data for business purposes. Applying Data Mining methods over Big Data is another challenge because of huge data volumes, variety of information, and the dynamic of the sources. Different applications are made in this area, but their successful usage depends on understanding many specific parameters. In this paper we present several opportunities for using Data Mining techniques provided by the analytical engine of RDBMS Oracle over data stored in Hadoop Distributed File System (HDFS). Some experimental results are given and they are discussed.

  9. Oracle WebLogic Server 12c advanced administration cookbook

    CERN Document Server

    Iwazaki, Dalton

    2013-01-01

    Using real life problems and simple solutions this book will make any issue seem small. WebLogic Server books can be a bit dry but Dalton keeps the tone light and ensures no matter how complex the problem you always feel like you have someone right there with you helping you along.This book is ideal for those who know the basics of WebLogic but want to dive deeper and get to grips with more advanced topics. So if you are a datacenter operator, system administrator or even a Java developer this book could be exactly what you are looking for to take you one step further with Oracle WebLogic Serv

  10. Impact of Spot Size and Spacing on the Quality of Robustly Optimized Intensity Modulated Proton Therapy Plans for Lung Cancer.

    Science.gov (United States)

    Liu, Chenbin; Schild, Steven E; Chang, Joe Y; Liao, Zhongxing; Korte, Shawn; Shen, Jiajian; Ding, Xiaoning; Hu, Yanle; Kang, Yixiu; Keole, Sameer R; Sio, Terence T; Wong, William W; Sahoo, Narayan; Bues, Martin; Liu, Wei

    2018-06-01

    To investigate how spot size and spacing affect plan quality, robustness, and interplay effects of robustly optimized intensity modulated proton therapy (IMPT) for lung cancer. Two robustly optimized IMPT plans were created for 10 lung cancer patients: first by a large-spot machine with in-air energy-dependent large spot size at isocenter (σ: 6-15 mm) and spacing (1.3 σ), and second by a small-spot machine with in-air energy-dependent small spot size (σ: 2-6 mm) and spacing (5 mm). Both plans were generated by optimizing radiation dose to internal target volume on averaged 4-dimensional computed tomography scans using an in-house-developed IMPT planning system. The dose-volume histograms band method was used to evaluate plan robustness. Dose evaluation software was developed to model time-dependent spot delivery to incorporate interplay effects with randomized starting phases for each field per fraction. Patient anatomy voxels were mapped phase-to-phase via deformable image registration, and doses were scored using in-house-developed software. Dose-volume histogram indices, including internal target volume dose coverage, homogeneity, and organs at risk (OARs) sparing, were compared using the Wilcoxon signed-rank test. Compared with the large-spot machine, the small-spot machine resulted in significantly lower heart and esophagus mean doses, with comparable target dose coverage, homogeneity, and protection of other OARs. Plan robustness was comparable for targets and most OARs. With interplay effects considered, significantly lower heart and esophagus mean doses with comparable target dose coverage and homogeneity were observed using smaller spots. Robust optimization with a small spot-machine significantly improves heart and esophagus sparing, with comparable plan robustness and interplay effects compared with robust optimization with a large-spot machine. A small-spot machine uses a larger number of spots to cover the same tumors compared with a large

  11. Octree-based indexing for 3D pointclouds within an Oracle Spatial DBMS

    Science.gov (United States)

    Schön, Bianca; Mosa, Abu Saleh Mohammad; Laefer, Debra F.; Bertolotto, Michela

    2013-02-01

    A large proportion of today's digital datasets have a spatial component. The effective storage and management of which poses particular challenges, especially with light detection and ranging (LiDAR), where datasets of even small geographic areas may contain several hundred million points. While in the last decade 2.5-dimensional data were prevalent, true 3-dimensional data are increasingly commonplace via LiDAR. They have gained particular popularity for urban applications including generation of city-scale maps, baseline data disaster management, and utility planning. Additionally, LiDAR is commonly used for flood plane identification, coastal-erosion tracking, and forest biomass mapping. Despite growing data availability, current spatial information systems do not provide suitable full support for the data's true 3D nature. Consequently, one system is needed to store the data and another for its processing, thereby necessitating format transformations. The work presented herein aims at a more cost-effective way for managing 3D LiDAR data that allows for storage and manipulation within a single system by enabling a new index within existing spatial database management technology. Implementation of an octree index for 3D LiDAR data atop Oracle Spatial 11g is presented, along with an evaluation showing up to an eight-fold improvement compared to the native Oracle R-tree index.

  12. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  13. Transformación del modelo de clases uml a oracle9i® bajo la directiva mda: un caso de estudio

    Directory of Open Access Journals (Sweden)

    FERNANDO ARANGO

    2006-01-01

    Full Text Available La Arquitectura Orientada a Modelos (MDA es la propuesta de refinamiento de la OMG orientada a la generación automática de código a partir de los Modelos UML de Sistemas Independientes de la Plataforma de Implementación. En este trabajo se presenta una metodología para transformar el Modelo de Clases UML a un Modelo UML Dependiente de la Plataforma Oracle9i®, siguiendo los lineamientos básicos presentados por esta arquitectura y utilizando a UML como lenguaje de modelado a través de todos los pasos de dicha transformación. Inicialmente las reglas de transformación del Modelo de Clases de UML al Modelo Objeto-Relacional soportado por Oracle9i® son recopiladas en Español y adaptadas a nivel de metamodelo, para lo cual fue necesario elaborar un metamodelo simplificado de la plataforma Oracle9i®. Este conjunto de reglas se hace automatizable al expresarlas en un formalismo lógico, que sea fácilmente ejecutable por una herramienta CASE que soporte un lenguaje formal. Finalmente, se aplican las reglas de refinamiento formalizadas al Modelo de Clases de un Caso Práctico de estudio obteniendo como resultado, un Modelo UML instancia del Metamodelo de la Plataforma Oracle9i®. Los aspectos del Modelo de Clases en los que se hace énfasis en la transformación son las invariantes y reglas de derivación de atributos definidas en el lenguaje formal OCL, así como las relaciones de asociación, composición y generalización entre Clases.

  14. Making Peer-Assisted Content Distribution Robust to Collusion Using Bandwidth Puzzles

    Science.gov (United States)

    Reiter, Michael K.; Sekar, Vyas; Spensky, Chad; Zhang, Zhenghao

    Many peer-assisted content-distribution systems reward a peer based on the amount of data that this peer serves to others. However, validating that a peer did so is, to our knowledge, an open problem; e.g., a group of colluding attackers can earn rewards by claiming to have served content to one another, when they have not. We propose a puzzle mechanism to make contribution-aware peer-assisted content distribution robust to such collusion. Our construction ties solving the puzzle to possession of specific content and, by issuing puzzle challenges simultaneously to all parties claiming to have that content, our mechanism prevents one content-holder from solving many others' puzzles. We prove (in the random oracle model) the security of our scheme, describe our integration of bandwidth puzzles into a media streaming system, and demonstrate the resulting attack resilience via simulations.

  15. War and religion: Lucian, the oracle of Alexander in Abonuteichos, and the military defeats of Sedatius Severianus against the Parthians, and Marcus Aurelius against Cuadi and Marcomanni

    Directory of Open Access Journals (Sweden)

    Sabino PEREA YÉBENES

    2013-03-01

    Full Text Available Taking as a start point the opuscule by Lucian of Samosata entitled Alexander or The false Prophet, we call attention on some autophone oracles sent by this controversial oracular shrine in Abonuteichos: with singular attention to oracle given to imperial legate Sedatius Severianus, in war against the parthians (Luc. Alex. 27, and the so called «oracle of the two lions» (Luc. Alex. 48, requested by the emperor Marcus Aurelius shortly before the beginning of the military campaign against the barbarians Cuadi and Marcomanni. The rising of divinatory practices and the popularization of «holy men» in this time are symptoms -rather than having relation with crisisof spiritual changes in the religious beliefs of the period. In the case studies analyzed here, these changes are also transferred to the political field: the war in the frontiers, which also show or announced far-reaching structural changes, with barbarian attacks becoming more systematic and effective who harassed the Roman power across the natural barrier of the Danube. We relate the «oracle of lions» with the scene XII (destroyed, but preserved in a drawing of the XVII Century of the Aurelian Column in Rome.

  16. Análisis de impacto y desarrollo de buenas práctica de auditoría en bases de datos Oracle 11G

    OpenAIRE

    Delgado Picazo, Mario

    2009-01-01

    Este proyecto consiste en realizar un análisis y un estudio de los diferentes tipos de auditoría en Oracle 11g, con el objetivo de estudiar las características de las herramientas disponibles, así como su viabilidad. Mediante experimentación, se estudiará el comportamiento y el rendimiento de las diferentes modalidades de auditoría que Oracle 11g ofrece como motor de la base de datos, obteniendo resultados y comparativas que se utilizarán para preparar un manual de buenas prácticas acerca de ...

  17. Oracle as a tool for monitoring data management in French nuclear power plants

    International Nuclear Information System (INIS)

    Joussellin, A.; Tarteret, P.; Gal, A.

    1996-05-01

    On-line monitoring of the main components of the French nuclear power plants is performed using an integrated system called PSAD (Poste de Surveillance et d'Aide au Diagnostic). In real-time, physical measurement data are continuously acquired, computed and stored in an ORACLE database. All measurement data are dated and represent a wide range of physical variables (temperatures, vibrations, acoustic waves,...). Then, millions of measurements are available to the operator for diagnostic. (author)

  18. Oracle as a tool for monitoring data management in French nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Joussellin, A.; Tarteret, P.; Gal, A.

    1996-05-01

    On-line monitoring of the main components of the French nuclear power plants is performed using an integrated system called PSAD (Poste de Surveillance et d`Aide au Diagnostic). In real-time, physical measurement data are continuously acquired, computed and stored in an ORACLE database. All measurement data are dated and represent a wide range of physical variables (temperatures, vibrations, acoustic waves,...). Then, millions of measurements are available to the operator for diagnostic. (author).

  19. Robust recognition of degraded machine-printed characters using complementary similarity measure and error-correction learning

    Science.gov (United States)

    Hagita, Norihiro; Sawaki, Minako

    1995-03-01

    Most conventional methods in character recognition extract geometrical features such as stroke direction, connectivity of strokes, etc., and compare them with reference patterns in a stored dictionary. Unfortunately, geometrical features are easily degraded by blurs, stains and the graphical background designs used in Japanese newspaper headlines. This noise must be removed before recognition commences, but no preprocessing method is completely accurate. This paper proposes a method for recognizing degraded characters and characters printed on graphical background designs. This method is based on the binary image feature method and uses binary images as features. A new similarity measure, called the complementary similarity measure, is used as a discriminant function. It compares the similarity and dissimilarity of binary patterns with reference dictionary patterns. Experiments are conducted using the standard character database ETL-2 which consists of machine-printed Kanji, Hiragana, Katakana, alphanumeric, an special characters. The results show that this method is much more robust against noise than the conventional geometrical feature method. It also achieves high recognition rates of over 92% for characters with textured foregrounds, over 98% for characters with textured backgrounds, over 98% for outline fonts, and over 99% for reverse contrast characters.

  20. Oracle Efficient Estimation and Forecasting with the Adaptive LASSO and the Adaptive Group LASSO in Vector Autoregressions

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Callot, Laurent

    We show that the adaptive Lasso (aLasso) and the adaptive group Lasso (agLasso) are oracle efficient in stationary vector autoregressions where the number of parameters per equation is smaller than the number of observations. In particular, this means that the parameters are estimated consistently...

  1. Robust transient stabilisation problem for a synchronous generator in a power network

    Science.gov (United States)

    Verrelli, C. M.; Damm, G.

    2010-04-01

    The robust transient stabilisation problem (with stability proof€) of a synchronous generator in an uncertain power network with transfer conductances is rigorously formulated and solved. The generator angular speed and electrical power are required to be kept close, when mechanical and electrical perturbations occur, to the synchronous speed and mechanical input power, respectively, while the generator terminal voltage is to be regulated, when perturbations are removed, to its pre-fault reference constant value. A robust adaptive nonlinear feedback control algorithm is designed on the basis of a third-order model of the synchronous machine: only two system parameters (synchronous machine damping and inertia constants) along with upper and lower bounds on the remaining uncertain ones are supposed to be known. The conditions to be satisfied by the remote network dynamics for guaranteeing ℒ2 and ℒ∞ robustness and asymptotic relative speed and voltage regulation to zero are weaker than those required by the single machine-infinite bus approximation: dynamic interactions between the local deviations of the generator states from the corresponding equilibrium values and the remote generators states are allowed.

  2. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    Science.gov (United States)

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  3. Designing the Search Service for Enterprise Portal based on Oracle Universal Content Management

    Science.gov (United States)

    Bauer, K. S.; Kuznetsov, D. Y.; Pominov, A. D.

    2017-01-01

    Enterprise Portal is an important part of an organization in informative and innovative space. The portal provides collaboration between employees and the organization. This article gives a valuable background of Enterprise Portal and technologies. The paper presents Oracle WebCenter Portal and UCM Server integration in detail. The focus is on tools for Enterprise Portal and on Search Service in particular. The paper also presents several UML diagrams to describe the use of cases for Search Service and main components of this application.

  4. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  5. Financial signal processing and machine learning

    CERN Document Server

    Kulkarni,Sanjeev R; Dmitry M. Malioutov

    2016-01-01

    The modern financial industry has been required to deal with large and diverse portfolios in a variety of asset classes often with limited market data available. Financial Signal Processing and Machine Learning unifies a number of recent advances made in signal processing and machine learning for the design and management of investment portfolios and financial engineering. This book bridges the gap between these disciplines, offering the latest information on key topics including characterizing statistical dependence and correlation in high dimensions, constructing effective and robust risk measures, and their use in portfolio optimization and rebalancing. The book focuses on signal processing approaches to model return, momentum, and mean reversion, addressing theoretical and implementation aspects. It highlights the connections between portfolio theory, sparse learning and compressed sensing, sparse eigen-portfolios, robust optimization, non-Gaussian data-driven risk measures, graphical models, causal analy...

  6. IQC-based robust stability analysis for LPV control of doubly-fed induction generators

    NARCIS (Netherlands)

    Tien, H. N.; Scherer, C. W.; Scherpen, J. M. A.

    2008-01-01

    Parameters of electrical machines are usually varying with time in a smooth way due to changing operating conditions, such as variations in the machine temperature and/or the magnetic saturation. This paper is concerned with robust stability analysis of controlled Doubly-Fed Induction Generators

  7. On robust parameter estimation in brain-computer interfacing

    Science.gov (United States)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  8. A robust regression based on weighted LSSVM and penalized trimmed squares

    International Nuclear Information System (INIS)

    Liu, Jianyong; Wang, Yong; Fu, Chengqun; Guo, Jie; Yu, Qin

    2016-01-01

    Least squares support vector machine (LS-SVM) for nonlinear regression is sensitive to outliers in the field of machine learning. Weighted LS-SVM (WLS-SVM) overcomes this drawback by adding weight to each training sample. However, as the number of outliers increases, the accuracy of WLS-SVM may decrease. In order to improve the robustness of WLS-SVM, a new robust regression method based on WLS-SVM and penalized trimmed squares (WLSSVM–PTS) has been proposed. The algorithm comprises three main stages. The initial parameters are obtained by least trimmed squares at first. Then, the significant outliers are identified and eliminated by the Fast-PTS algorithm. The remaining samples with little outliers are estimated by WLS-SVM at last. The statistical tests of experimental results carried out on numerical datasets and real-world datasets show that the proposed WLSSVM–PTS is significantly robust than LS-SVM, WLS-SVM and LSSVM–LTS.

  9. On Robust Information Extraction from High-Dimensional Data

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2014-01-01

    Roč. 9, č. 1 (2014), s. 131-144 ISSN 1452-4864 Grant - others:GA ČR(CZ) GA13-01930S Institutional support: RVO:67985807 Keywords : data mining * high-dimensional data * robust econometrics * outliers * machine learning Subject RIV: IN - Informatics, Computer Science

  10. Comparison of Cloud backup performance and costs in Oracle database

    Directory of Open Access Journals (Sweden)

    Aljaž Zrnec

    2011-06-01

    Full Text Available Normal 0 21 false false false SL X-NONE X-NONE Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database using Amazon S3 and EC2 cloud services. The costs estimation was performed on the basis of the prices published on Amazon S3 and Amazon EC2 sites.

  11. Analysis and minimization of Torque Ripple for variable Flux reluctance machines

    NARCIS (Netherlands)

    Bao, J.; Gysen, B.L.J.; Boynov, K.; Paulides, J.J.H.; Lomonova, E.A.

    2017-01-01

    Variable flux reluctance machines (VFRMs) are permanent-magnet-free three-phase machines and are promising candidates for applications requiring low cost and robustness. This paper studies the torque ripple and minimization methods for 12-stator VFRMs. Starting with the analysis of harmonics in the

  12. Application of Artificial Intelligence Techniques for the Control of the Asynchronous Machine

    Directory of Open Access Journals (Sweden)

    F. Khammar

    2016-01-01

    Full Text Available The induction machine is experiencing a growing success for two decades by gradually replacing the DC machines and synchronous in many industrial applications. This paper is devoted to the study of advanced methods applied to the command of the asynchronous machine in order to obtain a system of control of high performance. While the criteria for response time, overtaking, and static error can be assured by the techniques of conventional control, the criterion of robustness remains a challenge for researchers. This criterion can be satisfied only by applying advanced techniques of command. After mathematical modeling of the asynchronous machine, it defines the control strategies based on the orientation of the rotor flux. The results of the different simulation tests highlight the properties of robustness of algorithms proposed and suggested to compare the different control strategies.

  13. TCSC robust damping controller design based on particle swarm optimization for a multi-machine power system

    Energy Technology Data Exchange (ETDEWEB)

    Shayeghi, H., E-mail: hshayeghi@gmail.co [Technical Engineering Department, University of Mohaghegh Ardabili, Ardabil (Iran, Islamic Republic of); Shayanfar, H.A. [Center of Excellence for Power System Automation and Operation, Electrical Engineering Department, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Jalilzadeh, S.; Safari, A. [Technical Engineering Department, Zanjan University, Zanjan (Iran, Islamic Republic of)

    2010-10-15

    In this paper, a new approach based on the particle swarm optimization (PSO) technique is proposed to tune the parameters of the thyristor controlled series capacitor (TCSC) power oscillation damping controller. The design problem of the damping controller is converted to an optimization problem with the time-domain-based objective function which is solved by a PSO technique which has a strong ability to find the most optimistic results. To ensure the robustness of the proposed stabilizers, the design process takes a wide range of operating conditions into account. The performance of the newly designed controller is evaluated in a four-machine power system subjected to the different types of disturbances in comparison with the genetic algorithm based damping controller. The effectiveness of the proposed controller is demonstrated through the nonlinear time-domain simulation and some performance indices studies. The results analysis reveals that the tuned PSO based TCSC damping controller using the proposed fitness function has an excellent capability in damping power system inter-area oscillations and enhances greatly the dynamic stability of the power systems. Moreover, it is superior to the genetic algorithm based damping controller.

  14. Dynamic thermal analysis of machines in running state

    CERN Document Server

    Wang, Lihui

    2014-01-01

    With the increasing complexity and dynamism in today’s machine design and development, more precise, robust and practical approaches and systems are needed to support machine design. Existing design methods treat the targeted machine as stationery. Analysis and simulation are mostly performed at the component level. Although there are some computer-aided engineering tools capable of motion analysis and vibration simulation etc., the machine itself is in the dry-run state. For effective machine design, understanding its thermal behaviours is crucial in achieving the desired performance in real situation. Dynamic Thermal Analysis of Machines in Running State presents a set of innovative solutions to dynamic thermal analysis of machines when they are put under actual working conditions. The objective is to better understand the thermal behaviours of a machine in real situation while at the design stage. The book has two major sections, with the first section presenting a broad-based review of the key areas of ...

  15. The ORACLE Children Study: educational outcomes at 11 years of age following antenatal prescription of erythromycin or co-amoxiclav.

    Science.gov (United States)

    Marlow, Neil; Bower, Hannah; Jones, David; Brocklehurst, Peter; Kenyon, Sara; Pike, Katie; Taylor, David; Salt, Alison

    2017-03-01

    Antibiotics used for women in spontaneous preterm labour without overt infection, in contrast to those with preterm rupture of membranes, are associated with altered functional outcomes in their children. From the National Pupil Database, we used Key Stage 2 scores, national test scores in school year 6 at 11 years of age, to explore the hypothesis that erythromycin and co-amoxiclav were associated with poorer educational outcomes within the ORACLE Children Study. Anonymised scores for 97% of surviving children born to mothers recruited to ORACLE and resident in England were analysed against treatment group adjusting for key available socio-demographic potential confounders. No association with crude or with adjusted scores for English, mathematics or science was observed by maternal antibiotic group in either women with preterm rupture of membranes or spontaneous preterm labour with intact membranes. While the proportion receiving special educational needs was similar in each group (range 31.6-34.4%), it was higher than the national rate of 19%. Despite evidence that antibiotics are associated with increased functional impairment at 7 years, educational test scores and special needs at 11 years of age show no differences between trial groups. ISCRT Number 52995660 (original ORACLE trial number). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning.

    Science.gov (United States)

    Gorban, A N; Mirkes, E M; Zinovyev, A

    2016-12-01

    Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L 1 norm or even sub-linear potentials corresponding to quasinorms L p (0application of min-plus algebra. The approach can be applied in most of existing machine learning methods, including methods of data approximation and regularized and sparse regression, leading to the improvement in the computational cost/accuracy trade-off. We demonstrate that on synthetic and real-life datasets PQSQ-based machine learning methods achieve orders of magnitude faster computational performance than the corresponding state-of-the-art methods, having similar or better approximation accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Investigation on Oracle GoldenGate Veridata for Data Consistency in WLCG Distributed Database Environment

    OpenAIRE

    Asko, Anti; Lobato Pardavila, Lorena

    2014-01-01

    Abstract In the distributed database environment, the data divergence can be an important problem: if it is not discovered and correctly identified, incorrect data can lead to poor decision making, errors in the service and in the operative errors. Oracle GoldenGate Veridata is a product to compare two sets of data and identify and report on data that is out of synchronization. IT DB is providing a replication service between databases at CERN and other computer centers worldwide as a par...

  18. Robust non-wetting PTFE surfaces by femtosecond laser machining.

    Science.gov (United States)

    Liang, Fang; Lehr, Jorge; Danielczak, Lisa; Leask, Richard; Kietzig, Anne-Marie

    2014-08-08

    Nature shows many examples of surfaces with extraordinary wettability,which can often be associated with particular air-trapping surface patterns. Here,robust non-wetting surfaces have been created by femtosecond laser ablation of polytetrafluoroethylene (PTFE). The laser-created surface structure resembles a forest of entangled fibers, which support structural superhydrophobicity even when the surface chemistry is changed by gold coating. SEM analysis showed that the degree of entanglement of hairs and the depth of the forest pattern correlates positively with accumulated laser fluence and can thus be influenced by altering various laser process parameters. The resulting fibrous surfaces exhibit a tremendous decrease in wettability compared to smooth PTFE surfaces; droplets impacting the virgin or gold coated PTFE forest do not wet the surface but bounce off. Exploratory bioadhesion experiments showed that the surfaces are truly air-trapping and do not support cell adhesion. Therewith, the created surfaces successfully mimic biological surfaces such as insect wings with robust anti-wetting behavior and potential for antiadhesive applications. In addition, the fabrication can be carried out in one process step, and our results clearly show the insensitivity of the resulting non-wetting behavior to variations in the process parameters,both of which make it a strong candidate for industrial applications.

  19. Robust Non-Wetting PTFE Surfaces by Femtosecond Laser Machining

    Directory of Open Access Journals (Sweden)

    Fang Liang

    2014-08-01

    Full Text Available Nature shows many examples of surfaces with extraordinary wettability, which can often be associated with particular air-trapping surface patterns. Here, robust non-wetting surfaces have been created by femtosecond laser ablation of polytetrafluoroethylene (PTFE. The laser-created surface structure resembles a forest of entangled fibers, which support structural superhydrophobicity even when the surface chemistry is changed by gold coating. SEM analysis showed that the degree of entanglement of hairs and the depth of the forest pattern correlates positively with accumulated laser fluence and can thus be influenced by altering various laser process parameters. The resulting fibrous surfaces exhibit a tremendous decrease in wettability compared to smooth PTFE surfaces; droplets impacting the virgin or gold coated PTFE forest do not wet the surface but bounce off. Exploratory bioadhesion experiments showed that the surfaces are truly air-trapping and do not support cell adhesion. Therewith, the created surfaces successfully mimic biological surfaces such as insect wings with robust anti-wetting behavior and potential for antiadhesive applications. In addition, the fabrication can be carried out in one process step, and our results clearly show the insensitivity of the resulting non-wetting behavior to variations in the process parameters, both of which make it a strong candidate for industrial applications.

  20. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  1. Fine-Grained Forward-Secure Signature Schemes without Random Oracles

    DEFF Research Database (Denmark)

    Camenisch, Jan; Koprowski, Maciej

    2006-01-01

    We propose the concept of fine-grained forward-secure signature schemes. Such signature schemes not only provide nonrepudiation w.r.t. past time periods the way ordinary forward-secure signature schemes do but, in addition, allow the signer to specify which signatures of the current time period...... remain valid when revoking the public key. This is an important advantage if the signer produces many signatures per time period as otherwise the signer would have to re-issue those signatures (and possibly re-negotiate the respective messages) with a new key.Apart from a formal model for fine......-grained forward-secure signature schemes, we present practical schemes and prove them secure under the strong RSA assumption only, i.e., we do not resort to the random oracle model to prove security. As a side-result, we provide an ordinary forward-secure scheme whose key-update time is significantly smaller than...

  2. Production of biofuels and biochemicals: in need of an ORACLE.

    Science.gov (United States)

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  3. Model predictive control of hybrid systems : stability and robustness

    NARCIS (Netherlands)

    Lazar, M.

    2006-01-01

    This thesis considers the stabilization and the robust stabilization of certain classes of hybrid systems using model predictive control. Hybrid systems represent a broad class of dynamical systems in which discrete behavior (usually described by a finite state machine) and continuous behavior

  4. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

    Science.gov (United States)

    Kong, Shengchun; Nan, Bin

    2014-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses.

  5. Development of the LEP high level control system using ORACLE as an online database

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; De Rijk, G.; Tarrant, M.

    1994-01-01

    A complete rewrite of the high level application software for the control of LEP has been carried out. ORACLE was evaluated and subsequently used as the on-line database in the implementation of the system. All control information and settings are stored on this database. This paper describes the project development cycle, the method used, the use of CASE and the project management used by the team. The performance of the system and the database and their impact on the LEP performance is discussed. ((orig.))

  6. Řešení BI a DWH v prostředí Oracle

    OpenAIRE

    Podbraný, Petr

    2009-01-01

    The main objective of the thesis is to propose complex business intelligence and data warehouse architecture on the Oracle platform respecting current best-practice and recommendations. Part of the thesis is focused on selecting the right components for data warehousing and business intelligence, as well as on evaluating their suitability within the proposed business intelligence architecture and incorporating of these components into the complex business intelligence architecture. My thesis ...

  7. A Review of Design Optimization Methods for Electrical Machines

    Directory of Open Access Journals (Sweden)

    Gang Lei

    2017-11-01

    Full Text Available Electrical machines are the hearts of many appliances, industrial equipment and systems. In the context of global sustainability, they must fulfill various requirements, not only physically and technologically but also environmentally. Therefore, their design optimization process becomes more and more complex as more engineering disciplines/domains and constraints are involved, such as electromagnetics, structural mechanics and heat transfer. This paper aims to present a review of the design optimization methods for electrical machines, including design analysis methods and models, optimization models, algorithms and methods/strategies. Several efficient optimization methods/strategies are highlighted with comments, including surrogate-model based and multi-level optimization methods. In addition, two promising and challenging topics in both academic and industrial communities are discussed, and two novel optimization methods are introduced for advanced design optimization of electrical machines. First, a system-level design optimization method is introduced for the development of advanced electric drive systems. Second, a robust design optimization method based on the design for six-sigma technique is introduced for high-quality manufacturing of electrical machines in production. Meanwhile, a proposal is presented for the development of a robust design optimization service based on industrial big data and cloud computing services. Finally, five future directions are proposed, including smart design optimization method for future intelligent design and production of electrical machines.

  8. PERANCANGAN DAN PEMBUATAN APLIKASI ERD GENERATOR NOTASI ORM DARI SKRIP BASIS DATA ORACLE BERBASIS J2EE

    Directory of Open Access Journals (Sweden)

    Leo Willyanto Santoso

    2004-01-01

    Full Text Available The existing of database processing is needed by many institutions and companies. Database is not only to get information faster, it is also enlarging their service to customer. For companies, this advantage can increase competency. Because of this reason, many companies using manual processing turn to database. As mentioned above, database reverse engineering process has became a necessity for database developers to understand the structure of any databases. Commonly, this structure is modeled in some notations of Entity Relationships Diagram (ERD. The graphical visualization of database structure in an ERD can use many notations, so it is easy to understand. One of the approaches, which are easy to understand, is Object Role Modeling (ORM diagram. By reverse engineering the mapping process to relation schema of a database, ERD generation from a data definition language script of Oracle can be done. For more flexibility, this application is constructed web based with Servlets technology that is propertied by JavaTM 2 SDK. Abstract in Bahasa Indonesia : Kehadiran pemrosesan basis data diperlukan oleh berbagai institusi dan perusahaan. Basis data tidak hanya mempercepat pemerolehan informasi, tetapi juga dapat meningkatkan pelayanan kepada pelanggan. Bagi perusahaan, keuntungan seperti ini dapat meningkatkan daya saingnya terhadap perusahaan lain. Hal ini pulalah yang mendorong banyak perusahaan yang menggunakan pemrosesan manual mulai beralih memanfaatkan basis data. Sejalan dengan hal di atas, proses reverse engineering terhadap suatu basis data menjadi suatu kebutuhan bagi perancang basis data untuk mengetahui struktur dari sebuah basis data. Struktur tersebut biasanya dimodelkan dalam bentuk Entity Relationships Diagram (ERD. Penggambaran struktur basis data dalam sebuah ERD dapat menggunakan berbagai notasi agar menjadi lebih mudah dimengerti. Salah satu notasi yang pendekatannya mudah dipahami adalah notasi Object Role Modeling (ORM diagram

  9. [The historical materials of stomatology in the oracle bone inscriptions of the Yin-Shang Dynasties].

    Science.gov (United States)

    Li, Xiaojun; Zhu, Lang

    2015-07-01

    Some oracle bone inscriptions of the Yin-Shang Dynasties were related to the stomatology, including special terms of diseases of the mouth, tongue and teeth which were classified, and proper nouns of some special diseases. Moreover, witch doctors' exploration for the causes of oral diseases, the observation on different stages of oral diseases, and the records of oral disease treatment were also involved. All of these reflected the sprouting stage of stomatology in the Yin-Shang Dynasties in ancient China.

  10. Robust Optimization Approach for Design for a Dynamic Cell Formation Considering Labor Utilization: Bi-objective Mathematical Mode

    Directory of Open Access Journals (Sweden)

    Hiwa Farughi

    2016-05-01

    Full Text Available In this paper, robust optimization of a bi-objective mathematical model in a dynamic cell formation problem considering labor utilization with uncertain data is carried out. The robust approach is used to reduce the effects of fluctuations of the uncertain parameters with regards to all the possible future scenarios. In this research, cost parameters of the cell formation and demand fluctuations are subject to uncertainty and a mixed-integer programming (MIP model is developed to formulate the related robust dynamic cell formation problem. Then the problem is transformed into a bi-objective linear one. The first objective function seeks to minimize relevant costs of the problem including machine procurement and relocation costs, machine variable cost, inter-cell movement and intra-cell movement costs, overtime cost and labor shifting cost between cells, machine maintenance cost, inventory, holding part cost. The second objective function seeks to minimize total man-hour deviations between cells or indeed labor utilization of the modeled.

  11. Robust free-space optical communication for indoor information environment

    Science.gov (United States)

    Nakada, Toyohisa; Itoh, Hideo; Kunifuji, Susumu; Nakashima, Hideyuki

    2003-10-01

    The purpose of our study is to establish a robust communication, while keeping security and privacy, between a handheld communicator and the surrounding information environment. From the viewpoint of low power consumption, we have been developing a reflectivity modulating communication module composed of a liquid crystal light modulator and a corner-reflecting mirror sheet. We installed a corner-reflecting sheet instead of light scattering sheet in a handheld videogame machine with a display screen with a reflection-type liquid crystal. Infrared (IR) LED illuminator attached next to the IR camera of a base station illuminates all the room, and the terminal send their data to the base station by switching ON and OFF of the reflected IR beam. Intensity of reflected light differs with the position and the direction of the terminal, and sometimes the intensity of OFF signal at a certain condition is brighter than that of ON signal at another condition. To improve the communication quality, use of machine learning technique is a possibility of the solution. In this paper, we compare various machine learning techniques for the purpose of free space optical communication, and propose a new algorithm that improves the robustness of the data link. Evaluation using an actual free-space communication system is also described.

  12. Cloud Condensation Nuclei Measurements During the First Year of the ORACLES Study

    Science.gov (United States)

    Kacarab, M.; Howell, S. G.; Wood, R.; Redemann, J.; Nenes, A.

    2016-12-01

    Aerosols have significant impacts on air quality and climate. Their ability to scatter and absorb radiation and to act as cloud condensation nuclei (CCN) plays a very important role in the global climate. Biomass burning organic aerosol (BBOA) can drastically elevate the concentration of CCN in clouds, but the response in droplet number may be strongly suppressed (or even reversed) owing to low supersaturations that may develop from the strong competition of water vapor (Bougiatioti et al. 2016). Understanding and constraining the magnitude of droplet response to biomass burning plumes is an important component of the aerosol-cloud interaction problem. The southeastern Atlantic (SEA) cloud deck provides a unique opportunity to study these cloud-BBOA interactions for marine stratocumulus, as it is overlain by a large, optically thick biomass burning aerosol plume from Southern Africa during the burning season. The interaction between these biomass burning aerosols and the SEA cloud deck is being investigated in the NASA ObseRvations of Aerosols above Clouds and their intEractionS (ORACLES) study. The CCN activity of aerosol around the SEA cloud deck and associated biomass burning plume was evaluated during the first year of the ORACLES study with direct measurements of CCN concentration, aerosol size distribution and composition onboard the NASA P-3 aircraft during August and September of 2016. Here we present analysis of the observed CCN activity of the BBOA aerosol in and around the SEA cloud deck and its relationship to aerosol size, chemical composition, and plume mixing and aging. We also evaluate the predicted and observed droplet number sensitivity to the aerosol fluctuations and quantify, using the data, the drivers of droplet number variability (vertical velocity or aerosol properties) as a function of biomass burning plume characteristics.

  13. A double-sided linear primary permanent magnet vernier machine.

    Science.gov (United States)

    Du, Yi; Zou, Chunhua; Liu, Xianxing

    2015-01-01

    The purpose of this paper is to present a new double-sided linear primary permanent magnet (PM) vernier (DSLPPMV) machine, which can offer high thrust force, low detent force, and improved power factor. Both PMs and windings of the proposed machine are on the short translator, while the long stator is designed as a double-sided simple iron core with salient teeth so that it is very robust to transmit high thrust force. The key of this new machine is the introduction of double stator and the elimination of translator yoke, so that the inductance and the volume of the machine can be reduced. Hence, the proposed machine offers improved power factor and thrust force density. The electromagnetic performances of the proposed machine are analyzed including flux, no-load EMF, thrust force density, and inductance. Based on using the finite element analysis, the characteristics and performances of the proposed machine are assessed.

  14. Spatial extreme learning machines: An application on prediction of disease counts.

    Science.gov (United States)

    Prates, Marcos O

    2018-01-01

    Extreme learning machines have gained a lot of attention by the machine learning community because of its interesting properties and computational advantages. With the increase in collection of information nowadays, many sources of data have missing information making statistical analysis harder or unfeasible. In this paper, we present a new model, coined spatial extreme learning machine, that combine spatial modeling with extreme learning machines keeping the nice properties of both methodologies and making it very flexible and robust. As explained throughout the text, the spatial extreme learning machines have many advantages in comparison with the traditional extreme learning machines. By a simulation study and a real data analysis we present how the spatial extreme learning machine can be used to improve imputation of missing data and uncertainty prediction estimation.

  15. Oracle estimation of parametric models under boundary constraints.

    Science.gov (United States)

    Wong, Kin Yau; Goldberg, Yair; Fine, Jason P

    2016-12-01

    In many classical estimation problems, the parameter space has a boundary. In most cases, the standard asymptotic properties of the estimator do not hold when some of the underlying true parameters lie on the boundary. However, without knowledge of the true parameter values, confidence intervals constructed assuming that the parameters lie in the interior are generally over-conservative. A penalized estimation method is proposed in this article to address this issue. An adaptive lasso procedure is employed to shrink the parameters to the boundary, yielding oracle inference which adapt to whether or not the true parameters are on the boundary. When the true parameters are on the boundary, the inference is equivalent to that which would be achieved with a priori knowledge of the boundary, while if the converse is true, the inference is equivalent to that which is obtained in the interior of the parameter space. The method is demonstrated under two practical scenarios, namely the frailty survival model and linear regression with order-restricted parameters. Simulation studies and real data analyses show that the method performs well with realistic sample sizes and exhibits certain advantages over standard methods. © 2016, The International Biometric Society.

  16. Asynchronous machines. Direct torque control; Machines asynchrones. Commande par controle direct de couple

    Energy Technology Data Exchange (ETDEWEB)

    Fornel, B. de [Institut National Polytechnique, 31 - Toulouse (France)

    2006-05-15

    The asynchronous machine, with its low cost and robustness, is today the most widely used motor to make speed variators. However, its main drawback is that the same current generates both the magnetic flux and the torque, and thus any torque variation creates a flux variation. Such a coupling gives to the asynchronous machine a nonlinear behaviour which makes its control much more complex. The direct self control (DSC) method has been developed to improve the low efficiency of the scalar control method and for the specific railway drive application. The direct torque control (DTC) method is derived from the DSC method but corresponds to other type of applications. The DSC and DTC algorithms for asynchronous motors are presented in this article: 1 - direct control of the stator flux (DSC): principle, flux control, torque control, switching frequency of the inverter, speed estimation; 2 - direct torque control (DTC): principle, electromagnetic torque derivative, signals shape and switching frequency, some results, DTC speed variator without speed sensor, DTC application to multi-machine multi-converter systems; 3 - conclusion. (J.S.)

  17. ORACLE: an adjusted cross-section and covariance library for fast-reactor analysis

    International Nuclear Information System (INIS)

    Yeivin, Y.; Marable, J.H.; Weisbin, C.R.; Wagschal, J.J.

    1980-01-01

    Benchmark integral-experiment values from six fast critical-reactor assemblies and two standard neutron fields are combined with corresponding calculations using group cross sections based on ENDF/B-V in a least-squares data adjustment using evaluated covariances from ENDF/B-V and supporting covariance evaluations. Purpose is to produce an adjusted cross-section and covariance library which is based on well-documented data and methods and which is suitable for fast-reactor design. By use of such a library, data- and methods-related biases of calculated performance parameters should be reduced and uncertainties of the calculated values minimized. Consistency of the extensive data base is analyzed using the chi-square test. This adjusted library ORACLE will be available shortly

  18. An approach in building a chemical compound search engine in oracle database.

    Science.gov (United States)

    Wang, H; Volarath, P; Harrison, R

    2005-01-01

    A searching or identifying of chemical compounds is an important process in drug design and in chemistry research. An efficient search engine involves a close coupling of the search algorithm and database implementation. The database must process chemical structures, which demands the approaches to represent, store, and retrieve structures in a database system. In this paper, a general database framework for working as a chemical compound search engine in Oracle database is described. The framework is devoted to eliminate data type constrains for potential search algorithms, which is a crucial step toward building a domain specific query language on top of SQL. A search engine implementation based on the database framework is also demonstrated. The convenience of the implementation emphasizes the efficiency and simplicity of the framework.

  19. Conveyor belt nuclear weighing machine

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    In many industries the flow of materials on conveyor belts must be measured and controlled. Electromechanical weighing devices have high accuracy but are complicated and expensive to install and maintain. For many applications the nuclear weighing machine has sufficient accuracy but is considerably simpler, cheaper and more robust and is easier to maintain. The rating and performance of a gamma ray balance on the mar ket are detailed. (P.G.R.)

  20. A robust embedded vision system feasible white balance algorithm

    Science.gov (United States)

    Wang, Yuan; Yu, Feihong

    2018-01-01

    White balance is a very important part of the color image processing pipeline. In order to meet the need of efficiency and accuracy in embedded machine vision processing system, an efficient and robust white balance algorithm combining several classical ones is proposed. The proposed algorithm mainly has three parts. Firstly, in order to guarantee higher efficiency, an initial parameter calculated from the statistics of R, G and B components from raw data is used to initialize the following iterative method. After that, the bilinear interpolation algorithm is utilized to implement demosaicing procedure. Finally, an adaptive step adjustable scheme is introduced to ensure the controllability and robustness of the algorithm. In order to verify the proposed algorithm's performance on embedded vision system, a smart camera based on IMX6 DualLite, IMX291 and XC6130 is designed. Extensive experiments on a large amount of images under different color temperatures and exposure conditions illustrate that the proposed white balance algorithm avoids color deviation problem effectively, achieves a good balance between efficiency and quality, and is suitable for embedded machine vision processing system.

  1. Managing virtual machines with Vac and Vcycle

    Science.gov (United States)

    McNab, A.; Love, P.; MacMahon, E.

    2015-12-01

    We compare the Vac and Vcycle virtual machine lifecycle managers and our experiences in providing production job execution services for ATLAS, CMS, LHCb, and the GridPP VO at sites in the UK, France and at CERN. In both the Vac and Vcycle systems, the virtual machines are created outside of the experiment's job submission and pilot framework. In the case of Vac, a daemon runs on each physical host which manages a pool of virtual machines on that host, and a peer-to-peer UDP protocol is used to achieve the desired target shares between experiments across the site. In the case of Vcycle, a daemon manages a pool of virtual machines on an Infrastructure-as-a-Service cloud system such as OpenStack, and has within itself enough information to create the types of virtual machines to achieve the desired target shares. Both systems allow unused shares for one experiment to temporarily taken up by other experiements with work to be done. The virtual machine lifecycle is managed with a minimum of information, gathered from the virtual machine creation mechanism (such as libvirt or OpenStack) and using the proposed Machine/Job Features API from WLCG. We demonstrate that the same virtual machine designs can be used to run production jobs on Vac and Vcycle/OpenStack sites for ATLAS, CMS, LHCb, and GridPP, and that these technologies allow sites to be operated in a reliable and robust way.

  2. Haemodialysis at home: review of current dialysis machines.

    Science.gov (United States)

    Haroon, Sabrina; Davenport, Andrew

    2018-04-26

    Only a minority of patients with chronic kidney disease treated by hemodialysis are currently treated at home. Until relatively recently, the only type of hemodialysis machine available for these patients was a slightly smaller version of the standard machines used for in-center dialysis treatments. Areas covered: There are now an alternative generation of dialysis machines specifically designed for home hemodialysis. The home dialysis patient wants a smaller machine, which is intuitive to use, easy to trouble shoot, robust and reliable, quick to setup and put away, requiring minimal waste disposal. The machines designed for home dialysis have some similarities in terms of touch-screen patient interfaces, and using pre-prepared cartridges to speed up setting up the machine. On the other hand, they differ in terms of whether they use slower or standard dialysate flows, prepare batches of dialysis fluid, require separate water purification equipment, or whether this is integrated, or use pre-prepared sterile bags of dialysis fluid. Expert commentary: Dialysis machine complexity is one of the hurdles reducing the number of patients opting for home hemodialysis and the introduction of the newer generation of dialysis machines designed for ease of use will hopefully increase the number of patients opting for home hemodialysis.

  3. 17th Floor: A pedagogical oracle from/with Audre Lorde.

    Science.gov (United States)

    Gumbs, Alexis Pauline

    2017-10-02

    In 1974, warrior poet mother Audre Lorde published the poem "Blackstudies," a freeform dream villanelle about her complicated experience as a Black lesbian feminist English professor at the City University of New York during the dynamic period when students rose up in protest. The university granted open admissions, and cultural nationalists who taught at City University worked to create a Black Studies program. In the poem, she describes her vantage point at this particular historical and pedagogical moment from the seventeenth floor within a dreamscape where she navigates the stereotypes, silences, and urgencies that shaped her experience as an educator. 17 th Floor is a poetic oracle that contextualizes the ongoing work of "Blackstudies" (the poem and the practice), and for this reason, it should be activated as a resource for current Black and Brown lesbian educators and everyone who brings complexity and nuance to their teaching settings, their students, each other, and the world more broadly.

  4. Sequential data access with Oracle and Hadoop: a performance comparison

    International Nuclear Information System (INIS)

    Baranowski, Zbigniew; Canali, Luca; Grancher, Eric

    2014-01-01

    The Hadoop framework has proven to be an effective and popular approach for dealing with 'Big Data' and, thanks to its scaling ability and optimised storage access, Hadoop Distributed File System-based projects such as MapReduce or HBase are seen as candidates to replace traditional relational database management systems whenever scalable speed of data processing is a priority. But do these projects deliver in practice? Does migrating to Hadoop's 'shared nothing' architecture really improve data access throughput? And, if so, at what cost? Authors answer these questions–addressing cost/performance as well as raw performance– based on a performance comparison between an Oracle-based relational database and Hadoop's distributed solutions like MapReduce or HBase for sequential data access. A key feature of our approach is the use of an unbiased data model as certain data models can significantly favour one of the technologies tested.

  5. Designing a nuclear data base prototype using Oracle and Prolog

    International Nuclear Information System (INIS)

    Paviotti-Corcuera, R.; Ford, C.E.; Perez, R.B.

    1988-11-01

    An ever-increasing demand exists for easily accessible nuclear data base systems. The purpose of this work is to analyze the feasibility of using artificial intelligence methods as tools to provide the necessary functionality to extract information from nuclear data files in a user-friendly manner. For the prototype of this work, a sample of data that can be later enlarged to a complete, evaluated nuclear data base has been used. To implement this prototype, two approaches have been followed: a conventional approach using the commercially available Oracle relational data base management system; and an artificial intelligence approach using the Prolog programming language. This prototypic work shows the feasibility of applying artificial intelligence methods to data bases, and represents a first step toward development of intelligent nuclear data base systems. The characteristics of the query language from both approaches make the second one preferable from a user's point of view. 23 refs., 7 tabs

  6. Use of the ORACLE DBMS in determining the response of complex scientific instrumentation

    International Nuclear Information System (INIS)

    Auerbach, J.M.; DeMartini, B.J.; McCauley, E.W.

    1984-01-01

    In the Laser Fusion Program at Lawrence Livermore National Laboratory, a single laser fusion experiment lasts only a billionth of a second but in this time high speed instrumentation collects data that when digitized will create a data bank of several megabytes. This first level of data must be processed in several stages to put it in a form useful for interpretation of the experiments. One stage involves unfolding the source characteristics from the data and response of the instrument. This involves calculating the response of the instrument from the characteristics of each of its components. It is in this calculation that the ORACLE DBMS has become an invaluable tool for manipulation and archiving of the component data

  7. Robust PID based power system stabiliser: Design and real-time implementation

    Energy Technology Data Exchange (ETDEWEB)

    Bevrani, Hassan [Department of Electrical and Computer Eng., University of Kurdistan, Sanandaj (Iran, Islamic Republic of); Hiyama, Takashi [Department of Electrical and Computer Eng., Kumamoto University, Kumamoto (Japan); Bevrani, Hossein [Department of Statistics, University of Tabriz, Tabriz (Iran, Islamic Republic of)

    2011-02-15

    This paper addresses a new robust control strategy to synthesis of robust proportional-integral-derivative (PID) based power system stabilisers (PSS). The PID based PSS design problem is reduced to find an optimal gain vector via an H{infinity} static output feedback control (H{infinity}-SOF) technique, and the solution is easily carried out using a developed iterative linear matrix inequalities algorithm. To illustrate the developed approach, a real-time experiment has been performed for a longitudinal four-machine infinite-bus system using the Analog Power System Simulator at the Research Laboratory of the Kyushu Electric Power Company. The results of the proposed control strategy are compared with full-order H{infinity} and conventional PSS designs. The robust PSS is shown to maintain the robust performance and minimise the effect of disturbances properly. (author)

  8. Optimization on robot arm machining by using genetic algorithms

    Science.gov (United States)

    Liu, Tung-Kuan; Chen, Chiu-Hung; Tsai, Shang-En

    2007-12-01

    In this study, an optimization problem on the robot arm machining is formulated and solved by using genetic algorithms (GAs). The proposed approach adopts direct kinematics model and utilizes GA's global search ability to find the optimum solution. The direct kinematics equations of the robot arm are formulated and can be used to compute the end-effector coordinates. Based on these, the objective of optimum machining along a set of points can be evolutionarily evaluated with the distance between machining points and end-effector positions. Besides, a 3D CAD application, CATIA, is used to build up the 3D models of the robot arm, work-pieces and their components. A simulated experiment in CATIA is used to verify the computation results first and a practical control on the robot arm through the RS232 port is also performed. From the results, this approach is proved to be robust and can be suitable for most machining needs when robot arms are adopted as the machining tools.

  9. Machine Learning and Conflict Prediction: A Use Case

    Directory of Open Access Journals (Sweden)

    Chris Perry

    2013-10-01

    Full Text Available For at least the last two decades, the international community in general and the United Nations specifically have attempted to develop robust, accurate and effective conflict early warning system for conflict prevention. One potential and promising component of integrated early warning systems lies in the field of machine learning. This paper aims at giving conflict analysis a basic understanding of machine learning methodology as well as to test the feasibility and added value of such an approach. The paper finds that the selection of appropriate machine learning methodologies can offer substantial improvements in accuracy and performance. It also finds that even at this early stage in testing machine learning on conflict prediction, full models offer more predictive power than simply using a prior outbreak of violence as the leading indicator of current violence. This suggests that a refined data selection methodology combined with strategic use of machine learning algorithms could indeed offer a significant addition to the early warning toolkit. Finally, the paper suggests a number of steps moving forward to improve upon this initial test methodology.

  10. Managing IaaS and DBaaS clouds with Oracle Enterprise Manager Cloud Control 12c

    CERN Document Server

    Antani, Ved

    2013-01-01

    This book is a step-by-step tutorial filled with practical examples which will show readers how to configure and manage IaaS and DBaaS with Oracle Enterprise Manager.If you are a cloud administrator or a user of self-service provisioning systems offered by Enterprise Manager, this book is ideal for you. It will also help administrators who want to understand the chargeback mechanism offered by Enterprise Manager.An understanding of the basic building blocks of cloud computing such as networking, virtualization, storage, and so on, is needed by those of you interested in this book

  11. An immunohistochemical and fluorescence in situ hybridization-based comparison between the Oracle HER2 Bond Immunohistochemical System, Dako HercepTest, and Vysis PathVysion HER2 FISH using both commercially validated and modified ASCO/CAP and United Kingdom HER2 IHC scoring guidelines.

    LENUS (Irish Health Repository)

    O'Grady, Anthony

    2010-12-01

    Immunohistochemistry (IHC) is used as the frontline assay to determine HER2 status in invasive breast cancer patients. The aim of the study was to compare the performance of the Leica Oracle HER2 Bond IHC System (Oracle) with the current most readily accepted Dako HercepTest (HercepTest), using both commercially validated and modified ASCO\\/CAP and UK HER2 IHC scoring guidelines. A total of 445 breast cancer samples from 3 international clinical HER2 referral centers were stained with the 2 test systems and scored in a blinded fashion by experienced pathologists. The overall agreement between the 2 tests in a 3×3 (negative, equivocal and positive) analysis shows a concordance of 86.7% and 86.3%, respectively when analyzed using commercially validated and modified ASCO\\/CAP and UK HER2 IHC scoring guidelines. There is a good concordance between the Oracle and the HercepTest. The advantages of a complete fully automated test such as the Oracle include standardization of key analytical factors and improved turn around time. The implementation of the modified ASCO\\/CAP and UK HER2 IHC scoring guidelines has minimal effect on either assay interpretation, showing that Oracle can be used as a methodology for accurately determining HER2 IHC status in formalin fixed, paraffin-embedded breast cancer tissue.

  12. An immunohistochemical and fluorescence in situ hybridization-based comparison between the Oracle HER2 Bond Immunohistochemical System, Dako HercepTest, and Vysis PathVysion HER2 FISH using both commercially validated and modified ASCO/CAP and United Kingdom HER2 IHC scoring guidelines.

    Science.gov (United States)

    O'Grady, Anthony; Allen, David; Happerfield, Lisa; Johnson, Nicola; Provenzano, Elena; Pinder, Sarah E; Tee, Lilian; Gu, Mai; Kay, Elaine W

    2010-12-01

    Immunohistochemistry (IHC) is used as the frontline assay to determine HER2 status in invasive breast cancer patients. The aim of the study was to compare the performance of the Leica Oracle HER2 Bond IHC System (Oracle) with the current most readily accepted Dako HercepTest (HercepTest), using both commercially validated and modified ASCO/CAP and UK HER2 IHC scoring guidelines. A total of 445 breast cancer samples from 3 international clinical HER2 referral centers were stained with the 2 test systems and scored in a blinded fashion by experienced pathologists. The overall agreement between the 2 tests in a 3×3 (negative, equivocal and positive) analysis shows a concordance of 86.7% and 86.3%, respectively when analyzed using commercially validated and modified ASCO/CAP and UK HER2 IHC scoring guidelines. There is a good concordance between the Oracle and the HercepTest. The advantages of a complete fully automated test such as the Oracle include standardization of key analytical factors and improved turn around time. The implementation of the modified ASCO/CAP and UK HER2 IHC scoring guidelines has minimal effect on either assay interpretation, showing that Oracle can be used as a methodology for accurately determining HER2 IHC status in formalin fixed, paraffin-embedded breast cancer tissue.

  13. Control processes and machine protection on ASDEX Upgrade

    International Nuclear Information System (INIS)

    Raupp, G.; Treutterer, W.; Mertens, V.; Neu, G.; Sips, A.; Zasche, D.; Zehetbauer, Th.

    2007-01-01

    Safe operation of ASDEX Upgrade is guaranteed by a conventional hierarchy of simple and robust hard-wired systems for personnel and machine protection featuring standardized switch-off procedures. Machine protection and handling of off-normal events is further enhanced and peak and lifetime stress minimized through the plasma control system. Based on a real-time process model supporting safety critical applications with data quality tagging, process self-monitoring, watchdog monitoring and alarm propagation, processes detect complex and critical failures and reliably perform case-sensitive counter measures. Intelligent real-time failure handling is done with hardware or software redundancy and performance degradation, or modification of reference values to continue or terminate discharges with reduced machine stress. Examples implemented so far on ASDEX Upgrade are given, such as recovery from measurement failures, switch-over of redundant actuators, handling of actuator limitations, detection of plasma instabilities, plasma state dependent soft landing, or handling of failed switch-off procedures through breakers disconnecting the machine from grid

  14. A defect-driven diagnostic method for machine tool spindles.

    Science.gov (United States)

    Vogl, Gregory W; Donmez, M Alkan

    2015-01-01

    Simple vibration-based metrics are, in many cases, insufficient to diagnose machine tool spindle condition. These metrics couple defect-based motion with spindle dynamics; diagnostics should be defect-driven. A new method and spindle condition estimation device (SCED) were developed to acquire data and to separate system dynamics from defect geometry. Based on this method, a spindle condition metric relying only on defect geometry is proposed. Application of the SCED on various milling and turning spindles shows that the new approach is robust for diagnosing the machine tool spindle condition.

  15. Acoustic monitoring of rotating machine by advanced signal processing technology

    International Nuclear Information System (INIS)

    Kanemoto, Shigeru

    2010-01-01

    The acoustic data remotely measured by hand held type microphones are investigated for monitoring and diagnosing the rotational machine integrity in nuclear power plants. The plant operator's patrol monitoring is one of the important activities for condition monitoring. However, remotely measured sound has some difficulties to be considered for precise diagnosis or quantitative judgment of rotating machine anomaly, since the measurement sensitivity is different in each measurement, and also, the sensitivity deteriorates in comparison with an attached type sensor. Hence, in the present study, several advanced signal processing methods are examined and compared in order to find optimum anomaly monitoring technology from the viewpoints of both sensitivity and robustness of performance. The dimension of pre-processed signal feature patterns are reduced into two-dimensional space for the visualization by using the standard principal component analysis (PCA) or the kernel based PCA. Then, the normal state is classified by using probabilistic neural network (PNN) or support vector data description (SVDD). By using the mockup test facility of rotating machine, it is shown that the appropriate combination of the above algorithms gives sensitive and robust anomaly monitoring performance. (author)

  16. Robustness and prediction accuracy of machine learning for objective visual quality assessment

    OpenAIRE

    HINES, ANDREW

    2014-01-01

    PUBLISHED Lisbon, Portugal Machine Learning (ML) is a powerful tool to support the development of objective visual quality assessment metrics, serving as a substitute model for the perceptual mechanisms acting in visual quality appreciation. Nevertheless, the reli- ability of ML-based techniques within objective quality as- sessment metrics is often questioned. In this study, the ro- bustness of ML in supporting objective quality assessment is investigated, specific...

  17. High Frequency Voltage Injection Methods and Observer Design for Initial Position Detection of Permanent Magnet Synchronous Machines

    DEFF Research Database (Denmark)

    Jin, Xinhai; Ni, Ronggang; Chen, Wei

    2018-01-01

    The information of the initial rotor position is essential for smooth start up and robust control of Permanent Magnet Synchronous Machines (PMSMs). RoTating Voltage Injection (RTVI) methods in the stationary reference frame have been commonly adopted to detect the initial rotor position at stands......The information of the initial rotor position is essential for smooth start up and robust control of Permanent Magnet Synchronous Machines (PMSMs). RoTating Voltage Injection (RTVI) methods in the stationary reference frame have been commonly adopted to detect the initial rotor position...

  18. ORACLS: A system for linear-quadratic-Gaussian control law design

    Science.gov (United States)

    Armstrong, E. S.

    1978-01-01

    A modern control theory design package (ORACLS) for constructing controllers and optimal filters for systems modeled by linear time-invariant differential or difference equations is described. Numerical linear-algebra procedures are used to implement the linear-quadratic-Gaussian (LQG) methodology of modern control theory. Algorithms are included for computing eigensystems of real matrices, the relative stability of a matrix, factored forms for nonnegative definite matrices, the solutions and least squares approximations to the solutions of certain linear matrix algebraic equations, the controllability properties of a linear time-invariant system, and the steady state covariance matrix of an open-loop stable system forced by white noise. Subroutines are provided for solving both the continuous and discrete optimal linear regulator problems with noise free measurements and the sampled-data optimal linear regulator problem. For measurement noise, duality theory and the optimal regulator algorithms are used to solve the continuous and discrete Kalman-Bucy filter problems. Subroutines are also included which give control laws causing the output of a system to track the output of a prescribed model.

  19. Securing a robust electrical discharge drilling process by means of flow rate control

    Science.gov (United States)

    Risto, Matthias; Munz, Markus; Haas, Ruediger; Abdolahi, Ali

    2017-10-01

    This paper deals with the increase of the process robustness while drilling cemented carbide using electrical discharge machining (EDM). A demand for high efficiency in the resulting diameter is equivalent with a high robustness of the EDM drilling process. Analysis were done to investigate the process robustness (standard deviation of the borehole diameter) when drilling cemented carbide. The investigation has shown that the dielectric flow rate changes over the drilling process. In this case the flow rate decreased with a shorter tool electrode due to an uneven wear of the tool electrode's cross section. Using a controlled flow rate during the drilling process has led to a reduced standard deviation of the borehole diameter, thus to a higher process robustness when drilling cemented carbide.

  20. Outcomes at 7 years for babies who developed neonatal necrotising enterocolitis: the ORACLE Children Study.

    Science.gov (United States)

    Pike, Katie; Brocklehurst, Peter; Jones, David; Kenyon, Sarah; Salt, Alison; Taylor, David; Marlow, Neil

    2012-09-01

    Within the ORACLE Children Study Cohort, the authors have evaluated long-term consequences of the diagnosis of confirmed or suspected neonatal necrotising enterocolitis (NEC) at age of 7 years. Outcomes were assessed using a parental questionnaire, including the Health Utilities Index (HUI-3) to assess functional impairment, and specific medical and behavioural outcomes. Educational outcomes for children in England were explored using national standardised tests. Multiple logistic regression was used to explore independent associates of NEC within the cohort. The authors obtained data for 119 (77%) of 157 children following proven or suspected NEC and compared their outcomes with those of the remaining 6496 children. NEC was associated with an increase in risk of neonatal death (OR 14.6 (95% CI 10.4 to 20.6)). At 7 years, NEC conferred an increased risk of all grades of impairment. Adjusting for confounders, risks persisted for any HUI-3 defined functional impairment (adjusted OR 1.55 (1.05, 2.29)), particularly mild impairment (adjusted OR 1.61 (1.03, 2.53)) both in all NEC children and in those with proven NEC, which appeared to be independent. No behavioural or educational associations were confirmed. Following NEC, children were more likely to suffer bowel problems than non-NEC children (adjusted OR 3.96 (2.06, 7.61)). The ORACLE Children Study provided opportunity for the largest evaluation of school age outcome following neonatal NEC and demonstrates significant long-term consequences of both gut function (presence of stoma, admission for bowel problems and continuing medical care for gut-related problems) and motor, sensory and cognitive outcomes as measured using HUI-3.

  1. QFT Framework for Robust Tuning of Power System Stabilizers

    DEFF Research Database (Denmark)

    Alavi, Seyyed Mohammad Mahdi; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper discusses the use of conventional quantitative feedback design for Power System Stabilizer (PSS). An appropriate control structure of the PSS that is directly applicable to PSS, is described. Two desired performances are also proposed in order to achieve an overall improvement in damping...... and robustness. The efficiency of the proposed method is demonstrated on Single Machine Infinite Bus (SMIB) power system with level of uncertainty....

  2. Classification of large-sized hyperspectral imagery using fast machine learning algorithms

    Science.gov (United States)

    Xia, Junshi; Yokoya, Naoto; Iwasaki, Akira

    2017-07-01

    We present a framework of fast machine learning algorithms in the context of large-sized hyperspectral images classification from the theoretical to a practical viewpoint. In particular, we assess the performance of random forest (RF), rotation forest (RoF), and extreme learning machine (ELM) and the ensembles of RF and ELM. These classifiers are applied to two large-sized hyperspectral images and compared to the support vector machines. To give the quantitative analysis, we pay attention to comparing these methods when working with high input dimensions and a limited/sufficient training set. Moreover, other important issues such as the computational cost and robustness against the noise are also discussed.

  3. Developing robust arsenic awareness prediction models using machine learning algorithms.

    Science.gov (United States)

    Singh, Sushant K; Taylor, Robert W; Rahman, Mohammad Mahmudur; Pradhan, Biswajeet

    2018-04-01

    Arsenic awareness plays a vital role in ensuring the sustainability of arsenic mitigation technologies. Thus far, however, few studies have dealt with the sustainability of such technologies and its associated socioeconomic dimensions. As a result, arsenic awareness prediction has not yet been fully conceptualized. Accordingly, this study evaluated arsenic awareness among arsenic-affected communities in rural India, using a structured questionnaire to record socioeconomic, demographic, and other sociobehavioral factors with an eye to assessing their association with and influence on arsenic awareness. First a logistic regression model was applied and its results compared with those produced by six state-of-the-art machine-learning algorithms (Support Vector Machine [SVM], Kernel-SVM, Decision Tree [DT], k-Nearest Neighbor [k-NN], Naïve Bayes [NB], and Random Forests [RF]) as measured by their accuracy at predicting arsenic awareness. Most (63%) of the surveyed population was found to be arsenic-aware. Significant arsenic awareness predictors were divided into three types: (1) socioeconomic factors: caste, education level, and occupation; (2) water and sanitation behavior factors: number of family members involved in water collection, distance traveled and time spent for water collection, places for defecation, and materials used for handwashing after defecation; and (3) social capital and trust factors: presence of anganwadi and people's trust in other community members, NGOs, and private agencies. Moreover, individuals' having higher social network positively contributed to arsenic awareness in the communities. Results indicated that both the SVM and the RF algorithms outperformed at overall prediction of arsenic awareness-a nonlinear classification problem. Lower-caste, less educated, and unemployed members of the population were found to be the most vulnerable, requiring immediate arsenic mitigation. To this end, local social institutions and NGOs could play a

  4. Robust structural design against self-excited vibrations

    CERN Document Server

    Spelsberg-Korspeter, Gottfried

    2013-01-01

    This book studies methods for a robust design of rotors against self-excited vibrations. The occurrence of self-excited vibrations in engineering applications if often unwanted and in many cases difficult to model. Thinking of complex systems such as machines with many components and mechanical contacts, it is important to have guidelines for design so that the functionality is robust against small imperfections. This book discusses the question on how to design a structure such that unwanted self-excited vibrations do not occur. It shows theoretically and practically that the old design rule to avoid multiple eigenvalues points toward the right direction and have optimized structures accordingly. This extends results for the well-known flutter problem in which equations of motion with constant coefficients occur to the case of a linear conservative system with arbitrary time periodic perturbations.

  5. Robust photometric stereo using structural light sources

    Science.gov (United States)

    Han, Tian-Qi; Cheng, Yue; Shen, Hui-Liang; Du, Xin

    2014-05-01

    We propose a robust photometric stereo method by using structural arrangement of light sources. In the arrangement, light sources are positioned on a planar grid and form a set of collinear combinations. The shadow pixels are detected by adaptive thresholding. The specular highlight and diffuse pixels are distinguished according to their intensity deviations of the collinear combinations, thanks to the special arrangement of light sources. The highlight detection problem is cast as a pattern classification problem and is solved using support vector machine classifiers. Considering the possible misclassification of highlight pixels, the ℓ1 regularization is further employed in normal map estimation. Experimental results on both synthetic and real-world scenes verify that the proposed method can robustly recover the surface normal maps in the case of heavy specular reflection and outperforms the state-of-the-art techniques.

  6. Handling machine breakdown for dynamic scheduling by a colony of cognitive agents in a holonic manufacturing framework

    Directory of Open Access Journals (Sweden)

    T. K. Jana

    2015-09-01

    Full Text Available There is an ever increasing need of providing quick, yet improved solution to dynamic scheduling by better responsiveness following simple coordination mechanism to better adapt to the changing environments. In this endeavor, a cognitive agent based approach is proposed to deal with machine failure. A Multi Agent based Holonic Adaptive Scheduling (MAHoAS architecture is developed to frame the schedule by explicit communication between the product holons and the resource holons in association with the integrated process planning and scheduling (IPPS holon under normal situation. In the event of breakdown of a resource, the cooperation is sought by implicit communication. Inspired by the cognitive behavior of human being, a cognitive decision making scheme is proposed that reallocates the incomplete task to another resource in the most optimized manner and tries to expedite the processing in view of machine failure. A metamorphic algorithm is developed and implemented in Oracle 9i to identify the best candidate resource for task re-allocation. Integrated approach to process planning and scheduling realized under Multi Agent System (MAS framework facilitates dynamic scheduling with improved performance under such situations. The responsiveness of the resources having cognitive capabilities helps to overcome the adverse consequences of resource failure in a better way.

  7. Estudio comparativo de los Lenguajes de Programación Algebráicos SQL 2005 y Oracle

    OpenAIRE

    García Díaz, Bertila Liduvina

    2013-01-01

    Estudio comparativo de los lenguajes de programación algebraicos SQL 2005 y ORACLE. El objetivo de esta investigación es realizar un estudio comparativo a nivel práctico de ambos Lenguajes algebraicos y profundizar en un tema que corresponde al curso de Base de Datos a mi cargo. Para recopilar los datos de este estudio se creé una Base de Datos en cada uno de los Lenguajes y se comprobó a nivel del Lenguaje de definición de datos (DDL), Lenguaje de manipulación de datos (DML) y Leng...

  8. Efficient classical simulation of the Deutsch-Jozsa and Simon's algorithms

    Science.gov (United States)

    Johansson, Niklas; Larsson, Jan-Åke

    2017-09-01

    A long-standing aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speed-up are the Deutsch-Jozsa and Simon's problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the Deutsch-Jozsa problem with probability 1 using only one oracle query, and Simon's problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the Deutsch-Jozsa and Simon's problem do not require any genuinely quantum resources, and that the quantum algorithms show no speed-up when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation.

  9. A probabilistic model for robust localization based on a binaural auditory front-end

    NARCIS (Netherlands)

    May, T.; Par, van de S.L.J.D.E.; Kohlrausch, A.G.

    2011-01-01

    Although extensive research has been done in the field of machine-based localization, the degrading effect of reverberation and the presence of multiple sources on localization performance has remained a major problem. Motivated by the ability of the human auditory system to robustly analyze complex

  10. A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment

    Science.gov (United States)

    Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong

    Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.

  11. Simulation-driven machine learning: Bearing fault classification

    Science.gov (United States)

    Sobie, Cameron; Freitas, Carina; Nicolai, Mike

    2018-01-01

    Increasing the accuracy of mechanical fault detection has the potential to improve system safety and economic performance by minimizing scheduled maintenance and the probability of unexpected system failure. Advances in computational performance have enabled the application of machine learning algorithms across numerous applications including condition monitoring and failure detection. Past applications of machine learning to physical failure have relied explicitly on historical data, which limits the feasibility of this approach to in-service components with extended service histories. Furthermore, recorded failure data is often only valid for the specific circumstances and components for which it was collected. This work directly addresses these challenges for roller bearings with race faults by generating training data using information gained from high resolution simulations of roller bearing dynamics, which is used to train machine learning algorithms that are then validated against four experimental datasets. Several different machine learning methodologies are compared starting from well-established statistical feature-based methods to convolutional neural networks, and a novel application of dynamic time warping (DTW) to bearing fault classification is proposed as a robust, parameter free method for race fault detection.

  12. Matrix Multiplication Algorithm Selection with Support Vector Machines

    Science.gov (United States)

    2015-05-01

    STARnet, a Semiconductor Re- search Corporation program sponsored by MARCO and DARPA), and ASPIRE Lab industrial sponsors and af- filiates Intel, Google...Nokia, NVIDIA , and Oracle. Any opinions, findings, conclusions, or recommendations in this paper are solely those of the authors and does not neces

  13. Integrated Storage and Management of Vector and Raster Data Based on Oracle Database

    Directory of Open Access Journals (Sweden)

    WU Zheng

    2017-05-01

    Full Text Available At present, there are many problems in the storage and management of multi-source heterogeneous spatial data, such as the difficulty of transferring, the lack of unified storage and the low efficiency. By combining relational database and spatial data engine technology, an approach for integrated storage and management of vector and raster data is proposed on the basis of Oracle in this paper. This approach establishes an integrated storage model on vector and raster data and optimizes the retrieval mechanism at first, then designs a framework for the seamless data transfer, finally realizes the unified storage and efficient management of multi-source heterogeneous data. By comparing experimental results with the international leading similar software ArcSDE, it is proved that the proposed approach has higher data transfer performance and better query retrieval efficiency.

  14. Novel Breast Imaging and Machine Learning: Predicting Breast Lesion Malignancy at Cone-Beam CT Using Machine Learning Techniques.

    Science.gov (United States)

    Uhlig, Johannes; Uhlig, Annemarie; Kunze, Meike; Beissbarth, Tim; Fischer, Uwe; Lotz, Joachim; Wienbeck, Susanne

    2018-05-24

    The purpose of this study is to evaluate the diagnostic performance of machine learning techniques for malignancy prediction at breast cone-beam CT (CBCT) and to compare them to human readers. Five machine learning techniques, including random forests, back propagation neural networks (BPN), extreme learning machines, support vector machines, and K-nearest neighbors, were used to train diagnostic models on a clinical breast CBCT dataset with internal validation by repeated 10-fold cross-validation. Two independent blinded human readers with profound experience in breast imaging and breast CBCT analyzed the same CBCT dataset. Diagnostic performance was compared using AUC, sensitivity, and specificity. The clinical dataset comprised 35 patients (American College of Radiology density type C and D breasts) with 81 suspicious breast lesions examined with contrast-enhanced breast CBCT. Forty-five lesions were histopathologically proven to be malignant. Among the machine learning techniques, BPNs provided the best diagnostic performance, with AUC of 0.91, sensitivity of 0.85, and specificity of 0.82. The diagnostic performance of the human readers was AUC of 0.84, sensitivity of 0.89, and specificity of 0.72 for reader 1 and AUC of 0.72, sensitivity of 0.71, and specificity of 0.67 for reader 2. AUC was significantly higher for BPN when compared with both reader 1 (p = 0.01) and reader 2 (p Machine learning techniques provide a high and robust diagnostic performance in the prediction of malignancy in breast lesions identified at CBCT. BPNs showed the best diagnostic performance, surpassing human readers in terms of AUC and specificity.

  15. Desenvolvimento de Sistemas de Informação baseados em PHP e MySQL, e Java e Oracle

    OpenAIRE

    Couto, Francisco M; Santos, Emanuel

    2011-01-01

    Este manual tem como objectivo apoiar o desenvolvimento de sistemas de informação baseados nas linguagens de programação PHP e Java com recurso aos sistemas de gestão de bases de dados relacionais MySQL e Oracle, respectivamente. Este documento introduz os conceitos básicos explicados através de exemplos, que embora tenham sido testados na infra-estrutura informática do Departamento de Informática da FCUL podem ser fácilmente adaptados a qualquer outra infra-estrutura que possuia...

  16. A ORACLE-based system for data collection, storage and analysis of main equipment load factors in NPPs and TPPs

    International Nuclear Information System (INIS)

    Ivanova, L.

    1993-01-01

    This data base is developed by the National Electricity Company, Sofia (BG) as an aid to supervision, analysis and administration decision making in a variety of operational situations in NPPs and TPPs. As major indicators of the equipment condition the following primary data are stored: steam or electricity production per month; operation hours per month; equipment stand-by outages; planned outages; unplanned permitted maintenance outages; unplanned emergency maintenance outages; number of outages of the unit per month. These data cover the period from the putting of the corresponding equipment into operation till the present moment, i.e. or about 32 years. The data up to 1990 are annual and for the last three years - monthly. Based on these primary data, the following quantities are calculated: average capacity; average load factors; operation time factors - total and accounting for the planned and the permitted unplanned outages; unpermitted outages factors - total and accounting for the planned and the permitted outages. All the factors are calculated on user's request for a chosen time period, by summing up correspondingly the major indicators (production, operation hours and various outages) for the given period. The system operates on an IBM 4341 under VM/SP and DB ORACLE V.5. The input is entered directly from the TPP and NPP by telex lines from PCs, operating also as telex machines, into the mainframe of Energokibernetika Ltd. They are available to all authorised users from local terminals or PCs, connected to the computer by synchronous or asynchronous lines. A system for data transmission to remote users along commutated telephone lines is also developed. (R. Ts.)

  17. Medical Statistics – Mathematics or Oracle? Farewell Lecture

    Directory of Open Access Journals (Sweden)

    Gaus, Wilhelm

    2005-06-01

    Full Text Available Certainty is rare in medicine. This is a direct consequence of the individuality of each and every human being and the reason why we need medical statistics. However, statistics have their pitfalls, too. Fig. 1 shows that the suicide rate peaks in youth, while in Fig. 2 the rate is highest in midlife and Fig. 3 in old age. Which of these contradictory messages is right? After an introduction to the principles of statistical testing, this lecture examines the probability with which statistical test results are correct. For this purpose the level of significance and the power of the test are compared with the sensitivity and specificity of a diagnostic procedure. The probability of obtaining correct statistical test results is the same as that for the positive and negative correctness of a diagnostic procedure and therefore depends on prevalence. The focus then shifts to the problem of multiple statistical testing. The lecture demonstrates that for each data set of reasonable size at least one test result proves to be significant - even if the data set is produced by a random number generator. It is extremely important that a hypothesis is generated independently from the data used for its testing. These considerations enable us to understand the gradation of "lame excuses, lies and statistics" and the difference between pure truth and the full truth. Finally, two historical oracles are cited.

  18. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    Directory of Open Access Journals (Sweden)

    Qiaokang Liang

    2016-11-01

    Full Text Available Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  19. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    Science.gov (United States)

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  20. Childhood outcomes after prescription of antibiotics to pregnant women with preterm rupture of the membranes: 7-year follow-up of the ORACLE I trial.

    Science.gov (United States)

    Kenyon, S; Pike, K; Jones, D R; Brocklehurst, P; Marlow, N; Salt, A; Taylor, D J

    2008-10-11

    The ORACLE I trial compared the use of erythromycin and/or amoxicillin-clavulanate (co-amoxiclav) with that of placebo for women with preterm rupture of the membranes without overt signs of clinical infection, by use of a factorial randomised design. The aim of the present study--the ORACLE Children Study I--was to determine the long-term effects on children of these interventions. We assessed children at age 7 years born to the 4148 women who had completed the ORACLE I trial and who were eligible for follow-up with a structured parental questionnaire to assess the child's health status. Functional impairment was defined as the presence of any level of functional impairment (severe, moderate, or mild) derived from the mark III Multi-Attribute Health Status classification system. Educational outcomes were assessed with national curriculum test results for children resident in England. Outcome was determined for 3298 (75%) eligible children. There was no difference in the proportion of children with any functional impairment after prescription of erythromycin, with or without co-amoxiclav, compared with those born to mothers who received no erythromycin (594 [38.3%] of 1551 children vs 655 [40.4%] of 1620; odds ratio 0.91, 95% CI 0.79-1.05) or after prescription of co-amoxiclav, with or without erythromycin, compared with those born to mothers who received no co-amoxiclav (645 [40.6%] of 1587 vs 604 [38.1%] of 1584; 1.11, 0.96-1.28). Neither antibiotic had a significant effect on the overall level of behavioural difficulties experienced, on specific medical conditions, or on the proportions of children achieving each level in reading, writing, or mathematics at key stage one. The prescription of antibiotics for women with preterm rupture of the membranes seems to have little effect on the health of children at 7 years of age. UK Medical Research Council.

  1. Micro-machined resonator oscillator

    Science.gov (United States)

    Koehler, Dale R.; Sniegowski, Jeffry J.; Bivens, Hugh M.; Wessendorf, Kurt O.

    1994-01-01

    A micro-miniature resonator-oscillator is disclosed. Due to the miniaturization of the resonator-oscillator, oscillation frequencies of one MHz and higher are utilized. A thickness-mode quartz resonator housed in a micro-machined silicon package and operated as a "telemetered sensor beacon" that is, a digital, self-powered, remote, parameter measuring-transmitter in the FM-band. The resonator design uses trapped energy principles and temperature dependence methodology through crystal orientation control, with operation in the 20-100 MHz range. High volume batch-processing manufacturing is utilized, with package and resonator assembly at the wafer level. Unique design features include squeeze-film damping for robust vibration and shock performance, capacitive coupling through micro-machined diaphragms allowing resonator excitation at the package exterior, circuit integration and extremely small (0.1 in. square) dimensioning. A family of micro-miniature sensor beacons is also disclosed with widespread applications as bio-medical sensors, vehicle status monitors and high-volume animal identification and health sensors. The sensor family allows measurement of temperatures, chemicals, acceleration and pressure. A microphone and clock realization is also available.

  2. "Wrath Will Drip in the Plains of Macedonia" : Expectations of Nero's Return in the Egyptian Sibylline Oracles (Book 5), 2 Thessalonians, and Ancient Historical Writings

    NARCIS (Netherlands)

    van Kooten, G.H.; Hilhorst, A.; van Kooten, G.H.

    2005-01-01

    George H. van Kooten, “‘Wrath Will Drip in the Plains of Macedonia’: Expectations of Nero’s Return in the Egyptian Sibylline Oracles (Book 5), 2 Thessalonians, and Ancient Historical Writings,” in The Wisdom of Egypt: Jewish, Early Christian, and Gnostic Essays in Honour of Gerard P. Luttikhuizen

  3. Parallel Solution of Robust Nonlinear Model Predictive Control Problems in Batch Crystallization

    Directory of Open Access Journals (Sweden)

    Yankai Cao

    2016-06-01

    Full Text Available Representing the uncertainties with a set of scenarios, the optimization problem resulting from a robust nonlinear model predictive control (NMPC strategy at each sampling instance can be viewed as a large-scale stochastic program. This paper solves these optimization problems using the parallel Schur complement method developed to solve stochastic programs on distributed and shared memory machines. The control strategy is illustrated with a case study of a multidimensional unseeded batch crystallization process. For this application, a robust NMPC based on min–max optimization guarantees satisfaction of all state and input constraints for a set of uncertainty realizations, and also provides better robust performance compared with open-loop optimal control, nominal NMPC, and robust NMPC minimizing the expected performance at each sampling instance. The performance of robust NMPC can be improved by generating optimization scenarios using Bayesian inference. With the efficient parallel solver, the solution time of one optimization problem is reduced from 6.7 min to 0.5 min, allowing for real-time application.

  4. Routine educational outcome measures in health studies: Key Stage 1 in the ORACLE Children Study follow-up of randomised trial cohorts.

    Science.gov (United States)

    Jones, David R; Pike, Katie; Kenyon, Sara; Pike, Laura; Henderson, Brian; Brocklehurst, Peter; Marlow, Neil; Salt, Alison; Taylor, David J

    2011-01-01

    Statutory educational attainment measures are rarely used as health study outcomes, but Key Stage 1 (KS1) data formed secondary outcomes in the long-term follow-up to age 7 years of the ORACLE II trial of antibiotic use in preterm babies. This paper describes the approach, compares different approaches to analysis of the KS1 data and compares use of summary KS1 (level) data with use of individual question scores. 3394 children born to women in the ORACLE Children Study and resident in England at age 7. Analysis of educational achievement measured by national end of KS1 data (KS1) using Poisson regression modelling and anchoring of the KS1 data using external standards. KS1 summary level data were obtained for 3239 (95%) eligible children; raw individual question scores were obtained for 1899 (54%). Use of individual question scores where available did not change the conclusion of no evidence of treatment effects based on summary KS1 outcome data. When accessible for medical research purposes, routinely collected educational outcome data may have advantages of low cost and standardised definition. Here, summary scores lead to similar conclusions to raw (individual question) scores and so are attractive and cost-effective alternatives.

  5. Vibration Sensor Monitoring of Nickel-Titanium Alloy Turning for Machinability Evaluation

    Directory of Open Access Journals (Sweden)

    Tiziana Segreto

    2017-12-01

    Full Text Available Nickel-Titanium (Ni-Ti alloys are very difficult-to-machine materials causing notable manufacturing problems due to their unique mechanical properties, including superelasticity, high ductility, and severe strain-hardening. In this framework, the aim of this paper is to assess the machinability of Ni-Ti alloys with reference to turning processes in order to realize a reliable and robust in-process identification of machinability conditions. An on-line sensor monitoring procedure based on the acquisition of vibration signals was implemented during the experimental turning tests. The detected vibration sensorial data were processed through an advanced signal processing method in time-frequency domain based on wavelet packet transform (WPT. The extracted sensorial features were used to construct WPT pattern feature vectors to send as input to suitably configured neural networks (NNs for cognitive pattern recognition in order to evaluate the correlation between input sensorial information and output machinability conditions.

  6. Vibration Sensor Monitoring of Nickel-Titanium Alloy Turning for Machinability Evaluation.

    Science.gov (United States)

    Segreto, Tiziana; Caggiano, Alessandra; Karam, Sara; Teti, Roberto

    2017-12-12

    Nickel-Titanium (Ni-Ti) alloys are very difficult-to-machine materials causing notable manufacturing problems due to their unique mechanical properties, including superelasticity, high ductility, and severe strain-hardening. In this framework, the aim of this paper is to assess the machinability of Ni-Ti alloys with reference to turning processes in order to realize a reliable and robust in-process identification of machinability conditions. An on-line sensor monitoring procedure based on the acquisition of vibration signals was implemented during the experimental turning tests. The detected vibration sensorial data were processed through an advanced signal processing method in time-frequency domain based on wavelet packet transform (WPT). The extracted sensorial features were used to construct WPT pattern feature vectors to send as input to suitably configured neural networks (NNs) for cognitive pattern recognition in order to evaluate the correlation between input sensorial information and output machinability conditions.

  7. Robust facial landmark detection based on initializing multiple poses

    Directory of Open Access Journals (Sweden)

    Xin Chai

    2016-10-01

    Full Text Available For robot systems, robust facial landmark detection is the first and critical step for face-based human identification and facial expression recognition. In recent years, the cascaded-regression-based method has achieved excellent performance in facial landmark detection. Nevertheless, it still has certain weakness, such as high sensitivity to the initialization. To address this problem, regression based on multiple initializations is established in a unified model; face shapes are then estimated independently according to these initializations. With a ranking strategy, the best estimate is selected as the final output. Moreover, a face shape model based on restricted Boltzmann machines is built as a constraint to improve the robustness of ranking. Experiments on three challenging datasets demonstrate the effectiveness of the proposed facial landmark detection method against state-of-the-art methods.

  8. Robust Nonlinear Control with Compensation Operator for a Peltier System

    Directory of Open Access Journals (Sweden)

    Sheng-Jun Wen

    2014-01-01

    Full Text Available Robust nonlinear control with compensation operator is presented for a Peltier actuated system, where the compensation operator is designed by using a predictive model on heat radiation. For the Peltier system, the heat radiation is related to the fourth power of temperature. So, the heat radiation is affected evidently by the temperature when it is high and temperature difference between the system and environment is large. A new nonlinear model with the heat radiation is set up for the system according to some thermal conduction laws. To ensure robust stability of the nonlinear system, operator based robust right coprime factorization design is considered. Also, a compensation operator based on a predictive model is proposed to cancel effect of the heat radiation, where the predictive model is set up by using radial basis kernel function based SVM (support vector machine method. Finally, simulation results are given to show the effectiveness of the proposed scheme.

  9. Robust Backlash Estimation for Industrial Drive-Train Systems—Theory and Validation

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios; Blanke, Mogens; Niemann, Hans Henrik

    2018-01-01

    Backlash compensation is used in modern machinetool controls to ensure high-accuracy positioning. When wear of a machine causes deadzone width to increase, high-accuracy control may be maintained if the deadzone is accurately estimated. Deadzone estimation is also an important parameter to indica......-of-the-art Siemens equipment. The experiments validate the theory and show that expected performance and robustness to parameter uncertainties are both achieved....

  10. Characterization of wood dust emission from hand-held woodworking machines.

    Science.gov (United States)

    Keller, F-X; Chata, F

    2018-01-01

    This article focuses on the prevention of exposure to wood dust when operating electrical hand-held sawing and sanding machines. A laboratory methodology was developed to measure the dust concentration around machines during operating processes. The main objective was to characterize circular saws and sanders, with the aim of classifying the different power tools tested in terms of dust emission (high dust emitter vs. low dust emitter). A test set-up was developed and is described and a measurement methodology was determined for each of the two operations studied. The robustness of the experimental results is discussed and shows good tendencies. The impact of air-flow extraction rate was assessed and the pressure loss of the system for each machine established. For the circular saws, three machines over the nine tested could be classified in the low dust emitter group. Their mean concentration values measured are between 0.64 and 0.98 mg/m 3 for the low dust emitter group and from 2.55 and 4.37 mg/m 3 for the high dust emitter group. From concentration measurements, a machine classification is possible-one for sanding machines and one for sawing machines-and a ratio from 1-7 is obtained when comparing the results. This classification will be helpful when a choice of high performance power tools, in terms of dust emission, must be made by professionals.

  11. Catching errors with patient-specific pretreatment machine log file analysis.

    Science.gov (United States)

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  12. The effect of prepartum antibiotics on the type of neonatal bacteraemia: insights from the MRC ORACLE trials.

    Science.gov (United States)

    Gilbert, R E; Pike, K; Kenyon, S L; Tarnow-Mordi, W; Taylor, D J

    2005-06-01

    We analysed the type of bacteraemia before discharge from Neonatal Intensive Care Units in babies born to women randomised to the MRC ORACLE Trials. There was no evidence for an effect of oral antibiotics given prior to delivery on bacteraemia due to gram negative or enterococci bacteria, but Group B streptococcal (GBS) bacteraemia was significantly reduced in women with preterm prelabour rupture of the membranes (1.58% to 0.55%, relative risk 0.34; 95% CI: 0.17-0.70). There was no detectable effect in women in spontaneous preterm labour with intact membranes as the risk of GBS bacteraemia in their babies was very small regardless of treatment.

  13. Chord Recognition Based on Temporal Correlation Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Zhongyang Rao

    2016-05-01

    Full Text Available In this paper, we propose a method called temporal correlation support vector machine (TCSVM for automatic major-minor chord recognition in audio music. We first use robust principal component analysis to separate the singing voice from the music to reduce the influence of the singing voice and consider the temporal correlations of the chord features. Using robust principal component analysis, we expect the low-rank component of the spectrogram matrix to contain the musical accompaniment and the sparse component to contain the vocal signals. Then, we extract a new logarithmic pitch class profile (LPCP feature called enhanced LPCP from the low-rank part. To exploit the temporal correlation among the LPCP features of chords, we propose an improved support vector machine algorithm called TCSVM. We perform this study using the MIREX’09 (Music Information Retrieval Evaluation eXchange Audio Chord Estimation dataset. Furthermore, we conduct comprehensive experiments using different pitch class profile feature vectors to examine the performance of TCSVM. The results of our method are comparable to the state-of-the-art methods that entered the MIREX in 2013 and 2014 for the MIREX’09 Audio Chord Estimation task dataset.

  14. Friction-resilient position control for machine tools—Adaptive and sliding-mode methods compared

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios; Blanke, Mogens; Niemann, Hans Henrik

    2018-01-01

    Robust trajectory tracking and increasing demand for high-accuracy tool positioning have motivated research in advanced control design for machine tools. State-of-the-art industry solutions employ cascades of Proportional (P) and Proportional-Integral (PI) controllers for closed-loop servo contro...

  15. Robust control design for the plasma horizontal position control on J-TEXT Tokamak

    International Nuclear Information System (INIS)

    Yu, W.Z.; Chen, Z.P.; Zhuang, G.; Wang, Z.J.

    2013-01-01

    It is extremely important for tokamak to control the plasma position during routine discharge. However, the model of plasma in tokamak usually contains much of the uncertainty, such as structured uncertainties and unmodeled dynamics. Compared with the traditional PID control approach, robust control theory is more suitable to handle this problem. In the paper, we propose a H ∞ robust control scheme to control the horizontal position of plasma during the flat-top phase of discharge on Joint Texas Experimental Tokamak (J-TEXT) tokamak. First, the model of our plant for plasma horizontal position control is obtained from the position equilibrium equations. Then the H ∞ robust control framework is used to synthesize the controller. Based on this, an H ∞ controller is designed to minimize the regulation/tracking error. Finally, a comparison study is conducted between the optimized H ∞ robust controller and the traditional PID controller in simulations. The simulation results of the H ∞ robust controller show a significant improvement of the performance with respect to those obtained with traditional PID controller, which is currently used on our machine

  16. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  17. Representative Vector Machines: A Unified Framework for Classical Classifiers.

    Science.gov (United States)

    Gui, Jie; Liu, Tongliang; Tao, Dacheng; Sun, Zhenan; Tan, Tieniu

    2016-08-01

    Classifier design is a fundamental problem in pattern recognition. A variety of pattern classification methods such as the nearest neighbor (NN) classifier, support vector machine (SVM), and sparse representation-based classification (SRC) have been proposed in the literature. These typical and widely used classifiers were originally developed from different theory or application motivations and they are conventionally treated as independent and specific solutions for pattern classification. This paper proposes a novel pattern classification framework, namely, representative vector machines (or RVMs for short). The basic idea of RVMs is to assign the class label of a test example according to its nearest representative vector. The contributions of RVMs are twofold. On one hand, the proposed RVMs establish a unified framework of classical classifiers because NN, SVM, and SRC can be interpreted as the special cases of RVMs with different definitions of representative vectors. Thus, the underlying relationship among a number of classical classifiers is revealed for better understanding of pattern classification. On the other hand, novel and advanced classifiers are inspired in the framework of RVMs. For example, a robust pattern classification method called discriminant vector machine (DVM) is motivated from RVMs. Given a test example, DVM first finds its k -NNs and then performs classification based on the robust M-estimator and manifold regularization. Extensive experimental evaluations on a variety of visual recognition tasks such as face recognition (Yale and face recognition grand challenge databases), object categorization (Caltech-101 dataset), and action recognition (Action Similarity LAbeliNg) demonstrate the advantages of DVM over other classifiers.

  18. Analytic robust stability analysis of SVD orbit feedback

    CERN Document Server

    Pfingstner, Jürgen

    2012-01-01

    Orbit feedback controllers are indispensable for the operation of modern particle accelerators. Many such controllers are based on the decoupling of the inputs and outputs of the system to be controlled with the help of the singular value decomposition (SVD controller). It is crucial to verify the stability of SVD controllers, also in the presence of mismatches between the used accelerator model and the real machine (robust stability problem). In this paper, analytical criteria for guaranteed stability margins of SVD orbit feedback systems for three different types of model mismatches are presented: scaling errors of actuators and BPMs (beam position monitors) and additive errors of the orbit response matrix. For the derivation of these criteria, techniques from robust control theory have been used, e.g the small gain theorem. The obtained criteria can be easily applied directly to other SVD orbit feedback systems. As an example, the criteria were applied to the orbit feedback system of the Compact Linear ...

  19. Robustness evaluation of cutting tool maintenance planning for soft ground tunneling projects

    Directory of Open Access Journals (Sweden)

    Alena Conrads

    2018-03-01

    Full Text Available Tunnel boring machines require extensive maintenance and inspection effort to provide a high availability. The cutting tools of the cutting wheel must be changed timely upon reaching a critical condition. While one possible maintenance strategy is to change tools only when it is absolutely necessary, tools can also be changed preventively to avoid further damages. Such different maintenance strategies influence the maintenance duration and the overall project performance. However, determine downtime related to a particular maintenance strategy is still a challenging task. This paper shows an analysis of the robustness to achieve the planned project performance of a maintenance strategy considering uncertainties of wear behavior of the cutting tools. A simulation based analysis is presented, implementing an empirical wear prediction model. Different strategies of maintenance planning are compared by performing a parameter variation study including Monte-Carlo simulations. The maintenance costs are calculated and evaluated with respect to their robustness. Finally, an improved and robust maintenance strategy has been determined. Keywords: Mechanized tunneling, Maintenance, Wear of cutting tools, Process simulation, Robustness, Uncertainty modeling

  20. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  1. Relevance vector machine technique for the inverse scattering problem

    International Nuclear Information System (INIS)

    Wang Fang-Fang; Zhang Ye-Rong

    2012-01-01

    A novel method based on the relevance vector machine (RVM) for the inverse scattering problem is presented in this paper. The nonlinearity and the ill-posedness inherent in this problem are simultaneously considered. The nonlinearity is embodied in the relation between the scattered field and the target property, which can be obtained through the RVM training process. Besides, rather than utilizing regularization, the ill-posed nature of the inversion is naturally accounted for because the RVM can produce a probabilistic output. Simulation results reveal that the proposed RVM-based approach can provide comparative performances in terms of accuracy, convergence, robustness, generalization, and improved performance in terms of sparse property in comparison with the support vector machine (SVM) based approach. (general)

  2. Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

    Science.gov (United States)

    Neftci, Emre O.; Pedroni, Bruno U.; Joshi, Siddharth; Al-Shedivat, Maruan; Cauwenberghs, Gert

    2016-01-01

    Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections. Synaptic stochasticity plays the dual role of an efficient mechanism for sampling, and a regularizer during learning akin to DropConnect. A local synaptic plasticity rule implementing an event-driven form of contrastive divergence enables the learning of generative models in an on-line fashion. S2Ms perform equally well using discrete-timed artificial units (as in Hopfield networks) or continuous-timed leaky integrate and fire neurons. The learned representations are remarkably sparse and robust to reductions in bit precision and synapse pruning: removal of more than 75% of the weakest connections followed by cursory re-learning causes a negligible performance loss on benchmark classification tasks. The spiking neuron-based S2Ms outperform existing spike-based unsupervised learners, while potentially offering substantial advantages in terms of power and complexity, and are thus promising models for on-line learning in brain-inspired hardware. PMID:27445650

  3. Machine printed text and handwriting identification in noisy document images.

    Science.gov (United States)

    Zheng, Yefeng; Li, Huiping; Doermann, David

    2004-03-01

    In this paper, we address the problem of the identification of text in noisy document images. We are especially focused on segmenting and identifying between handwriting and machine printed text because: 1) Handwriting in a document often indicates corrections, additions, or other supplemental information that should be treated differently from the main content and 2) the segmentation and recognition techniques requested for machine printed and handwritten text are significantly different. A novel aspect of our approach is that we treat noise as a separate class and model noise based on selected features. Trained Fisher classifiers are used to identify machine printed text and handwriting from noise and we further exploit context to refine the classification. A Markov Random Field-based (MRF) approach is used to model the geometrical structure of the printed text, handwriting, and noise to rectify misclassifications. Experimental results show that our approach is robust and can significantly improve page segmentation in noisy document collections.

  4. Élaboration et gestion des plans d'installation de la machine LHC

    CERN Document Server

    Corso, Jean Pierre

    2005-01-01

    Remplir 27 km de l’étroit tunnel du LEP avec plusieurs milliers d’équipements est le défi posé par le LHC et à relever par les techniciens et ingénieurs du CERN. Le LEP avait initié ce concept avec un outil nommé LEGO ; Le LHC a encore franchi une étape avec DMU (Digital Mock-Up), en tirant profit de l’évolution significative des outils informatiques (langages de programmation et logiciels). Utilisé comme interface entre une base de données Oracle® regroupant toutes les informations sur chaque équipement et les modèles 3D standard correspondants issus d’Euclid®, DMU a étendu, depuis, son champ d’application en permettant dorénavant de générer automatiquement tous les plans d’installation des 514 demi cellules de la Machine. Les évolutions futures de la configuration du LHC nous conduiront à modifier ou déplacer les équipements en place et à en insérer de nouveaux. Ceux-ci seront introduits dans la base de données avant de pouvoir être restitués via DMU sous forme de pl...

  5. Evolution of the architecture of the ATLAS Metadata Interface (AMI)

    Science.gov (United States)

    Odier, J.; Aidel, O.; Albrand, S.; Fulachier, J.; Lambert, F.

    2015-12-01

    The ATLAS Metadata Interface (AMI) is now a mature application. Over the years, the number of users and the number of provided functions has dramatically increased. It is necessary to adapt the hardware infrastructure in a seamless way so that the quality of service re - mains high. We describe the AMI evolution since its beginning being served by a single MySQL backend database server to the current state having a cluster of virtual machines at French Tier1, an Oracle database at Lyon with complementary replication to the Oracle DB at CERN and AMI back-up server.

  6. Machine rates for selected forest harvesting machines

    Science.gov (United States)

    R.W. Brinker; J. Kinard; Robert Rummer; B. Lanford

    2002-01-01

    Very little new literature has been published on the subject of machine rates and machine cost analysis since 1989 when the Alabama Agricultural Experiment Station Circular 296, Machine Rates for Selected Forest Harvesting Machines, was originally published. Many machines discussed in the original publication have undergone substantial changes in various aspects, not...

  7. CIEQUI: An oracle database for information management in the analytical chemistry unit of CIEMAT

    International Nuclear Information System (INIS)

    Rucandio, M.I.; Roca, M.

    1997-01-01

    An in-house software product named CIEQUI has been developed in CIEMAT, with purpose-written programs as a laboratory information management system (LIMS). It is grounded upon relational data base from ORACLE, with the supported languages SQL, PL/SQL, SQL*Plus, and DEC BASIS, and with the tools SQL*Loader, SQL*Forms and SQL*Menu. Its internal organization and functional structure are schematically represented and the advantages and disadvantages of a tailored management system are described. Although it is difficult to unity the analysis criteria in a R AND D organization such as CIEMAT, because of the wide variety in the sample type and in the involved determinations, our system provides remarkable advantages. CIEQUI reflects the complexity of the laboratories it serves. It is a system easily accessible to all, that help us in many tasks about organization and management of the analytical service provided through the different laboratories of the CIEMAT Analytical Chemistry Unit. (Author)

  8. A bittersweet story: the true nature of the laurel of the Oracle of Delphi.

    Science.gov (United States)

    Harissis, Haralampos V

    2014-01-01

    It is known from ancient sources that "laurel," identified with sweet bay, was used at the ancient Greek oracle of Delphi. The Pythia, the priestess who spoke the prophecies, purportedly used laurel as a means to inspire her divine frenzy. However, the clinical symptoms of the Pythia, as described in ancient sources, cannot be attributed to the use of sweet bay, which is harmless. A review of contemporary toxicological literature indicates that it is oleander that causes symptoms similar to those of the Pythia, while a closer examination of ancient literary texts indicates that oleander was often included under the generic term laurel. It is therefore likely that it was oleander, not sweet bay, that the Pythia used before the oracular procedure. This explanation could also shed light on other ancient accounts regarding the alleged spirit and chasm of Delphi, accounts that have been the subject of intense debate and interdisciplinary research for the last hundred years.

  9. Meta-algorithmics patterns for robust, low cost, high quality systems

    CERN Document Server

    Simske, Steven J

    2013-01-01

    The confluence of cloud computing, parallelism and advanced machine intelligence approaches has created a world in which the optimum knowledge system will usually be architected from the combination of two or more knowledge-generating systems. There is a need, then, to provide a reusable, broadly-applicable set of design patterns to empower the intelligent system architect to take advantage of this opportunity. This book explains how to design and build intelligent systems that are optimized for changing system requirements (adaptability), optimized for changing system input (robustness), an

  10. Robust subspace estimation using low-rank optimization theory and applications

    CERN Document Server

    Oreifej, Omar

    2014-01-01

    Various fundamental applications in computer vision and machine learning require finding the basis of a certain subspace. Examples of such applications include face detection, motion estimation, and activity recognition. An increasing interest has been recently placed on this area as a result of significant advances in the mathematics of matrix rank optimization. Interestingly, robust subspace estimation can be posed as a low-rank optimization problem, which can be solved efficiently using techniques such as the method of Augmented Lagrange Multiplier. In this book,?the authors?discuss fundame

  11. Face Recognition in Humans and Machines

    Science.gov (United States)

    O'Toole, Alice; Tistarelli, Massimo

    The study of human face recognition by psychologists and neuroscientists has run parallel to the development of automatic face recognition technologies by computer scientists and engineers. In both cases, there are analogous steps of data acquisition, image processing, and the formation of representations that can support the complex and diverse tasks we accomplish with faces. These processes can be understood and compared in the context of their neural and computational implementations. In this chapter, we present the essential elements of face recognition by humans and machines, taking a perspective that spans psychological, neural, and computational approaches. From the human side, we overview the methods and techniques used in the neurobiology of face recognition, the underlying neural architecture of the system, the role of visual attention, and the nature of the representations that emerges. From the computational side, we discuss face recognition technologies and the strategies they use to overcome challenges to robust operation over viewing parameters. Finally, we conclude the chapter with a look at some recent studies that compare human and machine performances at face recognition.

  12. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  13. Assessing Tolerance-Based Robust Short-Term Load Forecasting in Buildings

    Directory of Open Access Journals (Sweden)

    Juan Prieto

    2013-04-01

    Full Text Available Short-term load forecasting (STLF in buildings differs from its broader counterpart in that the load to be predicted does not seem to be stationary, seasonal and regular but, on the contrary, it may be subject to sudden changes and variations on its consumption behaviour. Classical STLF methods do not react fast enough to these perturbations (i.e., they are not robust and the literature on building STLF has not yet explored this area. Hereby, we evaluate a well-known post-processing method (Learning Window Reinitialization applied to two broadly-used STLF algorithms (Autoregressive Model and Support Vector Machines in buildings to check their adaptability and robustness. We have tested the proposed method with real-world data and our results state that this methodology is especially suited for buildings with non-regular consumption profiles, as classical STLF methods are enough to model regular-profiled ones.

  14. Robust tissue classification for reproducible wound assessment in telemedicine environments

    Science.gov (United States)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  15. Multi-machine power system stabilizers design using chaotic optimization algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Shayeghi, H., E-mail: hshayeghi@gmail.co [Technical Engineering Department, University of Mohaghegh Ardabili, Ardabil (Iran, Islamic Republic of); Shayanfar, H.A. [Center of Excellence for Power System Automation and Operation, Electrical Engineering Department, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Jalilzadeh, S.; Safari, A. [Technical Engineering Department, Zanjan University, Zanjan (Iran, Islamic Republic of)

    2010-07-15

    In this paper, a multiobjective design of the multi-machine power system stabilizers (PSSs) using chaotic optimization algorithm (COA) is proposed. Chaotic optimization algorithms, which have the features of easy implementation, short execution time and robust mechanisms of escaping from the local optimum, is a promising tool for the engineering applications. The PSSs parameters tuning problem is converted to an optimization problem which is solved by a chaotic optimization algorithm based on Lozi map. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed chaotic optimization problem introduces chaos mapping using Lozi map chaotic sequences which increases its convergence rate and resulting precision. Two different objective functions are proposed in this study for the PSSs design problem. The first objective function is the eigenvalues based comprising the damping factor, and the damping ratio of the lightly damped electro-mechanical modes, while the second is the time domain-based multi-objective function. The robustness of the proposed COA-based PSSs (COAPSS) is verified on a multi-machine power system under different operating conditions and disturbances. The results of the proposed COAPSS are demonstrated through eigenvalue analysis, nonlinear time-domain simulation and some performance indices. In addition, the potential and superiority of the proposed method over the classical approach and genetic algorithm is demonstrated.

  16. [A new machinability test machine and the machinability of composite resins for core built-up].

    Science.gov (United States)

    Iwasaki, N

    2001-06-01

    A new machinability test machine especially for dental materials was contrived. The purpose of this study was to evaluate the effects of grinding conditions on machinability of core built-up resins using this machine, and to confirm the relationship between machinability and other properties of composite resins. The experimental machinability test machine consisted of a dental air-turbine handpiece, a control weight unit, a driving unit of the stage fixing the test specimen, and so on. The machinability was evaluated as the change in volume after grinding using a diamond point. Five kinds of core built-up resins and human teeth were used in this study. The machinabilities of these composite resins increased with an increasing load during grinding, and decreased with repeated grinding. There was no obvious correlation between the machinability and Vickers' hardness; however, a negative correlation was observed between machinability and scratch width.

  17. Training Restricted Boltzmann Machines

    DEFF Research Database (Denmark)

    Fischer, Asja

    relies on sampling based approximations of the log-likelihood gradient. I will present an empirical and theoretical analysis of the bias of these approximations and show that the approximation error can lead to a distortion of the learning process. The bias decreases with increasing mixing rate......Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can also be interpreted as stochastic neural networks. Training RBMs is known to be challenging. Computing the likelihood of the model parameters or its gradient is in general computationally intensive. Thus, training...... of the applied sampling procedure and I will introduce a transition operator that leads to faster mixing. Finally, a different parametrisation of RBMs will be discussed that leads to better learning results and more robustness against changes in the data representation....

  18. The reflection of evolving bearing faults in the stator current's extended park vector approach for induction machines

    Science.gov (United States)

    Corne, Bram; Vervisch, Bram; Derammelaere, Stijn; Knockaert, Jos; Desmet, Jan

    2018-07-01

    Stator current analysis has the potential of becoming the most cost-effective condition monitoring technology regarding electric rotating machinery. Since both electrical and mechanical faults are detected by inexpensive and robust current-sensors, measuring current is advantageous on other techniques such as vibration, acoustic or temperature analysis. However, this technology is struggling to breach into the market of condition monitoring as the electrical interpretation of mechanical machine-problems is highly complicated. Recently, the authors built a test-rig which facilitates the emulation of several representative mechanical faults on an 11 kW induction machine with high accuracy and reproducibility. Operating this test-rig, the stator current of the induction machine under test can be analyzed while mechanical faults are emulated. Furthermore, while emulating, the fault-severity can be manipulated adaptively under controllable environmental conditions. This creates the opportunity of examining the relation between the magnitude of the well-known current fault components and the corresponding fault-severity. This paper presents the emulation of evolving bearing faults and their reflection in the Extended Park Vector Approach for the 11 kW induction machine under test. The results confirm the strong relation between the bearing faults and the stator current fault components in both identification and fault-severity. Conclusively, stator current analysis increases reliability in the application as a complete, robust, on-line condition monitoring technology.

  19. Multiple-Swarm Ensembles: Improving the Predictive Power and Robustness of Predictive Models and Its Use in Computational Biology.

    Science.gov (United States)

    Alves, Pedro; Liu, Shuang; Wang, Daifeng; Gerstein, Mark

    2018-01-01

    Machine learning is an integral part of computational biology, and has already shown its use in various applications, such as prognostic tests. In the last few years in the non-biological machine learning community, ensembling techniques have shown their power in data mining competitions such as the Netflix challenge; however, such methods have not found wide use in computational biology. In this work, we endeavor to show how ensembling techniques can be applied to practical problems, including problems in the field of bioinformatics, and how they often outperform other machine learning techniques in both predictive power and robustness. Furthermore, we develop a methodology of ensembling, Multi-Swarm Ensemble (MSWE) by using multiple particle swarm optimizations and demonstrate its ability to further enhance the performance of ensembles.

  20. An intelligent man-machine system for future nuclear power plants

    International Nuclear Information System (INIS)

    Takizawa, Yoji; Hattori, Yoshiaki; Itoh, Juichiro; Fukumoto, Akira

    1994-01-01

    The objective of the development of an intelligent man-machine system for future nuclear power plants is enhancement of operational reliability by applying recent advances in cognitive science, artificial intelligence, and computer technologies. To realize this objective, the intelligent man-machine system, aiming to support a knowledge-based decision making process in an operator's supervisory plant control tasks, consists of three main functions, i.e., a cognitive model-based advisor, a robust automatic sequence controller, and an ecological interface. These three functions have been integrated into a console-type nuclear power plant monitoring and control system as a validation test bed. The validation tests in which experienced operator crews participated were carried out in 1991 and 1992. The test results show the usefulness of the support functions and the validity of the system design approach

  1. A machine vision system for the calibration of digital thermometers

    International Nuclear Information System (INIS)

    Vázquez-Fernández, Esteban; Dacal-Nieto, Angel; González-Jorge, Higinio; Alvarez-Valado, Victor; Martín, Fernando; Formella, Arno

    2009-01-01

    Automation is a key point in many industrial tasks such as calibration and metrology. In this context, machine vision has shown to be a useful tool for automation support, especially when there is no other option available. A system for the calibration of portable measurement devices has been developed. The system uses machine vision to obtain the numerical values shown by displays. A new approach based on human perception of digits, which works in parallel with other more classical classifiers, has been created. The results show the benefits of the system in terms of its usability and robustness, obtaining a success rate higher than 99% in display recognition. The system saves time and effort, and offers the possibility of scheduling calibration tasks without excessive attention by the laboratory technicians

  2. Capacitive Sensing for Contact-less Proximity Detection in Industrial Marble Machines

    Directory of Open Access Journals (Sweden)

    Sergio Saponara

    2010-02-01

    Full Text Available The paper presents the design and experimental characterization of capacitive sensors, plus the relevant front-end acquisition circuitry, for process control in industrial marble machines. The new developed sensing system allows detecting, in real-time and without any contact, the presence of stone samples under the abrasive/cutting heads in an industrial machine. The obtained detection signal is needed as a feedback to improve the automatic control of the polishing/cutting process in marble industry. Different types of sensors are proposed whose performances are assessed through experimental test campaigns considering real industrial working conditions. Compared to state-of-art sensors the proposed solutions allow for a reliable detection while being of low complexity and robust to harsh environment conditions.

  3. Sensorless AC electric motor control robust advanced design techniques and applications

    CERN Document Server

    Glumineau, Alain

    2015-01-01

    This monograph shows the reader how to avoid the burdens of sensor cost, reduced internal physical space, and system complexity in the control of AC motors. Many applications fields—electric vehicles, wind- and wave-energy converters and robotics, among them—will benefit. Sensorless AC Electric Motor Control describes the elimination of physical sensors and their replacement with observers, i.e., software sensors. Robustness is introduced to overcome problems associated with the unavoidable imperfection of knowledge of machine parameters—resistance, inertia, and so on—encountered in real systems. The details of a large number of speed- and/or position-sensorless ideas for different types of permanent-magnet synchronous motors and induction motors are presented along with several novel observer designs for electrical machines. Control strategies are developed using high-order, sliding-mode and quasi-continuous-sliding-mode techniques and two types of observer–controller schemes based on backstepping ...

  4. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  5. Damping of Low Frequency Oscillation in Power System using Robust Control of Superconductor Flywheel Energy Storage System

    International Nuclear Information System (INIS)

    Lee, Jung Pil; Kim, Han Gun

    2012-01-01

    In this paper, the robust superconductor flywheel energy storage system(SFESS) controller using H control theory was designed to damp low frequency oscillation of power system. The main advantage of the controller is that uncertainties of power system can be included at the stage of controller design. Both disturbance attenuation and robust stability for the power system were treated simultaneously by using mixed sensitivity problem. The robust stability and the performance for uncertainties of power system were represented by frequency weighted transfer function. To verify control performance of proposed SFESS controller using control, the closed loop eigenvalue and the damping ratio in dominant oscillation mode of power system were analyzed and nonlinear simulation for one-machine infinite bus system was performed under disturbance for various operating conditions. The results showed that the proposed SFESS controller was more robust than conventional power system stabilizer (PSS).

  6. Materials and optimized designs for human-machine interfaces via epidermal electronics.

    Science.gov (United States)

    Jeong, Jae-Woong; Yeo, Woon-Hong; Akhtar, Aadeel; Norton, James J S; Kwack, Young-Jin; Li, Shuo; Jung, Sung-Young; Su, Yewang; Lee, Woosik; Xia, Jing; Cheng, Huanyu; Huang, Yonggang; Choi, Woon-Seop; Bretl, Timothy; Rogers, John A

    2013-12-17

    Thin, soft, and elastic electronics with physical properties well matched to the epidermis can be conformally and robustly integrated with the skin. Materials and optimized designs for such devices are presented for surface electromyography (sEMG). The findings enable sEMG from wide ranging areas of the body. The measurements have quality sufficient for advanced forms of human-machine interface. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Evolution of the Architecture of the ATLAS Metadata Interface (AMI)

    CERN Document Server

    Odier, Jerome; The ATLAS collaboration; Fulachier, Jerome; Lambert, Fabian

    2015-01-01

    The ATLAS Metadata Interface (AMI) is now a mature application. Over the years, the number of users and the number of provided functions has dramatically increased. It is necessary to adapt the hardware infrastructure in a seamless way so that the quality of service remains high. We describe the evolution from the beginning of the application life, using one server with a MySQL backend database, to the current state in which a cluster of virtual machines on the French Tier 1 cloud at Lyon, an Oracle database also at Lyon, with replication to Oracle at CERN and a back-up server are used.

  8. Min st-cut oracle for planar graphs with near-linear preprocessing time

    DEFF Research Database (Denmark)

    Borradaile, Glencora; Sankowski, Piotr; Wulff-Nilsen, Christian

    2010-01-01

    For an undirected n-vertex planar graph G with non-negative edge-weights, we consider the following type of query: given two vertices s and t in G, what is the weight of a min st-cut in G? We show how to answer such queries in constant time with O(n log5 n) preprocessing time and O(n log n) space....... We use a Gomory-Hu tree to represent all the pairwise min st-cuts implicitly. Previously, no subquadratic time algorithm was known for this problem. Our oracle can be extended to report the min st-cuts in time proportional to their size. Since all-pairs min st-cut and the minimum cycle basis are dual...... problems in planar graphs, we also obtain an implicit representation of a minimum cycle basis in O(n log5 n) time and O(n log n) space and an explicit representation with additional O(C) time and space where C is the size of the basis. To obtain our results, we require that shortest paths be unique...

  9. Automatic microseismic event picking via unsupervised machine learning

    Science.gov (United States)

    Chen, Yangkang

    2018-01-01

    Effective and efficient arrival picking plays an important role in microseismic and earthquake data processing and imaging. Widely used short-term-average long-term-average ratio (STA/LTA) based arrival picking algorithms suffer from the sensitivity to moderate-to-strong random ambient noise. To make the state-of-the-art arrival picking approaches effective, microseismic data need to be first pre-processed, for example, removing sufficient amount of noise, and second analysed by arrival pickers. To conquer the noise issue in arrival picking for weak microseismic or earthquake event, I leverage the machine learning techniques to help recognizing seismic waveforms in microseismic or earthquake data. Because of the dependency of supervised machine learning algorithm on large volume of well-designed training data, I utilize an unsupervised machine learning algorithm to help cluster the time samples into two groups, that is, waveform points and non-waveform points. The fuzzy clustering algorithm has been demonstrated to be effective for such purpose. A group of synthetic, real microseismic and earthquake data sets with different levels of complexity show that the proposed method is much more robust than the state-of-the-art STA/LTA method in picking microseismic events, even in the case of moderately strong background noise.

  10. Effect of Machining Velocity in Nanoscale Machining Operations

    International Nuclear Information System (INIS)

    Islam, Sumaiya; Khondoker, Noman; Ibrahim, Raafat

    2015-01-01

    The aim of this study is to investigate the generated forces and deformations of single crystal Cu with (100), (110) and (111) crystallographic orientations at nanoscale machining operation. A nanoindenter equipped with nanoscratching attachment was used for machining operations and in-situ observation of a nano scale groove. As a machining parameter, the machining velocity was varied to measure the normal and cutting forces. At a fixed machining velocity, different levels of normal and cutting forces were generated due to different crystallographic orientations of the specimens. Moreover, after machining operation percentage of elastic recovery was measured and it was found that both the elastic and plastic deformations were responsible for producing a nano scale groove within the range of machining velocities from 250-1000 nm/s. (paper)

  11. Earth Science Project Office (ESPO) Field Experiences During ORACLES, ATom, KORUS and POSIDON

    Science.gov (United States)

    Salazar, Vidal; Zavaleta, Jhony

    2017-01-01

    Very often, scientific field campaigns entail years of planning and incur substantial cost, especially if they involve the operation of large research aircraft in remote locations. Deploying and operating these aircrafts even for short periods of time poses challenges that, if not addressed properly, can have significant negative consequences and potentially jeopardize the success of a scientific campaign. Challenges vary from country to country and range from safety, health, and security risks to differences in cultural and social norms. Our presentation will focus on sharing experiences on the ESPO 2016 conducted field campaigns ORACLES, ATom, KORUS and POSIDON. We will focus on the best practices, lessons learned, international relations and coordination aspects of the country-specific experiences. This presentation will be part of the ICARE Conference (2nd International Conference on Airborne Research for the Environment (ICARE 2017) that will focus on "Developing the infrastructure to meet future scientific challenges". This unique conference and gathering of facility support experts will not only allow for dissemination and sharing of knowledge but also promote collaboration and networking among groups that support scientific research using airborne platforms around the globe.

  12. Estimation of the Dynamic States of Synchronous Machines Using an Extended Particle Filter

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Ning; Meng, Da; Lu, Shuai

    2013-11-11

    In this paper, an extended particle filter (PF) is proposed to estimate the dynamic states of a synchronous machine using phasor measurement unit (PMU) data. A PF propagates the mean and covariance of states via Monte Carlo simulation, is easy to implement, and can be directly applied to a non-linear system with non-Gaussian noise. The extended PF modifies a basic PF to improve robustness. Using Monte Carlo simulations with practical noise and model uncertainty considerations, the extended PF’s performance is evaluated and compared with the basic PF and an extended Kalman filter (EKF). The extended PF results showed high accuracy and robustness against measurement and model noise.

  13. Developing an efficient decision support system for non-traditional machine selection: an application of MOORA and MOOSRA

    Directory of Open Access Journals (Sweden)

    Asis Sarkar

    2015-01-01

    Full Text Available The purpose of this paper is to find out an efficient decision support method for non-traditional machine selection. It seeks to analyze potential non-traditional machine selection attributes with a relatively new MCDM approach of MOORA and MOOSRA method. The use of MOORA and MOOSRA method has been adopted to tackle subjective evaluation of information collected from an expert group. An example case study is shown here for better understanding of the said selection module which can be effectively applied to any other decision-making scenario. The method is not only computationally very simple, easily comprehensible, and robust, but also believed to have numerous subjective attributes. The rankings are expected to provide good guidance to the managers of an organization to select a feasible non-traditional machine. It shall also provide a good insight for the non-traditional machine manufacturer who might encourage research work concerning non-traditional machine selection.

  14. Simultaneous feature selection and classification via Minimax Probability Machine

    Directory of Open Access Journals (Sweden)

    Liming Yang

    2010-12-01

    Full Text Available This paper presents a novel method for simultaneous feature selection and classification by incorporating a robust L1-norm into the objective function of Minimax Probability Machine (MPM. A fractional programming framework is derived by using a bound on the misclassification error involving the mean and covariance of the data. Furthermore, the problems are solved by the Quadratic Interpolation method. Experiments show that our methods can select fewer features to improve the generalization compared to MPM, which illustrates the effectiveness of the proposed algorithms.

  15. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  16. Design of Parameter Independent, High Performance Sensorless Controllers for Permanent Magnet Synchronous Machines

    DEFF Research Database (Denmark)

    Xie, Ge

    . The transient fluctuation of the estimated rotor position error is around 20 degrees with a step load torque change from 0% to 100% of the rated torque. The position error in steady state is within ±2 electrical degrees for the best case. The proposed method may also be used for e.g. online machine parameter......The Permanent Magnet Synchronous Machine (PMSM) has become an attractive candidate for various industrial applications due to its high efficiency and torque density. In the PMSM drive system, simple and robust control methods play an important role in achieving satisfactory drive performances....... For reducing the cost and increasing the reliability of the drive system, eliminating the mechanical sensor brings a lot advantages to the PMSM drive system. Therefore, sensorless control was developed and has been increasingly used in different PMSM drive systems in the last 20 years. However, machine...

  17. Passivity-Based Control of Electric Machines

    Energy Technology Data Exchange (ETDEWEB)

    Nicklasson, P.J.

    1996-12-31

    This doctoral thesis presents new results on the design and analysis of controllers for a class of electric machines. Nonlinear controllers are derived from a Lagrangian model representation using passivity techniques, and previous results on induction motors are improved and extended to Blondel-Park transformable machines. The relation to conventional techniques is discussed, and it is shown that the formalism introduced in this work facilitates analysis of conventional methods, so that open questions concerning these methods may be resolved. In addition, the thesis contains the following improvements of previously published results on the control of induction motors: (1) Improvement of a passivity-based speed/position controller, (2) Extension of passivity-based (observer-less and observer-based) controllers from regulation to tracking of rotor flux norm, (3) An extension of the classical indirect FOC (Field-Oriented Control) scheme to also include global rotor flux norm tracking, instead of only torque tracking and rotor flux norm regulation. The design is illustrated experimentally by applying the proposed control schemes to a squirrel-cage induction motor. The results show that the proposed methods have advantages over previous designs with respect to controller tuning, performance and robustness. 145 refs., 21 figs.

  18. Oracle BPM Suite 11g Developer's cookbook

    CERN Document Server

    Acharya, Vivek

    2012-01-01

    This book is written in simple, easy to understand format with lots of screenshots and step-by-step explanations. If you are a BPM developer, looking to develop robust BPM solutions without impediments, then this is the best guide for you. This book assumes that you have a fundamental knowledge of BPM.

  19. Non-negative Tensor Factorization for Robust Exploratory Big-Data Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Alexandrov, Boian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vesselinov, Velimir Valentinov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-17

    Currently, large multidimensional datasets are being accumulated in almost every field. Data are: (1) collected by distributed sensor networks in real-time all over the globe, (2) produced by large-scale experimental measurements or engineering activities, (3) generated by high-performance simulations, and (4) gathered by electronic communications and socialnetwork activities, etc. Simultaneous analysis of these ultra-large heterogeneous multidimensional datasets is often critical for scientific discoveries, decision-making, emergency response, and national and global security. The importance of such analyses mandates the development of the next-generation of robust machine learning (ML) methods and tools for bigdata exploratory analysis.

  20. Pileup Mitigation with Machine Learning (PUMML)

    Science.gov (United States)

    Komiske, Patrick T.; Metodiev, Eric M.; Nachman, Benjamin; Schwartz, Matthew D.

    2017-12-01

    Pileup involves the contamination of the energy distribution arising from the primary collision of interest (leading vertex) by radiation from soft collisions (pileup). We develop a new technique for removing this contamination using machine learning and convolutional neural networks. The network takes as input the energy distribution of charged leading vertex particles, charged pileup particles, and all neutral particles and outputs the energy distribution of particles coming from leading vertex alone. The PUMML algorithm performs remarkably well at eliminating pileup distortion on a wide range of simple and complex jet observables. We test the robustness of the algorithm in a number of ways and discuss how the network can be trained directly on data.

  1. Prediction of Machine Tool Condition Using Support Vector Machine

    International Nuclear Information System (INIS)

    Wang Peigong; Meng Qingfeng; Zhao Jian; Li Junjie; Wang Xiufeng

    2011-01-01

    Condition monitoring and predicting of CNC machine tools are investigated in this paper. Considering the CNC machine tools are often small numbers of samples, a condition predicting method for CNC machine tools based on support vector machines (SVMs) is proposed, then one-step and multi-step condition prediction models are constructed. The support vector machines prediction models are used to predict the trends of working condition of a certain type of CNC worm wheel and gear grinding machine by applying sequence data of vibration signal, which is collected during machine processing. And the relationship between different eigenvalue in CNC vibration signal and machining quality is discussed. The test result shows that the trend of vibration signal Peak-to-peak value in surface normal direction is most relevant to the trend of surface roughness value. In trends prediction of working condition, support vector machine has higher prediction accuracy both in the short term ('One-step') and long term (multi-step) prediction compared to autoregressive (AR) model and the RBF neural network. Experimental results show that it is feasible to apply support vector machine to CNC machine tool condition prediction.

  2. Asynchronous machine rotor speed estimation using a tabulated numerical approach

    Science.gov (United States)

    Nguyen, Huu Phuc; De Miras, Jérôme; Charara, Ali; Eltabach, Mario; Bonnet, Stéphane

    2017-12-01

    This paper proposes a new method to estimate the rotor speed of the asynchronous machine by looking at the estimation problem as a nonlinear optimal control problem. The behavior of the nonlinear plant model is approximated off-line as a prediction map using a numerical one-step time discretization obtained from simulations. At each time-step, the speed of the induction machine is selected satisfying the dynamic fitting problem between the plant output and the predicted output, leading the system to adopt its dynamical behavior. Thanks to the limitation of the prediction horizon to a single time-step, the execution time of the algorithm can be completely bounded. It can thus easily be implemented and embedded into a real-time system to observe the speed of the real induction motor. Simulation results show the performance and robustness of the proposed estimator.

  3. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  4. Childhood outcomes after prescription of antibiotics to pregnant women with spontaneous preterm labour: 7-year follow-up of the ORACLE II trial.

    Science.gov (United States)

    Kenyon, S; Pike, K; Jones, D R; Brocklehurst, P; Marlow, N; Salt, A; Taylor, D J

    2008-10-11

    The ORACLE II trial compared the use of erythromycin and/or amoxicillin-clavulanate (co-amoxiclav) with that of placebo for women in spontaneous preterm labour and intact membranes, without overt signs of clinical infection, by use of a factorial randomised design. The aim of the present study--the ORACLE Children Study II--was to determine the long-term effects on children after exposure to antibiotics in this clinical situation. We assessed children at age 7 years born to the 4221 women who had completed the ORACLE II study and who were eligible for follow-up with a structured parental questionnaire to assess the child's health status. Functional impairment was defined as the presence of any level of functional impairment (severe, moderate, or mild) derived from the mark III Multi-Attribute Health Status classification system. Educational outcomes were assessed with national curriculum test results for children resident in England. Outcome was determined for 3196 (71%) eligible children. Overall, a greater proportion of children whose mothers had been prescribed erythromycin, with or without co-amoxiclav, had any functional impairment than did those whose mothers had received no erythromycin (658 [42.3%] of 1554 children vs 574 [38.3%] of 1498; odds ratio 1.18, 95% CI 1.02-1.37). Co-amoxiclav (with or without erythromycin) had no effect on the proportion of children with any functional impairment, compared with receipt of no co-amoxiclav (624 [40.7%] of 1523 vs 608 [40.0%] of 1520; 1.03, 0.89-1.19). No effects were seen with either antibiotic on the number of deaths, other medical conditions, behavioural patterns, or educational attainment. However, more children whose mothers had received erythromycin or co-amoxiclav developed cerebral palsy than did those born to mothers who received no erythromycin or no co-amoxiclav, respectively (erythromycin: 53 [3.3%] of 1611 vs 27 [1.7%] of 1562, 1.93, 1.21-3.09; co-amoxiclav: 50 [3.2%] of 1587 vs 30 [1.9%] of 1586, 1

  5. PERANCANGAN PROTOTYPE APLIKASI KNOWLEDGE MANAGEMENT PADA DIVISI MANAGEMENT AUTOMATION INFORMATION UNTUK MENDUKUNG ORACLE FINANCIAL PADA ORANG TUA GROUP

    Directory of Open Access Journals (Sweden)

    Gema Gema

    2010-10-01

    Full Text Available The purpose of this project is to design a knowledge management application as a media to document knowledge and facility that supported a knowledge sharing culture in Oracle Financial subdivision in Orang Tua Group. The researcher uses 7 first steps method which defined by Tiwana in doing knowledge management application prototype. The knowledge management prototype application modules consist of Wiki page, document library, discussion board, blog, picture library, knowledge base, help desk, frequently asked questions, and surveys. In using knowledge in knowledge base, user will get knowledge through business process, how to use the application, or how to finish some cases. Knowledge management prototype application design as a whole could fulfill user’s needs in sharing knowledge, but still needs continuous improvement for maximal usage.Keywords: prototype design, knowledge management application, knowledge

  6. Robust design optimization using the price of robustness, robust least squares and regularization methods

    Science.gov (United States)

    Bukhari, Hassan J.

    2017-12-01

    In this paper a framework for robust optimization of mechanical design problems and process systems that have parametric uncertainty is presented using three different approaches. Robust optimization problems are formulated so that the optimal solution is robust which means it is minimally sensitive to any perturbations in parameters. The first method uses the price of robustness approach which assumes the uncertain parameters to be symmetric and bounded. The robustness for the design can be controlled by limiting the parameters that can perturb.The second method uses the robust least squares method to determine the optimal parameters when data itself is subjected to perturbations instead of the parameters. The last method manages uncertainty by restricting the perturbation on parameters to improve sensitivity similar to Tikhonov regularization. The methods are implemented on two sets of problems; one linear and the other non-linear. This methodology will be compared with a prior method using multiple Monte Carlo simulation runs which shows that the approach being presented in this paper results in better performance.

  7. Virtual screening by a new Clustering-based Weighted Similarity Extreme Learning Machine approach.

    Science.gov (United States)

    Pasupa, Kitsuchart; Kudisthalert, Wasu

    2018-01-01

    Machine learning techniques are becoming popular in virtual screening tasks. One of the powerful machine learning algorithms is Extreme Learning Machine (ELM) which has been applied to many applications and has recently been applied to virtual screening. We propose the Weighted Similarity ELM (WS-ELM) which is based on a single layer feed-forward neural network in a conjunction of 16 different similarity coefficients as activation function in the hidden layer. It is known that the performance of conventional ELM is not robust due to random weight selection in the hidden layer. Thus, we propose a Clustering-based WS-ELM (CWS-ELM) that deterministically assigns weights by utilising clustering algorithms i.e. k-means clustering and support vector clustering. The experiments were conducted on one of the most challenging datasets-Maximum Unbiased Validation Dataset-which contains 17 activity classes carefully selected from PubChem. The proposed algorithms were then compared with other machine learning techniques such as support vector machine, random forest, and similarity searching. The results show that CWS-ELM in conjunction with support vector clustering yields the best performance when utilised together with Sokal/Sneath(1) coefficient. Furthermore, ECFP_6 fingerprint presents the best results in our framework compared to the other types of fingerprints, namely ECFP_4, FCFP_4, and FCFP_6.

  8. Research on Three-dimensional Motion History Image Model and Extreme Learning Machine for Human Body Movement Trajectory Recognition

    Directory of Open Access Journals (Sweden)

    Zheng Chang

    2015-01-01

    Full Text Available Based on the traditional machine vision recognition technology and traditional artificial neural networks about body movement trajectory, this paper finds out the shortcomings of the traditional recognition technology. By combining the invariant moments of the three-dimensional motion history image (computed as the eigenvector of body movements and the extreme learning machine (constructed as the classification artificial neural network of body movements, the paper applies the method to the machine vision of the body movement trajectory. In detail, the paper gives a detailed introduction about the algorithm and realization scheme of the body movement trajectory recognition based on the three-dimensional motion history image and the extreme learning machine. Finally, by comparing with the results of the recognition experiments, it attempts to verify that the method of body movement trajectory recognition technology based on the three-dimensional motion history image and extreme learning machine has a more accurate recognition rate and better robustness.

  9. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  10. Machine learning search for variable stars

    Science.gov (United States)

    Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis

    2018-04-01

    Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.

  11. A new automated assessment method for contrast-detail images by applying support vector machine and its robustness to nonlinear image processing.

    Science.gov (United States)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-09-01

    The automated contrast-detail (C-D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C-D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C-D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5-5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C-D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C-D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C-D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  12. Hybrid machining processes perspectives on machining and finishing

    CERN Document Server

    Gupta, Kapil; Laubscher, R F

    2016-01-01

    This book describes various hybrid machining and finishing processes. It gives a critical review of the past work based on them as well as the current trends and research directions. For each hybrid machining process presented, the authors list the method of material removal, machining system, process variables and applications. This book provides a deep understanding of the need, application and mechanism of hybrid machining processes.

  13. An Expectation-Maximization Method for Calibrating Synchronous Machine Models

    Energy Technology Data Exchange (ETDEWEB)

    Meng, Da; Zhou, Ning; Lu, Shuai; Lin, Guang

    2013-07-21

    The accuracy of a power system dynamic model is essential to its secure and efficient operation. Lower confidence in model accuracy usually leads to conservative operation and lowers asset usage. To improve model accuracy, this paper proposes an expectation-maximization (EM) method to calibrate the synchronous machine model using phasor measurement unit (PMU) data. First, an extended Kalman filter (EKF) is applied to estimate the dynamic states using measurement data. Then, the parameters are calculated based on the estimated states using maximum likelihood estimation (MLE) method. The EM method iterates over the preceding two steps to improve estimation accuracy. The proposed EM method’s performance is evaluated using a single-machine infinite bus system and compared with a method where both state and parameters are estimated using an EKF method. Sensitivity studies of the parameter calibration using EM method are also presented to show the robustness of the proposed method for different levels of measurement noise and initial parameter uncertainty.

  14. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    Science.gov (United States)

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental

  15. Context-dependent adaptation improves robustness of myoelectric control for upper-limb prostheses

    Science.gov (United States)

    Patel, Gauravkumar K.; Hahne, Janne M.; Castellini, Claudio; Farina, Dario; Dosen, Strahinja

    2017-10-01

    Objective. Dexterous upper-limb prostheses are available today to restore grasping, but an effective and reliable feed-forward control is still missing. The aim of this work was to improve the robustness and reliability of myoelectric control by using context information from sensors embedded within the prosthesis. Approach. We developed a context-driven myoelectric control scheme (cxMYO) that incorporates the inference of context information from proprioception (inertial measurement unit) and exteroception (force and grip aperture) sensors to modulate the outputs of myoelectric control. Further, a realistic evaluation of the cxMYO was performed online in able-bodied subjects using three functional tasks, during which the cxMYO was compared to a purely machine-learning-based myoelectric control (MYO). Main results. The results demonstrated that utilizing context information decreased the number of unwanted commands, improving the performance (success rate and dropped objects) in all three functional tasks. Specifically, the median number of objects dropped per round with cxMYO was zero in all three tasks and a significant increase in the number of successful transfers was seen in two out of three functional tasks. Additionally, the subjects reported better user experience. Significance. This is the first online evaluation of a method integrating information from multiple on-board prosthesis sensors to modulate the output of a machine-learning-based myoelectric controller. The proposed scheme is general and presents a simple, non-invasive and cost-effective approach for improving the robustness of myoelectric control.

  16. The efficacy of cladribine tablets in CIS patients retrospectively assigned the diagnosis of MS using modern criteria: Results from the ORACLE-MS study.

    Science.gov (United States)

    Freedman, Mark S; Leist, Thomas P; Comi, Giancarlo; Cree, Bruce Ac; Coyle, Patricia K; Hartung, Hans-Peter; Vermersch, Patrick; Damian, Doris; Dangond, Fernando

    2017-01-01

    Multiple sclerosis (MS) diagnostic criteria have changed since the ORACLE-MS study was conducted; 223 of 616 patients (36.2%) would have met the diagnosis of MS vs clinically isolated syndrome (CIS) using the newer criteria. The objective of this paper is to assess the effect of cladribine tablets in patients with a first clinical demyelinating attack fulfilling newer criteria (McDonald 2010) for MS vs CIS. A post hoc analysis for subgroups of patients retrospectively classified as fulfilling or not fulfilling newer criteria at the first clinical demyelinating attack was conducted. Cladribine tablets 3.5 mg/kg ( n  = 68) reduced the risk of next attack or three-month confirmed Expanded Disability Status Scale (EDSS) worsening by 74% vs placebo ( n  = 72); p  = 0.0009 in patients meeting newer criteria for MS at baseline. Cladribine tablets 5.25 mg/kg ( n  = 83) reduced the risk of next attack or three-month confirmed EDSS worsening by 37%, but nominal significance was not reached ( p  = 0.14). In patients who were still CIS after applying newer criteria, cladribine tablets 3.5 mg/kg ( n  = 138) reduced the risk of conversion to clinically definite multiple sclerosis (CDMS) by 63% vs placebo ( n  = 134); p  = 0.0003. Cladribine tablets 5.25 mg/kg ( n  = 121) reduced the risk of conversion by 75% vs placebo ( n  = 134); p  ORACLE-MS study (NCT00725985).

  17. Roll-to-Roll Manufacturing of Robust Superhydrophobic Coating on Metallic Engineering Materials.

    Science.gov (United States)

    Dong, Shuliang; Wang, Zhenlong; Wang, Yukui; Bai, Xuelin; Fu, Yong Qing; Guo, Bin; Tan, Chaoliang; Zhang, Jia; Hu, PingAn

    2018-01-17

    Creating a robust superhydrophobic surface on the conventional engineering materials at mass production is of great importance for a self-cleaning, anti-icing, nonwetting surface and low flow resistance in industrial applications. Herein, we report a roll-to-roll strategy to create durable and robust superhydrophobic surfaces with designed micro-/nanoscale hierarchical structures on many conventional engineering materials by combining electrical discharge machining and coating of carbon nanoparticles, followed by oil penetration and drying. The treated surface shows good superhydrophobic properties with a static water contact angle of 170 ± 2° and slide angle of 3 ± 1°. The treated surface also exhibits good resilience and maintains the performance after being tested in various harsh conditions, including water flushing for several days, sand abrasion, scratching with sandpapers, and corrosive solution. Significantly, the superhydrophobic surfaces also show a high efficiency of self-cleaning properties even after oil contamination during applications.

  18. Machine learning in autistic spectrum disorder behavioral research: A review and ways forward.

    Science.gov (United States)

    Thabtah, Fadi

    2018-02-13

    Autistic Spectrum Disorder (ASD) is a mental disorder that retards acquisition of linguistic, communication, cognitive, and social skills and abilities. Despite being diagnosed with ASD, some individuals exhibit outstanding scholastic, non-academic, and artistic capabilities, in such cases posing a challenging task for scientists to provide answers. In the last few years, ASD has been investigated by social and computational intelligence scientists utilizing advanced technologies such as machine learning to improve diagnostic timing, precision, and quality. Machine learning is a multidisciplinary research topic that employs intelligent techniques to discover useful concealed patterns, which are utilized in prediction to improve decision making. Machine learning techniques such as support vector machines, decision trees, logistic regressions, and others, have been applied to datasets related to autism in order to construct predictive models. These models claim to enhance the ability of clinicians to provide robust diagnoses and prognoses of ASD. However, studies concerning the use of machine learning in ASD diagnosis and treatment suffer from conceptual, implementation, and data issues such as the way diagnostic codes are used, the type of feature selection employed, the evaluation measures chosen, and class imbalances in data among others. A more serious claim in recent studies is the development of a new method for ASD diagnoses based on machine learning. This article critically analyses these recent investigative studies on autism, not only articulating the aforementioned issues in these studies but also recommending paths forward that enhance machine learning use in ASD with respect to conceptualization, implementation, and data. Future studies concerning machine learning in autism research are greatly benefitted by such proposals.

  19. Classification of Strawberry Fruit Shape by Machine Learning

    Science.gov (United States)

    Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; Isobe, S.; Kochi, N.

    2018-05-01

    Shape is one of the most important traits of agricultural products due to its relationships with the quality, quantity, and value of the products. For strawberries, the nine types of fruit shape were defined and classified by humans based on the sampler patterns of the nine types. In this study, we tested the classification of strawberry shapes by machine learning in order to increase the accuracy of the classification, and we introduce the concept of computerization into this field. Four types of descriptors were extracted from the digital images of strawberries: (1) the Measured Values (MVs) including the length of the contour line, the area, the fruit length and width, and the fruit width/length ratio; (2) the Ellipse Similarity Index (ESI); (3) Elliptic Fourier Descriptors (EFDs), and (4) Chain Code Subtraction (CCS). We used these descriptors for the classification test along with the random forest approach, and eight of the nine shape types were classified with combinations of MVs + CCS + EFDs. CCS is a descriptor that adds human knowledge to the chain codes, and it showed higher robustness in classification than the other descriptors. Our results suggest machine learning's high ability to classify fruit shapes accurately. We will attempt to increase the classification accuracy and apply the machine learning methods to other plant species.

  20. Application of higher order spectral features and support vector machines for bearing faults classification.

    Science.gov (United States)

    Saidi, Lotfi; Ben Ali, Jaouher; Fnaiech, Farhat

    2015-01-01

    Condition monitoring and fault diagnosis of rolling element bearings timely and accurately are very important to ensure the reliability of rotating machinery. This paper presents a novel pattern classification approach for bearings diagnostics, which combines the higher order spectra analysis features and support vector machine classifier. The use of non-linear features motivated by the higher order spectra has been reported to be a promising approach to analyze the non-linear and non-Gaussian characteristics of the mechanical vibration signals. The vibration bi-spectrum (third order spectrum) patterns are extracted as the feature vectors presenting different bearing faults. The extracted bi-spectrum features are subjected to principal component analysis for dimensionality reduction. These principal components were fed to support vector machine to distinguish four kinds of bearing faults covering different levels of severity for each fault type, which were measured in the experimental test bench running under different working conditions. In order to find the optimal parameters for the multi-class support vector machine model, a grid-search method in combination with 10-fold cross-validation has been used. Based on the correct classification of bearing patterns in the test set, in each fold the performance measures are computed. The average of these performance measures is computed to report the overall performance of the support vector machine classifier. In addition, in fault detection problems, the performance of a detection algorithm usually depends on the trade-off between robustness and sensitivity. The sensitivity and robustness of the proposed method are explored by running a series of experiments. A receiver operating characteristic (ROC) curve made the results more convincing. The results indicated that the proposed method can reliably identify different fault patterns of rolling element bearings based on vibration signals. Copyright © 2014 ISA

  1. The Berkeley Out-of-Order Machine (BOOM): An Industry-Competitive, Synthesizable, Parameterized RISC-V Processor

    Science.gov (United States)

    2015-06-13

    12-2-0016, the Center for Future Architecture Research, a member of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA...and ASPIRE Lab industrial sponsors and affiliates Intel, Google, Huawei, Nokia, NVIDIA , Oracle, and Samsung. Any opinions, find- ings, conclusions

  2. Refinamiento del diagrama de clases UML a Oracle®9I en ATOM3

    Directory of Open Access Journals (Sweden)

    CARLOS M. ZAPATA

    2007-01-01

    Full Text Available El OMG define el refinamiento como el proceso de transformación desde un modelo independiente de la plataforma de implementación hacia un modelo específico de la plataforma de implementación. Las herramientas CASE convencionales han experimentado problemas con este tipo de transformación, debido a la definición estática de los modelos incluidos en ellas, a las dificultades para la definición de reglas de transformación y al pobre rendimiento mostrado por ellas en la obtención de código. Las herramientas MetaCASE han surgido con nuevas capacidades para mejorar el refinamiento en el contexto de la transformación entre modelos. En este artículo se presenta una implementación en AToM3 para refinamiento que transforma un diagrama de clases UML independiente de la plataforma de implementación a un diagrama de clases UML dependiente de la plataforma de implementación Oracle® 9i. Además, se muestra el uso de esta clase de refinamiento con un caso de estudio.

  3. Investigating the tool marks on oracle bones inscriptions from the Yinxu site (ca., 1319-1046 BC), Henan province, China.

    Science.gov (United States)

    Zhao, Xiaolong; Tang, Jigen; Gu, Zhou; Shi, Jilong; Yang, Yimin; Wang, Changsui

    2016-09-01

    Oracle Bone Inscriptions in the Shang dynasty (1600-1046 BC) are the earliest well-developed writing forms of the Chinese character system, and their carving techniques have not been studied by tool marks analysis with microscopy. In this study, a digital microscope with three-dimensional surface reconstruction based on extended depth of focus technology was used to investigate tool marks on the surface of four pieces of oracle bones excavated at the eastern area of Huayuanzhuang, Yinxu site(ca., 1319-1046 BC), the last capital of the Shang dynasty, Henan province, China. The results show that there were two procedures to carve the characters on the analyzed tortoise shells. The first procedure was direct carving. The second was "outlining design," which means to engrave a formal character after engraving a draft with a pointed tool. Most of the strokes developed by an engraver do not overlap the smaller draft, which implies that the outlining design would be a sound way to avoid errors such as wrong and missing characters. The strokes of these characters have different shape at two ends and variations on width and depth of the grooves. Moreover, the bottom of the grooves is always rugged. Thus, the use of rotary wheel-cutting tools could be ruled out. In most cases, the starting points of the strokes are round or flat while the finishing points are always pointed. Moreover, the strokes should be engraved from top to bottom. When vertical or horizontal strokes had been engraved, the shell would be turned about 90 degrees to engrave the crossed strokes from top to bottom. There was no preferred order to engrave vertical or horizontal strokes. Since both sides of the grooves of the characters are neat and there exists no unorganized tool marks, then it is suggested that some sharp tools had been used for engraving characters on the shells. Microsc. Res. Tech. 79:827-832, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Perceptual Robust Design

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard

    The research presented in this PhD thesis has focused on a perceptual approach to robust design. The results of the research and the original contribution to knowledge is a preliminary framework for understanding, positioning, and applying perceptual robust design. Product quality is a topic...... been presented. Therefore, this study set out to contribute to the understanding and application of perceptual robust design. To achieve this, a state-of-the-art and current practice review was performed. From the review two main research problems were identified. Firstly, a lack of tools...... for perceptual robustness was found to overlap with the optimum for functional robustness and at most approximately 2.2% out of the 14.74% could be ascribed solely to the perceptual robustness optimisation. In conclusion, the thesis have offered a new perspective on robust design by merging robust design...

  5. Comparación entre Oracle BPM y JBPM en la optimización de un proceso de admisiones

    Directory of Open Access Journals (Sweden)

    Jorge Leonardo Camargo Cuervo

    2013-12-01

    Full Text Available Se presenta el procedimiento seguido para evaluar y comparar dos suites en la Gestión por Procesos de Negocio: Oracle BPM y JBPM; procedimiento que se basó en la ponderación y gradación de las características Implementación, Integración, Desempeño, Escalabilidad y Documentación de cada suite en el caso de automatizar el proceso de Admisiones de la Oficina de Registro Académico de la Universidad Pedagógica y Tecnológica de Colombia, cuya complejidad y transversalidad a toda la universidad lo señalaron como el más apropiado para el proyecto. Para lograr el objetivo de este trabajo se utilizó la metodología SCRUM, que permite un desarrollo ágil y eficaz.

  6. A Novel Bearing Fault Diagnosis Method Based on Gaussian Restricted Boltzmann Machine

    Directory of Open Access Journals (Sweden)

    Xiao-hui He

    2016-01-01

    Full Text Available To realize the fault diagnosis of bearing effectively, this paper presents a novel bearing fault diagnosis method based on Gaussian restricted Boltzmann machine (Gaussian RBM. Vibration signals are firstly resampled to the same equivalent speed. Subsequently, the envelope spectrums of the resampled data are used directly as the feature vectors to represent the fault types of bearing. Finally, in order to deal with the high-dimensional feature vectors based on envelope spectrum, a classifier model based on Gaussian RBM is applied. Gaussian RBM has the ability to provide a closed-form representation of the distribution underlying the training data, and it is very convenient for modeling high-dimensional real-valued data. Experiments on 10 different data sets verify the performance of the proposed method. The superiority of Gaussian RBM classifier is also confirmed by comparing with other classifiers, such as extreme learning machine, support vector machine, and deep belief network. The robustness of the proposed method is also studied in this paper. It can be concluded that the proposed method can realize the bearing fault diagnosis accurately and effectively.

  7. The continuous-variable Deutsch–Jozsa algorithm using realistic quantum systems

    International Nuclear Information System (INIS)

    Wagner, Rob C; Kendon, Viv M

    2012-01-01

    This paper is a study of the continuous-variable Deutsch–Jozsa algorithm. First, we review an existing version of the algorithm for qunat states (Pati and Braunstein 2002 arXiv:0207108v1), and then, we present a realistic version of the Deutsch–Jozsa algorithm for continuous variables, which can be implemented in a physical quantum system given the appropriate oracle. Under these conditions, we have a probabilistic algorithm for deciding the function with a very high success rate with a single call to the oracle. Finally, we look at the effects of errors in both of these continuous-variable algorithms and how they affect the chances of success. We find that the algorithm is generally robust for errors in initialization and the oracle, but less so for errors in the measurement apparatus and the Fourier transform. This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Coherent states: mathematical and physical aspects’. (paper)

  8. Machine terms dictionary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1979-04-15

    This book gives descriptions of machine terms which includes machine design, drawing, the method of machine, machine tools, machine materials, automobile, measuring and controlling, electricity, basic of electron, information technology, quality assurance, Auto CAD and FA terms and important formula of mechanical engineering.

  9. Exact analytical modeling of magnetic vector potential in surface inset permanent magnet DC machines considering magnet segmentation

    Science.gov (United States)

    Jabbari, Ali

    2018-01-01

    Surface inset permanent magnet DC machine can be used as an alternative in automation systems due to their high efficiency and robustness. Magnet segmentation is a common technique in order to mitigate pulsating torque components in permanent magnet machines. An accurate computation of air-gap magnetic field distribution is necessary in order to calculate machine performance. An exact analytical method for magnetic vector potential calculation in surface inset permanent magnet machines considering magnet segmentation has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in polar coordinate by using sub-domain method. One of the main contributions of the paper is to derive an expression for the magnetic vector potential in the segmented PM region by using hyperbolic functions. The developed method is applied on the performance computation of two prototype surface inset magnet segmented motors with open circuit and on load conditions. The results of these models are validated through FEM method.

  10. Some relations between quantum Turing machines and Turing machines

    OpenAIRE

    Sicard, Andrés; Vélez, Mario

    1999-01-01

    For quantum Turing machines we present three elements: Its components, its time evolution operator and its local transition function. The components are related with the components of deterministic Turing machines, the time evolution operator is related with the evolution of reversible Turing machines and the local transition function is related with the transition function of probabilistic and reversible Turing machines.

  11. Robust equivalent consumption-based controllers for a dual-mode diesel parallel HEV

    International Nuclear Information System (INIS)

    Finesso, Roberto; Spessa, Ezio; Venditti, Mattia

    2016-01-01

    Highlights: • Non-plug-in dual-mode parallel hybrid architecture. • Cross-validation machine-learning for robust equivalent consumption-based controllers. • Optimal control strategy based on fuel consumption, NOx and battery aging. • Impact of different equivalent consumption definitions on HEV performance. • Correlation between vehicle braking energy and SOC variation in the traction stages. - Abstract: New equivalent consumption minimization strategy (ECMS) tools have been developed and applied to identify the optimal control strategy of a dual-mode parallel hybrid electric vehicle equipped with a compression-ignition engine. In this architecture, the electric machine is coupled to the engine through either a single-speed gearbox (torque-coupling) or a planetary gear set (speed-coupling). One of the main novelties of the present study concerns the definition of the instantaneous equivalent consumption (EC) function, which takes into account not only fuel consumption (FC) and the energy flow through the electric components, but also NO_x emissions, battery aging, and the battery SOC. The EC function has been trained using a cross-validation machine-learning technique, based on a genetic algorithm, where the training data set has been selected in order to maximize performances over a testing data set. The adoption of this technique, in conjunction with the new definition of EC, have led to the identification of very robust controllers, which provide an accurate control for different driving scenarios, even when the EC function is not specifically trained on the same missions over which it is tested. To this aim, a data set of fifty driving cycles and six user-defined missions, which cover a total distance of 70–100 km, has been considered as a training driving set. The ECMS controllers can be implemented in a vehicle control unit, and their performance has resulted to be close to that of a dynamic programming tool, which has here been used as benchmark

  12. Support vector machine in machine condition monitoring and fault diagnosis

    Science.gov (United States)

    Widodo, Achmad; Yang, Bo-Suk

    2007-08-01

    Recently, the issue of machine condition monitoring and fault diagnosis as a part of maintenance system became global due to the potential advantages to be gained from reduced maintenance costs, improved productivity and increased machine availability. This paper presents a survey of machine condition monitoring and fault diagnosis using support vector machine (SVM). It attempts to summarize and review the recent research and developments of SVM in machine condition monitoring and diagnosis. Numerous methods have been developed based on intelligent systems such as artificial neural network, fuzzy expert system, condition-based reasoning, random forest, etc. However, the use of SVM for machine condition monitoring and fault diagnosis is still rare. SVM has excellent performance in generalization so it can produce high accuracy in classification for machine condition monitoring and diagnosis. Until 2006, the use of SVM in machine condition monitoring and fault diagnosis is tending to develop towards expertise orientation and problem-oriented domain. Finally, the ability to continually change and obtain a novel idea for machine condition monitoring and fault diagnosis using SVM will be future works.

  13. A Review on the Faults of Electric Machines Used in Electric Ships

    OpenAIRE

    Dionysios V. Spyropoulos; Epaminondas D. Mitronikas

    2013-01-01

    Electric propulsion systems are today widely applied in modern ships, including transport ships and warships. The ship of the future will be fully electric, and not only its propulsion system but also all the other services will depend on electric power. The robust and reliable operation of the ship’s power system is essential. In this work, a review on the mechanical and electrical faults of electric machines that are used in electric ships is presented.

  14. ANN-PSO Integrated Optimization Methodology for Intelligent Control of MMC Machining

    Science.gov (United States)

    Chandrasekaran, Muthumari; Tamang, Santosh

    2017-08-01

    Metal Matrix Composites (MMC) show improved properties in comparison with non-reinforced alloys and have found increased application in automotive and aerospace industries. The selection of optimum machining parameters to produce components of desired surface roughness is of great concern considering the quality and economy of manufacturing process. In this study, a surface roughness prediction model for turning Al-SiCp MMC is developed using Artificial Neural Network (ANN). Three turning parameters viz., spindle speed ( N), feed rate ( f) and depth of cut ( d) were considered as input neurons and surface roughness was an output neuron. ANN architecture having 3 -5 -1 is found to be optimum and the model predicts with an average percentage error of 7.72 %. Particle Swarm Optimization (PSO) technique is used for optimizing parameters to minimize machining time. The innovative aspect of this work is the development of an integrated ANN-PSO optimization method for intelligent control of MMC machining process applicable to manufacturing industries. The robustness of the method shows its superiority for obtaining optimum cutting parameters satisfying desired surface roughness. The method has better convergent capability with minimum number of iterations.

  15. Upgrade of laser and electron beam welding database

    CERN Document Server

    Furman, Magdalena

    2014-01-01

    The main purpose of this project was to fix existing issues and update the existing database holding parameters of laser-beam and electron-beam welding machines. Moreover, the database had to be extended to hold the data for the new machines that arrived recently at the workshop. As a solution - the database had to be migrated to Oracle framework, the new user interface (using APEX) had to be designed and implemented with the integration with the CERN web services (EDMS, Phonebook, JMT, CDD and EDH).

  16. A new automated assessment method for contrast–detail images by applying support vector machine and its robustness to nonlinear image processing

    International Nuclear Information System (INIS)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kumiharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-01-01

    The automated contrast–detail (C–D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C–D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C–D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5–5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C–D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C–D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C–D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  17. An Extreme Learning Machine Based on the Mixed Kernel Function of Triangular Kernel and Generalized Hermite Dirichlet Kernel

    Directory of Open Access Journals (Sweden)

    Senyue Zhang

    2016-01-01

    Full Text Available According to the characteristics that the kernel function of extreme learning machine (ELM and its performance have a strong correlation, a novel extreme learning machine based on a generalized triangle Hermitian kernel function was proposed in this paper. First, the generalized triangle Hermitian kernel function was constructed by using the product of triangular kernel and generalized Hermite Dirichlet kernel, and the proposed kernel function was proved as a valid kernel function of extreme learning machine. Then, the learning methodology of the extreme learning machine based on the proposed kernel function was presented. The biggest advantage of the proposed kernel is its kernel parameter values only chosen in the natural numbers, which thus can greatly shorten the computational time of parameter optimization and retain more of its sample data structure information. Experiments were performed on a number of binary classification, multiclassification, and regression datasets from the UCI benchmark repository. The experiment results demonstrated that the robustness and generalization performance of the proposed method are outperformed compared to other extreme learning machines with different kernels. Furthermore, the learning speed of proposed method is faster than support vector machine (SVM methods.

  18. Simulations of Quantum Turing Machines by Quantum Multi-Stack Machines

    OpenAIRE

    Qiu, Daowen

    2005-01-01

    As was well known, in classical computation, Turing machines, circuits, multi-stack machines, and multi-counter machines are equivalent, that is, they can simulate each other in polynomial time. In quantum computation, Yao [11] first proved that for any quantum Turing machines $M$, there exists quantum Boolean circuit $(n,t)$-simulating $M$, where $n$ denotes the length of input strings, and $t$ is the number of move steps before machine stopping. However, the simulations of quantum Turing ma...

  19. Noise-robust speech recognition through auditory feature detection and spike sequence decoding.

    Science.gov (United States)

    Schafer, Phillip B; Jin, Dezhe Z

    2014-03-01

    Speech recognition in noisy conditions is a major challenge for computer systems, but the human brain performs it routinely and accurately. Automatic speech recognition (ASR) systems that are inspired by neuroscience can potentially bridge the performance gap between humans and machines. We present a system for noise-robust isolated word recognition that works by decoding sequences of spikes from a population of simulated auditory feature-detecting neurons. Each neuron is trained to respond selectively to a brief spectrotemporal pattern, or feature, drawn from the simulated auditory nerve response to speech. The neural population conveys the time-dependent structure of a sound by its sequence of spikes. We compare two methods for decoding the spike sequences--one using a hidden Markov model-based recognizer, the other using a novel template-based recognition scheme. In the latter case, words are recognized by comparing their spike sequences to template sequences obtained from clean training data, using a similarity measure based on the length of the longest common sub-sequence. Using isolated spoken digits from the AURORA-2 database, we show that our combined system outperforms a state-of-the-art robust speech recognizer at low signal-to-noise ratios. Both the spike-based encoding scheme and the template-based decoding offer gains in noise robustness over traditional speech recognition methods. Our system highlights potential advantages of spike-based acoustic coding and provides a biologically motivated framework for robust ASR development.

  20. On the applicability of brain reading for predictive human-machine interfaces in robotics.

    Science.gov (United States)

    Kirchner, Elsa Andrea; Kim, Su Kyoung; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Krell, Mario Michael; Tabie, Marc; Fahle, Manfred

    2013-01-01

    The ability of today's robots to autonomously support humans in their daily activities is still limited. To improve this, predictive human-machine interfaces (HMIs) can be applied to better support future interaction between human and machine. To infer upcoming context-based behavior relevant brain states of the human have to be detected. This is achieved by brain reading (BR), a passive approach for single trial EEG analysis that makes use of supervised machine learning (ML) methods. In this work we propose that BR is able to detect concrete states of the interacting human. To support this, we show that BR detects patterns in the electroencephalogram (EEG) that can be related to event-related activity in the EEG like the P300, which are indicators of concrete states or brain processes like target recognition processes. Further, we improve the robustness and applicability of BR in application-oriented scenarios by identifying and combining most relevant training data for single trial classification and by applying classifier transfer. We show that training and testing, i.e., application of the classifier, can be carried out on different classes, if the samples of both classes miss a relevant pattern. Classifier transfer is important for the usage of BR in application scenarios, where only small amounts of training examples are available. Finally, we demonstrate a dual BR application in an experimental setup that requires similar behavior as performed during the teleoperation of a robotic arm. Here, target recognition processes and movement preparation processes are detected simultaneously. In summary, our findings contribute to the development of robust and stable predictive HMIs that enable the simultaneous support of different interaction behaviors.

  1. On the applicability of brain reading for predictive human-machine interfaces in robotics.

    Directory of Open Access Journals (Sweden)

    Elsa Andrea Kirchner

    Full Text Available The ability of today's robots to autonomously support humans in their daily activities is still limited. To improve this, predictive human-machine interfaces (HMIs can be applied to better support future interaction between human and machine. To infer upcoming context-based behavior relevant brain states of the human have to be detected. This is achieved by brain reading (BR, a passive approach for single trial EEG analysis that makes use of supervised machine learning (ML methods. In this work we propose that BR is able to detect concrete states of the interacting human. To support this, we show that BR detects patterns in the electroencephalogram (EEG that can be related to event-related activity in the EEG like the P300, which are indicators of concrete states or brain processes like target recognition processes. Further, we improve the robustness and applicability of BR in application-oriented scenarios by identifying and combining most relevant training data for single trial classification and by applying classifier transfer. We show that training and testing, i.e., application of the classifier, can be carried out on different classes, if the samples of both classes miss a relevant pattern. Classifier transfer is important for the usage of BR in application scenarios, where only small amounts of training examples are available. Finally, we demonstrate a dual BR application in an experimental setup that requires similar behavior as performed during the teleoperation of a robotic arm. Here, target recognition processes and movement preparation processes are detected simultaneously. In summary, our findings contribute to the development of robust and stable predictive HMIs that enable the simultaneous support of different interaction behaviors.

  2. Machine Learning-Empowered Biometric Methods for Biomedicine Applications

    Directory of Open Access Journals (Sweden)

    Qingxue Zhang

    2017-07-01

    Full Text Available Nowadays, pervasive computing technologies are paving a promising way for advanced smart health applications. However, a key impediment faced by wide deployment of these assistive smart devices, is the increasing privacy and security issue, such as how to protect access to sensitive patient data in the health record. Focusing on this challenge, biometrics are attracting intense attention in terms of effective user identification to enable confidential health applications. In this paper, we take special interest in two bio-potential-based biometric modalities, electrocardiogram (ECG and electroencephalogram (EEG, considering that they are both unique to individuals, and more reliable than token (identity card and knowledge-based (username/password methods. After extracting effective features in multiple domains from ECG/EEG signals, several advanced machine learning algorithms are introduced to perform the user identification task, including Neural Network, K-nearest Neighbor, Bagging, Random Forest and AdaBoost. Experimental results on two public ECG and EEG datasets show that ECG is a more robust biometric modality compared to EEG, leveraging a higher signal to noise ratio and also more distinguishable morphological patterns. Among different machine learning classifiers, the random forest greatly outperforms the others and owns an identification rate as high as 98%. This study is expected to demonstrate that properly selected biometric empowered by an effective machine learner owns a great potential, to enable confidential biomedicine applications in the era of smart digital health.

  3. Feature Selection Methods for Robust Decoding of Finger Movements in a Non-human Primate

    Science.gov (United States)

    Padmanaban, Subash; Baker, Justin; Greger, Bradley

    2018-01-01

    Objective: The performance of machine learning algorithms used for neural decoding of dexterous tasks may be impeded due to problems arising when dealing with high-dimensional data. The objective of feature selection algorithms is to choose a near-optimal subset of features from the original feature space to improve the performance of the decoding algorithm. The aim of our study was to compare the effects of four feature selection techniques, Wilcoxon signed-rank test, Relative Importance, Principal Component Analysis (PCA), and Mutual Information Maximization on SVM classification performance for a dexterous decoding task. Approach: A nonhuman primate (NHP) was trained to perform small coordinated movements—similar to typing. An array of microelectrodes was implanted in the hand area of the motor cortex of the NHP and used to record action potentials (AP) during finger movements. A Support Vector Machine (SVM) was used to classify which finger movement the NHP was making based upon AP firing rates. We used the SVM classification to examine the functional parameters of (i) robustness to simulated failure and (ii) longevity of classification. We also compared the effect of using isolated-neuron and multi-unit firing rates as the feature vector supplied to the SVM. Main results: The average decoding accuracy for multi-unit features and single-unit features using Mutual Information Maximization (MIM) across 47 sessions was 96.74 ± 3.5% and 97.65 ± 3.36% respectively. The reduction in decoding accuracy between using 100% of the features and 10% of features based on MIM was 45.56% (from 93.7 to 51.09%) and 4.75% (from 95.32 to 90.79%) for multi-unit and single-unit features respectively. MIM had best performance compared to other feature selection methods. Significance: These results suggest improved decoding performance can be achieved by using optimally selected features. The results based on clinically relevant performance metrics also suggest that the decoding

  4. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  5. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  6. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  7. Evolution of the ATLAS Metadata Interface (AMI)

    CERN Document Server

    Odier, Jerome; The ATLAS collaboration; Fulachier, Jerome; Lambert, Fabian

    2015-01-01

    The ATLAS Metadata Interface (AMI) can be considered to be a mature application because it has existed for at least 10 years. Over the years, the number of users and the number of functions provided for these users has increased. It has been necessary to adapt the hardware infrastructure in a seamless way so that the Quality of Service remains high. We will describe the evolution of the application from the initial one, using single server with a MySQL backend database, to the current state, where we use a cluster of Virtual Machines on the French Tier 1 Cloud at Lyon, an ORACLE database backend also at Lyon, with replication to CERN using ORACLE streams behind a back-up server.

  8. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  9. Prediction of Hydrocarbon Reservoirs Permeability Using Support Vector Machine

    Directory of Open Access Journals (Sweden)

    R. Gholami

    2012-01-01

    Full Text Available Permeability is a key parameter associated with the characterization of any hydrocarbon reservoir. In fact, it is not possible to have accurate solutions to many petroleum engineering problems without having accurate permeability value. The conventional methods for permeability determination are core analysis and well test techniques. These methods are very expensive and time consuming. Therefore, attempts have usually been carried out to use artificial neural network for identification of the relationship between the well log data and core permeability. In this way, recent works on artificial intelligence techniques have led to introduce a robust machine learning methodology called support vector machine. This paper aims to utilize the SVM for predicting the permeability of three gas wells in the Southern Pars field. Obtained results of SVM showed that the correlation coefficient between core and predicted permeability is 0.97 for testing dataset. Comparing the result of SVM with that of a general regression neural network (GRNN revealed that the SVM approach is faster and more accurate than the GRNN in prediction of hydrocarbon reservoirs permeability.

  10. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  11. Humanizing machines: Anthropomorphization of slot machines increases gambling.

    Science.gov (United States)

    Riva, Paolo; Sacchi, Simona; Brambilla, Marco

    2015-12-01

    Do people gamble more on slot machines if they think that they are playing against humanlike minds rather than mathematical algorithms? Research has shown that people have a strong cognitive tendency to imbue humanlike mental states to nonhuman entities (i.e., anthropomorphism). The present research tested whether anthropomorphizing slot machines would increase gambling. Four studies manipulated slot machine anthropomorphization and found that exposing people to an anthropomorphized description of a slot machine increased gambling behavior and reduced gambling outcomes. Such findings emerged using tasks that focused on gambling behavior (Studies 1 to 3) as well as in experimental paradigms that included gambling outcomes (Studies 2 to 4). We found that gambling outcomes decrease because participants primed with the anthropomorphic slot machine gambled more (Study 4). Furthermore, we found that high-arousal positive emotions (e.g., feeling excited) played a role in the effect of anthropomorphism on gambling behavior (Studies 3 and 4). Our research indicates that the psychological process of gambling-machine anthropomorphism can be advantageous for the gaming industry; however, this may come at great expense for gamblers' (and their families') economic resources and psychological well-being. (c) 2015 APA, all rights reserved).

  12. Robustness of Structures

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Vrouwenvelder, A.C.W.M.; Sørensen, John Dalsgaard

    2011-01-01

    In 2005, the Joint Committee on Structural Safety (JCSS) together with Working Commission (WC) 1 of the International Association of Bridge and Structural Engineering (IABSE) organized a workshop on robustness of structures. Two important decisions resulted from this workshop, namely...... ‘COST TU0601: Robustness of Structures’ was initiated in February 2007, aiming to provide a platform for exchanging and promoting research in the area of structural robustness and to provide a basic framework, together with methods, strategies and guidelines enhancing robustness of structures...... the development of a joint European project on structural robustness under the COST (European Cooperation in Science and Technology) programme and the decision to develop a more elaborate document on structural robustness in collaboration between experts from the JCSS and the IABSE. Accordingly, a project titled...

  13. Precise on-machine extraction of the surface normal vector using an eddy current sensor array

    International Nuclear Information System (INIS)

    Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun

    2016-01-01

    To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces. (paper)

  14. Precise on-machine extraction of the surface normal vector using an eddy current sensor array

    Science.gov (United States)

    Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun

    2016-11-01

    To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces.

  15. Code-expanded radio access protocol for machine-to-machine communications

    DEFF Research Database (Denmark)

    Thomsen, Henning; Kiilerich Pratas, Nuno; Stefanovic, Cedomir

    2013-01-01

    The random access methods used for support of machine-to-machine, also referred to as Machine-Type Communications, in current cellular standards are derivatives of traditional framed slotted ALOHA and therefore do not support high user loads efficiently. We propose an approach that is motivated b...... subframes and orthogonal preambles, the amount of available contention resources is drastically increased, enabling the massive support of Machine-Type Communication users that is beyond the reach of current systems.......The random access methods used for support of machine-to-machine, also referred to as Machine-Type Communications, in current cellular standards are derivatives of traditional framed slotted ALOHA and therefore do not support high user loads efficiently. We propose an approach that is motivated...... by the random access method employed in LTE, which significantly increases the amount of contention resources without increasing the system resources, such as contention subframes and preambles. This is accomplished by a logical, rather than physical, extension of the access method in which the available system...

  16. A naive Bayes model for robust remaining useful life prediction of lithium-ion battery

    International Nuclear Information System (INIS)

    Ng, Selina S.Y.; Xing, Yinjiao; Tsui, Kwok L.

    2014-01-01

    Highlights: • Robustness of RUL predictions for lithium-ion batteries is analyzed quantitatively. • RUL predictions of the same battery over cycle life are evaluated. • RUL predictions of batteries over different operating conditions are evaluated. • Naive Bayes (NB) is proposed for predictions under constant discharge environments. • Its robustness and accuracy are compared with that of support vector machine (SVM). - Abstract: Online state-of-health (SoH) estimation and remaining useful life (RUL) prediction is a critical problem in battery health management. This paper studies the modeling of battery degradation under different usage conditions and ambient temperatures, which is seldom considered in the literature. Li-ion battery RUL prediction under constant operating conditions at different values of ambient temperature and discharge current are considered. A naive Bayes (NB) model is proposed for RUL prediction of batteries under different operating conditions. It is shown in this analysis that under constant discharge environments, the RUL of Li-ion batteries can be predicted with the NB method, irrespective of the exact values of the operating conditions. The case study shows that the NB generates stable and competitive prediction performance over that of the support vector machine (SVM). This also suggests that, while it is well known that the environmental conditions have big impact on the degradation trend, it is the changes in operating conditions of a Li-ion battery over cycle life that makes the Li-ion battery degradation and RUL prediction even more difficult

  17. A predictive model of chemical flooding for enhanced oil recovery purposes: Application of least square support vector machine

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Ahmadi

    2016-06-01

    Full Text Available Applying chemical flooding in petroleum reservoirs turns into interesting subject of the recent researches. Developing strategies of the aforementioned method are more robust and precise when they consider both economical point of views (net present value (NPV and technical point of views (recovery factor (RF. In the present study huge attempts are made to propose predictive model for specifying efficiency of chemical flooding in oil reservoirs. To gain this goal, the new type of support vector machine method which evolved by Suykens and Vandewalle was employed. Also, high precise chemical flooding data banks reported in previous works were employed to test and validate the proposed vector machine model. According to the mean square error (MSE, correlation coefficient and average absolute relative deviation, the suggested LSSVM model has acceptable reliability; integrity and robustness. Thus, the proposed intelligent based model can be considered as an alternative model to monitor the efficiency of chemical flooding in oil reservoir when the required experimental data are not available or accessible.

  18. Non-linear HVAC computations using least square support vector machines

    International Nuclear Information System (INIS)

    Kumar, Mahendra; Kar, I.N.

    2009-01-01

    This paper aims to demonstrate application of least square support vector machines (LS-SVM) to model two complex heating, ventilating and air-conditioning (HVAC) relationships. The two applications considered are the estimation of the predicted mean vote (PMV) for thermal comfort and the generation of psychrometric chart. LS-SVM has the potential for quick, exact representations and also possesses a structure that facilitates hardware implementation. The results show very good agreement between function values computed from conventional model and LS-SVM model in real time. The robustness of LS-SVM models against input noises has also been analyzed.

  19. Phase transitions in restricted Boltzmann machines with generic priors

    Science.gov (United States)

    Barra, Adriano; Genovese, Giuseppe; Sollich, Peter; Tantari, Daniele

    2017-10-01

    We study generalized restricted Boltzmann machines with generic priors for units and weights, interpolating between Boolean and Gaussian variables. We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as generalized Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore, we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalization in a teacher-student scenario of unsupervised learning.

  20. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  1. Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.

    Directory of Open Access Journals (Sweden)

    Igor Shuryak

    Full Text Available The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms. Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1 adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2 adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3 adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4 running a selected machine learning method multiple times (with different random-number seeds to test the robustness of the detected "signal"; (5 using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine, and (II bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA. We show that the proposed

  2. Information extraction from dynamic PS-InSAR time series using machine learning

    Science.gov (United States)

    van de Kerkhof, B.; Pankratius, V.; Chang, L.; van Swol, R.; Hanssen, R. F.

    2017-12-01

    Due to the increasing number of SAR satellites, with shorter repeat intervals and higher resolutions, SAR data volumes are exploding. Time series analyses of SAR data, i.e. Persistent Scatterer (PS) InSAR, enable the deformation monitoring of the built environment at an unprecedented scale, with hundreds of scatterers per km2, updated weekly. Potential hazards, e.g. due to failure of aging infrastructure, can be detected at an early stage. Yet, this requires the operational data processing of billions of measurement points, over hundreds of epochs, updating this data set dynamically as new data come in, and testing whether points (start to) behave in an anomalous way. Moreover, the quality of PS-InSAR measurements is ambiguous and heterogeneous, which will yield false positives and false negatives. Such analyses are numerically challenging. Here we extract relevant information from PS-InSAR time series using machine learning algorithms. We cluster (group together) time series with similar behaviour, even though they may not be spatially close, such that the results can be used for further analysis. First we reduce the dimensionality of the dataset in order to be able to cluster the data, since applying clustering techniques on high dimensional datasets often result in unsatisfying results. Our approach is to apply t-distributed Stochastic Neighbor Embedding (t-SNE), a machine learning algorithm for dimensionality reduction of high-dimensional data to a 2D or 3D map, and cluster this result using Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The results show that we are able to detect and cluster time series with similar behaviour, which is the starting point for more extensive analysis into the underlying driving mechanisms. The results of the methods are compared to conventional hypothesis testing as well as a Self-Organising Map (SOM) approach. Hypothesis testing is robust and takes the stochastic nature of the observations into account

  3. Machine performance assessment and enhancement for a hexapod machine

    Energy Technology Data Exchange (ETDEWEB)

    Mou, J.I. [Arizona State Univ., Tempe, AZ (United States); King, C. [Sandia National Labs., Livermore, CA (United States). Integrated Manufacturing Systems Center

    1998-03-19

    The focus of this study is to develop a sensor fused process modeling and control methodology to model, assess, and then enhance the performance of a hexapod machine for precision product realization. Deterministic modeling technique was used to derive models for machine performance assessment and enhancement. Sensor fusion methodology was adopted to identify the parameters of the derived models. Empirical models and computational algorithms were also derived and implemented to model, assess, and then enhance the machine performance. The developed sensor fusion algorithms can be implemented on a PC-based open architecture controller to receive information from various sensors, assess the status of the process, determine the proper action, and deliver the command to actuators for task execution. This will enhance a hexapod machine`s capability to produce workpieces within the imposed dimensional tolerances.

  4. MO-FG-CAMPUS-TeP3-04: Deliverable Robust Optimization in IMPT Using Quadratic Objective Function

    Energy Technology Data Exchange (ETDEWEB)

    Shan, J; Liu, W; Bues, M; Schild, S [Mayo Clinic Arizona, Phoenix, AZ (United States)

    2016-06-15

    Purpose: To find and evaluate the way of applying deliverable MU constraints into robust spot intensity optimization in Intensity-Modulated- Proton-Therapy (IMPT) to prevent plan quality and robustness from degrading due to machine deliverable MU-constraints. Methods: Currently, the influence of the deliverable MU-constraints is retrospectively evaluated by post-processing immediately following optimization. In this study, we propose a new method based on the quasi-Newton-like L-BFGS-B algorithm with which we turn deliverable MU-constraints on and off alternatively during optimization. Seven patients with two different machine settings (small and large spot size) were planned with both conventional and new methods. For each patient, three kinds of plans were generated — conventional non-deliverable plan (plan A), conventional deliverable plan with post-processing (plan B), and new deliverable plan (plan C). We performed this study with both realistic (small) and artificial (large) deliverable MU-constraints. Results: With small minimum MU-constraints considered, new method achieved a slightly better plan quality than conventional method (D95% CTV normalized to the prescription dose: 0.994[0.992∼0.996] (Plan C) vs 0.992[0.986∼0.996] (Plan B)). With large minimum MU constraints considered, results show that the new method maintains plan quality while plan quality from the conventional method is degraded greatly (D95% CTV normalized to the prescription dose: 0.987[0.978∼0.994] (Plan C) vs 0.797[0.641∼1.000] (Plan B)). Meanwhile, plan robustness of these two method’s results is comparable. (For all 7 patients, CTV DVH band gap at D95% normalized to the prescription dose: 0.015[0.005∼0.043] (Plan C) vs 0.012[0.006∼0.038] (Plan B) with small MU-constraints and 0.019[0.009∼0.039] (Plan C) vs 0.030[0.015∼0.041] (Plan B) with large MU-constraints) Conclusion: Positive correlation has been found between plan quality degeneration and magnitude of

  5. MO-FG-CAMPUS-TeP3-04: Deliverable Robust Optimization in IMPT Using Quadratic Objective Function

    International Nuclear Information System (INIS)

    Shan, J; Liu, W; Bues, M; Schild, S

    2016-01-01

    Purpose: To find and evaluate the way of applying deliverable MU constraints into robust spot intensity optimization in Intensity-Modulated- Proton-Therapy (IMPT) to prevent plan quality and robustness from degrading due to machine deliverable MU-constraints. Methods: Currently, the influence of the deliverable MU-constraints is retrospectively evaluated by post-processing immediately following optimization. In this study, we propose a new method based on the quasi-Newton-like L-BFGS-B algorithm with which we turn deliverable MU-constraints on and off alternatively during optimization. Seven patients with two different machine settings (small and large spot size) were planned with both conventional and new methods. For each patient, three kinds of plans were generated — conventional non-deliverable plan (plan A), conventional deliverable plan with post-processing (plan B), and new deliverable plan (plan C). We performed this study with both realistic (small) and artificial (large) deliverable MU-constraints. Results: With small minimum MU-constraints considered, new method achieved a slightly better plan quality than conventional method (D95% CTV normalized to the prescription dose: 0.994[0.992∼0.996] (Plan C) vs 0.992[0.986∼0.996] (Plan B)). With large minimum MU constraints considered, results show that the new method maintains plan quality while plan quality from the conventional method is degraded greatly (D95% CTV normalized to the prescription dose: 0.987[0.978∼0.994] (Plan C) vs 0.797[0.641∼1.000] (Plan B)). Meanwhile, plan robustness of these two method’s results is comparable. (For all 7 patients, CTV DVH band gap at D95% normalized to the prescription dose: 0.015[0.005∼0.043] (Plan C) vs 0.012[0.006∼0.038] (Plan B) with small MU-constraints and 0.019[0.009∼0.039] (Plan C) vs 0.030[0.015∼0.041] (Plan B) with large MU-constraints) Conclusion: Positive correlation has been found between plan quality degeneration and magnitude of

  6. Real Time Robot Soccer Game Event Detection Using Finite State Machines with Multiple Fuzzy Logic Probability Evaluators

    Directory of Open Access Journals (Sweden)

    Elmer P. Dadios

    2009-01-01

    Full Text Available This paper presents a new algorithm for real time event detection using Finite State Machines with multiple Fuzzy Logic Probability Evaluators (FLPEs. A machine referee for a robot soccer game is developed and is used as the platform to test the proposed algorithm. A novel technique to detect collisions and other events in microrobot soccer game under inaccurate and insufficient information is presented. The robots' collision is used to determine goalkeeper charging and goal score events which are crucial for the machine referee's decisions. The Main State Machine (MSM handles the schedule of event activation. The FLPE calculates the probabilities of the true occurrence of the events. Final decisions about the occurrences of events are evaluated and compared through threshold crisp probability values. The outputs of FLPEs can be combined to calculate the probability of an event composed of subevents. Using multiple fuzzy logic system, the FLPE utilizes minimal number of rules and can be tuned individually. Experimental results show the accuracy and robustness of the proposed algorithm.

  7. Robust Fully Distributed Minibatch Gradient Descent with Privacy Preservation

    Directory of Open Access Journals (Sweden)

    Gábor Danner

    2018-01-01

    Full Text Available Privacy and security are among the highest priorities in data mining approaches over data collected from mobile devices. Fully distributed machine learning is a promising direction in this context. However, it is a hard problem to design protocols that are efficient yet provide sufficient levels of privacy and security. In fully distributed environments, secure multiparty computation (MPC is often applied to solve these problems. However, in our dynamic and unreliable application domain, known MPC algorithms are not scalable or not robust enough. We propose a light-weight protocol to quickly and securely compute the sum query over a subset of participants assuming a semihonest adversary. During the computation the participants learn no individual values. We apply this protocol to efficiently calculate the sum of gradients as part of a fully distributed minibatch stochastic gradient descent algorithm. The protocol achieves scalability and robustness by exploiting the fact that in this application domain a “quick and dirty” sum computation is acceptable. We utilize the Paillier homomorphic cryptosystem as part of our solution combined with extreme lossy gradient compression to make the cost of the cryptographic algorithms affordable. We demonstrate both theoretically and experimentally, based on churn statistics from a real smartphone trace, that the protocol is indeed practically viable.

  8. Minimum Time Trajectory Optimization of CNC Machining with Tracking Error Constraints

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2014-01-01

    Full Text Available An off-line optimization approach of high precision minimum time feedrate for CNC machining is proposed. Besides the ordinary considered velocity, acceleration, and jerk constraints, dynamic performance constraint of each servo drive is also considered in this optimization problem to improve the tracking precision along the optimized feedrate trajectory. Tracking error is applied to indicate the servo dynamic performance of each axis. By using variable substitution, the tracking error constrained minimum time trajectory planning problem is formulated as a nonlinear path constrained optimal control problem. Bang-bang constraints structure of the optimal trajectory is proved in this paper; then a novel constraint handling method is proposed to realize a convex optimization based solution of the nonlinear constrained optimal control problem. A simple ellipse feedrate planning test is presented to demonstrate the effectiveness of the approach. Then the practicability and robustness of the trajectory generated by the proposed approach are demonstrated by a butterfly contour machining example.

  9. Automatic Detection of P and S Phases by Support Vector Machine

    Science.gov (United States)

    Jiang, Y.; Ning, J.; Bao, T.

    2017-12-01

    Many methods in seismology rely on accurately picked phases. A well performed program on automatically phase picking will assure the application of these methods. Related researches before mostly focus on finding different characteristics between noise and phases, which are all not enough successful. We have developed a new method which mainly based on support vector machine to detect P and S phases. In it, we first input some waveform pieces into the support vector machine, then employ it to work out a hyper plane which can divide the space into two parts: respectively noise and phase. We further use the same method to find a hyper plane which can separate the phase space into P and S parts based on the three components' cross-correlation matrix. In order to further improve the ability of phase detection, we also employ array data. At last, we show that the overall effect of our method is robust by employing both synthetic and real data.

  10. Superconducting rotating machines

    International Nuclear Information System (INIS)

    Smith, J.L. Jr.; Kirtley, J.L. Jr.; Thullen, P.

    1975-01-01

    The opportunities and limitations of the applications of superconductors in rotating electric machines are given. The relevant properties of superconductors and the fundamental requirements for rotating electric machines are discussed. The current state-of-the-art of superconducting machines is reviewed. Key problems, future developments and the long range potential of superconducting machines are assessed

  11. Automated detection of microaneurysms using robust blob descriptors

    Science.gov (United States)

    Adal, K.; Ali, S.; Sidibé, D.; Karnowski, T.; Chaum, E.; Mériaudeau, F.

    2013-03-01

    Microaneurysms (MAs) are among the first signs of diabetic retinopathy (DR) that can be seen as round dark-red structures in digital color fundus photographs of retina. In recent years, automated computer-aided detection and diagnosis (CAD) of MAs has attracted many researchers due to its low-cost and versatile nature. In this paper, the MA detection problem is modeled as finding interest points from a given image and several interest point descriptors are introduced and integrated with machine learning techniques to detect MAs. The proposed approach starts by applying a novel fundus image contrast enhancement technique using Singular Value Decomposition (SVD) of fundus images. Then, Hessian-based candidate selection algorithm is applied to extract image regions which are more likely to be MAs. For each candidate region, robust low-level blob descriptors such as Speeded Up Robust Features (SURF) and Intensity Normalized Radon Transform are extracted to characterize candidate MA regions. The combined features are then classified using SVM which has been trained using ten manually annotated training images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. Preliminary results show the competitiveness of the proposed candidate selection techniques against state-of-the art methods as well as the promising future for the proposed descriptors to be used in the localization of MAs from fundus images.

  12. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  13. Exploring a potential energy surface by machine learning for characterizing atomic transport

    Science.gov (United States)

    Kanamori, Kenta; Toyoura, Kazuaki; Honda, Junya; Hattori, Kazuki; Seko, Atsuto; Karasuyama, Masayuki; Shitara, Kazuki; Shiga, Motoki; Kuwabara, Akihide; Takeuchi, Ichiro

    2018-03-01

    We propose a machine-learning method for evaluating the potential barrier governing atomic transport based on the preferential selection of dominant points for atomic transport. The proposed method generates numerous random samples of the entire potential energy surface (PES) from a probabilistic Gaussian process model of the PES, which enables defining the likelihood of the dominant points. The robustness and efficiency of the method are demonstrated on a dozen model cases for proton diffusion in oxides, in comparison with a conventional nudge elastic band method.

  14. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    International Nuclear Information System (INIS)

    McGowan, S E; Albertini, F; Lomax, A J; Thomas, S J

    2015-01-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties. (paper)

  15. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    Science.gov (United States)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  16. An intelligent human-machine system based on an ecological interface design concept

    International Nuclear Information System (INIS)

    Naito, N.

    1995-01-01

    It seems both necessary and promising to develop an intelligent human-machine system, considering the objective of the human-machine system and the recent advance in cognitive engineering and artificial intelligence together with the ever-increasing importance of human factor issues in nuclear power plant operation and maintenance. It should support human operators in their knowledge-based behaviour and allow them to cope with unanticipated abnormal events, including recovery from erroneous human actions. A top-down design approach has been adopted based on cognitive work analysis, and (1) an ecological interface, (2) a cognitive model-based advisor and (3) a robust automatic sequence controller have been established. These functions have been integrated into an experimental control room. A validation test was carried out by the participation of experienced operators and engineers. The results showed the usefulness of this system in supporting the operator's supervisory plant control tasks. ((orig.))

  17. Advanced Electrical Machines and Machine-Based Systems for Electric and Hybrid Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Cheng

    2015-09-01

    Full Text Available The paper presents a number of advanced solutions on electric machines and machine-based systems for the powertrain of electric vehicles (EVs. Two types of systems are considered, namely the drive systems designated to the EV propulsion and the power split devices utilized in the popular series-parallel hybrid electric vehicle architecture. After reviewing the main requirements for the electric drive systems, the paper illustrates advanced electric machine topologies, including a stator permanent magnet (stator-PM motor, a hybrid-excitation motor, a flux memory motor and a redundant motor structure. Then, it illustrates advanced electric drive systems, such as the magnetic-geared in-wheel drive and the integrated starter generator (ISG. Finally, three machine-based implementations of the power split devices are expounded, built up around the dual-rotor PM machine, the dual-stator PM brushless machine and the magnetic-geared dual-rotor machine. As a conclusion, the development trends in the field of electric machines and machine-based systems for EVs are summarized.

  18. Asynchronized synchronous machines

    CERN Document Server

    Botvinnik, M M

    1964-01-01

    Asynchronized Synchronous Machines focuses on the theoretical research on asynchronized synchronous (AS) machines, which are "hybrids” of synchronous and induction machines that can operate with slip. Topics covered in this book include the initial equations; vector diagram of an AS machine; regulation in cases of deviation from the law of full compensation; parameters of the excitation system; and schematic diagram of an excitation regulator. The possible applications of AS machines and its calculations in certain cases are also discussed. This publication is beneficial for students and indiv

  19. Machine Shop Lathes.

    Science.gov (United States)

    Dunn, James

    This guide, the second in a series of five machine shop curriculum manuals, was designed for use in machine shop courses in Oklahoma. The purpose of the manual is to equip students with basic knowledge and skills that will enable them to enter the machine trade at the machine-operator level. The curriculum is designed so that it can be used in…

  20. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    Science.gov (United States)

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2017-06-14

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  1. Design Comparison of Inner and Outer Rotor of Permanent Magnet Flux Switching Machine for Electric Bicycle Application

    Science.gov (United States)

    Jusoh, L. I.; Sulaiman, E.; Bahrim, F. S.; Kumar, R.

    2017-08-01

    Recent advancements have led to the development of flux switching machines (FSMs) with flux sources within the stators. The advantage of being a single-piece machine with a robust rotor structure makes FSM an excellent choice for speed applications. There are three categories of FSM, namely, the permanent magnet (PM) FSM, the field excitation (FE) FSM, and the hybrid excitation (HE) FSM. The PMFSM and the FEFSM have their respective PM and field excitation coil (FEC) as their key flux sources. Meanwhile, as the name suggests, the HEFSM has a combination of PM and FECs as the flux sources. The PMFSM is a simple and cheap machine, and it has the ability to control variable flux, which would be suitable for an electric bicycle. Thus, this paper will present a design comparison between an inner rotor and an outer rotor for a single-phase permanent magnet flux switching machine with 8S-10P, designed specifically for an electric bicycle. The performance of this machine was validated using the 2D- FEA. As conclusion, the outer-rotor has much higher torque approximately at 54.2% of an innerrotor PMFSM. From the comprehensive analysis of both designs it can be conclude that output performance is lower than the SRM and IPMSM design machine. But, it shows that the possibility to increase the design performance by using “deterministic optimization method”.

  2. Real-time PCR Machine System Modeling and a Systematic Approach for the Robust Design of a Real-time PCR-on-a-Chip System

    OpenAIRE

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DN...

  3. Methods for robustness programming

    NARCIS (Netherlands)

    Olieman, N.J.

    2008-01-01

    Robustness of an object is defined as the probability that an object will have properties as required. Robustness Programming (RP) is a mathematical approach for Robustness estimation and Robustness optimisation. An example in the context of designing a food product, is finding the best composition

  4. Machining of Machine Elements Made of Polymer Composite Materials

    Science.gov (United States)

    Baurova, N. I.; Makarov, K. A.

    2017-12-01

    The machining of the machine elements that are made of polymer composite materials (PCMs) or are repaired using them is considered. Turning, milling, and drilling are shown to be most widely used among all methods of cutting PCMs. Cutting conditions for the machining of PCMs are presented. The factors that most strongly affect the roughness parameters and the accuracy of cutting PCMs are considered.

  5. Robustness of Structural Systems

    DEFF Research Database (Denmark)

    Canisius, T.D.G.; Sørensen, John Dalsgaard; Baker, J.W.

    2007-01-01

    The importance of robustness as a property of structural systems has been recognised following several structural failures, such as that at Ronan Point in 1968,where the consequenceswere deemed unacceptable relative to the initiating damage. A variety of research efforts in the past decades have...... attempted to quantify aspects of robustness such as redundancy and identify design principles that can improve robustness. This paper outlines the progress of recent work by the Joint Committee on Structural Safety (JCSS) to develop comprehensive guidance on assessing and providing robustness in structural...... systems. Guidance is provided regarding the assessment of robustness in a framework that considers potential hazards to the system, vulnerability of system components, and failure consequences. Several proposed methods for quantifying robustness are reviewed, and guidelines for robust design...

  6. The PEP-II project-wide database

    International Nuclear Information System (INIS)

    Chan, A.; Calish, S.; Crane, G.; MacGregor, I.; Meyer, S.; Wong, J.

    1995-05-01

    The PEP-II Project Database is a tool for monitoring the technical and documentation aspects of this accelerator construction. It holds the PEP-II design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, magnet and vacuum fabrication data. CAD drawings, publications and documentation, survey and alignment data and property control. The database can be extended to contain information required for the operations phase of the accelerator and detector. Features such as viewing CAD drawing graphics from the database will be implemented in the future. This central Oracle database on a UNIX server is built using ORACLE Case tools. Users at the three collaborating laboratories (SLAC, LBL, LLNL) can access the data remotely, using various desktop computer platforms and graphical interfaces

  7. Robustness in laying hens

    NARCIS (Netherlands)

    Star, L.

    2008-01-01

    The aim of the project ‘The genetics of robustness in laying hens’ was to investigate nature and regulation of robustness in laying hens under sub-optimal conditions and the possibility to increase robustness by using animal breeding without loss of production. At the start of the project, a robust

  8. Extreme Learning Machine and Moving Least Square Regression Based Solar Panel Vision Inspection

    Directory of Open Access Journals (Sweden)

    Heng Liu

    2017-01-01

    Full Text Available In recent years, learning based machine intelligence has aroused a lot of attention across science and engineering. Particularly in the field of automatic industry inspection, the machine learning based vision inspection plays a more and more important role in defect identification and feature extraction. Through learning from image samples, many features of industry objects, such as shapes, positions, and orientations angles, can be obtained and then can be well utilized to determine whether there is defect or not. However, the robustness and the quickness are not easily achieved in such inspection way. In this work, for solar panel vision inspection, we present an extreme learning machine (ELM and moving least square regression based approach to identify solder joint defect and detect the panel position. Firstly, histogram peaks distribution (HPD and fractional calculus are applied for image preprocessing. Then an ELM-based defective solder joints identification is discussed in detail. Finally, moving least square regression (MLSR algorithm is introduced for solar panel position determination. Experimental results and comparisons show that the proposed ELM and MLSR based inspection method is efficient not only in detection accuracy but also in processing speed.

  9. Mechanical Design for Robustness of the LHC Collimators

    CERN Document Server

    Bertarelli, Alessandro; Assmann, R W; Calatroni, Sergio; Dallocchio, Alessandro; Kurtyka, Tadeusz; Mayer, Manfred; Perret, Roger; Redaelli, Stefano; Robert-Demolaize, Guillaume

    2005-01-01

    The functional specification of the LHC Collimators requires, for the start-up of the machine and the initial luminosity runs (Phase 1), a collimation system with maximum robustness against abnormal beam operating conditions. The most severe cases to be considered in the mechanical design are the asynchronous beam dump at 7 TeV and the 450 GeV injection error. To ensure that the collimator jaws survive such accident scenarios, low-Z materials were chosen, driving the design towards Graphite or Carbon/Carbon composites. Furthermore, in-depth thermo-mechanical simulations, both static and dynamic, were necessary.This paper presents the results of the numerical analyses performed for the 450 GeV accident case, along with the experimental results of the tests conducted on a collimator prototype in Cern TT40 transfer line, impacted by a 450 GeV beam of 3.1·1013

  10. Improving Machining Accuracy of CNC Machines with Innovative Design Methods

    Science.gov (United States)

    Yemelyanov, N. V.; Yemelyanova, I. V.; Zubenko, V. L.

    2018-03-01

    The article considers achieving the machining accuracy of CNC machines by applying innovative methods in modelling and design of machining systems, drives and machine processes. The topological method of analysis involves visualizing the system as matrices of block graphs with a varying degree of detail between the upper and lower hierarchy levels. This approach combines the advantages of graph theory and the efficiency of decomposition methods, it also has visual clarity, which is inherent in both topological models and structural matrices, as well as the resiliency of linear algebra as part of the matrix-based research. The focus of the study is on the design of automated machine workstations, systems, machines and units, which can be broken into interrelated parts and presented as algebraic, topological and set-theoretical models. Every model can be transformed into a model of another type, and, as a result, can be interpreted as a system of linear and non-linear equations which solutions determine the system parameters. This paper analyses the dynamic parameters of the 1716PF4 machine at the stages of design and exploitation. Having researched the impact of the system dynamics on the component quality, the authors have developed a range of practical recommendations which have enabled one to reduce considerably the amplitude of relative motion, exclude some resonance zones within the spindle speed range of 0...6000 min-1 and improve machining accuracy.

  11. Machinability of nickel based alloys using electrical discharge machining process

    Science.gov (United States)

    Khan, M. Adam; Gokul, A. K.; Bharani Dharan, M. P.; Jeevakarthikeyan, R. V. S.; Uthayakumar, M.; Thirumalai Kumaran, S.; Duraiselvam, M.

    2018-04-01

    The high temperature materials such as nickel based alloys and austenitic steel are frequently used for manufacturing critical aero engine turbine components. Literature on conventional and unconventional machining of steel materials is abundant over the past three decades. However the machining studies on superalloy is still a challenging task due to its inherent property and quality. Thus this material is difficult to be cut in conventional processes. Study on unconventional machining process for nickel alloys is focused in this proposed research. Inconel718 and Monel 400 are the two different candidate materials used for electrical discharge machining (EDM) process. Investigation is to prepare a blind hole using copper electrode of 6mm diameter. Electrical parameters are varied to produce plasma spark for diffusion process and machining time is made constant to calculate the experimental results of both the material. Influence of process parameters on tool wear mechanism and material removal are considered from the proposed experimental design. While machining the tool has prone to discharge more materials due to production of high energy plasma spark and eddy current effect. The surface morphology of the machined surface were observed with high resolution FE SEM. Fused electrode found to be a spherical structure over the machined surface as clumps. Surface roughness were also measured with surface profile using profilometer. It is confirmed that there is no deviation and precise roundness of drilling is maintained.

  12. The achievements of the Z-machine; Les exploits de la Z-machine

    Energy Technology Data Exchange (ETDEWEB)

    Larousserie, D

    2008-03-15

    The ZR-machine that represents the latest generation of Z-pinch machines has recently begun preliminary testing before its full commissioning in Albuquerque (Usa). During its test the machine has well operated with electrical currents whose intensities of 26 million Ampere are already 2 times as high as the intensity of the operating current of the previous Z-machine. In 2006 the Z-machine reached temperatures of 2 billions Kelvin while 100 million Kelvin would be sufficient to ignite thermonuclear fusion. In fact the concept of Z-pinch machines was imagined in the fifties but the technological breakthrough that has allowed this recent success and the reborn of Z-machine, was the replacement of gas by an array of metal wires through which the electrical current flows and vaporizes it creating an imploding plasma. It is not well understood why Z-pinch machines generate far more radiation than theoretically expected. (A.C.)

  13. Quantum machine learning.

    Science.gov (United States)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  14. Machine protection systems

    CERN Document Server

    Macpherson, A L

    2010-01-01

    A summary of the Machine Protection System of the LHC is given, with particular attention given to the outstanding issues to be addressed, rather than the successes of the machine protection system from the 2009 run. In particular, the issues of Safe Machine Parameter system, collimation and beam cleaning, the beam dump system and abort gap cleaning, injection and dump protection, and the overall machine protection program for the upcoming run are summarised.

  15. FY1995 distributed control of man-machine cooperative multi agent systems; 1995 nendo ningen kyochogata multi agent kikai system no jiritsu seigyo

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    In the near future, distributed autonomous systems will be practical in many situations, e.g., interactive production systems, hazardous environments, nursing homes, and individual houses. The agents which consist of the distributed system must not give damages to human being and should be working economically. In this project man-machine cooperative multi agent systems are studied in many kind of respects, and basic design technology, basic control technique are developed by establishing fundamental theories and by constructing experimental systems. In this project theoretical and experimental studies are conducted in the following sub-projects: (1) Distributed cooperative control in multi agent type actuation systems (2) Control of non-holonomic systems (3) Man-machine Cooperative systems (4) Robot systems learning human skills (5) Robust force control of constrained systems In each sub-project cooperative nature between machine agent systems and human being, interference between artificial multi agents and environment and new function emergence in coordination of the multi agents and the environment, robust force control against for the environments, control methods for non-holonomic systems, robot systems which can mimic and learn human skills were studied. In each sub-project, some problems were hi-lighted and solutions for the problems have been given based on construction of experimental systems. (NEDO)

  16. Preliminary Test of Upgraded Conventional Milling Machine into PC Based CNC Milling Machine

    International Nuclear Information System (INIS)

    Abdul Hafid

    2008-01-01

    CNC (Computerized Numerical Control) milling machine yields a challenge to make an innovation in the field of machining. With an action job is machining quality equivalent to CNC milling machine, the conventional milling machine ability was improved to be based on PC CNC milling machine. Mechanically and instrumentally change. As a control replacing was conducted by servo drive and proximity were used. Computer programme was constructed to give instruction into milling machine. The program structure of consists GUI model and ladder diagram. Program was put on programming systems called RTX software. The result of up-grade is computer programming and CNC instruction job. The result was beginning step and it will be continued in next time. With upgrading ability milling machine becomes user can be done safe and optimal from accident risk. By improving performance of milling machine, the user will be more working optimal and safely against accident risk. (author)

  17. Machine Control System of Steady State Superconducting Tokamak-1

    Energy Technology Data Exchange (ETDEWEB)

    Masand, Harish, E-mail: harish@ipr.res.in; Kumar, Aveg; Bhandarkar, M.; Mahajan, K.; Gulati, H.; Dhongde, J.; Patel, K.; Chudasma, H.; Pradhan, S.

    2016-11-15

    Highlights: • Central Control System. • SST-1. • Machine Control System. - Abstract: Central Control System (CCS) of the Steady State Superconducting Tokamak-1 (SST-1) controls and monitors around 25 plant and experiment subsystems of SST-1 located remotely from the Central-Control room. Machine Control System (MCS) is a supervisory system that sits on the top of the CCS hierarchy and implements the CCS state diagram. MCS ensures the software interlock between the SST-1 subsystems with the CCS, any subsystem communication failure or its local error does not prohibit the execution of the MCS and in-turn the CCS operation. MCS also periodically monitors the subsystem’s status and their vital process parameters throughout the campaign. It also provides the platform for the Central Control operator to visualize and exchange remotely the operational and experimental configuration parameters with the sub-systems. MCS remains operational 24 × 7 from the commencement to the termination of the SST-1 campaign. The developed MCS has performed robustly and flawlessly during all the last campaigns of SST-1 carried out so far. This paper will describe various aspects of the development of MCS.

  18. Study of on-machine error identification and compensation methods for micro machine tools

    International Nuclear Information System (INIS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-01-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  19. Machine Fault Detection Based on Filter Bank Similarity Features Using Acoustic and Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Mauricio Holguín-Londoño

    2016-01-01

    Full Text Available Vibration and acoustic analysis actively support the nondestructive and noninvasive fault diagnostics of rotating machines at early stages. Nonetheless, the acoustic signal is less used because of its vulnerability to external interferences, hindering an efficient and robust analysis for condition monitoring (CM. This paper presents a novel methodology to characterize different failure signatures from rotating machines using either acoustic or vibration signals. Firstly, the signal is decomposed into several narrow-band spectral components applying different filter bank methods such as empirical mode decomposition, wavelet packet transform, and Fourier-based filtering. Secondly, a feature set is built using a proposed similarity measure termed cumulative spectral density index and used to estimate the mutual statistical dependence between each bandwidth-limited component and the raw signal. Finally, a classification scheme is carried out to distinguish the different types of faults. The methodology is tested in two laboratory experiments, including turbine blade degradation and rolling element bearing faults. The robustness of our approach is validated contaminating the signal with several levels of additive white Gaussian noise, obtaining high-performance outcomes that make the usage of vibration, acoustic, and vibroacoustic measurements in different applications comparable. As a result, the proposed fault detection based on filter bank similarity features is a promising methodology to implement in CM of rotating machinery, even using measurements with low signal-to-noise ratio.

  20. Super-delta: a new differential gene expression analysis procedure with robust data normalization.

    Science.gov (United States)

    Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing

    2017-12-21

    Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super

  1. National Machine Guarding Program: Part 1. Machine safeguarding practices in small metal fabrication businesses.

    Science.gov (United States)

    Parker, David L; Yamin, Samuel C; Brosseau, Lisa M; Xi, Min; Gordon, Robert; Most, Ivan G; Stanley, Rodney

    2015-11-01

    Metal fabrication workers experience high rates of traumatic occupational injuries. Machine operators in particular face high risks, often stemming from the absence or improper use of machine safeguarding or the failure to implement lockout procedures. The National Machine Guarding Program (NMGP) was a translational research initiative implemented in conjunction with two workers' compensation insures. Insurance safety consultants trained in machine guarding used standardized checklists to conduct a baseline inspection of machine-related hazards in 221 business. Safeguards at the point of operation were missing or inadequate on 33% of machines. Safeguards for other mechanical hazards were missing on 28% of machines. Older machines were both widely used and less likely than newer machines to be properly guarded. Lockout/tagout procedures were posted at only 9% of machine workstations. The NMGP demonstrates a need for improvement in many aspects of machine safety and lockout in small metal fabrication businesses. © 2015 The Authors. American Journal of Industrial Medicine published by Wiley Periodicals, Inc.

  2. Machine learning for outcome prediction of acute ischemic stroke post intra-arterial therapy.

    Directory of Open Access Journals (Sweden)

    Hamed Asadi

    datasets, likely further improving prediction. Finally, we propose that a robust machine learning system can potentially optimise the selection process for endovascular versus medical treatment in the management of acute stroke.

  3. Non-conventional electrical machines

    CERN Document Server

    Rezzoug, Abderrezak

    2013-01-01

    The developments of electrical machines are due to the convergence of material progress, improved calculation tools, and new feeding sources. Among the many recent machines, the authors have chosen, in this first book, to relate the progress in slow speed machines, high speed machines, and superconducting machines. The first part of the book is dedicated to materials and an overview of magnetism, mechanic, and heat transfer.

  4. Robust analysis of trends in noisy tokamak confinement data using geodesic least squares regression

    Energy Technology Data Exchange (ETDEWEB)

    Verdoolaege, G., E-mail: geert.verdoolaege@ugent.be [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Laboratory for Plasma Physics, Royal Military Academy, B-1000 Brussels (Belgium); Shabbir, A. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Max Planck Institute for Plasma Physics, Boltzmannstr. 2, 85748 Garching (Germany); Hornung, G. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium)

    2016-11-15

    Regression analysis is a very common activity in fusion science for unveiling trends and parametric dependencies, but it can be a difficult matter. We have recently developed the method of geodesic least squares (GLS) regression that is able to handle errors in all variables, is robust against data outliers and uncertainty in the regression model, and can be used with arbitrary distribution models and regression functions. We here report on first results of application of GLS to estimation of the multi-machine scaling law for the energy confinement time in tokamaks, demonstrating improved consistency of the GLS results compared to standard least squares.

  5. Characterization of Machine Variability and Progressive Heat Treatment in Selective Laser Melting of Inconel 718

    Science.gov (United States)

    Prater, Tracie; Tilson, Will; Jones, Zack

    2015-01-01

    The absence of an economy of scale in spaceflight hardware makes additive manufacturing an immensely attractive option for propulsion components. As additive manufacturing techniques are increasingly adopted by government and industry to produce propulsion hardware in human-rated systems, significant development efforts are needed to establish these methods as reliable alternatives to conventional subtractive manufacturing. One of the critical challenges facing powder bed fusion techniques in this application is variability between machines used to perform builds. Even with implementation of robust process controls, it is possible for two machines operating at identical parameters with equivalent base materials to produce specimens with slightly different material properties. The machine variability study presented here evaluates 60 specimens of identical geometry built using the same parameters. 30 samples were produced on machine 1 (M1) and the other 30 samples were built on machine 2 (M2). Each of the 30-sample sets were further subdivided into three subsets (with 10 specimens in each subset) to assess the effect of progressive heat treatment on machine variability. The three categories for post-processing were: stress relief, stress relief followed by hot isostatic press (HIP), and stress relief followed by HIP followed by heat treatment per AMS 5664. Each specimen (a round, smooth tensile) was mechanically tested per ASTM E8. Two formal statistical techniques, hypothesis testing for equivalency of means and one-way analysis of variance (ANOVA), were applied to characterize the impact of machine variability and heat treatment on six material properties: tensile stress, yield stress, modulus of elasticity, fracture elongation, and reduction of area. This work represents the type of development effort that is critical as NASA, academia, and the industrial base work collaboratively to establish a path to certification for additively manufactured parts. For future

  6. CrossTalk. The Journal of Defense Software Engineering. Volume 14, Number 5, May 2001

    Science.gov (United States)

    2001-05-01

    superb leadership, and adequate funding to fuel the program engine. A good start requires disciplined requirements generation and plotting the right course...multimedia products. • Self -contained, closed products. • Desk-top and portable computers. • Information documentation and support. The entire regulation...machines prefer certain e-mail clients. The Oracle sales force lives on laptops and prefers different e-mail clients. Some employees telecommute and

  7. Development of a Big Data Application Architecture for Navy Manpower, Personnel, Training, and Education

    Science.gov (United States)

    2016-03-01

    science IT information technology JBOD just a bunch of disks JDBC java database connectivity xviii JPME Joint Professional Military Education JSO...Joint Service Officer JVM java virtual machine MPP massively parallel processing MPTE Manpower, Personnel, Training, and Education NAVMAC Navy...27 external database, whether it is MySQL , Oracle, DB2, or SQL Server (Teller, 2015). Connectors optimize the data transfer by obtaining metadata

  8. Electrical machines & drives

    CERN Document Server

    Hammond, P

    1985-01-01

    Containing approximately 200 problems (100 worked), the text covers a wide range of topics concerning electrical machines, placing particular emphasis upon electrical-machine drive applications. The theory is concisely reviewed and focuses on features common to all machine types. The problems are arranged in order of increasing levels of complexity and discussions of the solutions are included where appropriate to illustrate the engineering implications. This second edition includes an important new chapter on mathematical and computer simulation of machine systems and revised discussions o

  9. DNA-based machines.

    Science.gov (United States)

    Wang, Fuan; Willner, Bilha; Willner, Itamar

    2014-01-01

    The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.

  10. Machine translation

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, M

    1982-04-01

    Each language has its own structure. In translating one language into another one, language attributes and grammatical interpretation must be defined in an unambiguous form. In order to parse a sentence, it is necessary to recognize its structure. A so-called context-free grammar can help in this respect for machine translation and machine-aided translation. Problems to be solved in studying machine translation are taken up in the paper, which discusses subjects for semantics and for syntactic analysis and translation software. 14 references.

  11. Robust Growth Determinants

    OpenAIRE

    Doppelhofer, Gernot; Weeks, Melvyn

    2011-01-01

    This paper investigates the robustness of determinants of economic growth in the presence of model uncertainty, parameter heterogeneity and outliers. The robust model averaging approach introduced in the paper uses a flexible and parsi- monious mixture modeling that allows for fat-tailed errors compared to the normal benchmark case. Applying robust model averaging to growth determinants, the paper finds that eight out of eighteen variables found to be significantly related to economic growth ...

  12. Induction machine handbook

    CERN Document Server

    Boldea, Ion

    2002-01-01

    Often called the workhorse of industry, the advent of power electronics and advances in digital control are transforming the induction motor into the racehorse of industrial motion control. Now, the classic texts on induction machines are nearly three decades old, while more recent books on electric motors lack the necessary depth and detail on induction machines.The Induction Machine Handbook fills industry's long-standing need for a comprehensive treatise embracing the many intricate facets of induction machine analysis and design. Moving gradually from simple to complex and from standard to

  13. Chaotic Boltzmann machines

    Science.gov (United States)

    Suzuki, Hideyuki; Imura, Jun-ichi; Horio, Yoshihiko; Aihara, Kazuyuki

    2013-01-01

    The chaotic Boltzmann machine proposed in this paper is a chaotic pseudo-billiard system that works as a Boltzmann machine. Chaotic Boltzmann machines are shown numerically to have computing abilities comparable to conventional (stochastic) Boltzmann machines. Since no randomness is required, efficient hardware implementation is expected. Moreover, the ferromagnetic phase transition of the Ising model is shown to be characterised by the largest Lyapunov exponent of the proposed system. In general, a method to relate probabilistic models to nonlinear dynamics by derandomising Gibbs sampling is presented. PMID:23558425

  14. Rotating electrical machines

    CERN Document Server

    Le Doeuff, René

    2013-01-01

    In this book a general matrix-based approach to modeling electrical machines is promulgated. The model uses instantaneous quantities for key variables and enables the user to easily take into account associations between rotating machines and static converters (such as in variable speed drives).   General equations of electromechanical energy conversion are established early in the treatment of the topic and then applied to synchronous, induction and DC machines. The primary characteristics of these machines are established for steady state behavior as well as for variable speed scenarios. I

  15. Robust SMES controller design for stabilization of inter-area oscillation considering coil size and system uncertainties

    International Nuclear Information System (INIS)

    Ngamroo, Issarachai

    2010-01-01

    It is well known that the superconducting magnetic energy storage (SMES) is able to quickly exchange active and reactive power with the power system. The SMES is expected to be the smart storage device for power system stabilization. Although the stabilizing effect of SMES is significant, the SMES is quite costly. Particularly, the superconducting magnetic coil size which is the essence of the SMES, must be carefully selected. On the other hand, various generation and load changes, unpredictable network structure, etc., cause system uncertainties. The power controller of SMES which is designed without considering such uncertainties, may not tolerate and loses stabilizing effect. To overcome these problems, this paper proposes the new design of robust SMES controller taking coil size and system uncertainties into account. The structure of the active and reactive power controllers is the 1st-order lead-lag compensator. No need for the exact mathematical representation, system uncertainties are modeled by the inverse input multiplicative perturbation. Without the difficulty of the trade-off of damping performance and robustness, the optimization problem of control parameters is formulated. The particle swarm optimization is used for solving the optimal parameters at each coil size automatically. Based on the normalized integral square error index and the consideration of coil current constraint, the robust SMES with the smallest coil size which still provides the satisfactory stabilizing effect, can be achieved. Simulation studies in the two-area four-machine interconnected power system show the superior robustness of the proposed robust SMES with the smallest coil size under various operating conditions over the non-robust SMES with large coil size.

  16. Robust SMES controller design for stabilization of inter-area oscillation considering coil size and system uncertainties

    Science.gov (United States)

    Ngamroo, Issarachai

    2010-12-01

    It is well known that the superconducting magnetic energy storage (SMES) is able to quickly exchange active and reactive power with the power system. The SMES is expected to be the smart storage device for power system stabilization. Although the stabilizing effect of SMES is significant, the SMES is quite costly. Particularly, the superconducting magnetic coil size which is the essence of the SMES, must be carefully selected. On the other hand, various generation and load changes, unpredictable network structure, etc., cause system uncertainties. The power controller of SMES which is designed without considering such uncertainties, may not tolerate and loses stabilizing effect. To overcome these problems, this paper proposes the new design of robust SMES controller taking coil size and system uncertainties into account. The structure of the active and reactive power controllers is the 1st-order lead-lag compensator. No need for the exact mathematical representation, system uncertainties are modeled by the inverse input multiplicative perturbation. Without the difficulty of the trade-off of damping performance and robustness, the optimization problem of control parameters is formulated. The particle swarm optimization is used for solving the optimal parameters at each coil size automatically. Based on the normalized integral square error index and the consideration of coil current constraint, the robust SMES with the smallest coil size which still provides the satisfactory stabilizing effect, can be achieved. Simulation studies in the two-area four-machine interconnected power system show the superior robustness of the proposed robust SMES with the smallest coil size under various operating conditions over the non-robust SMES with large coil size.

  17. Your Sewing Machine.

    Science.gov (United States)

    Peacock, Marion E.

    The programed instruction manual is designed to aid the student in learning the parts, uses, and operation of the sewing machine. Drawings of sewing machine parts are presented, and space is provided for the student's written responses. Following an introductory section identifying sewing machine parts, the manual deals with each part and its…

  18. Machine Learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  19. Machine learning algorithms for the creation of clinical healthcare enterprise systems

    Science.gov (United States)

    Mandal, Indrajit

    2017-10-01

    Clinical recommender systems are increasingly becoming popular for improving modern healthcare systems. Enterprise systems are persuasively used for creating effective nurse care plans to provide nurse training, clinical recommendations and clinical quality control. A novel design of a reliable clinical recommender system based on multiple classifier system (MCS) is implemented. A hybrid machine learning (ML) ensemble based on random subspace method and random forest is presented. The performance accuracy and robustness of proposed enterprise architecture are quantitatively estimated to be above 99% and 97%, respectively (above 95% confidence interval). The study then extends to experimental analysis of the clinical recommender system with respect to the noisy data environment. The ranking of items in nurse care plan is demonstrated using machine learning algorithms (MLAs) to overcome the drawback of the traditional association rule method. The promising experimental results are compared against the sate-of-the-art approaches to highlight the advancement in recommendation technology. The proposed recommender system is experimentally validated using five benchmark clinical data to reinforce the research findings.

  20. Monitoring of laser material processing using machine integrated low-coherence interferometry

    Science.gov (United States)

    Kunze, Rouwen; König, Niels; Schmitt, Robert

    2017-06-01

    Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.