WorldWideScience

Sample records for digital universe high-dimensional

  1. Astronomy in the Digital Universe

    Science.gov (United States)

    Haisch, Bernard M.; Lindblom, J.; Terzian, Y.

    2006-12-01

    The Digital Universe is an Internet project whose mission is to provide free, accurate, unbiased information covering all aspects of human knowledge, and to inspire humans to learn, make use of, and expand this knowledge. It is planned to be a decades long effort, inspired by the Encyclopedia Galactica concept popularized by Carl Sagan, and is being developed by the non-profit Digital Universe Foundation. A worldwide network of experts is responsible for selecting content featured within the Digital Universe. The first publicly available content is the Encyclopedia of Earth, a Boston University project headed by Prof. Cutler Cleveland, which will be part of the Earth Portal. The second major content area will be an analogous Encyclopedia of the Cosmos to be part of the Cosmos Portal. It is anticipated that this will evolve into a major resource for astronomy education. Authors and topic editors are now being recruited for the Encyclopedia of the Cosmos.

  2. Construction of high-dimensional universal quantum logic gates using a Λ system coupled with a whispering-gallery-mode microresonator.

    Science.gov (United States)

    He, Ling Yan; Wang, Tie-Jun; Wang, Chuan

    2016-07-11

    High-dimensional quantum system provides a higher capacity of quantum channel, which exhibits potential applications in quantum information processing. However, high-dimensional universal quantum logic gates is difficult to achieve directly with only high-dimensional interaction between two quantum systems and requires a large number of two-dimensional gates to build even a small high-dimensional quantum circuits. In this paper, we propose a scheme to implement a general controlled-flip (CF) gate where the high-dimensional single photon serve as the target qudit and stationary qubits work as the control logic qudit, by employing a three-level Λ-type system coupled with a whispering-gallery-mode microresonator. In our scheme, the required number of interaction times between the photon and solid state system reduce greatly compared with the traditional method which decomposes the high-dimensional Hilbert space into 2-dimensional quantum space, and it is on a shorter temporal scale for the experimental realization. Moreover, we discuss the performance and feasibility of our hybrid CF gate, concluding that it can be easily extended to a 2n-dimensional case and it is feasible with current technology.

  3. Digital Downsides: Exploring University Students' Negative Engagements with Digital Technology

    Science.gov (United States)

    Selwyn, Neil

    2016-01-01

    Digital technologies are now an integral feature of university study. As such, academic research has tended to concentrate on the potential of digital technologies to support, extend and even "enhance" student learning. This paper, in contrast, explores the rather more messy realities of students' engagements with digital technology. In…

  4. High dimensional entanglement

    CSIR Research Space (South Africa)

    Mc

    2012-07-01

    Full Text Available stream_source_info McLaren_2012.pdf.txt stream_content_type text/plain stream_size 2190 Content-Encoding ISO-8859-1 stream_name McLaren_2012.pdf.txt Content-Type text/plain; charset=ISO-8859-1 High dimensional... entanglement M. McLAREN1,2, F.S. ROUX1 & A. FORBES1,2,3 1. CSIR National Laser Centre, PO Box 395, Pretoria 0001 2. School of Physics, University of the Stellenbosch, Private Bag X1, 7602, Matieland 3. School of Physics, University of Kwazulu...

  5. Digital Technologies as Education Innovation at Universities

    Science.gov (United States)

    Kryukov, Vladimir; Gorin, Alexey

    2017-01-01

    This paper analyses the use of digital technology-based education innovations in higher education. It demonstrated that extensive implementation of digital technologies in universities is the main factor conditioning the acceleration of innovative changes in educational processes, while digital technologies themselves become one of the key…

  6. MILLION BOOK UNIVERSAL DIGITAL LIBRARY PROJECTS: INDIA

    OpenAIRE

    Waghmode, S. S.

    2009-01-01

    Digital Library of India is a digital library of books, which is free-to-read, searchable, predominantly in India languages, available to everyone over the Internet. Very soon it is expected that this portal would provide a gateway to Indian Digital Libraries in Science, Arts, Culture, Music, Movies, Traditional Medicine, Palm Leaves and many more. This project is collaboration between Indian Institute of Science, Bangalore, Universities and Carnegie Mellon University under MILLION BOOK UNIVE...

  7. University Libraries and Digital Learning Environments

    OpenAIRE

    2011-01-01

    University libraries around the world have embraced the possibilities of the digital learning environment, facilitating its use and proactively seeking to develop the provision of electronic resources and services. The digital environment offers opportunities and challenges for librarians in all aspects of their work – in information literacy, virtual reference, institutional repositories, e-learning, managing digital resources and social media. The authors in this timely book are leading exp...

  8. Developing Digital Technologies for Undergraduate University Mathematics

    DEFF Research Database (Denmark)

    Triantafyllou, Eva; Timcenko, Olga

    2013-01-01

    Our research effort presented in this paper relates with developing digital tools for mathematics education at undergraduate university level. It focuses specifically on studies where mathematics is not a core subject but it is very important in order to cope with core subjects. For our design, we...... requirements for the development of digital tools that support mathematics teaching and learning at university level....... during lectures and exercise time. During these observations we were able to investigate how the applets were used in practice but also to get insight in the challenges that the students face during mathematics learning. These findings together with student feedback inspire the next round of design...

  9. Clustering high dimensional data

    DEFF Research Database (Denmark)

    Assent, Ira

    2012-01-01

    High-dimensional data, i.e., data described by a large number of attributes, pose specific challenges to clustering. The so-called ‘curse of dimensionality’, coined originally to describe the general increase in complexity of various computational problems as dimensionality increases, is known...... to render traditional clustering algorithms ineffective. The curse of dimensionality, among other effects, means that with increasing number of dimensions, a loss of meaningful differentiation between similar and dissimilar objects is observed. As high-dimensional objects appear almost alike, new approaches...... for clustering are required. Consequently, recent research has focused on developing techniques and clustering algorithms specifically for high-dimensional data. Still, open research issues remain. Clustering is a data mining task devoted to the automatic grouping of data based on mutual similarity. Each cluster...

  10. Modeling High-Dimensional Multichannel Brain Signals

    KAUST Repository

    Hu, Lechuan; Fortin, Norbert J.; Ombao, Hernando

    2017-01-01

    aspects: first, there are major statistical and computational challenges for modeling and analyzing high-dimensional multichannel brain signals; second, there is no set of universally agreed measures for characterizing connectivity. To model multichannel

  11. Digital reading practices of university students

    Directory of Open Access Journals (Sweden)

    Karen Shirley LÓPEZ GIL

    2016-06-01

    Full Text Available This paper presents results of research on digital reading. The main objective of the research was to analyze the reading on screens practices of university students and how their practices are guided by professors and institutions of higher education. The research design was mixed and the type of study was descriptive of cross-sectional. The data collection techniques were questionnaire, document analysis and discussion group. ibm spss v.22 was used for statistical treatment of data and Atlas.Ti 7.0 was used for content analysis of qualitative information. The study showed that students usually read on screens, although many of their reading practices have recreational purposes. Students have troubles to find reliable information on the Internet when they have academic pursuits and frequently consult secondary sources. When texts are on screens, students generally scan information and surf from one document to another along hyperlinks. The boundaries between academic and leisure activities are not well defined; multitasking appears frequently. Students indicate there is a little guidance received from their professors or university. These findings show that students are constantly faced with digital reading, but practices do not always allow them to achieve their academic purposes, so it is necessary to strengthen the support offered to them, mainly from the classroom language. 

  12. Digitizing Villanova University's Eclipsing Binary Card Catalogue

    Science.gov (United States)

    Guzman, Giannina; Dalton, Briana; Conroy, Kyle; Prsa, Andrej

    2018-01-01

    Villanova University’s Department of Astrophysics and Planetary Science has years of hand-written archival data on Eclipsing Binaries at its disposal. This card catalog began at Princeton in the 1930’s with notable contributions from scientists such as Henry Norris Russel. During World War II, the archive was moved to the University of Pennsylvania, which was one of the world centers for Eclipsing Binary research, consequently, the contributions to the catalog during this time were immense. It was then moved to University of Florida at Gainesville before being accepted by Villanova in the 1990’s. The catalog has been kept in storage since then. The objective of this project is to digitize this archive and create a fully functional online catalog that contains the information available on the cards, along with the scan of the actual cards. Our group has built a database using a python-powered infrastructure to contain the collected data. The team also built a prototype web-based searchable interface as a front-end to the catalog. Following the data-entry process, information like the Right Ascension and Declination will be run against SIMBAD and any differences between values will be noted as part of the catalog. Information published online from the card catalog and even discrepancies in information for a star, could be a catalyst for new studies on these Eclipsing Binaries. Once completed, the database-driven interface will be made available to astronomers worldwide. The group will also acquire, from the database, a list of referenced articles that have yet to be found online in order to further pursue their digitization. This list will be comprised of references in the cards that were neither found on ADS nor online during the data-entry process. Pursuing the integration of these references to online queries such as ADS will be an ongoing process that will contribute and further facilitate studies on Eclipsing Binaries.

  13. Digital Natives Revisited: Developing Digital Wisdom in the Modern University

    Science.gov (United States)

    Harris, David

    2012-01-01

    The seminal work of Prensky on "digital natives" and "digital wisdom" is used to launch a broader discussion on the relations between electronic communication, higher education, and popular and elite culture. Prensky's critics commonly contrast his polarisations and generational divisions with a more complex picture of types of engagement with…

  14. A Study of Digital Communications between Universities and Students

    Science.gov (United States)

    Drake, Perry D.

    2017-01-01

    This study examined the digital and social media communication practices of nine urban universities including UMSL and compared those to known corporate best practices. The purpose of this study was to (1) research how these universities are using social/digital communications to engage with students and prospective students; (2) compare the…

  15. Different conceptions of digital university in Ibero-America

    Directory of Open Access Journals (Sweden)

    Jesús Salinas Ibáñez

    2018-03-01

    Full Text Available Recently, the term of digital university has been a buzzword or trend in different contexts in Spanish-speaking countries. However, there is no available clear definition of the concept that helps us specify its implementation further. Therefore, this study was aimed at analyzing the different existing conceptions of digital university in Ibero-America, via a systematic literature review and the analysis of the semantic field through a network analysis. This enabled us to identify what was understood by being a digital university from the perspective of different Ibero-American universities. Spanish keywords derived from the thesaurus -including “digital university”- were used in the search through diverse catalogs and databases, and documents of different kinds were found between 2007 and 2017. The literature review was based on the abstracts of the documents and we considered the creation of categories with diverse topics that represent how Ibero-America understands the digital university and with which restrictions has to deal with. The discussion incorporates the integration of those categories and topics into the digital university models that were previously identified. As a conclusion, a summary on the concept of digital university is presented and we point towards some remarks so that the conception is made a reality in its practice.

  16. hdm: High-dimensional metrics

    OpenAIRE

    Chernozhukov, Victor; Hansen, Christian; Spindler, Martin

    2016-01-01

    In this article the package High-dimensional Metrics (\\texttt{hdm}) is introduced. It is a collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e...

  17. Mining High-Dimensional Data

    Science.gov (United States)

    Wang, Wei; Yang, Jiong

    With the rapid growth of computational biology and e-commerce applications, high-dimensional data becomes very common. Thus, mining high-dimensional data is an urgent problem of great practical importance. However, there are some unique challenges for mining data of high dimensions, including (1) the curse of dimensionality and more crucial (2) the meaningfulness of the similarity measure in the high dimension space. In this chapter, we present several state-of-art techniques for analyzing high-dimensional data, e.g., frequent pattern mining, clustering, and classification. We will discuss how these methods deal with the challenges of high dimensionality.

  18. The Worldly Space: The Digital University in Network Time

    Science.gov (United States)

    Hassan, Robert

    2017-01-01

    This article considers the effect of information technology upon teaching, learning and research in the "digital university". In less than a generation the university has become a business like any other. It does so in the determining context of neoliberal globalisation and the computer revolution. The university develops through what we…

  19. An Appraisal of Digital Reference Services in Nigerian University ...

    African Journals Online (AJOL)

    This study examined digital reference services in Nigeria university libraries. A descriptive research design of the ex-post facto was adopted in the study. The instrument used in collecting data for this study was the questionnaire. One hundred and twenty respondents were randomly selected from six Nigerian universities ...

  20. Awareness and use of digital assistants by University of Ilorin ...

    African Journals Online (AJOL)

    Awareness and use of digital assistants by University of Ilorin postgraduate students. ... agents that help people with communication, information and time management. ... the use of DAs by postgraduate students of the University of Ilorin, Nigeria. ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  1. Are Digital Natives a Myth or Reality? University Students' Use of Digital Technologies

    Science.gov (United States)

    Margaryan, Anoush; Littlejohn, Allison; Vojt, Gabrielle

    2011-01-01

    This study investigated the extent and nature of university students' use of digital technologies for learning and socialising. The findings show that students use a limited range of mainly established technologies. Use of collaborative knowledge creation tools, virtual worlds, and social networking sites was low. "Digital natives" and students of…

  2. The Digital Divide among University Freshmen

    Science.gov (United States)

    Ricoy, Carmen; Feliz, Tiberio; Couto, Maria Joao

    2013-01-01

    Use of new technologies in university training is an ongoing reality today. However, the inequalities that exist among university students are the source of an important problem. Such inequalities need to be detected and analyzed and therefore a study of college freshmen can be very valuable. This qualitative study intends to analyze the digital…

  3. Eight-Channel Digital Signal Processor and Universal Trigger Module

    Science.gov (United States)

    Skulski, Wojtek; Wolfs, Frank

    2003-04-01

    A 10-bit, 8-channel, 40 megasamples per second digital signal processor and waveform digitizer DDC-8 (nicknamed Universal Trigger Module) is presented. The digitizer features 8 analog inputs, 1 analog output for a reconstructed analog waveform, 16 NIM logic inputs, 8 NIM logic outputs, and a pool of 16 TTL logic lines which can be individually configured as either inputs or outputs. The first application of this device is to enhance the present trigger electronics for PHOBOS at RHIC. The status of the development and the first results are presented. Possible applications of the new device are discussed. Supported by the NSF grant PHY-0072204.

  4. The Digitally Disadvantaged: Access to Digital Communication Technologies among First Year Students at a Rural South African University

    Science.gov (United States)

    Oyedemi, Toks; Mogano, Saki

    2018-01-01

    Considering the importance of digital skills in university education, this article reports on a study which examined access to technology among first year students at a rural South African university. The study focused on the digital readiness of students prior to their admission to the university, since many universities provide access to…

  5. Ghosts in the Machine: Incarcerated Students and the Digital University

    Science.gov (United States)

    Hopkins, Susan

    2015-01-01

    Providing higher education to offenders in custody has become an increasingly complex business in the age of digital learning. Most Australian prisoners still have no direct access to the internet and relatively unreliable access to information technology. As incarceration is now a business, prisons, like universities, are increasingly subject to…

  6. Digitization Of Federal University Libraries In Nigeria: A Doyen ...

    African Journals Online (AJOL)

    The Federal Government of Nigeria's proposal to digitize her university libraries is a right step in the right direction. The proposal amongst others intended to enhance the intellectual and manpower development as well as halt and reverse the falling standard of education in the country. The study discovered that for effective ...

  7. High-dimensional covariance estimation with high-dimensional data

    CERN Document Server

    Pourahmadi, Mohsen

    2013-01-01

    Methods for estimating sparse and large covariance matrices Covariance and correlation matrices play fundamental roles in every aspect of the analysis of multivariate data collected from a variety of fields including business and economics, health care, engineering, and environmental and physical sciences. High-Dimensional Covariance Estimation provides accessible and comprehensive coverage of the classical and modern approaches for estimating covariance matrices as well as their applications to the rapidly developing areas lying at the intersection of statistics and mac

  8. Rhizomes and plateaus: A study of digital communities of practice in University College English Teaching

    DEFF Research Database (Denmark)

    Kjærgaard, Thomas

    2017-01-01

    Rhizomes and plateaus: A study of digital communities of practice in University College English Teaching......Rhizomes and plateaus: A study of digital communities of practice in University College English Teaching...

  9. The Information Seeking Behavior of Digital Native and Digital Immigrant Students of Bogor Agricultural University

    Directory of Open Access Journals (Sweden)

    Janti Gristinawati Sujana

    2018-02-01

    Full Text Available Technological expansion and the changing way individuals gain access to information has deeply impacted the structure of libraries – physically as well as conceptually. A new generation of digital services platforms for libraries is emerging, designed to provide a more comprehensive  approach  to  the  management  and  access  to  all  formats  of  library  materials. Despite the modernization of libraries and their adaption to the digital age, the library still hold a critical role within community to serve its users, continuing to be beacons of information sharing, learning, and entertainment even amidst tight fiscal times.  As one of the leading university in Indonesia, Bogor Agricultural University Library must find solutions to new challenges, overhaul many of their entrenched business processes, and foster systems that engage students.  This study examined the information seeking behavior of the digital native and digital immigrant students of Bogor Agricultural University, in order to remind the library that there are some changes happened in its users and to recommend the new services should be taken by the library. The similarities and differences in seeking information of those two group students were discussed.

  10. Digitally Programmable High-Q Voltage Mode Universal Filter

    Directory of Open Access Journals (Sweden)

    D. Singh

    2013-12-01

    Full Text Available A new low-voltage low-power CMOS current feedback amplifier (CFA is presented in this paper. This is used to realize a novel digitally programmable CFA (DPCFA using transistor arrays and MOS switches. The proposed realizations nearly allow rail-to-rail swing capability at all the ports. Class-AB output stage ensures low power dissipation and high current drive capability. The proposed CFA/ DPCFA operates at supply voltage of ±0.75 V and exhibits bandwidth better than 95 MHz. An application of the DPCFA to realize a novel voltage mode high-Q digitally programmable universal filter (UF is given. Performances of all the proposed circuits are verified by PSPICE simulation using TSMC 0.25μm technology parameters.

  11. Investigating the Digital Addiction Level of the University Students According to Their Purposes for Using Digital Tools

    Science.gov (United States)

    Kesici, Ahmet; Tunç, Nazenin Fidan

    2018-01-01

    This study was carried out to investigate the digital addiction (DA) level of the university students according to their purposes for using digital tools. 527 students studying at the faculties of education of Erzincan, Dicle, and Siirt Universities participated this study in which general survey model was used. A form was used to reveal for which…

  12. University students’ self-regulated learning using digital technologies

    Directory of Open Access Journals (Sweden)

    Carmen Yot-Domínguez

    2017-11-01

    Full Text Available Abstract Analysing the process by which students—whether at university or not—manage and facilitate their own learning has been a recurrent educational research problem. Recently, the question arises about how the development of strategies taking place during the aforementioned process could be made easier by using technologies. In an effort to know whether university students really use digital technologies to plan, organize and facilitate their own learning, we have proposed three research questions. Which technologies do university students use to self-regulate their learning? What self-regulated learning strategies do they develop using technologies? What profiles could be identified among students based on their use of self-regulation strategies with technology? To answer these questions, the “Survey of Self-regulated Learning with Technology at the University” was designed. Information from a sample group with 711 students from various universities located in the region of Andalusia (Spain was collected with this survey. The results indicate that university students, even when they are frequent users of digital technology, they tend not to use these technologies to regulate their own learning process. Of all technologies analysed, Internet information search and instant communication tools are used continually. In turn, the most generalised self-regulation learning strategies are those relative to social support. Nevertheless, students differ from each other regarding their use and frequency. There are groups of students who make use of self-regulation strategies when learning with technologies. In this regard, two distinctive groups of students have been identified, who show differentiated self-regulated levels.

  13. Modeling high dimensional multichannel brain signals

    KAUST Repository

    Hu, Lechuan

    2017-03-27

    In this paper, our goal is to model functional and effective (directional) connectivity in network of multichannel brain physiological signals (e.g., electroencephalograms, local field potentials). The primary challenges here are twofold: first, there are major statistical and computational difficulties for modeling and analyzing high dimensional multichannel brain signals; second, there is no set of universally-agreed measures for characterizing connectivity. To model multichannel brain signals, our approach is to fit a vector autoregressive (VAR) model with sufficiently high order so that complex lead-lag temporal dynamics between the channels can be accurately characterized. However, such a model contains a large number of parameters. Thus, we will estimate the high dimensional VAR parameter space by our proposed hybrid LASSLE method (LASSO+LSE) which is imposes regularization on the first step (to control for sparsity) and constrained least squares estimation on the second step (to improve bias and mean-squared error of the estimator). Then to characterize connectivity between channels in a brain network, we will use various measures but put an emphasis on partial directed coherence (PDC) in order to capture directional connectivity between channels. PDC is a directed frequency-specific measure that explains the extent to which the present oscillatory activity in a sender channel influences the future oscillatory activity in a specific receiver channel relative all possible receivers in the network. Using the proposed modeling approach, we have achieved some insights on learning in a rat engaged in a non-spatial memory task.

  14. Modeling high dimensional multichannel brain signals

    KAUST Repository

    Hu, Lechuan; Fortin, Norbert; Ombao, Hernando

    2017-01-01

    In this paper, our goal is to model functional and effective (directional) connectivity in network of multichannel brain physiological signals (e.g., electroencephalograms, local field potentials). The primary challenges here are twofold: first, there are major statistical and computational difficulties for modeling and analyzing high dimensional multichannel brain signals; second, there is no set of universally-agreed measures for characterizing connectivity. To model multichannel brain signals, our approach is to fit a vector autoregressive (VAR) model with sufficiently high order so that complex lead-lag temporal dynamics between the channels can be accurately characterized. However, such a model contains a large number of parameters. Thus, we will estimate the high dimensional VAR parameter space by our proposed hybrid LASSLE method (LASSO+LSE) which is imposes regularization on the first step (to control for sparsity) and constrained least squares estimation on the second step (to improve bias and mean-squared error of the estimator). Then to characterize connectivity between channels in a brain network, we will use various measures but put an emphasis on partial directed coherence (PDC) in order to capture directional connectivity between channels. PDC is a directed frequency-specific measure that explains the extent to which the present oscillatory activity in a sender channel influences the future oscillatory activity in a specific receiver channel relative all possible receivers in the network. Using the proposed modeling approach, we have achieved some insights on learning in a rat engaged in a non-spatial memory task.

  15. The ‘universal library’ returns in digital form

    Directory of Open Access Journals (Sweden)

    Andy White

    2009-04-01

    Full Text Available This paper begins with two contemporary technological developments both of which present a serious challenge to the dominance in literature and learning of the book in its codex form. While their earlier manifestations were not commercially successful, the most recent e-book readers (portable technological contraptions that have the capacity to store thousands of books that can be read electronically have been praised not only for their functionality but also for their aesthetic appeal. A related development has been the growth of large-scale digital libraries, the most prominent of which is the Google Books Library Project, launched in 2004 and now committed to the digitisation of around 15 million volumes or 4.5 billion pages in the following six years from some of the world’s leading academic libraries. The purpose of this paper is to explore these developments within the context of ancient and Enlightenment ideas about the ‘universal library’ which assert that the construction of such an institution is the most effective way of promoting universal knowledge. Rather than employing a kind of technological determinism that renders these technologies as merely points along an inexorable continuum of progress, it will be argued that they are the latest manifestation of an idea that long pre-dated digital technology. Over two millennia ago, the Ptolemies attempted to collect the entire corpus of literature in the Greek language as well as significant works in other languages. Many have argued that the institution that held these huge collections, the library at Alexandria, was effectively the world’s first universal library. Even though this library was eventually destroyed, the idea of universalism survived and flourished again during the European Enlightenment, through Diderot’s Encyclopédie project and the construction of national libraries and archives. Latterly, the creation of the World Wide Web is conceived of by some as the

  16. Exploring the Digital Universe with Europe's Astrophysical Virtual Observatory

    Science.gov (United States)

    2001-12-01

    N° 73-2001 - Paris, 5 December 2001 The aim of AVO is to give astronomers instant access to the vast databanks now being built up by the world's observatories and forming what is in effect a "digital sky". Using AVO astronomers will be able, for example, to retrieve the elusive traces of the passage of an asteroid as it passes the Earth and so predict its future path and perhaps warn of a possible impact. When a giant star comes to the end of its life in a cataclysmic explosion called a supernova, they will be able to access the digital sky and pinpoint the star shortly before it exploded, adding invaluable data to the study of the evolution of stars. Modern observatories observe the sky continuously and data accumulates remorselessly in the digital archives. The growth rate is impressive and many hundreds of terabytes of data -corresponding to many thousands of billions of pixels - are already available to scientists. The real sky is being digitally reconstructed in the databanks. The volume and complexity of data and information available to astronomers are overwhelming. Hence the problem of how astronomers can possibly manage, distribute and analyse this great wealth of data. The Astrophysical Virtual Observatory will enable them to meet the challenge and "put the Universe online". AVO is a three-year project, funded by the European Commission under its Research and Technological Development (RTD) scheme, to design and implement a virtual observatory for the European astronomical community. The Commission has awarded a contract valued at EUR 4m for the project, starting on 15 November. AVO will provide software tools to enable astronomers to access the multi-wavelength data archives over the Internet and so give them the capability to resolve fundamental questions about the Universe by probing the digital sky. Equivalent searches of the "real" sky would, in comparison, both be prohibitively costly and take far too long. Towards a Global Virtual Observatory The

  17. High-Speed Universal Frequency-to-Digital Converter for Quasi-Digital Sensors and Transducers

    Directory of Open Access Journals (Sweden)

    Sergey Y. Yurish

    2007-06-01

    Full Text Available New fast, accurate universal integrated frequency-to-digital converter (UFDC-1M-16 is described in the article. It is based on the novel patented modified method of the dependent count and has non-redundant conversion time from 6.25 ms to 6.25 ms for 1 to 0.001 % relative errors respectively, comparable with conversion time for successive-approximation and S-D ADC. The IC can work with different sensors, transducers and encoders, which have frequency, period, duty-cycle, PWM, phase shift, pulse number, etc. output.

  18. The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Directory of Open Access Journals (Sweden)

    Meyer Z. Pesenson

    2010-01-01

    Full Text Available Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments that are vital to the analysis and visualization of complex datasets and images. One of the goals of this paper is to help bridge the gap between applied mathematics and artificial intelligence on the one side and astronomy on the other.

  19. Universal Service in a Broader Perspective: The European Digital Divide

    Directory of Open Access Journals (Sweden)

    Maria Concepcion GARCIA-JIMENEZ

    2009-01-01

    Full Text Available Ensuring universal service is a top objective in many countries in order that all the citizens can have access basic communications services. Although the ICT equipment in households and its usage by individuals are essential prerequisites for benefiting from ICTs, the situation in the European Union is far from uniform. This article provides a description of the European information society development scenario using the values reached by the member states in a set of indicators selected for measuring said progress in households. Two tools are used for providing a broader perspective of the digital divide: a composite index and the cluster analysis. Below, a study is provided on what variables are relevant for interpreting the situation that is presented.

  20. High-Dimensional Metrics in R

    OpenAIRE

    Chernozhukov, Victor; Hansen, Chris; Spindler, Martin

    2016-01-01

    The package High-dimensional Metrics (\\Rpackage{hdm}) is an evolving collection of statistical methods for estimation and quantification of uncertainty in high-dimensional approximately sparse models. It focuses on providing confidence intervals and significance testing for (possibly many) low-dimensional subcomponents of the high-dimensional parameter vector. Efficient estimators and uniformly valid confidence intervals for regression coefficients on target variables (e.g., treatment or poli...

  1. Exploring the Digital Universe with Europe's Astrophysical Virtual Observatory

    Science.gov (United States)

    2001-12-01

    Vast Databanks at the Astronomers' Fingertips Summary A new European initiative called the Astrophysical Virtual Observatory (AVO) is being launched to provide astronomers with a breathtaking potential for new discoveries. It will enable them to seamlessly combine the data from both ground- and space-based telescopes which are making observations of the Universe across the whole range of wavelengths - from high-energy gamma rays through the ultraviolet and visible to the infrared and radio. The aim of the Astrophysical Virtual Observatory (AVO) project, which started on 15 November 2001, is to allow astronomers instant access to the vast databanks now being built up by the world's observatories and which are forming what is, in effect, a "digital sky" . Using the AVO, astronomers will, for example, be able to retrieve the elusive traces of the passage of an asteroid as it passes near the Earth and so enable them to predict its future path and perhaps warn of a possible impact. When a giant star comes to the end of its life in a cataclysmic explosion called a supernova, they will be able to access the digital sky and pinpoint the star shortly before it exploded so adding invaluable data to the study of the evolution of stars. Background information on the Astrophysical Virtual Observatory is available in the Appendix. PR Photo 34a/01 : The Astrophysical Virtual Observatory - an artist's impression. The rapidly accumulating database ESO PR Photo 34a/01 ESO PR Photo 34a/01 [Preview - JPEG: 400 x 345 pix - 90k] [Normal - JPEG: 800 x 689 pix - 656k] [Hi-Res - JPEG: 3000 x 2582 pix - 4.3M] ESO PR Photo 34a/01 shows an artist's impression of the Astrophysical Virtual Observatory . Modern observatories observe the sky continuously and data accumulates remorselessly in the digital archives. The growth rate is impressive and many hundreds of terabytes of data - corresponding to many thousands of billions of pixels - are already available to scientists. The real sky is being

  2. Predicting Digital Informal Learning: An Empirical Study among Chinese University Students

    Science.gov (United States)

    He, Tao; Zhu, Chang; Questier, Frederik

    2018-01-01

    Although the adoption of digital technology has gained considerable attention in higher education, currently research mainly focuses on implementation in formal learning contexts. Investigating what factors influence students' digital informal learning is still unclear and limited. To understand better university students' digital informal…

  3. Application of Digital Cybersecurity Approaches to University Management--VFU SMART STUDENT

    Science.gov (United States)

    Nedyalkova, Anna; Bakardjieva, Teodora; Nedyalkov, Krasimir

    2016-01-01

    This paper suggests digital approaches in university management. Digital transformation requires leadership that can maintain and balance competing interests from faculty, administrators, students and others. The team of Varna Free University designed a flexible proper solution VFU SMART STUDENT aiming at lower operating costs and better…

  4. A digital platform for university education on geomorphosites

    Science.gov (United States)

    Coratza, Paola; Reynard, Emmanuel; Cayla, Nathalie; Comanescu, Laura; Darbellay, Lucie; Giusti, Christian; Grecu, Florina; Pereira, Paulo

    2016-04-01

    The working group on Geomorphosites of the International Association of Geomorphologists (IAG) is active since 2001 and has developed research activities on issues related to the geomorphological heritage (geomorphosites) (Reynard and Coratza, 2013). In parallel to the research activities, several intensive courses for Ph.D. and Master students have been organized since 2006 in various universities (Lausanne, Lesvos, Minho, Savoie, Beni Mellal) and a textbook for students was edited in 2009 (Reynard et al., 2009). The platform INTERGEO is prepared as a way to disseminate knowledge on geomorphological heritage, in particular in universities of developing countries where access to scientific papers and textbooks is not easy. It aims at improving students' autonomy by the reduction of frontal teaching and increasing autonomous learning as well as promoting international interactions between students interested in geomorphosite topics. The course, developed with the Learning Management System Moodle, is a completely free-access course. It is divided into four parts: (1) Generalities - definitions, links with heritage and landscape studies, active geomorphosites, the IAG working group; (2) Methods - selection and assessment, mapping issues, geomorphosite visualization, technical and digital tools in geomorphosite studies; (3) Conservation and promotion - example of geomorphosite studies related to geoconservation, geoparks, protected areas, World Heritage Sites, geotourism and interpretation, and natural hazards; (4) Examples - cultural, karstic, coastal, mountainous, fluvial, volcanic and anthropogenic geomorphosites. Each chapter contains a short description, a list of references, selected publications, as well as other educational material, e.g. videos, virtual fieldtrips, etc. In particular, several videos allow presenting in a very dynamic way concepts and examples. The content of the course will evolve and will be completed in the future. Most of the content is

  5. Modeling High-Dimensional Multichannel Brain Signals

    KAUST Repository

    Hu, Lechuan

    2017-12-12

    Our goal is to model and measure functional and effective (directional) connectivity in multichannel brain physiological signals (e.g., electroencephalograms, local field potentials). The difficulties from analyzing these data mainly come from two aspects: first, there are major statistical and computational challenges for modeling and analyzing high-dimensional multichannel brain signals; second, there is no set of universally agreed measures for characterizing connectivity. To model multichannel brain signals, our approach is to fit a vector autoregressive (VAR) model with potentially high lag order so that complex lead-lag temporal dynamics between the channels can be captured. Estimates of the VAR model will be obtained by our proposed hybrid LASSLE (LASSO + LSE) method which combines regularization (to control for sparsity) and least squares estimation (to improve bias and mean-squared error). Then we employ some measures of connectivity but put an emphasis on partial directed coherence (PDC) which can capture the directional connectivity between channels. PDC is a frequency-specific measure that explains the extent to which the present oscillatory activity in a sender channel influences the future oscillatory activity in a specific receiver channel relative to all possible receivers in the network. The proposed modeling approach provided key insights into potential functional relationships among simultaneously recorded sites during performance of a complex memory task. Specifically, this novel method was successful in quantifying patterns of effective connectivity across electrode locations, and in capturing how these patterns varied across trial epochs and trial types.

  6. DIGITAL BROADCASTING and INTERACTIVE TELEVISION in DISTANCE EDUCATION: Digital And Interactive Television Infrastructure Proposol for Anadolu University Open Education Faculty

    Directory of Open Access Journals (Sweden)

    Reha Recep ERGUL

    2007-01-01

    Full Text Available Rapid changes and improvements in the communication and information technologies beginning from the midst of the 20th Century and continuing today require new methods, constructions, and arrangements in the production and distribution of information. While television having the ability of presenting complex or difficult to comprehend concepts, subjects, and experimental studies to learners from different points of view, supported by 2D or 3D graphics and animations with audio visual stimulators replaces its technology from analog to digital and towards digital-interactive, it has also begun to convert the broadcasting technology in Turkey in this direction. Therefore, television broadcast infrastructure of Anadolu University Open Education Faculty needs to be replaced with a digital and interactive one. This study contains basic concepts of digital and interactive broadcasting and the new improvements. Furthermore, it includes the approaches in the basis of why and how a digital television broadcasting infrastructure should be stablished.

  7. Get the Digital Edge: linking students’ attitudes towards digital literacy and employability at the University of Westminster

    Directory of Open Access Journals (Sweden)

    Emma Woods

    2013-12-01

    Full Text Available The University of Westminster is located in London and is celebrating 175 years as an educational institution this year. A key part of the University's vision is in "building the next generation of highly employable global citizens to shape the future" (University of Westminster, 2013. This vision inspired us to look at the digital literacy skills our students need in order to be highly employable. In Spring 2012, the Information Services department at the University of Westminster secured Jisc funding to run a one year project exploring students' attitudes towards digital literacy and its relationship to employability for our students. The work is being carried out by a project board and a delivery group, which include members of staff from across the University who have an active interest in this area. We named the Project "DigitISE" (Digital Information Skills for Employability and colleagues involved with the project take turns in writing for its blog http://blog.westminster.ac.uk/jisc-employability/blog/ A questionnaire was circulated to find out about students' attitudes towards digital literacy and this was followed up by some focus groups. The headline findings from the survey are that 87.6% of students love digital technology and 81.5% believe themselves to be digitally literate. Attitudes vary significantly between subject areas. For example, with regard to the statement that the digital literacy skills needed in the courses get more complex as students progress through the course, students from the Business School agreed significantly more with this than did students from the School of Law, Social Sciences, Humanities and Languages or Architecture. The focus groups have supported the questionnaire findings and highlighted that students are largely unaware of the training that is already available to them. Ideas of how to market future training more effectively will therefore be an important outcome of this work. A further focus group

  8. Global Digital Revolution and Africa: Transforming Nigerian Universities to World Class Institutions

    Science.gov (United States)

    Isah, Emmanuel Aileonokhuoya; Ayeni, A. O.

    2010-01-01

    This study examined the global digital revolution and the transformation of Nigerian universities. The study overviewed university developments world wide in line with what obtains in Nigeria. The study highlighted the several challenges that face Nigerian universities inclusive of poor funding, poor personnel and the poor exposure to global…

  9. Student Communication and Study Habits of First-Year University Students in the Digital Era

    Science.gov (United States)

    Gallardo-Echenique, Eliana; Bullen, Mark; Marqués-Molías, Luis

    2016-01-01

    This paper reports on research into the study habits of-university students, their use digital technologies and how they communicate with each other and their professors. We conclude that most students feel comfortable with digital technologies and that they use social media for connecting and interacting with friends rather than for academic…

  10. Digital assessment in higher education: Promoting universal usability through requirements specification and universal design quality (UD-Q) reviews

    OpenAIRE

    Begnum, Miriam E. Nes; Foss-Pedersen, Rikke Julie

    2017-01-01

    Statistics show there is a clear relationship between higher education and employment in Norway, especially for people with disabilities. The use of digital assessment solutions is increasing in Norwegian higher education. The overall goal of this study is therefore to highlight the potential for improvement of current practices related to universal design, both for providers of digital assessment solutions and for higher education institutions. Based on a case study of practices in Norwegian...

  11. High dimensional neurocomputing growth, appraisal and applications

    CERN Document Server

    Tripathi, Bipin Kumar

    2015-01-01

    The book presents a coherent understanding of computational intelligence from the perspective of what is known as "intelligent computing" with high-dimensional parameters. It critically discusses the central issue of high-dimensional neurocomputing, such as quantitative representation of signals, extending the dimensionality of neuron, supervised and unsupervised learning and design of higher order neurons. The strong point of the book is its clarity and ability of the underlying theory to unify our understanding of high-dimensional computing where conventional methods fail. The plenty of application oriented problems are presented for evaluating, monitoring and maintaining the stability of adaptive learning machine. Author has taken care to cover the breadth and depth of the subject, both in the qualitative as well as quantitative way. The book is intended to enlighten the scientific community, ranging from advanced undergraduates to engineers, scientists and seasoned researchers in computational intelligenc...

  12. Asymptotically Honest Confidence Regions for High Dimensional

    DEFF Research Database (Denmark)

    Caner, Mehmet; Kock, Anders Bredahl

    While variable selection and oracle inequalities for the estimation and prediction error have received considerable attention in the literature on high-dimensional models, very little work has been done in the area of testing and construction of confidence bands in high-dimensional models. However...... develop an oracle inequality for the conservative Lasso only assuming the existence of a certain number of moments. This is done by means of the Marcinkiewicz-Zygmund inequality which in our context provides sharper bounds than Nemirovski's inequality. As opposed to van de Geer et al. (2014) we allow...

  13. Virtualization of Universities Digital Media and the Organization of Higher Education Institutions

    CERN Document Server

    Pfeffer, Thomas

    2012-01-01

    The purpose of this volume is to shape conceptual tools to understand the impact of new information and communication technologies (ICTs) on the organization of universities. Traditional research-based universities, the most typical representatives of the higher education system, find themselves challenged by the speed and the wide range of technical innovations, but also by a vast array of implicit assumptions and explicit promises associated with the distribution of digital media.  The author observes that as universities increasingly use digital media (computers and the Internet) to accomplish their tasks, a transformation takes place in an evolutionary rather than in a revolutionary way.  Using the University of Klagenfurt as an in-depth case study, he explores such dynamic issues as how digital media affect the practice of research, the preservation and dissemination of knowledge (for example, through publishing and archiving), and delivery of education at universities.  More broadly, he considers iss...

  14. Italian University Students and Digital Technologies: Some Results from a Field Research

    Science.gov (United States)

    Ferri, Paolo; Cavalli, Nicola; Costa, Elisabetta; Mangiatordi, Andrea; Mizzella, Stefano; Pozzali, Andrea; Scenini, Francesca

    Developments in information and communication technologies have raised the issue of how a kind of intergenerational digital divide can take place between "digital natives" and "digital immigrants". This can in turn have important consequences for the organization of educative systems. In this paper we present the result of a research performed during the course of 2008 to study how university students in Italy make use of digital technologies. The methodology was based on a mix of quantitative and qualitative approaches. A survey research was done, on a sample of 1186 students of the University of Milan-Bicocca, based on a questionnaire administrated through the Intranet of the University. A series of focus groups and in depth interviews with students, parents, and new media experts was furthermore performed. The results are consistent with the presence of a strong intergenerational divide. The implications of the results for the future organization of educative systems are discussed in the paper.

  15. Shifting gears higher - digital slides in graduate education - 4 years experience at Semmelweis University

    Directory of Open Access Journals (Sweden)

    Molnár Béla

    2010-11-01

    Full Text Available Abstract Background The spreading of whole slide imaging or digital slide systems in pathology as an innovative technique seems to be unstoppable. Successful introduction of digital slides in education has played a crucial role to reach this level of acceptance. Practically speaking there is no university institute where digital materials are not built into pathology education. At the 1st. Department of Pathology and Experimental Cancer Research, Semmelweis University optical microscopes have been replaced and for four years only digital slides have been used in education. The aim of this paper is to summarize our experiences gathered with the installation of a fully digitized histology lab for graduate education. Methods We have installed a digital histology lab with 40 PCs, two slide servers - one for internal use and one with external internet access. We have digitized hundreds of slides and after 4 years we use a set of 126 slides during the pathology course. A Student satisfaction questionnaire and a Tutor satisfaction questionnaire have been designed, both to be completed voluntarily to have feed back from the users. The page load statistics of the external slide server were evaluated. Results The digital histology lab served ~900 students and ~1600 hours of histology practice. The questionnaires revealed high satisfaction with digital slides. The results also emphasize the importance of the tutors' attitude towards digital microscopy as a factor influencing the students' satisfaction. The constantly growing number of page downloads from the external server confirms this satisfaction and the acceptance of digital slides. Conclusions We are confident, and have showed as well, that digital slides have got numerous advantages over optical slides and are more suitable in education.

  16. An Evaluation of the Informedia Digital Video Library System at the Open University.

    Science.gov (United States)

    Kukulska-Hulme, Agnes; Van der Zwan, Robert; DiPaolo, Terry; Evers, Vanessa; Clarke, Sarah

    1999-01-01

    Reports on an Open University evaluation study of the Informedia Digital Video Library System developed at Carnegie Mellon University (CMU). Findings indicate that there is definite potential for using the system, provided that certain modifications can be made. Results also confirm findings of the Informedia team at CMU that the content of video…

  17. Writing Programs as Distributed Networks: A Materialist Approach to University-Community Digital Media Literacy

    Science.gov (United States)

    Comstock, Michelle

    2006-01-01

    This article addresses how community-university digital media literacy projects are redefining literacy, literate practices, and institutions. Using Actor-Network Theory (ANT), which emphasizes the organizing process itself, I analyze the shifting definitions of literacy within one particular university-community collaboration. My analysis…

  18. Doing the Right Thing: One University's Approach to Digital Accessibility

    Science.gov (United States)

    Sieben-Schneider, Jill A.; Hamilton-Brodie, Valerie A.

    2016-01-01

    This article describes the approach employed by one university to address a complaint filed by students with disabilities with the Department of Justice (DOJ) regarding the inaccessibility of information and communication technology (ICT). Prior to the DOJ complaint, the university did not have a process in place to address ICT accessibility.…

  19. High-dimensional quantum cloning and applications to quantum hacking.

    Science.gov (United States)

    Bouchard, Frédéric; Fickler, Robert; Boyd, Robert W; Karimi, Ebrahim

    2017-02-01

    Attempts at cloning a quantum system result in the introduction of imperfections in the state of the copies. This is a consequence of the no-cloning theorem, which is a fundamental law of quantum physics and the backbone of security for quantum communications. Although perfect copies are prohibited, a quantum state may be copied with maximal accuracy via various optimal cloning schemes. Optimal quantum cloning, which lies at the border of the physical limit imposed by the no-signaling theorem and the Heisenberg uncertainty principle, has been experimentally realized for low-dimensional photonic states. However, an increase in the dimensionality of quantum systems is greatly beneficial to quantum computation and communication protocols. Nonetheless, no experimental demonstration of optimal cloning machines has hitherto been shown for high-dimensional quantum systems. We perform optimal cloning of high-dimensional photonic states by means of the symmetrization method. We show the universality of our technique by conducting cloning of numerous arbitrary input states and fully characterize our cloning machine by performing quantum state tomography on cloned photons. In addition, a cloning attack on a Bennett and Brassard (BB84) quantum key distribution protocol is experimentally demonstrated to reveal the robustness of high-dimensional states in quantum cryptography.

  20. Usability of PDF based Digital Textbooks to the Physically Disabled University Student.

    Science.gov (United States)

    Oku, Hidehisa; Matsubara, Kayoko; Booka, Masayuki

    2015-01-01

    Digital textbooks have been expected for providing multimedia information that the print textbooks could not handle. The original digital textbook can be fabricated relatively easily by using Epub or DAISY. Print textbooks are, however, employed as textbooks in the most of lectures in universities. Therefore, it is considered necessary to convert the content of the print textbook to the digital textbook simply and in a short time. In this paper, the digital textbook using PDF files of the print textbook was suggested as one of simple and practical solution to provide an alternative textbook for the physically disabled university student who has difficulty handling the print textbook. Then usability of the suggested method was evaluated experimentally from the point of workload. Result of the experiment indicates that the digital textbook fabricated as the alternative one for the print textbook by the suggested method has a potential to reduce workload for the physically disabled university students. In addition, the digital textbook with larger LCD display needs less workload than the print textbook. Then, there are not so much difference in the workload between the print book which is smaller than the print textbook and the digital book made from the print book.

  1. A Digital Library Example in the Digital Age: İstanbul Bilgi University Library and e-Resources

    Directory of Open Access Journals (Sweden)

    Banu Elçi

    2015-06-01

    Full Text Available This article describes the ongoing of the traditional library and service concept alteration to the library and service concept of the digital age and refers to an instance as İstanbul Bilgi University Library and e-Resources that pioneers technological advances and digital applications to be integrated into the library field. In this sense it involves services, developments, applications and projects provided by Bilgi Libraries.This article also accounts for a number of works which integrated digital resources and applications and social network interactions of the internet and the web of the digital age and how they are adjusted to the library area.In this context, it refers to an evolvement of a different approach of libraries and enhacements diverged from customary and adopted library service concepts and reveals through the medium of samples from İstanbul Bilgi University Library and e-Resources.

  2. High-dimensional quantum channel estimation using classical light

    CSIR Research Space (South Africa)

    Mabena, Chemist M

    2017-11-01

    Full Text Available stream_source_info Mabena_20007_2017.pdf.txt stream_content_type text/plain stream_size 960 Content-Encoding UTF-8 stream_name Mabena_20007_2017.pdf.txt Content-Type text/plain; charset=UTF-8 PHYSICAL REVIEW A 96, 053860... (2017) High-dimensional quantum channel estimation using classical light Chemist M. Mabena CSIR National Laser Centre, P.O. Box 395, Pretoria 0001, South Africa and School of Physics, University of the Witwatersrand, Johannesburg 2000, South...

  3. The convergence of the aspects of digital inclusion: experience in the domain of a university

    Directory of Open Access Journals (Sweden)

    Barbara Coelho Neves

    2008-01-01

    Full Text Available This article brings results of a case study applied with survey, seeking to analyze the aspects of the digital inclusion through the users of Tabuleiro Digital public Internet access point of the College of Education of the Federal University of Bahia - FACED/ UFBA. The main objective study was intended to investigate the type of digital inclusion, promoted by this Project, in accordance with perspective of the literature present. The specific objectives indicate the characteristics of the profile from the users; types of applications used and purpose of the Internet confrontation with levels of education and social. In this way, the information society and digital inclusion are contextualized; also the aspects of the inclusion of the Tabuleiro are demonstrated and discussed. Finally, with the analysis, was evidenced that the type of digital inclusion of the Project is induced (technique.

  4. Attitudes towards digital gap among university students, and its relationship with educational progress and socioeconomic status

    Directory of Open Access Journals (Sweden)

    Z Derikvandi

    2017-03-01

    Full Text Available Introduction: Digital gap may exist in national scale, among organizations and other groups of society, since it is an indicative of inequality in information technology ground and communication. This study aims to investigate the attitude towards digital gap among students, and its relationship with educational progress and socio-economic  status (SES of university students at Alborz University of Medical Sciences. Methods: This was a cross sectional analytic study. students were randomly selected according to multistage cluster method. The tools for collecting data were Davis (1989 questionnaire on attitude towards internet, and  a researcher made questionnaire. The formal validity of the questionnaires is confirmed by a panel of experts, Cronbach's alpha's coefficient was also calculated. Pearson coefficient were calculated andindependent T- test was used for analyzing the data. Result: The analysis of data indicates that there is a meaningful relationship between the attitude towards digital gap and educational progress, and also SES of the students. Furthermore, there was adifferences between the attitudes of males (48.7 and felames (46.5 toward digital gap (p=0.01. Conclusion:There is an attitude towards digital gap in university students. Interventions are needed to close the digital gaps in studnets.

  5. GuidosToolbox: universal digital image object analysis

    Science.gov (United States)

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  6. The digitization of the Wundt estate at Leipzig University.

    Science.gov (United States)

    Meyer, Till; Mädebach, Andreas; Schröger, Erich

    2017-08-01

    Wilhelm M. Wundt (1832-1920) was one of the most important German scholars of the 19th and early 20th centuries and famously founded the first institute for experimental psychology in Leipzig in 1879. Wundt's institute established a teaching and research facility that attracted a large number of students from all over the world and contributed greatly to the development of modern psychology. Until now, the relatively poor indexing and documentation as well as the difficulty in accessing the Wundt estate has prevented a widespread and comprehensive investigation and consideration of these documents. The digitization project described in this article has rectified these problems and will hopefully provide a valuable source for students and researchers interested in Wundt's work. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Exploring Digital Health Use and Opinions of University Students: Field Survey Study.

    Science.gov (United States)

    Montagni, Ilaria; Cariou, Tanguy; Feuillet, Tiphaine; Langlois, Emmanuel; Tzourio, Christophe

    2018-03-15

    During university, students face some potentially serious health risks, and their lifestyle can have a direct effect on health and health behaviors later in life. Concurrently, university students are digital natives having easy access to the internet and new technologies. Digital health interventions offer promising new opportunities for health promotion, disease prevention, and care in this specific population. The description of the current use of and opinions on digital health among university students can inform future digital health strategies and interventions within university settings. The aim of this exploratory study was to report on university students' use and opinions regarding information and communication technologies for health and well-being, taking into account sociodemographic and self-rated general and mental health correlates. This field survey was conducted from March to April 2017. An informed consent form and a paper questionnaire were given to students aged 18 to 24 years in 4 university campuses in Bordeaux, France. The survey was formulated in 3 sections: (1) sociodemographic characteristics and self-rated general and mental health, (2) information about the use of digital health, and (3) opinions about digital health. Data were analyzed using descriptive statistics and tests of independence. A total of 59.8% (303/507 females) students completed the questionnaire. Concerning digital health use, 34.9% (174/498) had at least 1 health app mostly for physical activity (49.4%, 86/174) and general health monitoring (41.4%, 72/174,), but only 3.9% (20/507) of students had a wearable device. Almost all (94.8%, 450/476) had searched for Web-based health-related information at least once in the last 12 months. The most sought health-related topics were nutrition (68.1%, 324/476); pain and illnesses (64.5%, 307/476); and stress, anxiety, or depression (51.1%, 243/476). Although Wikipedia (79.7%, 357/448) and general health websites (349/448, 77

  8. Exploring Digital Health Use and Opinions of University Students: Field Survey Study

    Science.gov (United States)

    Cariou, Tanguy; Feuillet, Tiphaine; Langlois, Emmanuel; Tzourio, Christophe

    2018-01-01

    Background During university, students face some potentially serious health risks, and their lifestyle can have a direct effect on health and health behaviors later in life. Concurrently, university students are digital natives having easy access to the internet and new technologies. Digital health interventions offer promising new opportunities for health promotion, disease prevention, and care in this specific population. The description of the current use of and opinions on digital health among university students can inform future digital health strategies and interventions within university settings. Objective The aim of this exploratory study was to report on university students’ use and opinions regarding information and communication technologies for health and well-being, taking into account sociodemographic and self-rated general and mental health correlates. Methods This field survey was conducted from March to April 2017. An informed consent form and a paper questionnaire were given to students aged 18 to 24 years in 4 university campuses in Bordeaux, France. The survey was formulated in 3 sections: (1) sociodemographic characteristics and self-rated general and mental health, (2) information about the use of digital health, and (3) opinions about digital health. Data were analyzed using descriptive statistics and tests of independence. Results A total of 59.8% (303/507 females) students completed the questionnaire. Concerning digital health use, 34.9% (174/498) had at least 1 health app mostly for physical activity (49.4%, 86/174) and general health monitoring (41.4%, 72/174,), but only 3.9% (20/507) of students had a wearable device. Almost all (94.8%, 450/476) had searched for Web-based health-related information at least once in the last 12 months. The most sought health-related topics were nutrition (68.1%, 324/476); pain and illnesses (64.5%, 307/476); and stress, anxiety, or depression (51.1%, 243/476). Although Wikipedia (79.7%, 357/448) and

  9. The Usage Analysis of Databases at Ankara University Digital Library

    Directory of Open Access Journals (Sweden)

    Sacit Arslantekin

    2006-12-01

    Full Text Available The development in information and communication technologies has changed and improved resources and services diversity in libraries. These changes continue to develop rapidly throughout the world. As for our country, remarkable developments, especially in university and special libraries, in this field are worth consideration. In order to take benefit of the existing and forthcoming developments in the field of electronic libraries the databases used by clients should be well-demonstrated and followed closely. The providing wide use of electronic databases leads to increasing the productivity of scientific and social information that that is the ultimate goal. The article points out electronic resources management and the effect of consortia developments in the field first, and then evaluates the results of the survey on the use of electronic libraries assessment questionnaires by faculty members at Ankara University.

  10. Emerging digital technologies come into the University: AR and VR

    Directory of Open Access Journals (Sweden)

    Julio Cabero Almenara

    2018-02-01

    Full Text Available A set of emerging technologies have been approaching university education in the last decade as it had not happened in previous times. Many of these technologies can be considered as disruptive, since they are transforming and improving the training scenarios. Among the technologies that are taking greater momentum and importance we find augmented reality and virtual reality, as evidenced by various reports Horizon and Edutrens. Although the research on the integration of these technologies is in an early stage, we can affirm that those carried out so far demonstrate the great benefits they incorporate into the teaching-learning process. Among these contributions we can highlight the improvement of the motivation, the satisfaction and the performance of the students. In this article we will cover the current state of the University regarding ICT integration, focusing on the technologies that have been bursting with greater force in recent times. In addition, we will focus on two of the technologies “augmented reality and virtual reality” that are experiencing more impact in recent years, showing evidences and mentioning their implementation in the university environment.

  11. Differences in basic digital competences between male and female university students of Social Sciences in Spain

    Directory of Open Access Journals (Sweden)

    Esteban Vázquez-Cano

    2017-11-01

    Full Text Available Abstract This article analyses the differences in basic digital competences of male and female university students on Social Education, Social Work and Pedagogy courses. The study of gender differences in university students’ acquisition of digital competence has considerable didactic and strategic consequences for the development of these skills. The study was carried out at two public universities in Spain (UNED – the National Distance-Learning University, and the Universidad Pablo de Olavide on a sample of 923 students, who responded to a questionnaire entitled “University Students’ Basic Digital Competences 2.0” (COBADI – registered at the Spanish Patent and Trademark Office. The research applied a quantitative methodology based on a Bayesian approach using multinomial joint distribution as prior distribution. The use of Bayes factors also offers advantages with respect to the use of frequentist p-values, like the generation of information on the alternative hypothesis, that the evidence is not dependent on the sample size used. The results show that men have greater perceived competence in digital cartography and online presentations, whereas women prefer to request personal tutorials to resolve doubts about technology and have greater perceived competence in corporate emailing. There is also evidence that the men have greater perceived competence in developing “online presentations” than women do. Regarding to, “Interpersonal competences in the use of ICT at university”, we observed that the female students opted for personal sessions with tutors in greater numbers than the male students did.

  12. La historia del periodismo en el universo digital / The history of journalism in the digital universe

    Directory of Open Access Journals (Sweden)

    Lorena Romero Domínguez

    2011-09-01

    Full Text Available Resumen: La digitalización es, pues, una necesidad ineludible en la actualidad para las instituciones públicas que pretendan poder competir con los proveedores de información en la red y mantener su estatus de instancias privilegiadas en la provisión de conocimiento de las sociedades. De este modo, las instituciones culturales se sienten conminadas a la digitalización de sus fondos, y a considerar esta técnica la panacea de sus problemas en la gestión y difusión de su riqueza documental, por la continua revalorización de esas fuentes que pueden ser empleadas en el ámbito de la enseñanza, del aprendizaje, de la investigación, de la documentación, y en el manejo de la gestión pública de los catálogos de datos. Este trabajo pretende adentrarnos en el campo de estudio del proceso de digitalización de una disciplina tradicional, la historia de la comunicación y del periodismo, su inmersión en el escenario digital, y, más concretamente, en el acercamiento y explotación realizado por los investigadores de los fondos digitales contenidos en las numerosas hemerotecas digitalizadas que están proliferando en Internet.Abstract: Digitisation is therefore an absolute necessity today for public institutions seeking to compete with providers of information on the network and maintain its status as privileged instances in the provision of knowledge societies. This way, cultural institutions feel the need for the digitization of their collections, and to consider this technology a panacea for their problems in the management and dissemination of its wealth of documents, the continued appreciation of the sources that can be used in the field of teaching, learning, research, documentation, and management of the public management of data catalogs. This paper aims to move into the field of study in the digitization process of a traditional discipline, the history of communication and journalism, his immersion in the digital arena and, more

  13. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  14. Estimating High-Dimensional Time Series Models

    DEFF Research Database (Denmark)

    Medeiros, Marcelo C.; Mendes, Eduardo F.

    We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly......, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. A simulation study shows...

  15. High dimensional classifiers in the imbalanced case

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Jensen, Jens Ledet

    We consider the binary classification problem in the imbalanced case where the number of samples from the two groups differ. The classification problem is considered in the high dimensional case where the number of variables is much larger than the number of samples, and where the imbalance leads...... to a bias in the classification. A theoretical analysis of the independence classifier reveals the origin of the bias and based on this we suggest two new classifiers that can handle any imbalance ratio. The analytical results are supplemented by a simulation study, where the suggested classifiers in some...

  16. Topology of high-dimensional manifolds

    Energy Technology Data Exchange (ETDEWEB)

    Farrell, F T [State University of New York, Binghamton (United States); Goettshe, L [Abdus Salam ICTP, Trieste (Italy); Lueck, W [Westfaelische Wilhelms-Universitaet Muenster, Muenster (Germany)

    2002-08-15

    The School on High-Dimensional Manifold Topology took place at the Abdus Salam ICTP, Trieste from 21 May 2001 to 8 June 2001. The focus of the school was on the classification of manifolds and related aspects of K-theory, geometry, and operator theory. The topics covered included: surgery theory, algebraic K- and L-theory, controlled topology, homology manifolds, exotic aspherical manifolds, homeomorphism and diffeomorphism groups, and scalar curvature. The school consisted of 2 weeks of lecture courses and one week of conference. Thwo-part lecture notes volume contains the notes of most of the lecture courses.

  17. 3D Adaptive Virtual Exhibit for the University of Denver Digital Collections

    Directory of Open Access Journals (Sweden)

    Shea-Tinn Yeh

    2015-07-01

    Full Text Available While the gaming industry has taken the world by storm with its three-dimensional (3D user interfaces, current digital collection exhibits presented by museums, historical societies, and libraries are still limited to a two-dimensional (2D interface display. Why can’t digital collections take advantage of this 3D interface advancement? The prototype discussed in this paper presents to the visitor a 3D virtual exhibit containing a set of digital objects from the University of Denver Libraries’ digital image collections, giving visitors an immersive experience when viewing the collections. In particular, the interface is adaptive to the visitor’s browsing behaviors and alters the selection and display of the objects throughout the exhibit to encourage serendipitous discovery. Social media features were also integrated to allow visitors to share items of interest and to create a sense of virtual community.

  18. Exploring the Use of Interactive Digital Storytelling Video: Promoting Student Engagement and Learning in a University Hybrid Course

    Science.gov (United States)

    Shelton, Catharyn C.; Warren, Annie E.; Archambault, Leanna M.

    2016-01-01

    This study explores interactive digital storytelling in a university hybrid course. Digital stories leverage imagery and narrative-based content to explore concepts, while appealing to millennials. When digital storytelling is used as the main source of course content, tensions arise regarding how to engage and support student learning while…

  19. Digital phonocardiographic experiments and signal processing in multidisciplinary fields of university education

    International Nuclear Information System (INIS)

    Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán

    2017-01-01

    Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones. (paper)

  20. Digital phonocardiographic experiments and signal processing in multidisciplinary fields of university education

    Science.gov (United States)

    Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán

    2017-09-01

    Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones.

  1. Perceptions of Library Staff Regarding Challenges of Developing Digital Libraries: The Case of an Iranian University

    Science.gov (United States)

    Mohsenzadeh, Faranak; Isfandyari-Moghaddam, Alireza

    2011-01-01

    Purpose: The present research aims to identify the difficulties and obstacles for developing digital libraries in the seven regional branches of Islamic Azad University (IAU), Iran, and to study the status of librarians' skills and education programmes at these institutions. Design/methodology/approach: The 40 individuals working in the regional…

  2. Michigan State University Extension Educators' Perceptions of the Use of Digital Technology in Their Work

    Science.gov (United States)

    Wells, Elizabeth Chase

    2009-01-01

    This research study examined Michigan State University Extension educators' perceptions of the use of digital technology in their work. It used a mixed method of research which included a mailed survey and interviews of selected respondents. A census survey using Dillman's Total Design method was sent to 290 field staff of Michigan State…

  3. Understanding University Students' Thoughts and Practices about Digital Citizenship: A Mixed Methods Study

    Science.gov (United States)

    Kara, Nuri

    2018-01-01

    The purpose of this study was to investigate university students' thoughts and practices concerning digital citizenship. An explanatory mixed methods design was used, and it involved collecting qualitative data after a quantitative phase in order to follow up on the quantitative data in more depth. In the first quantitative phase of the study, a…

  4. Developing digital technologies for university mathematics by applying participatory design methods

    DEFF Research Database (Denmark)

    Triantafyllou, Eva; Timcenko, Olga

    2013-01-01

    This paper presents our research efforts to develop digital technologies for undergraduate university mathematics. We employ participatory design methods in order to involve teachers and students in the design of such technologies. The results of the first round of our design are included...

  5. Student Digital Piracy in the Florida State University System: An Exploratory Study on Its Infrastructural Effects

    Science.gov (United States)

    Reiss, Jeffrey

    2010-01-01

    Digital piracy is a problem that may never disappear from society. Through readily available resources such as those found in a university, students will always have access to illegal goods. While piracy is a global phenomenon, an institution's resources combined with the typical college student's lack of funds makes it more lucrative. Students…

  6. DIGITAL TRANSFORMATION OF UNIVERSITY EDUCATION IN UKRAINE: TRAJECTORIES OF DEVELOPMENT IN THE CONDITIONS OF NEW TECHNOLOGICAL AND ECONOMIC ORDER

    OpenAIRE

    Oleg Ye. Kaminskyi; Yulia O. Yereshko; Sergii O. Kyrychenko

    2018-01-01

    The article substantiates the role of the digital transformation of higher education in Ukraine in the era of the fourth industrial revolution. There was proven the need to develop the strategy of the university education digital transformation, as well as the formation of new information and communication competencies. According to the authors, the strategy of digital transformation of the university education system has to include the modernization of corporate IT architecture management, w...

  7. Desigualdad digital en la universidad: usos de Internet en Ecuador Digital Divide in Universities: Internet Use in Ecuadorian Universities

    Directory of Open Access Journals (Sweden)

    Juan Carlos Torres Díaz

    2011-10-01

    Full Text Available Las tecnologías han transformado la educación superior impulsando cambios que han sido asimilados por la comunidad universitaria de distintas maneras. Como consecuencia, los estudiantes han presentado diversas formas y niveles de aprovechamiento de los recursos que nos ofrece Internet, delineándose brechas sutiles en la población universitaria. En este estudio se puntualizan algunas características de estas brechas; concretamente se analiza la incidencia de la variable ingresos del estudiante sobre los usos e intensidad de uso de las herramientas y recursos de Internet. Para lograrlo se clasificó a los estudiantes aplicando análisis factorial, complementado por análisis clúster para obtener perfiles de usuarios; estos perfiles se contrastaron con análisis discriminante y, finalmente, se aplicó chicuadrado para verificar la relación entre el nivel de ingresos y los perfiles de usuarios. Se determinaron tres perfiles con distintos niveles de las herramientas y recursos de Internet; y se comprobó estadísticamente la incidencia del nivel de ingresos en la conformación de estos perfiles. Se concluye que el nivel de ingreso incide mayormente en las variables que definen las posibilidades de acceso; el género tiene un comportamiento especial, puesto que, si bien el perfil más alto tiene el doble de proporción de hombres, las mujeres tienen un mejor desempeño en general.New technologies have transformed higher education whose application has implied changes at all levels. These changes have been assimilated by the university community in various ways. Subtle differences among university students have emerged; these differences determine that the resources the network offers have been used in different ways, thus creating gaps in the university population. This study seeks to determine the level of incidence of the variable of university students’ incomes on the uses and intensity of use of the Internet tools and resources. Students

  8. Evaluation and development of digital competence in future primary school teachers at the University of Murcia

    Directory of Open Access Journals (Sweden)

    Isabel Gutiérrez Porlán

    2016-01-01

    Full Text Available This paper presents the findings of a study carried out in the academic year 2014-2015 at the faculty of Education of the University of Murcia with first year degree students in Primary Education studying Research and ICT. The study started with the application of the DIGCOM questionnaire to analyze the digital competences of 134 students. The questionnaire served as an initial task to help students reflect on their digital competences. The subject was developed around tasks which adopted a transversal approach and used the nature of the contents itself to direct and improve students’ digital competencies. Finally, the initial questionnaire was reformulated and run in order to ascertain the students’ self-perception of their improvement in these competencies through the tasks they had performed. Below we present the tasks carried out, the organization of each subject and the most relevant data regarding the self-perception of digital competencies of the future primary school teachers enrolled at the University of Murcia. The data reveal, on the one hand, that the students participating consider themselves to be competent in the most basic aspects of digital competencies and, on the other, their perception that the work done in the subject has helped them quite a lot in improving their competencies.

  9. Clustering high dimensional data using RIA

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, Nazrina [School of Quantitative Sciences, College of Arts and Sciences, Universiti Utara Malaysia, 06010 Sintok, Kedah (Malaysia)

    2015-05-15

    Clustering may simply represent a convenient method for organizing a large data set so that it can easily be understood and information can efficiently be retrieved. However, identifying cluster in high dimensionality data sets is a difficult task because of the curse of dimensionality. Another challenge in clustering is some traditional functions cannot capture the pattern dissimilarity among objects. In this article, we used an alternative dissimilarity measurement called Robust Influence Angle (RIA) in the partitioning method. RIA is developed using eigenstructure of the covariance matrix and robust principal component score. We notice that, it can obtain cluster easily and hence avoid the curse of dimensionality. It is also manage to cluster large data sets with mixed numeric and categorical value.

  10. Digital Entrepreneurships and Business Models Canvas: Applied Research for Communication University Students

    Directory of Open Access Journals (Sweden)

    Jorge Montalvo-Castro

    2016-07-01

    Full Text Available Digital economy requires other business models different to the ones present in the physical world; therefore, they should be studied from a particular perspective. In this research a variety of canvas formats or business model canvas are compared and analyzed. The entrepreneurial intention of Communication students from the University of Lima is also studied. The methodology included a survey among students and an educational experience within the subject: Advertising Creativity. The main result of this research is a proposal canvas to design digital entrepreneurships in Communication, whether commercial or social businesses.

  11. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology.

    Science.gov (United States)

    Rohde, Gustavo K; Ozolek, John A; Parwani, Anil V; Pantanowitz, Liron

    2014-01-01

    Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI) which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University's (CMU) Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.

  12. READING HABITS IN DIGITAL ERA: A RESEARCH ON THE STUDENTS IN BORNEO UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Firima Zona Tanjung

    2017-10-01

    Full Text Available This research aims to explore the current reading habits of university students. Moreover, it aims to determine the effects of widespread use of the internet and other digital resources in reading habits and to give some possible recommendation to improve students’ reading habits in the digital era. The research design was descriptive survey research. The instrument of the research was questionnaire, which is based on Akarsu and Dariyemez (2014 and Chauhan and Lal (2012. The participants of the research were 320 students studying in six majors in Faculty of Teachers Training and Education at Borneo University. They were selected through the cluster random sampling. The questionnaire involved six categories, namely demographic information, frequency of items read, contents of online reading, online activities, content first clicked when online, and techniques to develop reading habits. All research data was analyzed using SPSS Statistics 22 program.

  13. Universal Michelson Gires-Tournois interferometer optical interleaver based on digital signal processing.

    Science.gov (United States)

    Zhang, Juan; Yang, Xiaowei

    2010-03-01

    Optical interleavers based on Michelson Gires-Tournois interferometer (MGTI) with arbitrary cascaded reflectors for symmetrical or asymmetrical periodic frequency response with arbitrary duty cycles are defined as universal MGTI optical interleaver (UMGTIOI). It can significantly enhance flexibility and applicability of optical networks. A novel and simple method based on digital signal processing is proposed for the design of UMGTIOI. Different kinds of design examples are given to confirm effectiveness of the method.

  14. Digital Labour in the University: Understanding the Transformations of Academic Work in the UK

    Directory of Open Access Journals (Sweden)

    Jamie Woodcock

    2018-01-01

    Full Text Available Universities have been the site of a variety of shifts and transformations in the previous few decades. Both the composition of students and academics are changing (to a lesser or greater extent, along with the ways in which teaching and research is supported, conducted, and delivered. The effects of neoliberalism, privatisation, precarious employment, debt, and digitalisation have been highlighted as important factors in understanding these changes. However, the ways in which these tendencies are expressed in universities – both in specific and general ways – remain fragmented and under-analysed. In particular, the role of academic labour processes, increasingly mediated through digital technology, remains in the background. There is a risk of viewing these transformations as abstracted, far removed from the day-to-day activities of academic labour on which universities rely. This article will therefore focus on connecting the broader changes in funding, organisation, and digital technology to the labour processes of academics. Rather than seeking a return to a romanticised pre-neoliberal university, this article explores the possibilities of resistance and alternatives to the university as it is now.

  15. The Universe of Digital Sky Surveys : Meeting to Honour the 70th Birthday of Massimo Capaccioli

    CERN Document Server

    Longo, Giuseppe; Marconi, Marcella; Paolillo, Maurizio; Iodice, Enrichetta

    2016-01-01

    These are the proceedings of a meeting in honour of Massimo Capaccioli at the occasion of his 70th birthday. The conference aimed at summarizing the results from the main current and past digital sky survey projects and at discussing how these can be used to inspire ongoing projects and better plan the future ones. Over the last decades, digital sky surveys performed with dedicated telescopes and finely-tuned wide-field cameras, have revolutionized astronomy. They have become the main tool to investigate the nearby and far away universe, thus providing new insights in the understanding of the galaxy structure and assembly across time, the dark components of the universe, as well as the history of our own galaxy. They have also opened the time domain leading to a new understanding of the transient phenomena in the universe. By providing public access to top quality data, digital surveys have also changed the everyday practice of astronomers who have become less dependent on direct access to large observing ...

  16. THE COLORIMETRY WITH A HIGH DIMENSIONAL RESOLUTION

    Directory of Open Access Journals (Sweden)

    I. E. Zuikov

    2013-01-01

    Full Text Available The developed method of Objects photometric and colorimetric characteristics measurements on the base of digital cameras are described. Application of the initial samples those are realized as the support of the control points on a non-radiant Object or as primary sources on a radiant Object will provide the metrological traceability and also build a conditional scale in each colour channel and expand the dynamic range of measurements by increasing the reliability and accuracy of measurement results.

  17. Digital Literacy Skills Among Librarians In University Libraries In The 21st Century In Edo And Delta States Nigeria

    OpenAIRE

    Emiri; Ogochukwu T.

    2015-01-01

    Abstract Libraries all over the world have been faced with the evolving technological advancement globalization and digitization of information. These have led to library automation digital and virtual libraries. This paper discussed the contemporary digital literacy skills DLS among librarians in university libraries the 21st century in Edo and Delta States of Southern Nigeria. The study was guided by six objectives and research questions and one hypothesis. The design of the study is descri...

  18. Digitization

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    what a concept of digital media might add to the understanding of processes of mediatization and what the concept of mediatization might add to the understanding of digital media. It is argued that digital media open an array of new trajectories in human communication, trajectories which were...

  19. High-dimensional quantum cryptography with twisted light

    International Nuclear Information System (INIS)

    Mirhosseini, Mohammad; Magaña-Loaiza, Omar S; O’Sullivan, Malcolm N; Rodenburg, Brandon; Malik, Mehul; Boyd, Robert W; Lavery, Martin P J; Padgett, Miles J; Gauthier, Daniel J

    2015-01-01

    Quantum key distribution (QKD) systems often rely on polarization of light for encoding, thus limiting the amount of information that can be sent per photon and placing tight bounds on the error rates that such a system can tolerate. Here we describe a proof-of-principle experiment that indicates the feasibility of high-dimensional QKD based on the transverse structure of the light field allowing for the transfer of more than 1 bit per photon. Our implementation uses the orbital angular momentum (OAM) of photons and the corresponding mutually unbiased basis of angular position (ANG). Our experiment uses a digital micro-mirror device for the rapid generation of OAM and ANG modes at 4 kHz, and a mode sorter capable of sorting single photons based on their OAM and ANG content with a separation efficiency of 93%. Through the use of a seven-dimensional alphabet encoded in the OAM and ANG bases, we achieve a channel capacity of 2.05 bits per sifted photon. Our experiment demonstrates that, in addition to having an increased information capacity, multilevel QKD systems based on spatial-mode encoding can be more resilient against intercept-resend eavesdropping attacks. (paper)

  20. Digital Scholarly Publishing and Archiving Services by Academic Libraries: Case Study of the University of Patras

    Directory of Open Access Journals (Sweden)

    Panos Georgiou

    2010-09-01

    Full Text Available During the last years, dramatic changes in the electronic publishing landscape have created new roles and changed the traditional ones. Presently, some libraries have capitalised on their experience and knowledge in information technology and electronic publishing to undertake such activities, while at the same time they spearhead the campaign for Open Access spreading within academic communities. The Library & Information Centre (LIC of the University of Patras (UoP, Greece, has been playing an active role in promoting Open Access (OA in Greece. Since 2007, LIC has been experimenting with OA publishing practices and tools within the framework of various R&D projects. Two of the major results of these efforts are the ‘Pasithee’ e-publishing platform and the ‘Dexamene’ digital archive for Greek scholarly journals. Both platforms are based on OJS-Open Journal Systems e-publishing software. The two facilities were appropriately modified to meet the LIC’s publishing and archiving requirements respectively. Currently two journals are being hosted on each platform and all four are from the Humanities. The LIC is negotiating with more publishers and editorial teams to host their journals. In this article we focus on: - technical and managerial key issues of the development and operation phases, - services and procedures, - the business model, - technological, procedural and legal issues and problems that were encountered when working together with publishers, editors and authors, and - future plans for improving and upgrading our e-publishing services into an integrated institutional platform to cover all kinds of publications and data types (monographs, conference proceedings, teaching material, bulletins, magazines etc.. The article concludes with a succinct presentation of the Directory of Greek Digital Resources, a pilot infrastructure developed by the LIC which indexes and presents digital publishing initiatives in Greece and aims to

  1. Digital Divide in Sub-Saharan African Universities: Recommendations and Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Barry, Boubakar; /Assoc. Afr. Univ.; Chukwuma, Victor; /Olabisi Onabanjo U.; Petitdidier, Monique; /CEPT, Velizy; Cottrell, Les; /SLAC; Bartons, Charles; /Australian Natl. U., RSES

    2009-12-17

    The Digital Divide prevents Africa from taking advantages of new information technologies. One of the most urgent priorities is to bring the Internet in African Universities, Research, and Learning Centers to the level of other regions of the world. eGY-Africa, and the Sharing Knowledge Foundation are two bottom-up initiatives by scientists to secure better cyber-infrastructure and Internet facilities in Africa. Recommendations by the present scientific communities are being formulated at national, regional and international levels. The Internet capabilities are well documented at country level overall, but this is not the case at the University level. The snapshot of the Internet status in universities in 17 African countries, obtained by a questionnaire survey, is consistent with measures of Internet penetration in the corresponding country. The monitoring of Internet performance has been proposed to those African universities to provide an information base for arguing the need to improve the coverage for Africa. A pilot program is recommended that will start scientific collaboration with Europe in western Africa using ICT. The program will lay the foundations for the arrival of new technologies like Grids.

  2. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  3. Hierarchical low-rank approximation for high dimensional approximation

    KAUST Repository

    Nouy, Anthony

    2016-01-01

    Tensor methods are among the most prominent tools for the numerical solution of high-dimensional problems where functions of multiple variables have to be approximated. Such high-dimensional approximation problems naturally arise in stochastic analysis and uncertainty quantification. In many practical situations, the approximation of high-dimensional functions is made computationally tractable by using rank-structured approximations. In this talk, we present algorithms for the approximation in hierarchical tensor format using statistical methods. Sparse representations in a given tensor format are obtained with adaptive or convex relaxation methods, with a selection of parameters using crossvalidation methods.

  4. Hierarchical low-rank approximation for high dimensional approximation

    KAUST Repository

    Nouy, Anthony

    2016-01-07

    Tensor methods are among the most prominent tools for the numerical solution of high-dimensional problems where functions of multiple variables have to be approximated. Such high-dimensional approximation problems naturally arise in stochastic analysis and uncertainty quantification. In many practical situations, the approximation of high-dimensional functions is made computationally tractable by using rank-structured approximations. In this talk, we present algorithms for the approximation in hierarchical tensor format using statistical methods. Sparse representations in a given tensor format are obtained with adaptive or convex relaxation methods, with a selection of parameters using crossvalidation methods.

  5. Explorations on High Dimensional Landscapes: Spin Glasses and Deep Learning

    Science.gov (United States)

    Sagun, Levent

    This thesis deals with understanding the structure of high-dimensional and non-convex energy landscapes. In particular, its focus is on the optimization of two classes of functions: homogeneous polynomials and loss functions that arise in machine learning. In the first part, the notion of complexity of a smooth, real-valued function is studied through its critical points. Existing theoretical results predict that certain random functions that are defined on high dimensional domains have a narrow band of values whose pre-image contains the bulk of its critical points. This section provides empirical evidence for convergence of gradient descent to local minima whose energies are near the predicted threshold justifying the existing asymptotic theory. Moreover, it is empirically shown that a similar phenomenon may hold for deep learning loss functions. Furthermore, there is a comparative analysis of gradient descent and its stochastic version showing that in high dimensional regimes the latter is a mere speedup. The next study focuses on the halting time of an algorithm at a given stopping condition. Given an algorithm, the normalized fluctuations of the halting time follow a distribution that remains unchanged even when the input data is sampled from a new distribution. Two qualitative classes are observed: a Gumbel-like distribution that appears in Google searches, human decision times, and spin glasses and a Gaussian-like distribution that appears in conjugate gradient method, deep learning with MNIST and random input data. Following the universality phenomenon, the Hessian of the loss functions of deep learning is studied. The spectrum is seen to be composed of two parts, the bulk which is concentrated around zero, and the edges which are scattered away from zero. Empirical evidence is presented for the bulk indicating how over-parametrized the system is, and for the edges that depend on the input data. Furthermore, an algorithm is proposed such that it would

  6. DIGITAL

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Digital Flood Insurance Rate Map (DFIRM) Database depicts flood risk information and supporting data used to develop the risk data. The primary risk...

  7. The Role of Entrepreneurial Knowledge and Skills in Developing Digital Entrepreneurial Intentions in Public Universities in Hamedan Province

    Directory of Open Access Journals (Sweden)

    Ahmad yaghoubi Farani

    2016-06-01

    Full Text Available The main purpose of this study was to extend the Theory of Planned Behavior (TPB to more comprehensively explain the formation of students’ digital entrepreneurial intentions.In particular, the extended TPB incorporates two critical constructs, namely entrepreneurial knowledge and skills into the original TPB model.Data were collected from 150 computer science students from four public universities in Hamedan province. The results of regression analysis showed that there was asignificant relationship between motivational factors such as attitudes, subjective norms and perceived behavioral control and digital entrepreneurial intentions. Also perceived behavioral control played the strongest role in the determination of digital entrepreneurial intentions.Furthermore, the results illustrated that entrepreneurial knowledge and skills significantly relate to digital entrepreneurial intentions. Based on the knowledge gained in this study, some recommendation were offered for developing entrepreneurial culture, knowledge and skillsin order topromoting digital entrepreneurship.

  8. Digital curation and online resources: digital scanning of surgical tools at the royal college of physicians and surgeons of Glasgow for an open university learning resource.

    Science.gov (United States)

    Earley, Kirsty; Livingstone, Daniel; Rea, Paul M

    2017-01-01

    Collection preservation is essential for the cultural status of any city. However, presenting a collection publicly risks damage. Recently this drawback has been overcome by digital curation. Described here is a method of digitisation using photogrammetry and virtual reality software. Items were selected from the Royal College of Physicians and Surgeons of Glasgow archives, and implemented into an online learning module for the Open University. Images were processed via Agisoft Photoscan, Autodesk Memento, and Garden Gnome Object 2VR. Although problems arose due to specularity, 2VR digital models were developed for online viewing. Future research must minimise the difficulty of digitising specular objects.

  9. DIGITAL TRANSFORMATION OF UNIVERSITY EDUCATION IN UKRAINE: TRAJECTORIES OF DEVELOPMENT IN THE CONDITIONS OF NEW TECHNOLOGICAL AND ECONOMIC ORDER

    Directory of Open Access Journals (Sweden)

    Oleg Ye. Kaminskyi

    2018-04-01

    Full Text Available The article substantiates the role of the digital transformation of higher education in Ukraine in the era of the fourth industrial revolution. There was proven the need to develop the strategy of the university education digital transformation, as well as the formation of new information and communication competencies. According to the authors, the strategy of digital transformation of the university education system has to include the modernization of corporate IT architecture management, which should be implemented as a cloud-based platform. The authors analysed the main possible directions of the educational services transformation and the accompanying business processes. The use of blockchain technology for the educational content management module construction is proposed. The integration of the educational content management modules of different Ukrainian universities should become the basis for creating a global cloud-based platform for higher education.

  10. Do Gender and Age Affect the Level of Digital Competence? A Study with University Students

    Directory of Open Access Journals (Sweden)

    Marcos CABEZAS GONZÁLEZ

    2017-12-01

    Full Text Available The characteristics of Information and Communication Technologies (ICT and their implementation at the global level have led to significant changes in different areas, especially institutional ones. This article presents the results of a research study whose purpose was to learn the level of digital competence of university students of education and to verify whether the variables of gender and age have any influence on it. A quantitative methodology was used, with a non-experimental, descriptive and inferential method, and the digital questionnaire was employed as an instrument for collecting information on the dimensions of knowledge and management of ICTs as well as attitudes towards them. The data were analysed based on a comparison of means, using non-parametric tests. The results show that the sample studied self-evaluated negatively their knowledge of ICT concepts, considered positive their management of devices, tools and services, and showed a very positive attitude toward technology. Regarding the variables studied, significant differences were found in favour of men in relation to knowledge and management, and in favour of older subjects with regard to attitude.

  11. First year university student engagement using digital curation and career goal setting

    Directory of Open Access Journals (Sweden)

    Amy Antonio

    2015-10-01

    Full Text Available The engagement of students is one of the most pressing issues facing higher education in the 21st century. Around the world, participation rates in tertiary education are on the rise and one of the key challenges facing educators is finding ways to engage these students. We present the results of a project that assesses the impact of an engagement strategy in which a cohort of students entering their first year of university (1 establish and maintain a clear goal of their ideal future career and (2 make use of a web-based digital curation tool to research and present their findings. The results demonstrate the effectiveness of the strategy, which could arguably be applied to a broad range of disciplines given that the majority of students today are technologically literate.

  12. The long-term preservation of the digital heritage: the case of universities institutional repositories

    Directory of Open Access Journals (Sweden)

    Luciana Duranti

    2010-03-01

    Full Text Available L'articolo affronta le tematiche legate ai problemi della conservazione a lungo termine del contenuto degli archivi digitali. Il materiale d'archivio richiede un'attenzione speciale ad aspetti quali la credibilità, il valore giuridico, i diritti morali e legali e la privacy. La necessità di assicurare accessibilità e integrità ai dati informatici è tuttavia una problematica che attraversa tutti i campi dell'informatizzazione ed è strettamente legata a fattori come la frequente duplicazione e la corretta scelta dei metadadi. Attraverso l'analisi del caso di studio rappresentato da cIRcle, l'istitutional digital repository della University of British Columbia (UBC, il contributo mostra problemi, rischi e soluzioni utili nella gestione di un archivio digitale, mostrando che l'esperienza degli archivisti può essere utile per sviluppare sistemi legati a depositi di informazione non prettamente archivistici.

  13. University Mentoring with the Support of a Digital Pen and Hypermedia Resources

    Directory of Open Access Journals (Sweden)

    Manuel Francisco Aguilar Tamayo

    2014-12-01

    Full Text Available This paper systematizes face-to-face tutoring experience and the use of technologies to make audio and written recordings of tutorial sessions with undergraduate, master’s and doctoral students available online. One hundred and three tutorial sessions with 26 students and a single tutor are analyzed; the sessions were recorded by means of a digital pen that recorded sound and writing synchronously. By means of this analysis students’ issues and problems were identified and a model for connecting teaching strategy and the production of hypermedia resources is presented. We contend that it is important to create learning resources to accompany the educational process of tutees. In conclusion the study presents a model for organizing and supporting university level tutoring.

  14. The influence of institutional measures and technological proficiency on university teaching through digital platforms

    Directory of Open Access Journals (Sweden)

    Tirado, Ramón

    2012-06-01

    Full Text Available The objective of this study is to empirically test the theoretical model that explains the influence of primary and secondary factors on the integration of digital platforms in university teaching. A sample of 495 teachers from universities in Andalusia completed an online questionnaire that analysed the functions of usage, the digital materials used, the didactic and technological competence of the teaching staff, the support measures adopted by the institutions and the effect on teaching of platform use. Prior factor analysis and the application of the Amos program enabled us to develop a structural equation model to corroborate the indirect influence of the support measures and institutional recognition on teachers in their use of the platforms, and the direct influence of the teachers’ technological proficiency. Este estudio tiene como objetivo poner a prueba empíricamente el modelo teórico que explica la influencia de los factores de primer y segundo orden sobre la integración de las plataformas digitales en la docencia universitaria. Para ello, sobre una muestra de 495 profesores universitarios andaluces, se aplica un cuestionario online que analiza las funciones de uso, materiales digitales utilizados, competencia didáctica y tecnológica del profesorado, medidas de impulso institucionales, y efectos didácticos del uso. El análisis factorial previo y la aplicación del programa Amos permite la elaboración un modelo de ecuación estructural que corrobora la influencia indirecta de las medidas de apoyo y el reconocimiento institucional sobre los efectos didácticos del uso de plataformas, así como la influencia directa de la competencia tecnológica del profesorado.

  15. University energy management improvement on basis of standards and digital technologies

    Directory of Open Access Journals (Sweden)

    Novikova Olga

    2018-01-01

    Full Text Available Nowadays to implement the energy management system it is important to fulfill not only the legal requirements but also to follow the set of recommendations prepared by international and national management standards. The purpose of this article is to prepare the concept and methodology for the optimization and improvement of the energy management system (EMS for Universities with implementation of legal requirements and recommendations from international and national management standards with the help of digital technologies. During the research the systematic analysis, complex approach, logical sampling and analogy were used. It is shown that this process should be done with the help of the process-based approach, in accordance with ISO 9001, and energy management ISO 50001. The authors developed the structure of the basic standard of energy management: "Guidelines for the energy management system". It is proved that the involvement of the technical senior students in the project of EMS improvement allows to expand their competencies for new technics and technologies. Cloud service Bitrix24 was chosen for IT-support of the project. During the study, a list of characteristics was used as a basis for creating a query to the technology department of the university. DBMS Microsoft Access was chosen for its creation. In addition, the possible results of initiating a single database containing all the information needed for accounting and control of energy supply were listed. Moreover, the possibility of automated energy management system implementation and its results were considered. The required actions described in this research can be implemented in any University, that will extend energy management to any University worldwide.

  16. The movable digital planetary from the Cruzeiro do Sul University as a distributing agent of astronomy

    Science.gov (United States)

    Voelzke, Marcos Rincon

    2012-10-01

    The Movable Digital Planetary from the Cruzeiro do Sul University has been working in order to publicize and to popularize Astronomy, in particular among students and teachers of Primary (EF) and Medium (EM) Education in municipal and state schools of the City of São Paulo, but also for the general public at large. The aim of this paper is to show and publicize the activities already undertaken by this planetary. In 2010, several presentations were recorded, such as: for the School Cruzeiro do Sul, in São Miguel Paulista, serving 161 children in the EF; Eighth Symposium on Education, Cruzeiro do Sul University, 75 students; NGO Educational Project Capuano, Anália Franco, 30 adults: Fair Student Guide in Shopping Center Norte, 455 people; NGO Association for Charitable Paulista, Burgo Paulista, 70 children; Workshop of Advanced Computing and Informatics, Cruzeiro do Sul University, 37 students; Day of Social Responsibility, Social Work in Don Bosco, Itaquera, 133 people! . In 2011 the presentations took place during the XIII Regional Meeting of Astronomy Education at Cruzeiro do Sul University, serving 112 teachers; College Cruzeiro do Sul, São Miguel Paulista, 356 children of the EF; College Brasilia from São Paulo, Anália Franco, 102 children in the EF and for the Scout Group Caramuru, São Paulo, 104 children. The applied methodology in all presentations consisted of the exhibition of two videos about Astronomy with a subsequent discussion about the presented issues. Previous surveys have shown a great interest in the majority of participants in wanting to learn more about the subject, which clearly explains the importance of education in non-formal places for the teaching of Astronomy

  17. THE ROLE OF DIGITAL MARKETING IN UNIVERSITY SPORT: AN OVERVIEW STUDY OF HIGHER EDUCATION INSTITUTION IN CROATIA

    Directory of Open Access Journals (Sweden)

    Antun Biloš

    2016-12-01

    Full Text Available The importance of student sport activities within the structure of academic development is arguably significant. However, university sport is one of the elements of academic development that is not represented adequately as a research subject on a global scale in both scientific and professional environments alike. Along with the global growth of university level education based on the rise of student mobility across countries and continents, and the strong global ICT development, a new perspective on university sport can be observed and several implications analyzed. The focus of this paper is set on the communication capabilities of the internet as a digital medium that can be used as a means of fostering student sport and related activities while taking into account the characteristics and behavioral components of the student population. The primary research was conducted on a sample of students of Josip Juraj Strossmayer University of Osijek. The research provided several interesting implications on student behavior regarding the general information collection and consumption, as well as information about student sport activities on the university level. The paper provides a brief sport marketing literature review and suggests several important guidelines for further research. The assumption that the internet is a key element in the marketing potential of student sport was confirmed. Comparative analysis of digital marketing activities of benchmark universities has been conducted in order to determine suggestions on creating and/or improving digital marketing tools such as web site, social network presence and mobile application for reaching marketing potential of university sport.

  18. Digital Literacy Skills Among Librarians In University Libraries In The 21st Century In Edo And Delta States Nigeria

    Directory of Open Access Journals (Sweden)

    Emiri

    2015-08-01

    Full Text Available Abstract Libraries all over the world have been faced with the evolving technological advancement globalization and digitization of information. These have led to library automation digital and virtual libraries. This paper discussed the contemporary digital literacy skills DLS among librarians in university libraries the 21st century in Edo and Delta States of Southern Nigeria. The study was guided by six objectives and research questions and one hypothesis. The design of the study is descriptive survey and the population consists of all librarians from university libraries in the aforementioned states in Nigeria. The instrument used to generate data is the questionnaire and the date generated was analyzed using simple percentages and frequency count for research questions and SPSS version 14.0. The findings show that electronic mailing social networking use of PDAs mobile phones and internet surfing are the major DLS amongst librarians. It was also discovered that librarians acquired DLS through colleagues assistance trial and error IT programmes and formal education while librarians level of use of DLS is low amongst other findings. Researcher recommends that management of university libraries should provide training for librarians so as to help update their knowledge in application of digital skills and digital skill competence should be giving more attention during recruitment of librarians amongst others.

  19. Harnessing high-dimensional hyperentanglement through a biphoton frequency comb

    Science.gov (United States)

    Xie, Zhenda; Zhong, Tian; Shrestha, Sajan; Xu, Xinan; Liang, Junlin; Gong, Yan-Xiao; Bienfang, Joshua C.; Restelli, Alessandro; Shapiro, Jeffrey H.; Wong, Franco N. C.; Wei Wong, Chee

    2015-08-01

    Quantum entanglement is a fundamental resource for secure information processing and communications, and hyperentanglement or high-dimensional entanglement has been separately proposed for its high data capacity and error resilience. The continuous-variable nature of the energy-time entanglement makes it an ideal candidate for efficient high-dimensional coding with minimal limitations. Here, we demonstrate the first simultaneous high-dimensional hyperentanglement using a biphoton frequency comb to harness the full potential in both the energy and time domain. Long-postulated Hong-Ou-Mandel quantum revival is exhibited, with up to 19 time-bins and 96.5% visibilities. We further witness the high-dimensional energy-time entanglement through Franson revivals, observed periodically at integer time-bins, with 97.8% visibility. This qudit state is observed to simultaneously violate the generalized Bell inequality by up to 10.95 standard deviations while observing recurrent Clauser-Horne-Shimony-Holt S-parameters up to 2.76. Our biphoton frequency comb provides a platform for photon-efficient quantum communications towards the ultimate channel capacity through energy-time-polarization high-dimensional encoding.

  20. Digital Divide in the Utilization of Information and Communication Technology (ICT) in Counsellor Education in Nigerian Universities

    Science.gov (United States)

    Eyo, Mfon

    2014-01-01

    This study investigated digital divide in the utilization of Information and Communication Technology (ICT) in counsellor education in Nigerian universities. It had two research questions and two hypotheses tested at 0.05 level of significance. It adopted a survey design and used ICT Utilization Questionnaire (IUQ) in gathering data from the…

  1. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  2. Editorial volume 3 - issue 1: Wishes and hopes for the digital university.

    Directory of Open Access Journals (Sweden)

    Yngve Nordkvelle

    2007-12-01

    issue. They suggest that letting university teachers study online would be a valuable exercise before letting them organize and run online learning themselves. Their paper reveals how teachers reflect on being students themselves when they learn how to study online. This is in a profound way an essential step in making colleagues critical about e-learning. Kristen Snyder coins the term “digital culture” as a key term in understanding the “information age”. She proposes that technology in human communication is a part of the communication act and therefore a part of the process of creating meaning. Her aim is to develop an awareness of the implications for behavior, norms and values, and how meaning making is integral to understanding the digital culture. She addresses in many respect the concerns voiced by Douglas Kellner (above. In developing a digital culture within the university, we can already trace significant differences between student cohorts. A general feeling is that mature students are less confident with ICT and its associated soft- and hardware than the youngest students arriving directly from upper secondary education. In a journal addressing lifelong learning, Håvard Skaar’s contribution suggests it is interesting to understand children’s learning processes from an early stage in the area of ICT. The article focuses on how boys and girls express themselves differently when using multimedia. Finally, Gunilla Jedeskog, who has followed the implementation of ICT in Swedish schools for more than two decades, makes an analysis of policy documents that guided this development. She addresses ownership of the process, and how it was interpreted. She finds that implementation was anything but a streamlined process, in many ways similar to the process in higher education, and that it changed focus over time. References:Burbules, N. C. og Callister, T. A.jr. (2000 Watch IT. The Risks and Promises of Information Technologies for Education. Boulder, Col

  3. High-dimensional model estimation and model selection

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I will review concepts and algorithms from high-dimensional statistics for linear model estimation and model selection. I will particularly focus on the so-called p>>n setting where the number of variables p is much larger than the number of samples n. I will focus mostly on regularized statistical estimators that produce sparse models. Important examples include the LASSO and its matrix extension, the Graphical LASSO, and more recent non-convex methods such as the TREX. I will show the applicability of these estimators in a diverse range of scientific applications, such as sparse interaction graph recovery and high-dimensional classification and regression problems in genomics.

  4. University digital libraries in Spain and TIC as paradigms of the information needs for teachers and students

    Directory of Open Access Journals (Sweden)

    Carlos Oliva Marañón

    2012-11-01

    Full Text Available The university digital libraries have experienced an improvement in recent years, allowing easy retrieval of information in different media. The objectives of this research are to verify the suitability of online catalogs to meet the information needs of teachers and students in the area of Library and Information by evaluating a sample of 23 University digital libraries, and raise the necessary improvements. The results verify the suitability of online catalogs to solve the information needs of teachers, researchers and students, being the University libraries of Barcelona, Granada and Sevilla the most relevant in the area of documentation, as well as the professionalism of librarians to heed the needs of users. Among other improvements, raised teacher education and students in using electronic resources and the creation of online help to improve user interfaces-Web in order to retrieve information quickly and efficiently.

  5. Using High-Dimensional Image Models to Perform Highly Undetectable Steganography

    Science.gov (United States)

    Pevný, Tomáš; Filler, Tomáš; Bas, Patrick

    This paper presents a complete methodology for designing practical and highly-undetectable stegosystems for real digital media. The main design principle is to minimize a suitably-defined distortion by means of efficient coding algorithm. The distortion is defined as a weighted difference of extended state-of-the-art feature vectors already used in steganalysis. This allows us to "preserve" the model used by steganalyst and thus be undetectable even for large payloads. This framework can be efficiently implemented even when the dimensionality of the feature set used by the embedder is larger than 107. The high dimensional model is necessary to avoid known security weaknesses. Although high-dimensional models might be problem in steganalysis, we explain, why they are acceptable in steganography. As an example, we introduce HUGO, a new embedding algorithm for spatial-domain digital images and we contrast its performance with LSB matching. On the BOWS2 image database and in contrast with LSB matching, HUGO allows the embedder to hide 7× longer message with the same level of security level.

  6. Supporting Dynamic Quantization for High-Dimensional Data Analytics.

    Science.gov (United States)

    Guzun, Gheorghi; Canahuate, Guadalupe

    2017-05-01

    Similarity searches are at the heart of exploratory data analysis tasks. Distance metrics are typically used to characterize the similarity between data objects represented as feature vectors. However, when the dimensionality of the data increases and the number of features is large, traditional distance metrics fail to distinguish between the closest and furthest data points. Localized distance functions have been proposed as an alternative to traditional distance metrics. These functions only consider dimensions close to query to compute the distance/similarity. Furthermore, in order to enable interactive explorations of high-dimensional data, indexing support for ad-hoc queries is needed. In this work we set up to investigate whether bit-sliced indices can be used for exploratory analytics such as similarity searches and data clustering for high-dimensional big-data. We also propose a novel dynamic quantization called Query dependent Equi-Depth (QED) quantization and show its effectiveness on characterizing high-dimensional similarity. When applying QED we observe improvements in kNN classification accuracy over traditional distance functions. Gheorghi Guzun and Guadalupe Canahuate. 2017. Supporting Dynamic Quantization for High-Dimensional Data Analytics. In Proceedings of Ex-ploreDB'17, Chicago, IL, USA, May 14-19, 2017, 6 pages. https://doi.org/http://dx.doi.org/10.1145/3077331.3077336.

  7. A hybridized K-means clustering approach for high dimensional ...

    African Journals Online (AJOL)

    International Journal of Engineering, Science and Technology ... Due to incredible growth of high dimensional dataset, conventional data base querying methods are inadequate to extract useful information, so researchers nowadays ... Recently cluster analysis is a popularly used data analysis method in number of areas.

  8. On Robust Information Extraction from High-Dimensional Data

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2014-01-01

    Roč. 9, č. 1 (2014), s. 131-144 ISSN 1452-4864 Grant - others:GA ČR(CZ) GA13-01930S Institutional support: RVO:67985807 Keywords : data mining * high-dimensional data * robust econometrics * outliers * machine learning Subject RIV: IN - Informatics, Computer Science

  9. Inference in High-dimensional Dynamic Panel Data Models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Tang, Haihan

    We establish oracle inequalities for a version of the Lasso in high-dimensional fixed effects dynamic panel data models. The inequalities are valid for the coefficients of the dynamic and exogenous regressors. Separate oracle inequalities are derived for the fixed effects. Next, we show how one can...

  10. Pricing High-Dimensional American Options Using Local Consistency Conditions

    NARCIS (Netherlands)

    Berridge, S.J.; Schumacher, J.M.

    2004-01-01

    We investigate a new method for pricing high-dimensional American options. The method is of finite difference type but is also related to Monte Carlo techniques in that it involves a representative sampling of the underlying variables.An approximating Markov chain is built using this sampling and

  11. Irregular grid methods for pricing high-dimensional American options

    NARCIS (Netherlands)

    Berridge, S.J.

    2004-01-01

    This thesis proposes and studies numerical methods for pricing high-dimensional American options; important examples being basket options, Bermudan swaptions and real options. Four new methods are presented and analysed, both in terms of their application to various test problems, and in terms of

  12. Hospital information systems: experience at the fully digitized Seoul National University Bundang Hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Hwang, Hee; Jheon, Sanghoon

    2016-08-01

    The different levels of health information technology (IT) adoption and its integration into hospital workflow can affect the maximization of the benefits of using of health IT. We aimed at sharing our experiences and the journey to the successful adoption of health IT over 13 years at a tertiary university hospital in South Korea. The integrated system of comprehensive applications for direct care, support care, and smart care has been implemented with the latest IT and a rich user information platform, achieving the fully digitized hospital. The users experience design methodology, barcode and radio-frequency identification (RFID) technologies, smartphone and mobile technologies, and data analytics were integrated into hospital workflow. Applications for user-centered electronic medical record (EMR) and clinical decision support (CDS), closed loop medication administration (CLMA), mobile EMR and dashboard system for care coordination, clinical data warehouse (CDW) system, and patient engagement solutions were designed and developed to improve quality of care, work efficiency, and patient safety. We believe that comprehensive electronic health record systems and patient-centered smart hospital applications will go a long way in ensuring seamless patient care and experience.

  13. A basis of common approach to the development of universal steganalysis methods for digital images

    Directory of Open Access Journals (Sweden)

    Alla А. Kobozeva

    2014-12-01

    Full Text Available In this paper a new common approach to the organization of steganalysis in digital images is developed. New features of formal parameters defining the image are identified, theoretically grounded and practically tested. For the first time characteristics of mutual disposition of the left and right singular vectors corresponding to the largest singular value of the matrix (block of matrix of an image and the vector composed of the singular values obtained as a result of normal singular decomposition of the matrix (block matrix are obtained. It is shown that for the majority of the blocks of the original image (regardless of the storage format — lossy, lossless the angle between the left (right singular vector and the vector composed of singular numbers is determined by the angle between the n-optimal vector and the standard space basis of the corresponding dimension. It is shown that the discovered feature is violated for the mentioned formal parameters in the disturbed image. This is an indicator of integrity violation, particularly steganotransformation, and it can be used to develop new universal steganalysis methods and algorithms. Their efficiency does not depend on the specifics of steganoalgorithm used for insertion of additional information.

  14. EPS-LASSO: Test for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.

    Science.gov (United States)

    Xu, Chao; Fang, Jian; Shen, Hui; Wang, Yu-Ping; Deng, Hong-Wen

    2018-01-25

    Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in extreme phenotypic samples, EPS can boost the association power compared to random sampling. Most existing statistical methods for EPS examine the genetic factors individually, despite many quantitative traits have multiple genetic factors underlying their variation. It is desirable to model the joint effects of genetic factors, which may increase the power and identify novel quantitative trait loci under EPS. The joint analysis of genetic data in high-dimensional situations requires specialized techniques, e.g., the least absolute shrinkage and selection operator (LASSO). Although there are extensive research and application related to LASSO, the statistical inference and testing for the sparse model under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function. The comprehensive simulation shows EPS-LASSO outperforms existing methods with stable type I error and FDR control. EPS-LASSO can provide a consistent power for both low- and high-dimensional situations compared with the other methods dealing with high-dimensional situations. The power of EPS-LASSO is close to other low-dimensional methods when the causal effect sizes are small and is superior when the effects are large. Applying EPS-LASSO to a transcriptome-wide gene expression study for obesity reveals 10 significant body mass index associated genes. Our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. The source code is available at https://github.com/xu1912/EPSLASSO. hdeng2@tulane.edu. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please

  15. Digital teaching file. Concept, implementation, and experiences in a university setting

    International Nuclear Information System (INIS)

    Trumm, C.; Wirth, S.; Treitl, M.; Lucke, A.; Kuettner, B.; Pander, E.; Clevert, D.-A.; Glaser, C.; Reiser, M.; Dugas, M.

    2005-01-01

    Film-based teaching files require a substantial investment in human, logistic, and financial resources. The combination of computer and network technology facilitates the workflow integration of distributing radiologic teaching cases within an institution (intranet) or via the World Wide Web (Internet). A digital teaching file (DTF) should include the following basic functions: image import from different sources and of different formats, editing of imported images, uniform case classification, quality control (peer review), a controlled access of different user groups (in-house and external), and an efficient retrieval strategy. The portable network graphics image format (PNG) is especially suitable for DTFs because of several features: pixel support, 2D-interlacing, gamma correction, and lossless compression. The American College of Radiology (ACR) ''Index for Radiological Diagnoses'' is hierarchically organized and thus an ideal classification system for a DTF. Computer-based training (CBT) in radiology is described in numerous publications, from supplementing traditional learning methods to certified education via the Internet. Attractiveness of a CBT application can be increased by integration of graphical and interactive elements but makes workflow integration of daily case input more difficult. Our DTF was built with established Internet instruments and integrated into a heterogeneous PACS/RIS environment. It facilitates a quick transfer (DICOM S end) of selected images at the time of interpretation to the DTF and access to the DTF application at any time anywhere within the university hospital intranet employing a standard web browser. A DTF is a small but important building block in an institutional strategy of knowledge management. (orig.) [de

  16. A University Library Creates a Digital Repository for Documenting and Disseminating Community Engagement

    Science.gov (United States)

    Miller, William A.; Billings, Marilyn

    2012-01-01

    Digital repositories are new tools for documenting the accumulated scholarly work produced at academic institutions and disseminating that material broadly via the internet. Digital repositories support all file types and can be adapted to meet the custom design specifications of individual institutions. A section for community engagement…

  17. High Dimensional Modulation and MIMO Techniques for Access Networks

    DEFF Research Database (Denmark)

    Binti Othman, Maisara

    Exploration of advanced modulation formats and multiplexing techniques for next generation optical access networks are of interest as promising solutions for delivering multiple services to end-users. This thesis addresses this from two different angles: high dimensionality carrierless...... the capacity per wavelength of the femto-cell network. Bit rate up to 1.59 Gbps with fiber-wireless transmission over 1 m air distance is demonstrated. The results presented in this thesis demonstrate the feasibility of high dimensionality CAP in increasing the number of dimensions and their potentially......) optical access network. 2 X 2 MIMO RoF employing orthogonal frequency division multiplexing (OFDM) with 5.6 GHz RoF signaling over all-vertical cavity surface emitting lasers (VCSEL) WDM passive optical networks (PONs). We have employed polarization division multiplexing (PDM) to further increase...

  18. HSM: Heterogeneous Subspace Mining in High Dimensional Data

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Seidl, Thomas

    2009-01-01

    Heterogeneous data, i.e. data with both categorical and continuous values, is common in many databases. However, most data mining algorithms assume either continuous or categorical attributes, but not both. In high dimensional data, phenomena due to the "curse of dimensionality" pose additional...... challenges. Usually, due to locally varying relevance of attributes, patterns do not show across the full set of attributes. In this paper we propose HSM, which defines a new pattern model for heterogeneous high dimensional data. It allows data mining in arbitrary subsets of the attributes that are relevant...... for the respective patterns. Based on this model we propose an efficient algorithm, which is aware of the heterogeneity of the attributes. We extend an indexing structure for continuous attributes such that HSM indexing adapts to different attribute types. In our experiments we show that HSM efficiently mines...

  19. Analysis of chaos in high-dimensional wind power system.

    Science.gov (United States)

    Wang, Cong; Zhang, Hongli; Fan, Wenhui; Ma, Ping

    2018-01-01

    A comprehensive analysis on the chaos of a high-dimensional wind power system is performed in this study. A high-dimensional wind power system is more complex than most power systems. An 11-dimensional wind power system proposed by Huang, which has not been analyzed in previous studies, is investigated. When the systems are affected by external disturbances including single parameter and periodic disturbance, or its parameters changed, chaotic dynamics of the wind power system is analyzed and chaotic parameters ranges are obtained. Chaos existence is confirmed by calculation and analysis of all state variables' Lyapunov exponents and the state variable sequence diagram. Theoretical analysis and numerical simulations show that the wind power system chaos will occur when parameter variations and external disturbances change to a certain degree.

  20. HIGH DIMENSIONAL COVARIANCE MATRIX ESTIMATION IN APPROXIMATE FACTOR MODELS.

    Science.gov (United States)

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2011-01-01

    The variance covariance matrix plays a central role in the inferential theories of high dimensional factor models in finance and economics. Popular regularization methods of directly exploiting sparsity are not directly applicable to many financial problems. Classical methods of estimating the covariance matrices are based on the strict factor models, assuming independent idiosyncratic components. This assumption, however, is restrictive in practical applications. By assuming sparse error covariance matrix, we allow the presence of the cross-sectional correlation even after taking out common factors, and it enables us to combine the merits of both methods. We estimate the sparse covariance using the adaptive thresholding technique as in Cai and Liu (2011), taking into account the fact that direct observations of the idiosyncratic components are unavailable. The impact of high dimensionality on the covariance matrix estimation based on the factor structure is then studied.

  1. High-dimensional data in economics and their (robust) analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 12, č. 1 (2017), s. 171-183 ISSN 1452-4864 R&D Projects: GA ČR GA17-07384S Institutional support: RVO:67985556 Keywords : econometrics * high-dimensional data * dimensionality reduction * linear regression * classification analysis * robustness Subject RIV: BA - General Mathematics OBOR OECD: Business and management http://library.utia.cas.cz/separaty/2017/SI/kalina-0474076.pdf

  2. High-dimensional Data in Economics and their (Robust) Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 12, č. 1 (2017), s. 171-183 ISSN 1452-4864 R&D Projects: GA ČR GA17-07384S Grant - others:GA ČR(CZ) GA13-01930S Institutional support: RVO:67985807 Keywords : econometrics * high-dimensional data * dimensionality reduction * linear regression * classification analysis * robustness Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability

  3. Quantifying high dimensional entanglement with two mutually unbiased bases

    Directory of Open Access Journals (Sweden)

    Paul Erker

    2017-07-01

    Full Text Available We derive a framework for quantifying entanglement in multipartite and high dimensional systems using only correlations in two unbiased bases. We furthermore develop such bounds in cases where the second basis is not characterized beyond being unbiased, thus enabling entanglement quantification with minimal assumptions. Furthermore, we show that it is feasible to experimentally implement our method with readily available equipment and even conservative estimates of physical parameters.

  4. High dimensional model representation method for fuzzy structural dynamics

    Science.gov (United States)

    Adhikari, S.; Chowdhury, R.; Friswell, M. I.

    2011-03-01

    Uncertainty propagation in multi-parameter complex structures possess significant computational challenges. This paper investigates the possibility of using the High Dimensional Model Representation (HDMR) approach when uncertain system parameters are modeled using fuzzy variables. In particular, the application of HDMR is proposed for fuzzy finite element analysis of linear dynamical systems. The HDMR expansion is an efficient formulation for high-dimensional mapping in complex systems if the higher order variable correlations are weak, thereby permitting the input-output relationship behavior to be captured by the terms of low-order. The computational effort to determine the expansion functions using the α-cut method scales polynomically with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is first illustrated for multi-parameter nonlinear mathematical test functions with fuzzy variables. The method is then integrated with a commercial finite element software (ADINA). Modal analysis of a simplified aircraft wing with fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations. It is shown that using the proposed HDMR approach, the number of finite element function calls can be reduced without significantly compromising the accuracy.

  5. High Dimensional Classification Using Features Annealed Independence Rules.

    Science.gov (United States)

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  6. Tribus digitales en las aulas universitarias Digital Tribes in the University Classrooms

    Directory of Open Access Journals (Sweden)

    Andrés Palacios Picos

    2010-03-01

    believe, and that they give rise to many different situations that characterize university teaching and learning practice. Using the ratings of three Likert questionnaires on information processing and communication in three different environments (Moodle, Tuenti and the classroom itself, and applying multivariate techniques (factor analysis and cluster analysis, we have found four clusters: pro-ICT students, anti-ICT students, listless student and neutral student. The presence of such segments of students allows us to conclude that, although computers are nowadays taken for granted in Higher Education classrooms, we are perhaps overestimating both the real impact of ICT on teaching and students’ digital competencies, and that this false perception of reality benefits technology vendors but not methodological and pedagogic innovation, which can only be achieved through the necessary reflection on education matters from educational principles.

  7. Enhanced spectral resolution by high-dimensional NMR using the filter diagonalization method and "hidden" dimensions.

    Science.gov (United States)

    Meng, Xi; Nguyen, Bao D; Ridge, Clark; Shaka, A J

    2009-01-01

    High-dimensional (HD) NMR spectra have poorer digital resolution than low-dimensional (LD) spectra, for a fixed amount of experiment time. This has led to "reduced-dimensionality" strategies, in which several LD projections of the HD NMR spectrum are acquired, each with higher digital resolution; an approximate HD spectrum is then inferred by some means. We propose a strategy that moves in the opposite direction, by adding more time dimensions to increase the information content of the data set, even if only a very sparse time grid is used in each dimension. The full HD time-domain data can be analyzed by the filter diagonalization method (FDM), yielding very narrow resonances along all of the frequency axes, even those with sparse sampling. Integrating over the added dimensions of HD FDM NMR spectra reconstitutes LD spectra with enhanced resolution, often more quickly than direct acquisition of the LD spectrum with a larger number of grid points in each of the fewer dimensions. If the extra-dimensions do not appear in the final spectrum, and are used solely to boost information content, we propose the moniker hidden-dimension NMR. This work shows that HD peaks have unmistakable frequency signatures that can be detected as single HD objects by an appropriate algorithm, even though their patterns would be tricky for a human operator to visualize or recognize, and even if digital resolution in an HD FT spectrum is very coarse compared with natural line widths.

  8. Hawking radiation of a high-dimensional rotating black hole

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Ren; Zhang, Lichun; Li, Huaifan; Wu, Yueqin [Shanxi Datong University, Institute of Theoretical Physics, Department of Physics, Datong (China)

    2010-01-15

    We extend the classical Damour-Ruffini method and discuss Hawking radiation spectrum of high-dimensional rotating black hole using Tortoise coordinate transformation defined by taking the reaction of the radiation to the spacetime into consideration. Under the condition that the energy and angular momentum are conservative, taking self-gravitation action into account, we derive Hawking radiation spectrums which satisfy unitary principle in quantum mechanics. It is shown that the process that the black hole radiates particles with energy {omega} is a continuous tunneling process. We provide a theoretical basis for further studying the physical mechanism of black-hole radiation. (orig.)

  9. On spectral distribution of high dimensional covariation matrices

    DEFF Research Database (Denmark)

    Heinrich, Claudio; Podolskij, Mark

    In this paper we present the asymptotic theory for spectral distributions of high dimensional covariation matrices of Brownian diffusions. More specifically, we consider N-dimensional Itô integrals with time varying matrix-valued integrands. We observe n equidistant high frequency data points...... of the underlying Brownian diffusion and we assume that N/n -> c in (0,oo). We show that under a certain mixed spectral moment condition the spectral distribution of the empirical covariation matrix converges in distribution almost surely. Our proof relies on method of moments and applications of graph theory....

  10. The additive hazards model with high-dimensional regressors

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    This paper considers estimation and prediction in the Aalen additive hazards model in the case where the covariate vector is high-dimensional such as gene expression measurements. Some form of dimension reduction of the covariate space is needed to obtain useful statistical analyses. We study...... model. A standard PLS algorithm can also be constructed, but it turns out that the resulting predictor can only be related to the original covariates via time-dependent coefficients. The methods are applied to a breast cancer data set with gene expression recordings and to the well known primary biliary...

  11. Data analysis in high-dimensional sparse spaces

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    classification techniques for high-dimensional problems are presented: Sparse discriminant analysis, sparse mixture discriminant analysis and orthogonality constrained support vector machines. The first two introduces sparseness to the well known linear and mixture discriminant analysis and thereby provide low...... are applied to classifications of fish species, ear canal impressions used in the hearing aid industry, microbiological fungi species, and various cancerous tissues and healthy tissues. In addition, novel applications of sparse regressions (also called the elastic net) to the medical, concrete, and food...

  12. Why we need to find time for digital humanities: presenting a new partnership model at the University of Sussex

    Directory of Open Access Journals (Sweden)

    Jane Harvell

    2017-11-01

    Full Text Available Recognizing that academic libraries should develop and nurture strong, mutually beneficial relationships with researchers in digital humanities, the authors believe it is strategically important to invest time and resources exploring ideas and partnering with academic colleagues on projects. This approach can provide many unforeseen benefits to both the Library service and to the workforce. The article is based on our experience as Core Associates of the Sussex Humanities Lab at the University of Sussex. It outlines the impact this collaboration has had, including influencing working practices and culture within the Library, involvement in research bids, informing the development of new services, and addressing library questions using digital humanities methods. Most importantly, it exemplifies a new model of the librarian as equal partner in the research process.

  13. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology

    Directory of Open Access Journals (Sweden)

    Gustavo K Rohde

    2014-01-01

    Full Text Available Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University′s (CMU Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.

  14. Adding Value to the University of Oklahoma Libraries History of Science Collections through Digital Enhancement

    Directory of Open Access Journals (Sweden)

    Maura Valentino

    2014-03-01

    Full Text Available Much of the focus of digital collections has been and continues to be on rare and unique materials, including monographs.   A monograph may be made even rarer and more valuable by virtue of hand written marginalia.   Using technology to enhance scans of unique books and make previously unreadable marginalia readable increases the value of a digital object to researchers.  This article describes a case study of enhancing the marginalia in a rare book by Copernicus.

  15. Scalable Nearest Neighbor Algorithms for High Dimensional Data.

    Science.gov (United States)

    Muja, Marius; Lowe, David G

    2014-11-01

    For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching.

  16. Manifold learning to interpret JET high-dimensional operational space

    International Nuclear Information System (INIS)

    Cannas, B; Fanni, A; Pau, A; Sias, G; Murari, A

    2013-01-01

    In this paper, the problem of visualization and exploration of JET high-dimensional operational space is considered. The data come from plasma discharges selected from JET campaigns from C15 (year 2005) up to C27 (year 2009). The aim is to learn the possible manifold structure embedded in the data and to create some representations of the plasma parameters on low-dimensional maps, which are understandable and which preserve the essential properties owned by the original data. A crucial issue for the design of such mappings is the quality of the dataset. This paper reports the details of the criteria used to properly select suitable signals downloaded from JET databases in order to obtain a dataset of reliable observations. Moreover, a statistical analysis is performed to recognize the presence of outliers. Finally data reduction, based on clustering methods, is performed to select a limited and representative number of samples for the operational space mapping. The high-dimensional operational space of JET is mapped using a widely used manifold learning method, the self-organizing maps. The results are compared with other data visualization methods. The obtained maps can be used to identify characteristic regions of the plasma scenario, allowing to discriminate between regions with high risk of disruption and those with low risk of disruption. (paper)

  17. Elucidating high-dimensional cancer hallmark annotation via enriched ontology.

    Science.gov (United States)

    Yan, Shankai; Wong, Ka-Chun

    2017-09-01

    Cancer hallmark annotation is a promising technique that could discover novel knowledge about cancer from the biomedical literature. The automated annotation of cancer hallmarks could reveal relevant cancer transformation processes in the literature or extract the articles that correspond to the cancer hallmark of interest. It acts as a complementary approach that can retrieve knowledge from massive text information, advancing numerous focused studies in cancer research. Nonetheless, the high-dimensional nature of cancer hallmark annotation imposes a unique challenge. To address the curse of dimensionality, we compared multiple cancer hallmark annotation methods on 1580 PubMed abstracts. Based on the insights, a novel approach, UDT-RF, which makes use of ontological features is proposed. It expands the feature space via the Medical Subject Headings (MeSH) ontology graph and utilizes novel feature selections for elucidating the high-dimensional cancer hallmark annotation space. To demonstrate its effectiveness, state-of-the-art methods are compared and evaluated by a multitude of performance metrics, revealing the full performance spectrum on the full set of cancer hallmarks. Several case studies are conducted, demonstrating how the proposed approach could reveal novel insights into cancers. https://github.com/cskyan/chmannot. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. High-Dimensional Adaptive Particle Swarm Optimization on Heterogeneous Systems

    International Nuclear Information System (INIS)

    Wachowiak, M P; Sarlo, B B; Foster, A E Lambe

    2014-01-01

    Much work has recently been reported in parallel GPU-based particle swarm optimization (PSO). Motivated by the encouraging results of these investigations, while also recognizing the limitations of GPU-based methods for big problems using a large amount of data, this paper explores the efficacy of employing other types of parallel hardware for PSO. Most commodity systems feature a variety of architectures whose high-performance capabilities can be exploited. In this paper, high-dimensional problems and those that employ a large amount of external data are explored within the context of heterogeneous systems. Large problems are decomposed into constituent components, and analyses are undertaken of which components would benefit from multi-core or GPU parallelism. The current study therefore provides another demonstration that ''supercomputing on a budget'' is possible when subtasks of large problems are run on hardware most suited to these tasks. Experimental results show that large speedups can be achieved on high dimensional, data-intensive problems. Cost functions must first be analysed for parallelization opportunities, and assigned hardware based on the particular task

  19. High-dimensional single-cell cancer biology.

    Science.gov (United States)

    Irish, Jonathan M; Doxie, Deon B

    2014-01-01

    Cancer cells are distinguished from each other and from healthy cells by features that drive clonal evolution and therapy resistance. New advances in high-dimensional flow cytometry make it possible to systematically measure mechanisms of tumor initiation, progression, and therapy resistance on millions of cells from human tumors. Here we describe flow cytometry techniques that enable a "single-cell " view of cancer. High-dimensional techniques like mass cytometry enable multiplexed single-cell analysis of cell identity, clinical biomarkers, signaling network phospho-proteins, transcription factors, and functional readouts of proliferation, cell cycle status, and apoptosis. This capability pairs well with a signaling profiles approach that dissects mechanism by systematically perturbing and measuring many nodes in a signaling network. Single-cell approaches enable study of cellular heterogeneity of primary tissues and turn cell subsets into experimental controls or opportunities for new discovery. Rare populations of stem cells or therapy-resistant cancer cells can be identified and compared to other types of cells within the same sample. In the long term, these techniques will enable tracking of minimal residual disease (MRD) and disease progression. By better understanding biological systems that control development and cell-cell interactions in healthy and diseased contexts, we can learn to program cells to become therapeutic agents or target malignant signaling events to specifically kill cancer cells. Single-cell approaches that provide deep insight into cell signaling and fate decisions will be critical to optimizing the next generation of cancer treatments combining targeted approaches and immunotherapy.

  20. Evaluasi Tingkat Kesiapan Organisasi dalam Rangka Preservasi Digital (Studi Kasus Pada Unit Repositori Flinders Academic Commons Of Flinders University Library (FACFUL, Adelaide, Australia Selatan

    Directory of Open Access Journals (Sweden)

    Rattahpinnusa Haresariu Handisa

    2017-12-01

    Full Text Available Penelitian ini bertujuan mengidentifikasi tingkat kesiapan lembaga repositori dalam rangka preservasi digital pada Flinders Academic Commons Flinders University Library (FACFUL dan mengidentifikasi faktor-faktor yang berpengaruh terhadap tingkat kesiapan organisasi. Terdapat tiga aspek yang diteliti meliputi: kesiapan infrastruktur, kesiapan teknologi serta sumber daya yang dibutuhkan bagi preservasi digital. Penelitian ini menggunakan metode studi kasus dengan  intesity sampel. Adapun instrumen pengumpulan data menggunakan Cornell University Survey of Institutional Readiness Checklist. Selanjutnya, tehnik pengambilan data menggunakan tehnik wawancara dengan Ms. Liz-Walkley Hall selaku pustakawati yang bertanggung jawab terhadap unit repositori digital FACFUL. Adapun informasi penunjang diperoleh melalui studi kepustakaan merujuk pada website Perpustakaan Universitas  Flinders. Data yang terkumpul dianalisis secara deskriptif menggunakan indikator kesiapan organisasi Cornel University Model. Hasil penelitian menunjukkan bahwa unit repositori pada FACFUL kurang siap dalam menjalankan preservasi digital. Tingkat kesiapan organisasi Perpustakaan Universitas  Flinders dalam pelestarian digital berapa pada level terbawah yakni Acknowledgement . Pada tingkat tersebut, Perpustakaan Flinders masih dalam tahap pengembangan kesadaran tentang pentingnya preservasi digital. Selanjutnya, faktor-faktor yang mempengaruhi rendahnya tingkat kesiapan organisasi tersebut adalah tidak adanya pernyataan pentingya preservasi digital pada kebijakan pengembangan koleksi; keterbatasan pendanaan dan keterbatasan sumber daya manusia yang kompeten dalam preservasi digital. Penelitian ini merekomendasikan Perpustakaan Universitas Flinders untuk  melakukan uji kelayakan bagi preservasi digital. Salah satu model bisnis yang sesuai dengan kondisi Perpustakaan Universitas  Flinders adalah Meta Archive Model (MAM. Model tersebut berbasis komunitas bagi preservasi digital

  1. Digital preservation of cultural and scientific heritage: involving university students to raise awareness of its importance

    Directory of Open Access Journals (Sweden)

    Paula Redweik

    2017-05-01

    Full Text Available Cultural heritage is a relevant issue in contemporary society. While its preservation is a challenge, its dissemination, can contribute for an economic balance between costs and benefits.  Scientific  heritage  can  be  considered  as  a  special domain  of  cultural  heritage, not yet sought  by  the  mass  tourism, but worth being preserved as the roots of today’s knowledge.  Considering  that  university  students  of  engineering  and computer  science traditionally  do  not  address cultural or scientific heritage issues in their syllabus, and that they constitute a layer of young citizens that will come to be influential  in  the  future  of  society,  an  effort  was  undertaken  to  focus  on  this  theme  in  disciplines  of  different  courses, allying  the  learning  of  technical  skills  with  the  natural  interest  of  younger  people  for  3D  and  animation  for  the  profit  of heritage. The goal was to raise the awareness of this particular group to the importance of maintaining heritage issues, in particular,  in  a  virtual  way,  both  for  documentation  and  for  divulgating  their  existence.  Raising  funds  for  buildings’ restoration, attracting the public to visit buildings and collections that are outside the usual tourism routes, contributing to revenue  generation,  or  allowing  virtual  visits  of  not  accessible  issues,  complementing  physical  visits  on  site,  were  the general  aims of  the  proposed  projects.  A survey was undertaken under the participating students to evaluate how the projects influenced their attitude towards heritage. The obtained feedback was very positive: 76% agreed that the project alerted them for the importance of preserving historical and cultural heritage, while 72% considered it was interesting that the topic of digital cultural heritage was used for the assessments of

  2. Digital substraction angiography (DSA) in a universal radiodiagnostic room with a novel multi-pulse high-frequency generator

    International Nuclear Information System (INIS)

    Ellegast, H.H.; Kloss, R.; Mayr, H.; Ammann, E.; Kuehnel, W.; Siemens A.G., Erlangen

    1985-01-01

    Application of digital subtraction angiography in a universal radiodiagnostic room can be implemented rapidly and reliably. The number of examinations could be increased without negative effects to conventional operations in this room. At optimum radiation hygiene and high-degree operational safety, the multipulse high-frequency generator with its DSA parameter automatic system guarantees a reproducibly good image quality equalling that of a special DSA facility. In this way, the examination room constitutes an economic solution for small-sized hospitals without any special angiography room, too. (orig.) [de

  3. Investigating the Learning Challenges Presented by Digital Technologies to the College of Education in Kuwait University

    Science.gov (United States)

    Aldhafeeri, Fayiz; Male, Trevor

    2016-01-01

    There is now widespread recognition that digital technologies, particularly portable hand held devices capable of Internet connection, present opportunities and challenges to the way in which student learning is organized in schools, colleges and institutions of higher education in the 21st Century. Traxler, "Journal of the Research Centre…

  4. Open Access, Open Source and Digital Libraries: A Current Trend in University Libraries around the World

    Science.gov (United States)

    Krishnamurthy, M.

    2008-01-01

    Purpose: The purpose of this paper is to describe the open access and open source movement in the digital library world. Design/methodology/approach: A review of key developments in the open access and open source movement is provided. Findings: Open source software and open access to research findings are of great use to scholars in developing…

  5. Effects of Digital Story on Academic Achievement, Learning Motivation and Retention among University Students

    Science.gov (United States)

    Aktas, Elif; Yurt, Serap Uzuner

    2017-01-01

    The aim of this study was to determine the effect of the learning environment where digital stories are used as a learning material on the motivation, academic success, retention, and students' opinions. The study was carried out with mixed method which is a combination of quantitative and qualitative research approach. The study was implemented…

  6. The Effect of Digital Publishing on Technical Services in University Libraries

    Science.gov (United States)

    Hunter, Ben

    2013-01-01

    The past decade has brought enormous changes in scholarly communication, leading many libraries to undertake large-scale digital publishing initiatives. However, no study has investigated how technical services departments are changing to support these new services. Using change management as a theoretical framework, the investigator uses content…

  7. Investigating the ICT Use and Needs of "Digital Natives" in Learning English at a Taiwanese University

    Science.gov (United States)

    Ko, Chao-Jung; Thang, Siew Ming; Ou, Shu-chen

    2014-01-01

    This article reports key findings of a study which investigated the use of technology by 569 "digital natives" students for English Language learning and recreational purposes. Their views on the applicability of technological tools such as Facebook, blogging and Skype for English Language teaching and learning were also investigated.…

  8. Class prediction for high-dimensional class-imbalanced data

    Directory of Open Access Journals (Sweden)

    Lusa Lara

    2010-10-01

    Full Text Available Abstract Background The goal of class prediction studies is to develop rules to accurately predict the class membership of new samples. The rules are derived using the values of the variables available for each subject: the main characteristic of high-dimensional data is that the number of variables greatly exceeds the number of samples. Frequently the classifiers are developed using class-imbalanced data, i.e., data sets where the number of samples in each class is not equal. Standard classification methods used on class-imbalanced data often produce classifiers that do not accurately predict the minority class; the prediction is biased towards the majority class. In this paper we investigate if the high-dimensionality poses additional challenges when dealing with class-imbalanced prediction. We evaluate the performance of six types of classifiers on class-imbalanced data, using simulated data and a publicly available data set from a breast cancer gene-expression microarray study. We also investigate the effectiveness of some strategies that are available to overcome the effect of class imbalance. Results Our results show that the evaluated classifiers are highly sensitive to class imbalance and that variable selection introduces an additional bias towards classification into the majority class. Most new samples are assigned to the majority class from the training set, unless the difference between the classes is very large. As a consequence, the class-specific predictive accuracies differ considerably. When the class imbalance is not too severe, down-sizing and asymmetric bagging embedding variable selection work well, while over-sampling does not. Variable normalization can further worsen the performance of the classifiers. Conclusions Our results show that matching the prevalence of the classes in training and test set does not guarantee good performance of classifiers and that the problems related to classification with class

  9. High-Dimensional Quantum Information Processing with Linear Optics

    Science.gov (United States)

    Fitzpatrick, Casey A.

    Quantum information processing (QIP) is an interdisciplinary field concerned with the development of computers and information processing systems that utilize quantum mechanical properties of nature to carry out their function. QIP systems have become vastly more practical since the turn of the century. Today, QIP applications span imaging, cryptographic security, computation, and simulation (quantum systems that mimic other quantum systems). Many important strategies improve quantum versions of classical information system hardware, such as single photon detectors and quantum repeaters. Another more abstract strategy engineers high-dimensional quantum state spaces, so that each successful event carries more information than traditional two-level systems allow. Photonic states in particular bring the added advantages of weak environmental coupling and data transmission near the speed of light, allowing for simpler control and lower system design complexity. In this dissertation, numerous novel, scalable designs for practical high-dimensional linear-optical QIP systems are presented. First, a correlated photon imaging scheme using orbital angular momentum (OAM) states to detect rotational symmetries in objects using measurements, as well as building images out of those interactions is reported. Then, a statistical detection method using chains of OAM superpositions distributed according to the Fibonacci sequence is established and expanded upon. It is shown that the approach gives rise to schemes for sorting, detecting, and generating the recursively defined high-dimensional states on which some quantum cryptographic protocols depend. Finally, an ongoing study based on a generalization of the standard optical multiport for applications in quantum computation and simulation is reported upon. The architecture allows photons to reverse momentum inside the device. This in turn enables realistic implementation of controllable linear-optical scattering vertices for

  10. High-dimensional change-point estimation: Combining filtering with convex optimization

    OpenAIRE

    Soh, Yong Sheng; Chandrasekaran, Venkat

    2017-01-01

    We consider change-point estimation in a sequence of high-dimensional signals given noisy observations. Classical approaches to this problem such as the filtered derivative method are useful for sequences of scalar-valued signals, but they have undesirable scaling behavior in the high-dimensional setting. However, many high-dimensional signals encountered in practice frequently possess latent low-dimensional structure. Motivated by this observation, we propose a technique for high-dimensional...

  11. Comprehension of texts in Digital Format versus Printed Texts and Self-Regulated Learning in University Students

    Directory of Open Access Journals (Sweden)

    Paula Gabriela Flores-Carrasco

    2016-12-01

    Full Text Available This article aims (1 to describe the levels of self-regulation and reading comprehension of scientific expository texts; (2 to establish the relationship between self-regulation and reading comprehension; and (3 to compare the performance in comprehension when the printed media (paper or digital media (computer is used. A quasi-experimental, quantitative, descriptive and correlative design was implemented. The sample was composed of 55 university students from four careers of Education; they were in 1st and 3rd year of study at a regional university of the Council of Rectors of Chilean Universities. Three measuring instruments were used: a questionnaire of self-regulated learning and two comprehension tests based on the understanding of Parodi’s (2005 assessment model. The implementation was made in two consecutive moments; first, the self-questionnaire; then, the tests for reading comprehension in both media. With the data obtained, statistical tests of variance, one-way ANOVA, Pearson’s correlation, and means comparison with Bruner and Munzel and U-Mann Whitney’s tests were calculated. In conclusion, and different from the initial statement, it was obtained that university students have an adequate level of self-regulation and low reading comprehension in both data, even the scores are relatively lower in digital data. In both data the output is inverse to the complexity of the questions. Between 1st and 3rd year, there is no increase either in the self-regulation or in reading comprehension; but, exceptionally, the career of Primary General Education specialist on Language and History did. There is a strong relationship between reading comprehension in printed media and self-regulation (ARATEX. The support does not affect reading comprehension, but individual reading skills of the subjects do. A competent reader will have similar performance in both reading supports.

  12. Variance inflation in high dimensional Support Vector Machines

    DEFF Research Database (Denmark)

    Abrahamsen, Trine Julie; Hansen, Lars Kai

    2013-01-01

    Many important machine learning models, supervised and unsupervised, are based on simple Euclidean distance or orthogonal projection in a high dimensional feature space. When estimating such models from small training sets we face the problem that the span of the training data set input vectors...... the case of Support Vector Machines (SVMS) and we propose a non-parametric scheme to restore proper generalizability. We illustrate the algorithm and its ability to restore performance on a wide range of benchmark data sets....... follow a different probability law with less variance. While the problem and basic means to reconstruct and deflate are well understood in unsupervised learning, the case of supervised learning is less well understood. We here investigate the effect of variance inflation in supervised learning including...

  13. Applying recursive numerical integration techniques for solving high dimensional integrals

    International Nuclear Information System (INIS)

    Ammon, Andreas; Genz, Alan; Hartung, Tobias; Jansen, Karl; Volmer, Julia; Leoevey, Hernan

    2016-11-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  14. High-dimensional cluster analysis with the Masked EM Algorithm

    Science.gov (United States)

    Kadir, Shabnam N.; Goodman, Dan F. M.; Harris, Kenneth D.

    2014-01-01

    Cluster analysis faces two problems in high dimensions: first, the “curse of dimensionality” that can lead to overfitting and poor generalization performance; and second, the sheer time taken for conventional algorithms to process large amounts of high-dimensional data. We describe a solution to these problems, designed for the application of “spike sorting” for next-generation high channel-count neural probes. In this problem, only a small subset of features provide information about the cluster member-ship of any one data vector, but this informative feature subset is not the same for all data points, rendering classical feature selection ineffective. We introduce a “Masked EM” algorithm that allows accurate and time-efficient clustering of up to millions of points in thousands of dimensions. We demonstrate its applicability to synthetic data, and to real-world high-channel-count spike sorting data. PMID:25149694

  15. Network Reconstruction From High-Dimensional Ordinary Differential Equations.

    Science.gov (United States)

    Chen, Shizhe; Shojaie, Ali; Witten, Daniela M

    2017-01-01

    We consider the task of learning a dynamical system from high-dimensional time-course data. For instance, we might wish to estimate a gene regulatory network from gene expression data measured at discrete time points. We model the dynamical system nonparametrically as a system of additive ordinary differential equations. Most existing methods for parameter estimation in ordinary differential equations estimate the derivatives from noisy observations. This is known to be challenging and inefficient. We propose a novel approach that does not involve derivative estimation. We show that the proposed method can consistently recover the true network structure even in high dimensions, and we demonstrate empirical improvement over competing approaches. Supplementary materials for this article are available online.

  16. Quantum correlation of high dimensional system in a dephasing environment

    Science.gov (United States)

    Ji, Yinghua; Ke, Qiang; Hu, Juju

    2018-05-01

    For a high dimensional spin-S system embedded in a dephasing environment, we theoretically analyze the time evolutions of quantum correlation and entanglement via Frobenius norm and negativity. The quantum correlation dynamics can be considered as a function of the decoherence parameters, including the ratio between the system oscillator frequency ω0 and the reservoir cutoff frequency ωc , and the different environment temperature. It is shown that the quantum correlation can not only measure nonclassical correlation of the considered system, but also perform a better robustness against the dissipation. In addition, the decoherence presents the non-Markovian features and the quantum correlation freeze phenomenon. The former is much weaker than that in the sub-Ohmic or Ohmic thermal reservoir environment.

  17. Evaluating Clustering in Subspace Projections of High Dimensional Data

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Günnemann, Stephan; Assent, Ira

    2009-01-01

    Clustering high dimensional data is an emerging research field. Subspace clustering or projected clustering group similar objects in subspaces, i.e. projections, of the full space. In the past decade, several clustering paradigms have been developed in parallel, without thorough evaluation...... and comparison between these paradigms on a common basis. Conclusive evaluation and comparison is challenged by three major issues. First, there is no ground truth that describes the "true" clusters in real world data. Second, a large variety of evaluation measures have been used that reflect different aspects...... of the clustering result. Finally, in typical publications authors have limited their analysis to their favored paradigm only, while paying other paradigms little or no attention. In this paper, we take a systematic approach to evaluate the major paradigms in a common framework. We study representative clustering...

  18. Applying recursive numerical integration techniques for solving high dimensional integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [IVU Traffic Technologies AG, Berlin (Germany); Genz, Alan [Washington State Univ., Pullman, WA (United States). Dept. of Mathematics; Hartung, Tobias [King' s College, London (United Kingdom). Dept. of Mathematics; Jansen, Karl; Volmer, Julia [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leoevey, Hernan [Humboldt Univ. Berlin (Germany). Inst. fuer Mathematik

    2016-11-15

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  19. Reduced order surrogate modelling (ROSM) of high dimensional deterministic simulations

    Science.gov (United States)

    Mitry, Mina

    Often, computationally expensive engineering simulations can prohibit the engineering design process. As a result, designers may turn to a less computationally demanding approximate, or surrogate, model to facilitate their design process. However, owing to the the curse of dimensionality, classical surrogate models become too computationally expensive for high dimensional data. To address this limitation of classical methods, we develop linear and non-linear Reduced Order Surrogate Modelling (ROSM) techniques. Two algorithms are presented, which are based on a combination of linear/kernel principal component analysis and radial basis functions. These algorithms are applied to subsonic and transonic aerodynamic data, as well as a model for a chemical spill in a channel. The results of this thesis show that ROSM can provide a significant computational benefit over classical surrogate modelling, sometimes at the expense of a minor loss in accuracy.

  20. Asymptotics of empirical eigenstructure for high dimensional spiked covariance.

    Science.gov (United States)

    Wang, Weichen; Fan, Jianqing

    2017-06-01

    We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies.

  1. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology

    OpenAIRE

    Gustavo K Rohde; John A Ozolek; Anil V Parwani; Liron Pantanowitz

    2014-01-01

    Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI) which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection o...

  2. Low Phase Noise Universal Microwave Oscillator for Analog and Digital Devices, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An inherently rugged Universal Oscillator (UO) is needed to enable a superior class of configurable communications for NASA applications. The requirements are a low...

  3. The Digital Archiving of Endangered Language Oral Traditions: Kaipuleohone at the University of Hawai‘i and C’ek’aedi Hwnax in Alaska

    Directory of Open Access Journals (Sweden)

    Andrea L. Berez

    2013-10-01

    Full Text Available This essay compares and contrasts two small-scale digital endangered language archives with regard to their relevance for oral tradition research. The first is a university-based archive curated at the University of Hawai‘i, which is designed to house endangered language materials arising from the fieldwork of university researchers. The second is an indigenously-administered archive in rural Alaska that serves the language maintenance needs of the Ahtna Athabaskan Alaska Native community.

  4. Digital Image Correlation of Concrete Slab at University of Tennessee, Knoxville

    Energy Technology Data Exchange (ETDEWEB)

    Mahadevan, Sankaran [Idaho National Lab. (INL), Idaho Falls, ID (United States); Agarwal, Vivek [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pham, Binh T. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kyle, Neal [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Some degradation mechanisms of concrete manifest themselves via swelling or by other shape deformation of the concrete. Specifically, degradation of concrete structure damaged by ASR is viewed as one of the dominant factors impacting the structural integrity of aging nuclear power plants. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. Number of nondestructive examination techniques (i.e., thermography, digital image correlation, mechanical deformation measurements, nonlinear impact resonance (DIC) acoustic spectroscopy, and vibro-acoustic modulation) is used to detect the damage caused by ASR. DIC techniques have been increasing in popularity, especially in micro- and nano-scale mechanical testing applications due to its relative ease of implementation and use. Advances in computer technology and digital cameras help this method moving forward. To ensure the best outcome of the DIC system, important factors in the experiment are identified. They include standoff distance, speckle size, speckle pattern, and durable paint. These optimal experimental options are selected basing on a thorough investigation. The resulting DIC deformation map indicates that this technique can be used to generate data related to degradation assessment of concrete structure damaged by the impact of ASR.

  5. Digital Image Correlation of Concrete Slab at University of Tennessee, Knoxville

    International Nuclear Information System (INIS)

    Mahadevan, Sankaran; Agarwal, Vivek; Pham, Binh T.; Kyle, Neal

    2016-01-01

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Some degradation mechanisms of concrete manifest themselves via swelling or by other shape deformation of the concrete. Specifically, degradation of concrete structure damaged by ASR is viewed as one of the dominant factors impacting the structural integrity of aging nuclear power plants. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. Number of nondestructive examination techniques (i.e., thermography, digital image correlation, mechanical deformation measurements, nonlinear impact resonance (DIC) acoustic spectroscopy, and vibro-acoustic modulation) is used to detect the damage caused by ASR. DIC techniques have been increasing in popularity, especially in micro- and nano-scale mechanical testing applications due to its relative ease of implementation and use. Advances in computer technology and digital cameras help this method moving forward. To ensure the best outcome of the DIC system, important factors in the experiment are identified. They include standoff distance, speckle size, speckle pattern, and durable paint. These optimal experimental options are selected basing on a thorough investigation. The resulting DIC deformation map indicates that this technique can be used to generate data related to degradation assessment of concrete structure damaged by the impact of ASR.

  6. A qualitative numerical study of high dimensional dynamical systems

    Science.gov (United States)

    Albers, David James

    Since Poincare, the father of modern mathematical dynamical systems, much effort has been exerted to achieve a qualitative understanding of the physical world via a qualitative understanding of the functions we use to model the physical world. In this thesis, we construct a numerical framework suitable for a qualitative, statistical study of dynamical systems using the space of artificial neural networks. We analyze the dynamics along intervals in parameter space, separating the set of neural networks into roughly four regions: the fixed point to the first bifurcation; the route to chaos; the chaotic region; and a transition region between chaos and finite-state neural networks. The study is primarily with respect to high-dimensional dynamical systems. We make the following general conclusions as the dimension of the dynamical system is increased: the probability of the first bifurcation being of type Neimark-Sacker is greater than ninety-percent; the most probable route to chaos is via a cascade of bifurcations of high-period periodic orbits, quasi-periodic orbits, and 2-tori; there exists an interval of parameter space such that hyperbolicity is violated on a countable, Lebesgue measure 0, "increasingly dense" subset; chaos is much more likely to persist with respect to parameter perturbation in the chaotic region of parameter space as the dimension is increased; moreover, as the number of positive Lyapunov exponents is increased, the likelihood that any significant portion of these positive exponents can be perturbed away decreases with increasing dimension. The maximum Kaplan-Yorke dimension and the maximum number of positive Lyapunov exponents increases linearly with dimension. The probability of a dynamical system being chaotic increases exponentially with dimension. The results with respect to the first bifurcation and the route to chaos comment on previous results of Newhouse, Ruelle, Takens, Broer, Chenciner, and Iooss. Moreover, results regarding the high-dimensional

  7. Teaching and Learning about Universal Human Rights and International Humanitarian Law: Digital Resources and Global Expectations

    Science.gov (United States)

    Blanchard, Rosemary Ann

    2013-01-01

    Today's education for civic engagement requires a global dimension. To live responsibly in their own communities, young people need to situate their personal and local interests in the context of their global interconnections. Bridging the personal, local, and global begins with an awareness of the universal aspirations for dignity and human…

  8. RODERIC, University of Valencia's Digital Repository for Education, Research and Culture

    Directory of Open Access Journals (Sweden)

    Mª Francisca Abad García

    2009-12-01

    Full Text Available En este artículo se presentan las principales características de RODERIC, acrónimo con el que se designa al repositorio de acceso abierto de la Universitat de Valencia y que significa Repositori d’ Objectes Digitals per al Ensenyament la Recerca i la Cultura, haciendo así alusión a los tipos de contenidos que se difundirán a través del mismo al mismo tiempo que se rinde homenaje al Papa Roderic Borgia quien en 1501 concedió la bula papal que permitió la creación de la Universitat de València. Se introducen así mismo los aspectos esenciales del movimiento de acceso abierto en el que se fundamenta el desarrollo de este tipo de infraestructuras.

  9. Progress in high-dimensional percolation and random graphs

    CERN Document Server

    Heydenreich, Markus

    2017-01-01

    This text presents an engaging exposition of the active field of high-dimensional percolation that will likely provide an impetus for future work. With over 90 exercises designed to enhance the reader’s understanding of the material, as well as many open problems, the book is aimed at graduate students and researchers who wish to enter the world of this rich topic.  The text may also be useful in advanced courses and seminars, as well as for reference and individual study. Part I, consisting of 3 chapters, presents a general introduction to percolation, stating the main results, defining the central objects, and proving its main properties. No prior knowledge of percolation is assumed. Part II, consisting of Chapters 4–9, discusses mean-field critical behavior by describing the two main techniques used, namely, differential inequalities and the lace expansion. In Parts I and II, all results are proved, making this the first self-contained text discussing high-dimensiona l percolation.  Part III, consist...

  10. Effects of dependence in high-dimensional multiple testing problems

    Directory of Open Access Journals (Sweden)

    van de Wiel Mark A

    2008-02-01

    Full Text Available Abstract Background We consider effects of dependence among variables of high-dimensional data in multiple hypothesis testing problems, in particular the False Discovery Rate (FDR control procedures. Recent simulation studies consider only simple correlation structures among variables, which is hardly inspired by real data features. Our aim is to systematically study effects of several network features like sparsity and correlation strength by imposing dependence structures among variables using random correlation matrices. Results We study the robustness against dependence of several FDR procedures that are popular in microarray studies, such as Benjamin-Hochberg FDR, Storey's q-value, SAM and resampling based FDR procedures. False Non-discovery Rates and estimates of the number of null hypotheses are computed from those methods and compared. Our simulation study shows that methods such as SAM and the q-value do not adequately control the FDR to the level claimed under dependence conditions. On the other hand, the adaptive Benjamini-Hochberg procedure seems to be most robust while remaining conservative. Finally, the estimates of the number of true null hypotheses under various dependence conditions are variable. Conclusion We discuss a new method for efficient guided simulation of dependent data, which satisfy imposed network constraints as conditional independence structures. Our simulation set-up allows for a structural study of the effect of dependencies on multiple testing criterions and is useful for testing a potentially new method on π0 or FDR estimation in a dependency context.

  11. Inference for High-dimensional Differential Correlation Matrices.

    Science.gov (United States)

    Cai, T Tony; Zhang, Anru

    2016-01-01

    Motivated by differential co-expression analysis in genomics, we consider in this paper estimation and testing of high-dimensional differential correlation matrices. An adaptive thresholding procedure is introduced and theoretical guarantees are given. Minimax rate of convergence is established and the proposed estimator is shown to be adaptively rate-optimal over collections of paired correlation matrices with approximately sparse differences. Simulation results show that the procedure significantly outperforms two other natural methods that are based on separate estimation of the individual correlation matrices. The procedure is also illustrated through an analysis of a breast cancer dataset, which provides evidence at the gene co-expression level that several genes, of which a subset has been previously verified, are associated with the breast cancer. Hypothesis testing on the differential correlation matrices is also considered. A test, which is particularly well suited for testing against sparse alternatives, is introduced. In addition, other related problems, including estimation of a single sparse correlation matrix, estimation of the differential covariance matrices, and estimation of the differential cross-correlation matrices, are also discussed.

  12. Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression

    Science.gov (United States)

    Ndiaye, Eugene; Fercoq, Olivier; Gramfort, Alexandre; Leclère, Vincent; Salmon, Joseph

    2017-10-01

    In high dimensional settings, sparse structures are crucial for efficiency, both in term of memory, computation and performance. It is customary to consider ℓ 1 penalty to enforce sparsity in such scenarios. Sparsity enforcing methods, the Lasso being a canonical example, are popular candidates to address high dimension. For efficiency, they rely on tuning a parameter trading data fitting versus sparsity. For the Lasso theory to hold this tuning parameter should be proportional to the noise level, yet the latter is often unknown in practice. A possible remedy is to jointly optimize over the regression parameter as well as over the noise level. This has been considered under several names in the literature: Scaled-Lasso, Square-root Lasso, Concomitant Lasso estimation for instance, and could be of interest for uncertainty quantification. In this work, after illustrating numerical difficulties for the Concomitant Lasso formulation, we propose a modification we coined Smoothed Concomitant Lasso, aimed at increasing numerical stability. We propose an efficient and accurate solver leading to a computational cost no more expensive than the one for the Lasso. We leverage on standard ingredients behind the success of fast Lasso solvers: a coordinate descent algorithm, combined with safe screening rules to achieve speed efficiency, by eliminating early irrelevant features.

  13. Bayesian Subset Modeling for High-Dimensional Generalized Linear Models

    KAUST Repository

    Liang, Faming

    2013-06-01

    This article presents a new prior setting for high-dimensional generalized linear models, which leads to a Bayesian subset regression (BSR) with the maximum a posteriori model approximately equivalent to the minimum extended Bayesian information criterion model. The consistency of the resulting posterior is established under mild conditions. Further, a variable screening procedure is proposed based on the marginal inclusion probability, which shares the same properties of sure screening and consistency with the existing sure independence screening (SIS) and iterative sure independence screening (ISIS) procedures. However, since the proposed procedure makes use of joint information from all predictors, it generally outperforms SIS and ISIS in real applications. This article also makes extensive comparisons of BSR with the popular penalized likelihood methods, including Lasso, elastic net, SIS, and ISIS. The numerical results indicate that BSR can generally outperform the penalized likelihood methods. The models selected by BSR tend to be sparser and, more importantly, of higher prediction ability. In addition, the performance of the penalized likelihood methods tends to deteriorate as the number of predictors increases, while this is not significant for BSR. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  14. The literary uses of high-dimensional space

    Directory of Open Access Journals (Sweden)

    Ted Underwood

    2015-12-01

    Full Text Available Debates over “Big Data” shed more heat than light in the humanities, because the term ascribes new importance to statistical methods without explaining how those methods have changed. What we badly need instead is a conversation about the substantive innovations that have made statistical modeling useful for disciplines where, in the past, it truly wasn’t. These innovations are partly technical, but more fundamentally expressed in what Leo Breiman calls a new “culture” of statistical modeling. Where 20th-century methods often required humanists to squeeze our unstructured texts, sounds, or images into some special-purpose data model, new methods can handle unstructured evidence more directly by modeling it in a high-dimensional space. This opens a range of research opportunities that humanists have barely begun to discuss. To date, topic modeling has received most attention, but in the long run, supervised predictive models may be even more important. I sketch their potential by describing how Jordan Sellers and I have begun to model poetic distinction in the long 19th century—revealing an arc of gradual change much longer than received literary histories would lead us to expect.

  15. Turning microscopy in the medical curriculum digital: Experiences from the faculty of health and medical sciences at University of Copenhagen

    Directory of Open Access Journals (Sweden)

    Ben Vainer

    2017-01-01

    Full Text Available Familiarity with the structure and composition of normal tissue and an understanding of the changes that occur during disease is pivotal to the study of the human body. For decades, microscope slides have been central to teaching pathology in medical courses and related subjects at the University of Copenhagen. Students had to learn how to use a microscope and envisage three-dimensional processes that occur in the body from two-dimensional glass slides. Here, we describe how a PathXL virtual microscopy system for teaching pathology and histology at the Faculty has recently been implemented, from an administrative, an economic, and a teaching perspective. This fully automatic digital microscopy system has been received positively by both teachers and students, and a decision was made to convert all courses involving microscopy to the virtual microscopy format. As a result, conventional analog microscopy will be phased out from the fall of 2016.

  16. Digital identifiers as permanent unique registers for researchers in the university context

    Directory of Open Access Journals (Sweden)

    Luisa F. Acosta-Ortega

    2016-09-01

    Full Text Available The increase in the use of Internet and the web allows a wide access to a greater warehouse of information sources in thousand of journals and publications, nets of almost unlimited number of people, computers and opportunities for learning and research without precedents. That makes the correct identification and recovery of scientific production of researchers very difficult. For that reason, during the last years different attemps of different organizations have been made to create a permanent unique register for authors, which permits to identify their articles wherever they are placed and without taking into account the specificity in the author’s name, publishing and  processing practices In data base,  and different bibliographic description styles as well. ORCID (Openn Researcher and Contribution ID is an identifier with the greatest posibilities of becoming universal to achieve visibility and positioning of Latin-American universities in the present international context.

  17. Measurement Invariance of the Digital Natives Assessment Scale across Gender in a Sample of Turkish University Students

    Science.gov (United States)

    Ursavas, Ömer Faruk; Kabakçi Yurdakul, Isil; Türk, Mesut; Mcilroy, David

    2016-01-01

    With reference to the digital natives' debate, there is a gap on digital natives' characteristics. To fill this gap, the Digital Natives Assessment Scale was developed to measure students' assessment of the degree to which they perceived themselves to possess the attributes of digital natives. The scale was developed within the Turkish language…

  18. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    Statistical inference for sparse signals or low-rank matrices in high-dimensional settings is of significant interest in a range of contemporary applications. It has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. In this thesis, we consider several problems in including sparse signal recovery (compressed sensing under restricted isometry) and low-rank matrix recovery (matrix recovery via rank-one projections and structured matrix completion). The first part of the thesis discusses compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and low-rank matrix recovery. The analysis relies on a key technical tool which represents points in a polytope by convex combinations of sparse vectors. The technique is elementary while leads to sharp results. It is shown that, in compressed sensing, delta kA 0, delta kA < 1/3 + epsilon, deltak A + thetak,kA < 1 + epsilon, or deltatkA< √(t - 1) / t + epsilon are not sufficient to guarantee the exact recovery of all k-sparse signals for large k. Similar result also holds for matrix recovery. In addition, the conditions delta kA<1/3, deltak A+ thetak,kA<1, delta tkA < √(t - 1)/t and deltarM<1/3, delta rM+ thetar,rM<1, delta trM< √(t - 1)/ t are also shown to be sufficient respectively for stable recovery of approximately sparse signals and low-rank matrices in the noisy case. For the second part of the thesis, we introduce a rank-one projection model for low-rank matrix recovery and propose a constrained nuclear norm minimization method for stable recovery of low-rank matrices in the noisy case. The procedure is adaptive to the rank and robust against small perturbations. Both upper and lower bounds for the estimation accuracy under the Frobenius norm loss are obtained. The proposed estimator is shown to be rate-optimal under certain conditions. The

  19. Genuinely high-dimensional nonlocality optimized by complementary measurements

    International Nuclear Information System (INIS)

    Lim, James; Ryu, Junghee; Yoo, Seokwon; Lee, Changhyoup; Bang, Jeongho; Lee, Jinhyoung

    2010-01-01

    Qubits exhibit extreme nonlocality when their state is maximally entangled and this is observed by mutually unbiased local measurements. This criterion does not hold for the Bell inequalities of high-dimensional systems (qudits), recently proposed by Collins-Gisin-Linden-Massar-Popescu and Son-Lee-Kim. Taking an alternative approach, called the quantum-to-classical approach, we derive a series of Bell inequalities for qudits that satisfy the criterion as for the qubits. In the derivation each d-dimensional subsystem is assumed to be measured by one of d possible measurements with d being a prime integer. By applying to two qubits (d=2), we find that a derived inequality is reduced to the Clauser-Horne-Shimony-Holt inequality when the degree of nonlocality is optimized over all the possible states and local observables. Further applying to two and three qutrits (d=3), we find Bell inequalities that are violated for the three-dimensionally entangled states but are not violated by any two-dimensionally entangled states. In other words, the inequalities discriminate three-dimensional (3D) entanglement from two-dimensional (2D) entanglement and in this sense they are genuinely 3D. In addition, for the two qutrits we give a quantitative description of the relations among the three degrees of complementarity, entanglement and nonlocality. It is shown that the degree of complementarity jumps abruptly to very close to its maximum as nonlocality starts appearing. These characteristics imply that complementarity plays a more significant role in the present inequality compared with the previously proposed inequality.

  20. Approximation of High-Dimensional Rank One Tensors

    KAUST Repository

    Bachmayr, Markus

    2013-11-12

    Many real world problems are high-dimensional in that their solution is a function which depends on many variables or parameters. This presents a computational challenge since traditional numerical techniques are built on model classes for functions based solely on smoothness. It is known that the approximation of smoothness classes of functions suffers from the so-called \\'curse of dimensionality\\'. Avoiding this curse requires new model classes for real world functions that match applications. This has led to the introduction of notions such as sparsity, variable reduction, and reduced modeling. One theme that is particularly common is to assume a tensor structure for the target function. This paper investigates how well a rank one function f(x 1,...,x d)=f 1(x 1)⋯f d(x d), defined on Ω=[0,1]d can be captured through point queries. It is shown that such a rank one function with component functions f j in W∞ r([0,1]) can be captured (in L ∞) to accuracy O(C(d,r)N -r) from N well-chosen point evaluations. The constant C(d,r) scales like d dr. The queries in our algorithms have two ingredients, a set of points built on the results from discrepancy theory and a second adaptive set of queries dependent on the information drawn from the first set. Under the assumption that a point z∈Ω with nonvanishing f(z) is known, the accuracy improves to O(dN -r). © 2013 Springer Science+Business Media New York.

  1. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  2. Quality and efficiency in high dimensional Nearest neighbor search

    KAUST Repository

    Tao, Yufei; Yi, Ke; Sheng, Cheng; Kalnis, Panos

    2009-01-01

    Nearest neighbor (NN) search in high dimensional space is an important problem in many applications. Ideally, a practical solution (i) should be implementable in a relational database, and (ii) its query cost should grow sub-linearly with the dataset size, regardless of the data and query distributions. Despite the bulk of NN literature, no solution fulfills both requirements, except locality sensitive hashing (LSH). The existing LSH implementations are either rigorous or adhoc. Rigorous-LSH ensures good quality of query results, but requires expensive space and query cost. Although adhoc-LSH is more efficient, it abandons quality control, i.e., the neighbor it outputs can be arbitrarily bad. As a result, currently no method is able to ensure both quality and efficiency simultaneously in practice. Motivated by this, we propose a new access method called the locality sensitive B-tree (LSB-tree) that enables fast highdimensional NN search with excellent quality. The combination of several LSB-trees leads to a structure called the LSB-forest that ensures the same result quality as rigorous-LSH, but reduces its space and query cost dramatically. The LSB-forest also outperforms adhoc-LSH, even though the latter has no quality guarantee. Besides its appealing theoretical properties, the LSB-tree itself also serves as an effective index that consumes linear space, and supports efficient updates. Our extensive experiments confirm that the LSB-tree is faster than (i) the state of the art of exact NN search by two orders of magnitude, and (ii) the best (linear-space) method of approximate retrieval by an order of magnitude, and at the same time, returns neighbors with much better quality. © 2009 ACM.

  3. Approximation of High-Dimensional Rank One Tensors

    KAUST Repository

    Bachmayr, Markus; Dahmen, Wolfgang; DeVore, Ronald; Grasedyck, Lars

    2013-01-01

    Many real world problems are high-dimensional in that their solution is a function which depends on many variables or parameters. This presents a computational challenge since traditional numerical techniques are built on model classes for functions based solely on smoothness. It is known that the approximation of smoothness classes of functions suffers from the so-called 'curse of dimensionality'. Avoiding this curse requires new model classes for real world functions that match applications. This has led to the introduction of notions such as sparsity, variable reduction, and reduced modeling. One theme that is particularly common is to assume a tensor structure for the target function. This paper investigates how well a rank one function f(x 1,...,x d)=f 1(x 1)⋯f d(x d), defined on Ω=[0,1]d can be captured through point queries. It is shown that such a rank one function with component functions f j in W∞ r([0,1]) can be captured (in L ∞) to accuracy O(C(d,r)N -r) from N well-chosen point evaluations. The constant C(d,r) scales like d dr. The queries in our algorithms have two ingredients, a set of points built on the results from discrepancy theory and a second adaptive set of queries dependent on the information drawn from the first set. Under the assumption that a point z∈Ω with nonvanishing f(z) is known, the accuracy improves to O(dN -r). © 2013 Springer Science+Business Media New York.

  4. Low-cost digital image processing at the University of Oklahoma

    Science.gov (United States)

    Harrington, J. A., Jr.

    1981-01-01

    Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.

  5. Final report on a pilot academic e-books project at Keio University Libraries : Potential for the scholarly use of digitized academic books

    Science.gov (United States)

    Shimada, Takashi

    This article reports on the results and significance of a pilot academic e-books project carried out at the Keio University Libraries for fiscal 2010 to 2012 to assess the viability of a new model of the libraries providing all the campuses with accesses to Japanese academic books digitized jointly with academic publishers and cooperative firms. It focuses on the experimental use of digitized books, highlighting the students’ attitudes and expectations towards e-books as found from surveys. Some major findings include the following. Users have a strong demand for digitized readings that are rather lookup-oriented than learning-oriented, with greater value placed on the functionalities of federated full-text searching, reading on a screen, and accessing the desired chapter direct from table of contents. They also want an online space in which to manage different forms of digitized learning resources. We investigated the potential of e-books and new type of textbooks as educational infrastructures based on the results of experiment. Japan’s university libraries should need to engage actively in the mass digitization of academic books to be adaptive to the change in the ways research, study and teaching are conducted. We plan to start a joint experiment with other university libraries to develop a practical model for the use of e-books.

  6. Sloan Digital Sky Survey IV: Mapping the Milky Way, Nearby Galaxies, and the Distant Universe

    Science.gov (United States)

    Blanton, Michael R.; Bershady, Matthew A.; Abolfathi, Bela; Albareti, Franco D.; Allende Prieto, Carlos; Almeida, Andres; Alonso-García, Javier; Anders, Friedrich; Anderson, Scott F.; Andrews, Brett; Aquino-Ortíz, Erik; Aragón-Salamanca, Alfonso; Argudo-Fernández, Maria; Armengaud, Eric; Aubourg, Eric; Avila-Reese, Vladimir; Badenes, Carles; Bailey, Stephen; Barger, Kathleen A.; Barrera-Ballesteros, Jorge; Bartosz, Curtis; Bates, Dominic; Baumgarten, Falk; Bautista, Julian; Beaton, Rachael; Beers, Timothy C.; Belfiore, Francesco; Bender, Chad F.; Berlind, Andreas A.; Bernardi, Mariangela; Beutler, Florian; Bird, Jonathan C.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blomqvist, Michael; Bolton, Adam S.; Boquien, Médéric; Borissova, Jura; van den Bosch, Remco; Bovy, Jo; Brandt, William N.; Brinkmann, Jonathan; Brownstein, Joel R.; Bundy, Kevin; Burgasser, Adam J.; Burtin, Etienne; Busca, Nicolás G.; Cappellari, Michele; Delgado Carigi, Maria Leticia; Carlberg, Joleen K.; Carnero Rosell, Aurelio; Carrera, Ricardo; Chanover, Nancy J.; Cherinka, Brian; Cheung, Edmond; Gómez Maqueo Chew, Yilen; Chiappini, Cristina; Doohyun Choi, Peter; Chojnowski, Drew; Chuang, Chia-Hsun; Chung, Haeun; Cirolini, Rafael Fernando; Clerc, Nicolas; Cohen, Roger E.; Comparat, Johan; da Costa, Luiz; Cousinou, Marie-Claude; Covey, Kevin; Crane, Jeffrey D.; Croft, Rupert A. C.; Cruz-Gonzalez, Irene; Garrido Cuadra, Daniel; Cunha, Katia; Damke, Guillermo J.; Darling, Jeremy; Davies, Roger; Dawson, Kyle; de la Macorra, Axel; Dell'Agli, Flavia; De Lee, Nathan; Delubac, Timothée; Di Mille, Francesco; Diamond-Stanic, Aleks; Cano-Díaz, Mariana; Donor, John; Downes, Juan José; Drory, Niv; du Mas des Bourboux, Hélion; Duckworth, Christopher J.; Dwelly, Tom; Dyer, Jamie; Ebelke, Garrett; Eigenbrot, Arthur D.; Eisenstein, Daniel J.; Emsellem, Eric; Eracleous, Mike; Escoffier, Stephanie; Evans, Michael L.; Fan, Xiaohui; Fernández-Alvar, Emma; Fernandez-Trincado, J. G.; Feuillet, Diane K.; Finoguenov, Alexis; Fleming, Scott W.; Font-Ribera, Andreu; Fredrickson, Alexander; Freischlad, Gordon; Frinchaboy, Peter M.; Fuentes, Carla E.; Galbany, Lluís; Garcia-Dias, R.; García-Hernández, D. A.; Gaulme, Patrick; Geisler, Doug; Gelfand, Joseph D.; Gil-Marín, Héctor; Gillespie, Bruce A.; Goddard, Daniel; Gonzalez-Perez, Violeta; Grabowski, Kathleen; Green, Paul J.; Grier, Catherine J.; Gunn, James E.; Guo, Hong; Guy, Julien; Hagen, Alex; Hahn, ChangHoon; Hall, Matthew; Harding, Paul; Hasselquist, Sten; Hawley, Suzanne L.; Hearty, Fred; Gonzalez Hernández, Jonay I.; Ho, Shirley; Hogg, David W.; Holley-Bockelmann, Kelly; Holtzman, Jon A.; Holzer, Parker H.; Huehnerhoff, Joseph; Hutchinson, Timothy A.; Hwang, Ho Seong; Ibarra-Medel, Héctor J.; da Silva Ilha, Gabriele; Ivans, Inese I.; Ivory, KeShawn; Jackson, Kelly; Jensen, Trey W.; Johnson, Jennifer A.; Jones, Amy; Jönsson, Henrik; Jullo, Eric; Kamble, Vikrant; Kinemuchi, Karen; Kirkby, David; Kitaura, Francisco-Shu; Klaene, Mark; Knapp, Gillian R.; Kneib, Jean-Paul; Kollmeier, Juna A.; Lacerna, Ivan; Lane, Richard R.; Lang, Dustin; Law, David R.; Lazarz, Daniel; Lee, Youngbae; Le Goff, Jean-Marc; Liang, Fu-Heng; Li, Cheng; Li, Hongyu; Lian, Jianhui; Lima, Marcos; Lin, Lihwai; Lin, Yen-Ting; Bertran de Lis, Sara; Liu, Chao; de Icaza Lizaola, Miguel Angel C.; Long, Dan; Lucatello, Sara; Lundgren, Britt; MacDonald, Nicholas K.; Deconto Machado, Alice; MacLeod, Chelsea L.; Mahadevan, Suvrath; Geimba Maia, Marcio Antonio; Maiolino, Roberto; Majewski, Steven R.; Malanushenko, Elena; Malanushenko, Viktor; Manchado, Arturo; Mao, Shude; Maraston, Claudia; Marques-Chaves, Rui; Masseron, Thomas; Masters, Karen L.; McBride, Cameron K.; McDermid, Richard M.; McGrath, Brianne; McGreer, Ian D.; Medina Peña, Nicolás; Melendez, Matthew; Merloni, Andrea; Merrifield, Michael R.; Meszaros, Szabolcs; Meza, Andres; Minchev, Ivan; Minniti, Dante; Miyaji, Takamitsu; More, Surhud; Mulchaey, John; Müller-Sánchez, Francisco; Muna, Demitri; Munoz, Ricardo R.; Myers, Adam D.; Nair, Preethi; Nandra, Kirpal; Correa do Nascimento, Janaina; Negrete, Alenka; Ness, Melissa; Newman, Jeffrey A.; Nichol, Robert C.; Nidever, David L.; Nitschelm, Christian; Ntelis, Pierros; O'Connell, Julia E.; Oelkers, Ryan J.; Oravetz, Audrey; Oravetz, Daniel; Pace, Zach; Padilla, Nelson; Palanque-Delabrouille, Nathalie; Alonso Palicio, Pedro; Pan, Kaike; Parejko, John K.; Parikh, Taniya; Pâris, Isabelle; Park, Changbom; Patten, Alim Y.; Peirani, Sebastien; Pellejero-Ibanez, Marcos; Penny, Samantha; Percival, Will J.; Perez-Fournon, Ismael; Petitjean, Patrick; Pieri, Matthew M.; Pinsonneault, Marc; Pisani, Alice; Poleski, Radosław; Prada, Francisco; Prakash, Abhishek; Queiroz, Anna Bárbara de Andrade; Raddick, M. Jordan; Raichoor, Anand; Barboza Rembold, Sandro; Richstein, Hannah; Riffel, Rogemar A.; Riffel, Rogério; Rix, Hans-Walter; Robin, Annie C.; Rockosi, Constance M.; Rodríguez-Torres, Sergio; Roman-Lopes, A.; Román-Zúñiga, Carlos; Rosado, Margarita; Ross, Ashley J.; Rossi, Graziano; Ruan, John; Ruggeri, Rossana; Rykoff, Eli S.; Salazar-Albornoz, Salvador; Salvato, Mara; Sánchez, Ariel G.; Aguado, D. S.; Sánchez-Gallego, José R.; Santana, Felipe A.; Santiago, Basílio Xavier; Sayres, Conor; Schiavon, Ricardo P.; da Silva Schimoia, Jaderson; Schlafly, Edward F.; Schlegel, David J.; Schneider, Donald P.; Schultheis, Mathias; Schuster, William J.; Schwope, Axel; Seo, Hee-Jong; Shao, Zhengyi; Shen, Shiyin; Shetrone, Matthew; Shull, Michael; Simon, Joshua D.; Skinner, Danielle; Skrutskie, M. F.; Slosar, Anže; Smith, Verne V.; Sobeck, Jennifer S.; Sobreira, Flavia; Somers, Garrett; Souto, Diogo; Stark, David V.; Stassun, Keivan; Stauffer, Fritz; Steinmetz, Matthias; Storchi-Bergmann, Thaisa; Streblyanska, Alina; Stringfellow, Guy S.; Suárez, Genaro; Sun, Jing; Suzuki, Nao; Szigeti, Laszlo; Taghizadeh-Popp, Manuchehr; Tang, Baitian; Tao, Charling; Tayar, Jamie; Tembe, Mita; Teske, Johanna; Thakar, Aniruddha R.; Thomas, Daniel; Thompson, Benjamin A.; Tinker, Jeremy L.; Tissera, Patricia; Tojeiro, Rita; Hernandez Toledo, Hector; de la Torre, Sylvain; Tremonti, Christy; Troup, Nicholas W.; Valenzuela, Octavio; Martinez Valpuesta, Inma; Vargas-González, Jaime; Vargas-Magaña, Mariana; Vazquez, Jose Alberto; Villanova, Sandro; Vivek, M.; Vogt, Nicole; Wake, David; Walterbos, Rene; Wang, Yuting; Weaver, Benjamin Alan; Weijmans, Anne-Marie; Weinberg, David H.; Westfall, Kyle B.; Whelan, David G.; Wild, Vivienne; Wilson, John; Wood-Vasey, W. M.; Wylezalek, Dominika; Xiao, Ting; Yan, Renbin; Yang, Meng; Ybarra, Jason E.; Yèche, Christophe; Zakamska, Nadia; Zamora, Olga; Zarrouk, Pauline; Zasowski, Gail; Zhang, Kai; Zhao, Gong-Bo; Zheng, Zheng; Zheng, Zheng; Zhou, Xu; Zhou, Zhi-Min; Zhu, Guangtun B.; Zoccali, Manuela; Zou, Hu

    2017-07-01

    We describe the Sloan Digital Sky Survey IV (SDSS-IV), a project encompassing three major spectroscopic programs. The Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2) is observing hundreds of thousands of Milky Way stars at high resolution and high signal-to-noise ratios in the near-infrared. The Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey is obtaining spatially resolved spectroscopy for thousands of nearby galaxies (median z˜ 0.03). The extended Baryon Oscillation Spectroscopic Survey (eBOSS) is mapping the galaxy, quasar, and neutral gas distributions between z˜ 0.6 and 3.5 to constrain cosmology using baryon acoustic oscillations, redshift space distortions, and the shape of the power spectrum. Within eBOSS, we are conducting two major subprograms: the SPectroscopic IDentification of eROSITA Sources (SPIDERS), investigating X-ray AGNs and galaxies in X-ray clusters, and the Time Domain Spectroscopic Survey (TDSS), obtaining spectra of variable sources. All programs use the 2.5 m Sloan Foundation Telescope at the Apache Point Observatory; observations there began in Summer 2014. APOGEE-2 also operates a second near-infrared spectrograph at the 2.5 m du Pont Telescope at Las Campanas Observatory, with observations beginning in early 2017. Observations at both facilities are scheduled to continue through 2020. In keeping with previous SDSS policy, SDSS-IV provides regularly scheduled public data releases; the first one, Data Release 13, was made available in 2016 July.

  7. Testing the Andrews Framework of Strategy Formulation and Implementation: Case Study of the University of Cape Coast Digital Library in Ghana

    Directory of Open Access Journals (Sweden)

    Nesba Yaa Anima Adzobu

    2014-12-01

    Full Text Available This paper investigates how strategy formulation and implementation processes used by the University of Cape Coast (UCC in building its digital collections compare with the Andrew’s strategic formulation and implementation theoretical framework. Theory-testing case study methodology was used. The data collection instruments were the key informant interview technique and document reviews. During the formulation phase, two aspects (resources and aspirations of senior management were emergent. During the implementation phase, five aspects (achieving results, processes and behaviour, standards, motivation, personal were emergent. All other elements of building the UCC digital collections were planned during both the formulation and implementation phases. Although the emphasis on students and learning is laudable and apt, there seems to be lack of focus on research support beyond digital collection building, despite the fact that research excellence is one of the UCC’s key priorities. Opportunities exist for improving feedback mechanisms between the users, digital library staff and the university management; and inclusion of social media tools in the digital library project. Since only the experience of a single institution of higher learning is considered, it cannot be definitively stated that strategy formulation and implementation will be similar in every institutional context. However, the results provide a basis for academic digital libraries to draw lessons from this case. In African public universities, there is little earlier research on strategy formulation and implementation in digital library management. Strategy formulation and implementation is a critical issue for higher education academic libraries especially in developing countries like Ghana, due to limited financial resources and the rapid change in the information environment during the last several decades.

  8. Journalism and Mass Communication Students at Historically Black Colleges and Universities and Predominantly White Institutions: Saying Goodbye to the Digital Divide

    Science.gov (United States)

    Crawford, Jerry, II

    2013-01-01

    The digital divide has been described as the distance or gap in access to information based on race, ethnicity, income, education and geographical location. This study examined how freshmen and first-semester journalism and mass communications students at five Historically Black Colleges and Universities [HBCUs] have been able to bridge the…

  9. Social Media Contribution to the Promotion of Digital Citizenship among Female Students at Imam Mohammed bin Saud Islamic University in Riyadh

    Science.gov (United States)

    Alharbi, Wafa Owaydhah; Alturki, Khaled Ibrahim

    2018-01-01

    The study aimed to identify the degree of social media contribution to reinforcing digital citizenship meaning from the viewpoint of female students at Imam Mohammed bin Saud Islamic University in Riyadh. The study was an attempt to answer the following two questions in order to achieve the objectives of the study: To which extent does SnapChat…

  10. Do hospital physicians really want to go digital? Acceptance of a picture archiving and communication system in a university hospital

    International Nuclear Information System (INIS)

    Duyck, P.; Pynoo, B.; Devolder, P.; Voet, T.; Adang, L.; Vercruysse, J.

    2008-01-01

    Purpose: radiology departments are making the transition from analog film to digital images by means of PACS (Picture Archiving and Communication System). It is critical for the hospital that its physicians adopt and accept the new digital work method regarding radiological information. The aim of this study is to investigate hospital physicians' acceptance of PACS using questionnaires pre- and post-implementation and to identify main influencing factors. Materials and methods: the study was conducted in an 1169 bed university hospital. The UTAUT (Unified Theory of Acceptance and Use of Technology) questionnaire was administered at two times: one month pre-implementation (T1) and 1.5 years post-implementation (T2) of PACS, targeting all hospital physicians with the exemption of radiologists. The UTAUT scales (Behavioral Intention BI; Facilitating Conditions FC; Effort Expectancy EE; Performance Expectancy PE; Anxiety ANX; Social Influence SI; System Use USE; Attitude toward technology ATT; Self-Efficacy SE) were used to assess questions regarding: (a) PACS' usefulness, (b) PACS' ease of learning/using, (c) PACS support availability, (d) the perceived pressure to use PACS, (e) physicians' attitude towards PACS and (f) physicians' intention to use and actual use of PACS. Results: at T1 scale ratings were positive toward the PACS implementation. The ratings on all scales with the exception of self-efficacy improved at T2. Regression analysis revealed that the key factor for intention to use PACS at T1 was the usefulness of PACS, while the availability and awareness of support was its most important predictor at T2. Overall, PE was the best predictor of BI, but all four UTAUT-determinants (PE, FC, EE and SI) were salient for its prediction. Variance explained in BI ranged from 31 to 37% while variance explained in USE was very low (3%). (orig.)

  11. Sloan Digital Sky Survey III photometric quasar clustering: probing the initial conditions of the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Shirley; Agarwal, Nishant; Lyons, Richard; Disbrow, Ashley; O' Connell, Ross [McWilliams Center for Cosmology, Department of Physics, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Myers, Adam D. [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Seo, Hee-Jong; Schlegel, David; Ross, Nicholas P. [Lawrence Berkeley National Laboratory, 1 Cyclotron Rd, Berkeley, CA 94702 (United States); Ross, Ashley [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth, PO1 3FX (United Kingdom); Hirata, Christopher; Huff, Eric; Weinberg, David [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Padmanabhan, Nikhil [Department of Physics and Astronomy, Yale University, New Haven, CT 06520 (United States); Slosar, Anže [Brookhaven National Laboratory, Bldg. 510, Upton NY 11375 (United States); Strauss, Michael; Bahcall, Neta [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Schneider, Donald P. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Brinkmann, J. [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349-0059 (United States); Palanque-Delabrouille, Nathalie, E-mail: shirleyh@andrew.cmu.edu [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); and others

    2015-05-01

    The Sloan Digital Sky Survey has surveyed 14,555 square degrees of the sky, and delivered over a trillion pixels of imaging data. We present the large-scale clustering of 1.6 million quasars between z=0.5 and z=2.5 that have been classified from this imaging, representing the highest density of quasars ever studied for clustering measurements. This data set spans 0∼ 11,00 square degrees and probes a volume of 80 h{sup −3} Gpc{sup 3}. In principle, such a large volume and medium density of tracers should facilitate high-precision cosmological constraints. We measure the angular clustering of photometrically classified quasars using an optimal quadratic estimator in four redshift slices with an accuracy of ∼ 25% over a bin width of δ{sub l} ∼ 10−15 on scales corresponding to matter-radiation equality and larger (0ℓ ∼ 2−3). Observational systematics can strongly bias clustering measurements on large scales, which can mimic cosmologically relevant signals such as deviations from Gaussianity in the spectrum of primordial perturbations. We account for systematics by employing a new method recently proposed by Agarwal et al. (2014) to the clustering of photometrically classified quasars. We carefully apply our methodology to mitigate known observational systematics and further remove angular bins that are contaminated by unknown systematics. Combining quasar data with the photometric luminous red galaxy (LRG) sample of Ross et al. (2011) and Ho et al. (2012), and marginalizing over all bias and shot noise-like parameters, we obtain a constraint on local primordial non-Gaussianity of f{sub NL} = −113{sup +154}{sub −154} (1σ error). We next assume that the bias of quasar and galaxy distributions can be obtained independently from quasar/galaxy-CMB lensing cross-correlation measurements (such as those in Sherwin et al. (2013)). This can be facilitated by spectroscopic observations of the sources, enabling the redshift distribution to be

  12. Matrix correlations for high-dimensional data: The modified RV-coefficient

    NARCIS (Netherlands)

    Smilde, A.K.; Kiers, H.A.L.; Bijlsma, S.; Rubingh, C.M.; Erk, M.J. van

    2009-01-01

    Motivation: Modern functional genomics generates high-dimensional datasets. It is often convenient to have a single simple number characterizing the relationship between pairs of such high-dimensional datasets in a comprehensive way. Matrix correlations are such numbers and are appealing since they

  13. A model for a PC-based, universal-format, multimedia digitization system: moving beyond the scanner.

    Science.gov (United States)

    McEachen, James C; Cusack, Thomas J; McEachen, John C

    2003-08-01

    Digitizing images for use in case presentations based on hardcopy films, slides, photographs, negatives, books, and videos can present a challenging task. Scanners and digital cameras have become standard tools of the trade. Unfortunately, use of these devices to digitize multiple images in many different media formats can be a time-consuming and in some cases unachievable process. The authors' goal was to create a PC-based solution for digitizing multiple media formats in a timely fashion while maintaining adequate image presentation quality. The authors' PC-based solution makes use of off-the-shelf hardware applications to include a digital document camera (DDC), VHS video player, and video-editing kit. With the assistance of five staff radiologists, the authors examined the quality of multiple image types digitized with this equipment. The authors also quantified the speed of digitization of various types of media using the DDC and video-editing kit. With regard to image quality, the five staff radiologists rated the digitized angiography, CT, and MR images as adequate to excellent for use in teaching files and case presentations. With regard to digitized plain films, the average rating was adequate. As for performance, the authors recognized a 68% improvement in the time required to digitize hardcopy films using the DDC instead of a professional quality scanner. The PC-based solution provides a means for digitizing multiple images from many different types of media in a timely fashion while maintaining adequate image presentation quality.

  14. On the Brain Basis of Digital Daze in Millennial Minds: Rejoinder to "Digital Technology and Student Cognitive Development: The Neuroscience of the University Classroom"

    Science.gov (United States)

    Brown, Timothy T.

    2016-01-01

    In this issue, Cavanaugh, Giapponi, and Golden (2016) have discussed the new prominent role of digital devices in the lives of students; the possible impact of these widely-used technologies on developing, learning minds; and the relevance of new cognitive neuroscience research and technologies for better understanding the potential effects of…

  15. Self-dissimilarity as a High Dimensional Complexity Measure

    Science.gov (United States)

    Wolpert, David H.; Macready, William

    2005-01-01

    For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain-forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used a s the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity we present several numerical experiments. In particular, we show that underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map

  16. Digital delicacies

    OpenAIRE

    Holley, Rose

    2001-01-01

    This presentation outlines the purpose and work of the newly appointed Digital Projects Librarian at the University of Auckland. It gives a brief overview of what digitisation is, the benefits, the stages of a digitisation project and also samples of interesting international digitisation projects and new University of Auckland Library Digitisation projects.

  17. Enhanced spectral resolution by high-dimensional NMR using the filter diagonalization method and “hidden” dimensions

    Science.gov (United States)

    Meng, Xi; Nguyen, Bao D.; Ridge, Clark; Shaka, A. J.

    2009-01-01

    High-dimensional (HD) NMR spectra have poorer digital resolution than low-dimensional (LD) spectra, for a fixed amount of experiment time. This has led to “reduced-dimensionality” strategies, in which several LD projections of the HD NMR spectrum are acquired, each with higher digital resolution; an approximate HD spectrum is then inferred by some means. We propose a strategy that moves in the opposite direction, by adding more time dimensions to increase the information content of the data set, even if only a very sparse time grid is used in each dimension. The full HD time-domain data can be analyzed by the Filter Diagonalization Method (FDM), yielding very narrow resonances along all of the frequency axes, even those with sparse sampling. Integrating over the added dimensions of HD FDM NMR spectra reconstitutes LD spectra with enhanced resolution, often more quickly than direct acquisition of the LD spectrum with a larger number of grid points in each of the fewer dimensions. If the extra dimensions do not appear in the final spectrum, and are used solely to boost information content, we propose the moniker hidden-dimension NMR. This work shows that HD peaks have unmistakable frequency signatures that can be detected as single HD objects by an appropriate algorithm, even though their patterns would be tricky for a human operator to visualize or recognize, and even if digital resolution in an HD FT spectrum is very coarse compared with natural line widths. PMID:18926747

  18. Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data.

    Science.gov (United States)

    Serra, Angela; Coretto, Pietro; Fratello, Michele; Tagliaferri, Roberto; Stegle, Oliver

    2018-02-15

    Microarray technology can be used to study the expression of thousands of genes across a number of different experimental conditions, usually hundreds. The underlying principle is that genes sharing similar expression patterns, across different samples, can be part of the same co-expression system, or they may share the same biological functions. Groups of genes are usually identified based on cluster analysis. Clustering methods rely on the similarity matrix between genes. A common choice to measure similarity is to compute the sample correlation matrix. Dimensionality reduction is another popular data analysis task which is also based on covariance/correlation matrix estimates. Unfortunately, covariance/correlation matrix estimation suffers from the intrinsic noise present in high-dimensional data. Sources of noise are: sampling variations, presents of outlying sample units, and the fact that in most cases the number of units is much larger than the number of genes. In this paper, we propose a robust correlation matrix estimator that is regularized based on adaptive thresholding. The resulting method jointly tames the effects of the high-dimensionality, and data contamination. Computations are easy to implement and do not require hand tunings. Both simulated and real data are analyzed. A Monte Carlo experiment shows that the proposed method is capable of remarkable performances. Our correlation metric is more robust to outliers compared with the existing alternatives in two gene expression datasets. It is also shown how the regularization allows to automatically detect and filter spurious correlations. The same regularization is also extended to other less robust correlation measures. Finally, we apply the ARACNE algorithm on the SyNTreN gene expression data. Sensitivity and specificity of the reconstructed network is compared with the gold standard. We show that ARACNE performs better when it takes the proposed correlation matrix estimator as input. The R

  19. SEPHIROT: Scenario for Universe-Creation AUTOMATICALLY from Digits On-Average Euler-Bernoulli-Kummer-Riemann-Newcomb-Poincare-Weyl-Benford-Kac-Raimi-Hill-Antonoff-Siegel ``Digit-Physics'' Logarithm-Law: ``It's a Jack-in-the-Box Universe'': EMET/TRUTH!!!

    Science.gov (United States)

    Siegel, Edward Carl-Ludwig; Young, Frederic; Wignall, Janis

    2013-04-01

    SEPHIROT: Siegel[http://fqxi.org/community/forum/topic/1553]: Ten-[0->9]-Digits; Average Log-Law SCALE-Invariance; Utter-Simplicity: ``Complexity'' (vs. ``Complicatedness''); Zipf-law/Hyperbolicity/ Inevitability SCENARIO AUTOMATICALLY CREATES & EVOLVES a UNIVERSE: inflation, a big-bang, bosons(E)->Mellin-(c2)-tranform->fermions(m), hidden-dark-energy(HDE), hidden-dark-matter (HDM), cosmic-microwave-background(CMB), supersymmetry(SUSY), PURPOSELY NO: theories,models,mechanisms,processes, parameters,assumptions,WHATSOEVER: It's a ``Jack-in-the-Box'' Universe!!!: ONLY VIA: Newcomb [Am.J.Math.4(1),39(1881)]QUANTUM-discovery!!!-Benford-Siegel-Antonoff[AMS.Joint-Mtg.(02)-Abs.#973-60-124!!!] inversion to ONLY BEQS with d=0 BEC: ``Digit-Physics''!; Log fixed-point invariance(s): [base=units=SCALE] of digits classic (not classical!) average [CAUSING] log statistical-correlations =log(1+1/d), with physics-crucial d=0 BEC singularity/pole, permits SEPHIROT!!!: ``digits are quanta are bosons because bosons are and always were digits!!!'': Digits = Bosons with d=0 BEC(!!!) & expansion to Zipf-law Hyperbolicity INEVITABILITY CMB!

  20. Mitigating the Insider Threat Using High-Dimensional Search and Modeling

    National Research Council Canada - National Science Library

    Van Den Berg, Eric; Uphadyaya, Shambhu; Ngo, Phi H; Muthukrishnan, Muthu; Palan, Rajago

    2006-01-01

    In this project a system was built aimed at mitigating insider attacks centered around a high-dimensional search engine for correlating the large number of monitoring streams necessary for detecting insider attacks...

  1. Approximating high-dimensional dynamics by barycentric coordinates with linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Hirata, Yoshito, E-mail: yoshito@sat.t.u-tokyo.ac.jp; Aihara, Kazuyuki; Suzuki, Hideyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan); Department of Mathematical Informatics, The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan); CREST, JST, 4-1-8 Honcho, Kawaguchi, Saitama 332-0012 (Japan); Shiro, Masanori [Department of Mathematical Informatics, The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan); Mathematical Neuroinformatics Group, Advanced Industrial Science and Technology, Tsukuba, Ibaraki 305-8568 (Japan); Takahashi, Nozomu; Mas, Paloma [Center for Research in Agricultural Genomics (CRAG), Consorci CSIC-IRTA-UAB-UB, Barcelona 08193 (Spain)

    2015-01-15

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.

  2. Approximating high-dimensional dynamics by barycentric coordinates with linear programming

    International Nuclear Information System (INIS)

    Hirata, Yoshito; Aihara, Kazuyuki; Suzuki, Hideyuki; Shiro, Masanori; Takahashi, Nozomu; Mas, Paloma

    2015-01-01

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data

  3. Approximating high-dimensional dynamics by barycentric coordinates with linear programming.

    Science.gov (United States)

    Hirata, Yoshito; Shiro, Masanori; Takahashi, Nozomu; Aihara, Kazuyuki; Suzuki, Hideyuki; Mas, Paloma

    2015-01-01

    The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.

  4. Efficient and accurate nearest neighbor and closest pair search in high-dimensional space

    KAUST Repository

    Tao, Yufei; Yi, Ke; Sheng, Cheng; Kalnis, Panos

    2010-01-01

    Nearest Neighbor (NN) search in high-dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii

  5. Compressively Characterizing High-Dimensional Entangled States with Complementary, Random Filtering

    Science.gov (United States)

    2016-06-30

    Schneeloch,1,2 Daniel J. Lum,1 and John C. Howell1 1Department of Physics and Astronomy, University of Rochester, 500 Wilson Boulevard, Rochester, New...halves of the SLM, respectively. The signal and idler fields are routed to separate digital micromirror devices (DMDs) via a 500 -mm lens and a 50=50 beam...detectors. The solver we use for Eq. (6) is TVAL3 [50]. The full measurement and reconstruction recipe we follow is similar to that described in Ref

  6. Classrooms as ‘safe houses’? The ethical and emotional implications of digital storytelling in a university writing classroom

    OpenAIRE

    Kristian D Stewart

    2017-01-01

    This paper reports the findings of a digital storytelling praxis within a higher education classroom located outside of Metro Detroit in the United States. Drawing on Zembylas’s (2006, 2008) scholarship on emotion in the production of knowledge and the teacher’s role, adjacent to literature surrounding personal writing and safe houses for learning, an investigation of student perceptions of digital storytelling within a writing classroom took place during the 2016 and 2017 academic years. Dat...

  7. Classrooms as ‘safe houses’? The ethical and emotional implications of digital storytelling in a university writing classroom

    Directory of Open Access Journals (Sweden)

    Kristian D Stewart

    2017-06-01

    Full Text Available This paper reports the findings of a digital storytelling praxis within a higher education classroom located outside of Metro Detroit in the United States. Drawing on Zembylas’s (2006, 2008 scholarship on emotion in the production of knowledge and the teacher’s role, adjacent to literature surrounding personal writing and safe houses for learning, an investigation of student perceptions of digital storytelling within a writing classroom took place during the 2016 and 2017 academic years. Data highlights the students’ interest for the emotionally-driven course content digital storytelling encourages, as it taught students how to insert genre conventions into their own writing. Digital storytelling, according to the students, also supplied a means for students to develop relationships with their peers as many students felt isolated on this largely commuter campus. Students additionally viewed the curriculum as promoting ‘real world’ skills they could transfer outside of the classroom and into their lives. However, to craft digital stories, data revealed how students turned toward sharing personal (and or traumatic narratives. This can be problematic in terms of emotional safety if students are made to feel they must leverage emotions for grades and are then forced to broadcast their digital stories in a public forum. To lessen these concerns, strategies for implementing digital storytelling into the curriculum are provided. Lastly, the author concludes that educating students within a Trump presidency requires a different pedagogical approach. Assignments such as digital storytelling that merge the scholarly and the personal, alongside nurturing empathy, open dialogue, and building relationships might offer a direction forward.

  8. Using digital game-based technologies in a system of studying russian as aforeign language in modern university

    Directory of Open Access Journals (Sweden)

    A. V. Matokhina

    2017-01-01

    Full Text Available According to the State Educational Standard of the Russian Federation, the main objectives of teaching Russian as a foreign language are training communication and independent working skills, learning neutral and scientific styles of speech, motivating to study at Russian universities, preparing to pass certification and qualification examinations, adapting to live in Russia, etc. One of the promising trends in teaching foreign languages is the use of educational computer games. By now, digital gamebased technologies for studying Russian as a foreignlanguage have been implemented in a number of desktop and mobile applications, however, they are all intended for teaching Russian language as a discipline, and are not focused on adapting international students who have come to a new language environment. In this article, a learning game is presentedfor studying Russian as a foreign language with immersing a user into a virtual language environment in different life situations. The game includes seven game levels; each level consists of several sections, devoted to a specific real life situation with a set of assignments of increasing complexity for writing or translating some words, phrases or sentences. For each type of assignment a template with empty text fields is used, for importing files with corresponding data and their on-screen display special functions are implemented. Such approach allows to use the same template several times for the same type of assignment or to load different files for filling out the assignment text fields, depending on the number of player’s attempts. The database of tasks, level scripts and graphical content for each section are developed. Each level is matched with a game character, accompanying the player and helping him to complete the assignments. The player can choose a character, andchoose any section of the level. The assignments are stored in a coded format, for uploading files with data matched to

  9. From Ambiguities to Insights: Query-based Comparisons of High-Dimensional Data

    Science.gov (United States)

    Kowalski, Jeanne; Talbot, Conover; Tsai, Hua L.; Prasad, Nijaguna; Umbricht, Christopher; Zeiger, Martha A.

    2007-11-01

    Genomic technologies will revolutionize drag discovery and development; that much is universally agreed upon. The high dimension of data from such technologies has challenged available data analytic methods; that much is apparent. To date, large-scale data repositories have not been utilized in ways that permit their wealth of information to be efficiently processed for knowledge, presumably due in large part to inadequate analytical tools to address numerous comparisons of high-dimensional data. In candidate gene discovery, expression comparisons are often made between two features (e.g., cancerous versus normal), such that the enumeration of outcomes is manageable. With multiple features, the setting becomes more complex, in terms of comparing expression levels of tens of thousands transcripts across hundreds of features. In this case, the number of outcomes, while enumerable, become rapidly large and unmanageable, and scientific inquiries become more abstract, such as "which one of these (compounds, stimuli, etc.) is not like the others?" We develop analytical tools that promote more extensive, efficient, and rigorous utilization of the public data resources generated by the massive support of genomic studies. Our work innovates by enabling access to such metadata with logically formulated scientific inquires that define, compare and integrate query-comparison pair relations for analysis. We demonstrate our computational tool's potential to address an outstanding biomedical informatics issue of identifying reliable molecular markers in thyroid cancer. Our proposed query-based comparison (QBC) facilitates access to and efficient utilization of metadata through logically formed inquires expressed as query-based comparisons by organizing and comparing results from biotechnologies to address applications in biomedicine.

  10. Influencia de la combinación de marketing digital y marketing tradicional en el comportamiento de las ventas empresariales. Caso: música digital Universal Music Ecuador

    OpenAIRE

    Díaz Zárate, Paola Cristina

    2014-01-01

    Las ventas del formato físico de música se han reducido en Ecuador por falta de control de la piratería y elevados precios de venta al público, provocando la reducción de tiendas discos y el fortalecimiento de la ilegalidad. El lanzamiento de tiendas digitales legales de música en el país representa una oportunidad para las compañías discográficas de impulsar el consumo de música digital, destacando iTunes como la plataforma de mayor difusión y penetración, frente a la cual ninguna compañía h...

  11. Engineering two-photon high-dimensional states through quantum interference

    Science.gov (United States)

    Zhang, Yingwen; Roux, Filippus S.; Konrad, Thomas; Agnew, Megan; Leach, Jonathan; Forbes, Andrew

    2016-01-01

    Many protocols in quantum science, for example, linear optical quantum computing, require access to large-scale entangled quantum states. Such systems can be realized through many-particle qubits, but this approach often suffers from scalability problems. An alternative strategy is to consider a lesser number of particles that exist in high-dimensional states. The spatial modes of light are one such candidate that provides access to high-dimensional quantum states, and thus they increase the storage and processing potential of quantum information systems. We demonstrate the controlled engineering of two-photon high-dimensional states entangled in their orbital angular momentum through Hong-Ou-Mandel interference. We prepare a large range of high-dimensional entangled states and implement precise quantum state filtering. We characterize the full quantum state before and after the filter, and are thus able to determine that only the antisymmetric component of the initial state remains. This work paves the way for high-dimensional processing and communication of multiphoton quantum states, for example, in teleportation beyond qubits. PMID:26933685

  12. A Comparison of Methods for Estimating the Determinant of High-Dimensional Covariance Matrix

    KAUST Repository

    Hu, Zongliang

    2017-09-27

    The determinant of the covariance matrix for high-dimensional data plays an important role in statistical inference and decision. It has many real applications including statistical tests and information theory. Due to the statistical and computational challenges with high dimensionality, little work has been proposed in the literature for estimating the determinant of high-dimensional covariance matrix. In this paper, we estimate the determinant of the covariance matrix using some recent proposals for estimating high-dimensional covariance matrix. Specifically, we consider a total of eight covariance matrix estimation methods for comparison. Through extensive simulation studies, we explore and summarize some interesting comparison results among all compared methods. We also provide practical guidelines based on the sample size, the dimension, and the correlation of the data set for estimating the determinant of high-dimensional covariance matrix. Finally, from a perspective of the loss function, the comparison study in this paper may also serve as a proxy to assess the performance of the covariance matrix estimation.

  13. A Comparison of Methods for Estimating the Determinant of High-Dimensional Covariance Matrix.

    Science.gov (United States)

    Hu, Zongliang; Dong, Kai; Dai, Wenlin; Tong, Tiejun

    2017-09-21

    The determinant of the covariance matrix for high-dimensional data plays an important role in statistical inference and decision. It has many real applications including statistical tests and information theory. Due to the statistical and computational challenges with high dimensionality, little work has been proposed in the literature for estimating the determinant of high-dimensional covariance matrix. In this paper, we estimate the determinant of the covariance matrix using some recent proposals for estimating high-dimensional covariance matrix. Specifically, we consider a total of eight covariance matrix estimation methods for comparison. Through extensive simulation studies, we explore and summarize some interesting comparison results among all compared methods. We also provide practical guidelines based on the sample size, the dimension, and the correlation of the data set for estimating the determinant of high-dimensional covariance matrix. Finally, from a perspective of the loss function, the comparison study in this paper may also serve as a proxy to assess the performance of the covariance matrix estimation.

  14. A Comparison of Methods for Estimating the Determinant of High-Dimensional Covariance Matrix

    KAUST Repository

    Hu, Zongliang; Dong, Kai; Dai, Wenlin; Tong, Tiejun

    2017-01-01

    The determinant of the covariance matrix for high-dimensional data plays an important role in statistical inference and decision. It has many real applications including statistical tests and information theory. Due to the statistical and computational challenges with high dimensionality, little work has been proposed in the literature for estimating the determinant of high-dimensional covariance matrix. In this paper, we estimate the determinant of the covariance matrix using some recent proposals for estimating high-dimensional covariance matrix. Specifically, we consider a total of eight covariance matrix estimation methods for comparison. Through extensive simulation studies, we explore and summarize some interesting comparison results among all compared methods. We also provide practical guidelines based on the sample size, the dimension, and the correlation of the data set for estimating the determinant of high-dimensional covariance matrix. Finally, from a perspective of the loss function, the comparison study in this paper may also serve as a proxy to assess the performance of the covariance matrix estimation.

  15. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Hongchao Song

    2017-01-01

    Full Text Available Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE and an ensemble k-nearest neighbor graphs- (K-NNG- based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.

  16. Model-based Clustering of High-Dimensional Data in Astrophysics

    Science.gov (United States)

    Bouveyron, C.

    2016-05-01

    The nature of data in Astrophysics has changed, as in other scientific fields, in the past decades due to the increase of the measurement capabilities. As a consequence, data are nowadays frequently of high dimensionality and available in mass or stream. Model-based techniques for clustering are popular tools which are renowned for their probabilistic foundations and their flexibility. However, classical model-based techniques show a disappointing behavior in high-dimensional spaces which is mainly due to their dramatical over-parametrization. The recent developments in model-based classification overcome these drawbacks and allow to efficiently classify high-dimensional data, even in the "small n / large p" situation. This work presents a comprehensive review of these recent approaches, including regularization-based techniques, parsimonious modeling, subspace classification methods and classification methods based on variable selection. The use of these model-based methods is also illustrated on real-world classification problems in Astrophysics using R packages.

  17. High-dimensional orbital angular momentum entanglement concentration based on Laguerre–Gaussian mode selection

    International Nuclear Information System (INIS)

    Zhang, Wuhong; Su, Ming; Wu, Ziwen; Lu, Meng; Huang, Bingwei; Chen, Lixiang

    2013-01-01

    Twisted photons enable the definition of a Hilbert space beyond two dimensions by orbital angular momentum (OAM) eigenstates. Here we propose a feasible entanglement concentration experiment, to enhance the quality of high-dimensional entanglement shared by twisted photon pairs. Our approach is started from the full characterization of entangled spiral bandwidth, and is then based on the careful selection of the Laguerre–Gaussian (LG) modes with specific radial and azimuthal indices p and ℓ. In particular, we demonstrate the possibility of high-dimensional entanglement concentration residing in the OAM subspace of up to 21 dimensions. By means of LabVIEW simulations with spatial light modulators, we show that the Shannon dimensionality could be employed to quantify the quality of the present concentration. Our scheme holds promise in quantum information applications defined in high-dimensional Hilbert space. (letter)

  18. Detection of Subtle Context-Dependent Model Inaccuracies in High-Dimensional Robot Domains.

    Science.gov (United States)

    Mendoza, Juan Pablo; Simmons, Reid; Veloso, Manuela

    2016-12-01

    Autonomous robots often rely on models of their sensing and actions for intelligent decision making. However, when operating in unconstrained environments, the complexity of the world makes it infeasible to create models that are accurate in every situation. This article addresses the problem of using potentially large and high-dimensional sets of robot execution data to detect situations in which a robot model is inaccurate-that is, detecting context-dependent model inaccuracies in a high-dimensional context space. To find inaccuracies tractably, the robot conducts an informed search through low-dimensional projections of execution data to find parametric Regions of Inaccurate Modeling (RIMs). Empirical evidence from two robot domains shows that this approach significantly enhances the detection power of existing RIM-detection algorithms in high-dimensional spaces.

  19. Linear stability theory as an early warning sign for transitions in high dimensional complex systems

    International Nuclear Information System (INIS)

    Piovani, Duccio; Grujić, Jelena; Jensen, Henrik Jeldtoft

    2016-01-01

    We analyse in detail a new approach to the monitoring and forecasting of the onset of transitions in high dimensional complex systems by application to the Tangled Nature model of evolutionary ecology and high dimensional replicator systems with a stochastic element. A high dimensional stability matrix is derived in the mean field approximation to the stochastic dynamics. This allows us to determine the stability spectrum about the observed quasi-stable configurations. From overlap of the instantaneous configuration vector of the full stochastic system with the eigenvectors of the unstable directions of the deterministic mean field approximation, we are able to construct a good early-warning indicator of the transitions occurring intermittently. (paper)

  20. Interface between path and orbital angular momentum entanglement for high-dimensional photonic quantum information.

    Science.gov (United States)

    Fickler, Robert; Lapkiewicz, Radek; Huber, Marcus; Lavery, Martin P J; Padgett, Miles J; Zeilinger, Anton

    2014-07-30

    Photonics has become a mature field of quantum information science, where integrated optical circuits offer a way to scale the complexity of the set-up as well as the dimensionality of the quantum state. On photonic chips, paths are the natural way to encode information. To distribute those high-dimensional quantum states over large distances, transverse spatial modes, like orbital angular momentum possessing Laguerre Gauss modes, are favourable as flying information carriers. Here we demonstrate a quantum interface between these two vibrant photonic fields. We create three-dimensional path entanglement between two photons in a nonlinear crystal and use a mode sorter as the quantum interface to transfer the entanglement to the orbital angular momentum degree of freedom. Thus our results show a flexible way to create high-dimensional spatial mode entanglement. Moreover, they pave the way to implement broad complex quantum networks where high-dimensionally entangled states could be distributed over distant photonic chips.

  1. Universe

    CERN Document Server

    2009-01-01

    The Universe, is one book in the Britannica Illustrated Science Library Series that is correlated to the science curriculum in grades 5-8. The Britannica Illustrated Science Library is a visually compelling set that covers earth science, life science, and physical science in 16 volumes.  Created for ages 10 and up, each volume provides an overview on a subject and thoroughly explains it through detailed and powerful graphics-more than 1,000 per volume-that turn complex subjects into information that students can grasp.  Each volume contains a glossary with full definitions for vocabulary help and an index.

  2. Bringing the Digital World to Students: Partnering with the University Communications Office to Provide Social Media Experiential Learning Projects

    Science.gov (United States)

    Childers, Courtney C.; Levenshus, Abbey B.

    2016-01-01

    The Accrediting Council on Education in Journalism and Mass Communications recognizes the importance of a curriculum that prepares students "to apply current tools and technologies appropriate for the communications professions in which they work, and to understand the digital world" (ACEJMC, n.d.). Infusing experiential learning into…

  3. Scalable Clustering of High-Dimensional Data Technique Using SPCM with Ant Colony Optimization Intelligence

    Directory of Open Access Journals (Sweden)

    Thenmozhi Srinivasan

    2015-01-01

    Full Text Available Clusters of high-dimensional data techniques are emerging, according to data noisy and poor quality challenges. This paper has been developed to cluster data using high-dimensional similarity based PCM (SPCM, with ant colony optimization intelligence which is effective in clustering nonspatial data without getting knowledge about cluster number from the user. The PCM becomes similarity based by using mountain method with it. Though this is efficient clustering, it is checked for optimization using ant colony algorithm with swarm intelligence. Thus the scalable clustering technique is obtained and the evaluation results are checked with synthetic datasets.

  4. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn [School of Information Science and Technology, ShanghaiTech University, Shanghai 200031 (China); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics & School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  5. The validation and assessment of machine learning: a game of prediction from high-dimensional data

    DEFF Research Database (Denmark)

    Pers, Tune Hannes; Albrechtsen, A; Holst, C

    2009-01-01

    In applied statistics, tools from machine learning are popular for analyzing complex and high-dimensional data. However, few theoretical results are available that could guide to the appropriate machine learning tool in a new application. Initial development of an overall strategy thus often...... the ideas, the game is applied to data from the Nugenob Study where the aim is to predict the fat oxidation capacity based on conventional factors and high-dimensional metabolomics data. Three players have chosen to use support vector machines, LASSO, and random forests, respectively....

  6. Digital radiography

    International Nuclear Information System (INIS)

    Rath, M.; Lissner, J.; Rienmueller, R.; Haendle, J.; Siemens A.G., Erlangen

    1984-01-01

    Using a prototype of an electronic, universal examination unit equipped with a special X-ray TV installation, spotfilm exposures and digital angiographies with high spatial resolution and wide-range contrast could be made in the clinic for the first time. With transvenous contrast medium injection, the clinical results of digital angiography show excellent image quality in the region of the carotids and renal arteries as well as the arteries of the extremities. The electronic series exposures have an image quality almost comparable to the quality obtained with cutfilm changers in conventional angiography. There are certain limitations due to the input field of 25 cm X-ray image intensified used. In respect of the digital angiography imaging technique, the electronic universal unit is fully suitable for clinical application. (orig.) [de

  7. Nativos digitales y nuevas tecnologias: implantación en la universidad / Digital natives and new technologies: implementation in the university

    Directory of Open Access Journals (Sweden)

    Patricia Nuñes Gómez

    2011-04-01

    Full Text Available Este articulo se enmarca dentro de la línea de investigación que el gruposocmedia (www.gruposocmedia.es está desarrollando desde hace varios años sobre Nativos Digitales y su relación con las Nuevas Tecnologías. Se trata de investigar cómo construyen realidad Social los jóvenes a través de los servicios y contenidos digitales abiertos. En este artículo se expone la implantación de los conocimientos extraídos de las investigaciones del grupo a las clases prácticas dentro de la universidad durante el curso académico 2009-2010. Se intenta transferir los conocimientos adquiridos en investigación a las clases de la Facultad de Ciencias de la Información de la Universidad Complutense .AbstractThis article is part of the research that gruposocmedia (www.gruposocmedia.es is developed over several years on Digital Natives and their relationship to new technologies. The research focuses on how young people construct social reality through services and open digital content. This article describes the implementation of knowledge from the group research to practical classes in the university during the academic year 2009-2010. Attempting to transfer the knowledge acquired in research classes at the School of Information Sciences at the Complutense University.

  8. EDUCATING THE PEOPLE AS A DIGITAL PHOTOGRAPHER AND CAMERA OPERATOR VIA OPEN EDUCATION SYSTEM STUDIES FROM TURKEY: Anadolu University Open Education Faculty Case

    Directory of Open Access Journals (Sweden)

    Huseyin ERYILMAZ

    2010-04-01

    Full Text Available Today, Photography and visual arts are very important in our modern life. Especially for the mass communication, the visual images and visual arts have very big importance. In modern societies, people must have knowledge about the visual things, such as photographs, cartoons, drawings, typography, etc. Briefly, the people need education on visual literacy.In today’s world, most of the people in the world have a digital camera for photography or video image. But it is not possible to give people, visual literacy education in classic school system. But the camera users need a teaching medium for using their cameras effectively. So they are trying to use internet opportunities, some internet websites and pages as an information source. But as the well known problem, not all the websites give the correct learning or know-how on internet. There are a lot of mistakes and false information. Because of the reasons given above, Anadolu University Open Education Faculty is starting a new education system to educate people as a digital photographer and camera person in 2009. This program has very importance as a case study. The language of photography and digital technology is in English. Of course, not all the camera users understand English language. So, owing to this program, most of the camera users and especially people who is working as an operator in studios will learn a lot of things on photography, digital technology and camera systems. On the other hand, these people will learn about composition, visual image's history etc. Because of these reasons, this program is very important especially for developing countries. This paper will try to discuss this subject.

  9. An irregular grid approach for pricing high-dimensional American options

    NARCIS (Netherlands)

    Berridge, S.J.; Schumacher, J.M.

    2008-01-01

    We propose and test a new method for pricing American options in a high-dimensional setting. The method is centered around the approximation of the associated complementarity problem on an irregular grid. We approximate the partial differential operator on this grid by appealing to the SDE

  10. Can We Train Machine Learning Methods to Outperform the High-dimensional Propensity Score Algorithm?

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Pang, Menglan; Platt, Robert W

    2018-03-01

    The use of retrospective health care claims datasets is frequently criticized for the lack of complete information on potential confounders. Utilizing patient's health status-related information from claims datasets as surrogates or proxies for mismeasured and unobserved confounders, the high-dimensional propensity score algorithm enables us to reduce bias. Using a previously published cohort study of postmyocardial infarction statin use (1998-2012), we compare the performance of the algorithm with a number of popular machine learning approaches for confounder selection in high-dimensional covariate spaces: random forest, least absolute shrinkage and selection operator, and elastic net. Our results suggest that, when the data analysis is done with epidemiologic principles in mind, machine learning methods perform as well as the high-dimensional propensity score algorithm. Using a plasmode framework that mimicked the empirical data, we also showed that a hybrid of machine learning and high-dimensional propensity score algorithms generally perform slightly better than both in terms of mean squared error, when a bias-based analysis is used.

  11. Reconstruction of high-dimensional states entangled in orbital angular momentum using mutually unbiased measurements

    CSIR Research Space (South Africa)

    Giovannini, D

    2013-06-01

    Full Text Available : QELS_Fundamental Science, San Jose, California United States, 9-14 June 2013 Reconstruction of High-Dimensional States Entangled in Orbital Angular Momentum Using Mutually Unbiased Measurements D. Giovannini1, ⇤, J. Romero1, 2, J. Leach3, A...

  12. Global communication schemes for the numerical solution of high-dimensional PDEs

    DEFF Research Database (Denmark)

    Hupp, Philipp; Heene, Mario; Jacob, Riko

    2016-01-01

    The numerical treatment of high-dimensional partial differential equations is among the most compute-hungry problems and in urgent need for current and future high-performance computing (HPC) systems. It is thus also facing the grand challenges of exascale computing such as the requirement...

  13. High-Dimensional Intrinsic Interpolation Using Gaussian Process Regression and Diffusion Maps

    International Nuclear Information System (INIS)

    Thimmisetty, Charanraj A.; Ghanem, Roger G.; White, Joshua A.; Chen, Xiao

    2017-01-01

    This article considers the challenging task of estimating geologic properties of interest using a suite of proxy measurements. The current work recast this task as a manifold learning problem. In this process, this article introduces a novel regression procedure for intrinsic variables constrained onto a manifold embedded in an ambient space. The procedure is meant to sharpen high-dimensional interpolation by inferring non-linear correlations from the data being interpolated. The proposed approach augments manifold learning procedures with a Gaussian process regression. It first identifies, using diffusion maps, a low-dimensional manifold embedded in an ambient high-dimensional space associated with the data. It relies on the diffusion distance associated with this construction to define a distance function with which the data model is equipped. This distance metric function is then used to compute the correlation structure of a Gaussian process that describes the statistical dependence of quantities of interest in the high-dimensional ambient space. The proposed method is applicable to arbitrarily high-dimensional data sets. Here, it is applied to subsurface characterization using a suite of well log measurements. The predictions obtained in original, principal component, and diffusion space are compared using both qualitative and quantitative metrics. Considerable improvement in the prediction of the geological structural properties is observed with the proposed method.

  14. Finding and Visualizing Relevant Subspaces for Clustering High-Dimensional Astronomical Data Using Connected Morphological Operators

    NARCIS (Netherlands)

    Ferdosi, Bilkis J.; Buddelmeijer, Hugo; Trager, Scott; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    2010-01-01

    Data sets in astronomy are growing to enormous sizes. Modern astronomical surveys provide not only image data but also catalogues of millions of objects (stars, galaxies), each object with hundreds of associated parameters. Exploration of this very high-dimensional data space poses a huge challenge.

  15. High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm

    Science.gov (United States)

    Cai, Li

    2010-01-01

    A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…

  16. Estimating the effect of a variable in a high-dimensional regression model

    DEFF Research Database (Denmark)

    Jensen, Peter Sandholt; Wurtz, Allan

    assume that the effect is identified in a high-dimensional linear model specified by unconditional moment restrictions. We consider  properties of the following methods, which rely on lowdimensional models to infer the effect: Extreme bounds analysis, the minimum t-statistic over models, Sala...

  17. Multi-Scale Factor Analysis of High-Dimensional Brain Signals

    KAUST Repository

    Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-01-01

    In this paper, we develop an approach to modeling high-dimensional networks with a large number of nodes arranged in a hierarchical and modular structure. We propose a novel multi-scale factor analysis (MSFA) model which partitions the massive

  18. Spectrally-Corrected Estimation for High-Dimensional Markowitz Mean-Variance Optimization

    NARCIS (Netherlands)

    Z. Bai (Zhidong); H. Li (Hua); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2016-01-01

    textabstractThis paper considers the portfolio problem for high dimensional data when the dimension and size are both large. We analyze the traditional Markowitz mean-variance (MV) portfolio by large dimension matrix theory, and find the spectral distribution of the sample covariance is the main

  19. Using Localised Quadratic Functions on an Irregular Grid for Pricing High-Dimensional American Options

    NARCIS (Netherlands)

    Berridge, S.J.; Schumacher, J.M.

    2004-01-01

    We propose a method for pricing high-dimensional American options on an irregular grid; the method involves using quadratic functions to approximate the local effect of the Black-Scholes operator.Once such an approximation is known, one can solve the pricing problem by time stepping in an explicit

  20. Multigrid for high dimensional elliptic partial differential equations on non-equidistant grids

    NARCIS (Netherlands)

    bin Zubair, H.; Oosterlee, C.E.; Wienands, R.

    2006-01-01

    This work presents techniques, theory and numbers for multigrid in a general d-dimensional setting. The main focus is the multigrid convergence for high-dimensional partial differential equations (PDEs). As a model problem we have chosen the anisotropic diffusion equation, on a unit hypercube. We

  1. An Irregular Grid Approach for Pricing High-Dimensional American Options

    NARCIS (Netherlands)

    Berridge, S.J.; Schumacher, J.M.

    2004-01-01

    We propose and test a new method for pricing American options in a high-dimensional setting.The method is centred around the approximation of the associated complementarity problem on an irregular grid.We approximate the partial differential operator on this grid by appealing to the SDE

  2. Pricing and hedging high-dimensional American options : an irregular grid approach

    NARCIS (Netherlands)

    Berridge, S.; Schumacher, H.

    2002-01-01

    We propose and test a new method for pricing American options in a high dimensional setting. The method is centred around the approximation of the associated variational inequality on an irregular grid. We approximate the partial differential operator on this grid by appealing to the SDE

  3. THE ROLE OF DIGITAL MARKETING IN UNIVERSITY SPORT: AN OVERVIEW STUDY OF HIGHER EDUCATION INSTITUTION IN CROATIA

    OpenAIRE

    Biloš, Antun; Galić, Tvrtko

    2016-01-01

    The importance of student sport activities within the structure of academic development is arguably significant. However, university sport is one of the elements of academic development that is not represented adequately as a research subject on a global scale in both scientific and professional environments alike. Along with the global growth of university level education based on the rise of student mobility across countries and continents, and the strong global ICT development, a new persp...

  4. Effective Communication to Aid Collaboration for Digital Collections: A Case Study at Florida Gulf Coast University Library

    Science.gov (United States)

    VandeBurgt, Melissa Minds; Rivera, Kaleena

    2016-01-01

    Effective communication is one of the most important resources for successful outreach efforts. This article addresses the benefits that can emerge from successful communication as well as the negative effects that may stem from ineffective communication. A case study of Florida Gulf Coast University Archives, Special Collections, & Digital…

  5. Rhodes University

    African Journals Online (AJOL)

    Samridhi Sharma

    2013-10-29

    Oct 29, 2013 ... been taken may improve the reception, by the target audience, of the intended communication. This may ... alcohol marketing. Similarly .... of the intended users (Rhodes University support staff ..... Digital Human Modeling and.

  6. Bit-Table Based Biclustering and Frequent Closed Itemset Mining in High-Dimensional Binary Data

    Directory of Open Access Journals (Sweden)

    András Király

    2014-01-01

    Full Text Available During the last decade various algorithms have been developed and proposed for discovering overlapping clusters in high-dimensional data. The two most prominent application fields in this research, proposed independently, are frequent itemset mining (developed for market basket data and biclustering (applied to gene expression data analysis. The common limitation of both methodologies is the limited applicability for very large binary data sets. In this paper we propose a novel and efficient method to find both frequent closed itemsets and biclusters in high-dimensional binary data. The method is based on simple but very powerful matrix and vector multiplication approaches that ensure that all patterns can be discovered in a fast manner. The proposed algorithm has been implemented in the commonly used MATLAB environment and freely available for researchers.

  7. Characterization of discontinuities in high-dimensional stochastic problems on adaptive sparse grids

    International Nuclear Information System (INIS)

    Jakeman, John D.; Archibald, Richard; Xiu Dongbin

    2011-01-01

    In this paper we present a set of efficient algorithms for detection and identification of discontinuities in high dimensional space. The method is based on extension of polynomial annihilation for discontinuity detection in low dimensions. Compared to the earlier work, the present method poses significant improvements for high dimensional problems. The core of the algorithms relies on adaptive refinement of sparse grids. It is demonstrated that in the commonly encountered cases where a discontinuity resides on a small subset of the dimensions, the present method becomes 'optimal', in the sense that the total number of points required for function evaluations depends linearly on the dimensionality of the space. The details of the algorithms will be presented and various numerical examples are utilized to demonstrate the efficacy of the method.

  8. Non-intrusive low-rank separated approximation of high-dimensional stochastic models

    KAUST Repository

    Doostan, Alireza; Validi, AbdoulAhad; Iaccarino, Gianluca

    2013-01-01

    This work proposes a sampling-based (non-intrusive) approach within the context of low-. rank separated representations to tackle the issue of curse-of-dimensionality associated with the solution of models, e.g., PDEs/ODEs, with high-dimensional random inputs. Under some conditions discussed in details, the number of random realizations of the solution, required for a successful approximation, grows linearly with respect to the number of random inputs. The construction of the separated representation is achieved via a regularized alternating least-squares regression, together with an error indicator to estimate model parameters. The computational complexity of such a construction is quadratic in the number of random inputs. The performance of the method is investigated through its application to three numerical examples including two ODE problems with high-dimensional random inputs. © 2013 Elsevier B.V.

  9. Non-intrusive low-rank separated approximation of high-dimensional stochastic models

    KAUST Repository

    Doostan, Alireza

    2013-08-01

    This work proposes a sampling-based (non-intrusive) approach within the context of low-. rank separated representations to tackle the issue of curse-of-dimensionality associated with the solution of models, e.g., PDEs/ODEs, with high-dimensional random inputs. Under some conditions discussed in details, the number of random realizations of the solution, required for a successful approximation, grows linearly with respect to the number of random inputs. The construction of the separated representation is achieved via a regularized alternating least-squares regression, together with an error indicator to estimate model parameters. The computational complexity of such a construction is quadratic in the number of random inputs. The performance of the method is investigated through its application to three numerical examples including two ODE problems with high-dimensional random inputs. © 2013 Elsevier B.V.

  10. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  11. Single cell proteomics in biomedicine: High-dimensional data acquisition, visualization, and analysis.

    Science.gov (United States)

    Su, Yapeng; Shi, Qihui; Wei, Wei

    2017-02-01

    New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A Shell Multi-dimensional Hierarchical Cubing Approach for High-Dimensional Cube

    Science.gov (United States)

    Zou, Shuzhi; Zhao, Li; Hu, Kongfa

    The pre-computation of data cubes is critical for improving the response time of OLAP systems and accelerating data mining tasks in large data warehouses. However, as the sizes of data warehouses grow, the time it takes to perform this pre-computation becomes a significant performance bottleneck. In a high dimensional data warehouse, it might not be practical to build all these cuboids and their indices. In this paper, we propose a shell multi-dimensional hierarchical cubing algorithm, based on an extension of the previous minimal cubing approach. This method partitions the high dimensional data cube into low multi-dimensional hierarchical cube. Experimental results show that the proposed method is significantly more efficient than other existing cubing methods.

  13. Minimax Rate-optimal Estimation of High-dimensional Covariance Matrices with Incomplete Data.

    Science.gov (United States)

    Cai, T Tony; Zhang, Anru

    2016-09-01

    Missing data occur frequently in a wide range of applications. In this paper, we consider estimation of high-dimensional covariance matrices in the presence of missing observations under a general missing completely at random model in the sense that the missingness is not dependent on the values of the data. Based on incomplete data, estimators for bandable and sparse covariance matrices are proposed and their theoretical and numerical properties are investigated. Minimax rates of convergence are established under the spectral norm loss and the proposed estimators are shown to be rate-optimal under mild regularity conditions. Simulation studies demonstrate that the estimators perform well numerically. The methods are also illustrated through an application to data from four ovarian cancer studies. The key technical tools developed in this paper are of independent interest and potentially useful for a range of related problems in high-dimensional statistical inference with missing data.

  14. Minimax Rate-optimal Estimation of High-dimensional Covariance Matrices with Incomplete Data*

    Science.gov (United States)

    Cai, T. Tony; Zhang, Anru

    2016-01-01

    Missing data occur frequently in a wide range of applications. In this paper, we consider estimation of high-dimensional covariance matrices in the presence of missing observations under a general missing completely at random model in the sense that the missingness is not dependent on the values of the data. Based on incomplete data, estimators for bandable and sparse covariance matrices are proposed and their theoretical and numerical properties are investigated. Minimax rates of convergence are established under the spectral norm loss and the proposed estimators are shown to be rate-optimal under mild regularity conditions. Simulation studies demonstrate that the estimators perform well numerically. The methods are also illustrated through an application to data from four ovarian cancer studies. The key technical tools developed in this paper are of independent interest and potentially useful for a range of related problems in high-dimensional statistical inference with missing data. PMID:27777471

  15. Distribution of high-dimensional entanglement via an intra-city free-space link.

    Science.gov (United States)

    Steinlechner, Fabian; Ecker, Sebastian; Fink, Matthias; Liu, Bo; Bavaresco, Jessica; Huber, Marcus; Scheidl, Thomas; Ursin, Rupert

    2017-07-24

    Quantum entanglement is a fundamental resource in quantum information processing and its distribution between distant parties is a key challenge in quantum communications. Increasing the dimensionality of entanglement has been shown to improve robustness and channel capacities in secure quantum communications. Here we report on the distribution of genuine high-dimensional entanglement via a 1.2-km-long free-space link across Vienna. We exploit hyperentanglement, that is, simultaneous entanglement in polarization and energy-time bases, to encode quantum information, and observe high-visibility interference for successive correlation measurements in each degree of freedom. These visibilities impose lower bounds on entanglement in each subspace individually and certify four-dimensional entanglement for the hyperentangled system. The high-fidelity transmission of high-dimensional entanglement under real-world atmospheric link conditions represents an important step towards long-distance quantum communications with more complex quantum systems and the implementation of advanced quantum experiments with satellite links.

  16. An Unbiased Distance-based Outlier Detection Approach for High-dimensional Data

    DEFF Research Database (Denmark)

    Nguyen, Hoang Vu; Gopalkrishnan, Vivekanand; Assent, Ira

    2011-01-01

    than a global property. Different from existing approaches, it is not grid-based and dimensionality unbiased. Thus, its performance is impervious to grid resolution as well as the curse of dimensionality. In addition, our approach ranks the outliers, allowing users to select the number of desired...... outliers, thus mitigating the issue of high false alarm rate. Extensive empirical studies on real datasets show that our approach efficiently and effectively detects outliers, even in high-dimensional spaces....

  17. Controlling chaos in low and high dimensional systems with periodic parametric perturbations

    International Nuclear Information System (INIS)

    Mirus, K.A.; Sprott, J.C.

    1998-06-01

    The effect of applying a periodic perturbation to an accessible parameter of various chaotic systems is examined. Numerical results indicate that perturbation frequencies near the natural frequencies of the unstable periodic orbits of the chaotic systems can result in limit cycles for relatively small perturbations. Such perturbations can also control or significantly reduce the dimension of high-dimensional systems. Initial application to the control of fluctuations in a prototypical magnetic fusion plasma device will be reviewed

  18. A Comparison of Machine Learning Methods in a High-Dimensional Classification Problem

    OpenAIRE

    Zekić-Sušac, Marijana; Pfeifer, Sanja; Šarlija, Nataša

    2014-01-01

    Background: Large-dimensional data modelling often relies on variable reduction methods in the pre-processing and in the post-processing stage. However, such a reduction usually provides less information and yields a lower accuracy of the model. Objectives: The aim of this paper is to assess the high-dimensional classification problem of recognizing entrepreneurial intentions of students by machine learning methods. Methods/Approach: Four methods were tested: artificial neural networks, CART ...

  19. GAMLSS for high-dimensional data – a flexible approach based on boosting

    OpenAIRE

    Mayr, Andreas; Fenske, Nora; Hofner, Benjamin; Kneib, Thomas; Schmid, Matthias

    2010-01-01

    Generalized additive models for location, scale and shape (GAMLSS) are a popular semi-parametric modelling approach that, in contrast to conventional GAMs, regress not only the expected mean but every distribution parameter (e.g. location, scale and shape) to a set of covariates. Current fitting procedures for GAMLSS are infeasible for high-dimensional data setups and require variable selection based on (potentially problematic) information criteria. The present work describes a boosting algo...

  20. Preface [HD3-2015: International meeting on high-dimensional data-driven science

    International Nuclear Information System (INIS)

    2016-01-01

    A never-ending series of innovations in measurement technology and evolutions in information and communication technologies have led to the ongoing generation and accumulation of large quantities of high-dimensional data every day. While detailed data-centric approaches have been pursued in respective research fields, situations have been encountered where the same mathematical framework of high-dimensional data analysis can be found in a wide variety of seemingly unrelated research fields, such as estimation on the basis of undersampled Fourier transform in nuclear magnetic resonance spectroscopy in chemistry, in magnetic resonance imaging in medicine, and in astronomical interferometry in astronomy. In such situations, bringing diverse viewpoints together therefore becomes a driving force for the creation of innovative developments in various different research fields. This meeting focuses on “Sparse Modeling” (SpM) as a methodology for creation of innovative developments through the incorporation of a wide variety of viewpoints in various research fields. The objective of this meeting is to offer a forum where researchers with interest in SpM can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies for High-Dimensional Data-Driven science (HD 3 ). The meeting was held in Kyoto from 14-17 December 2015. We are pleased to publish 22 papers contributed by invited speakers in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of High-Dimensional Data-Driven science. (paper)

  1. Reinforcement learning on slow features of high-dimensional input streams.

    Directory of Open Access Journals (Sweden)

    Robert Legenstein

    Full Text Available Humans and animals are able to learn complex behaviors based on a massive stream of sensory information from different modalities. Early animal studies have identified learning mechanisms that are based on reward and punishment such that animals tend to avoid actions that lead to punishment whereas rewarded actions are reinforced. However, most algorithms for reward-based learning are only applicable if the dimensionality of the state-space is sufficiently small or its structure is sufficiently simple. Therefore, the question arises how the problem of learning on high-dimensional data is solved in the brain. In this article, we propose a biologically plausible generic two-stage learning system that can directly be applied to raw high-dimensional input streams. The system is composed of a hierarchical slow feature analysis (SFA network for preprocessing and a simple neural network on top that is trained based on rewards. We demonstrate by computer simulations that this generic architecture is able to learn quite demanding reinforcement learning tasks on high-dimensional visual input streams in a time that is comparable to the time needed when an explicit highly informative low-dimensional state-space representation is given instead of the high-dimensional visual input. The learning speed of the proposed architecture in a task similar to the Morris water maze task is comparable to that found in experimental studies with rats. This study thus supports the hypothesis that slowness learning is one important unsupervised learning principle utilized in the brain to form efficient state representations for behavioral learning.

  2. Dissecting high-dimensional phenotypes with bayesian sparse factor analysis of genetic covariance matrices.

    Science.gov (United States)

    Runcie, Daniel E; Mukherjee, Sayan

    2013-07-01

    Quantitative genetic studies that model complex, multivariate phenotypes are important for both evolutionary prediction and artificial selection. For example, changes in gene expression can provide insight into developmental and physiological mechanisms that link genotype and phenotype. However, classical analytical techniques are poorly suited to quantitative genetic studies of gene expression where the number of traits assayed per individual can reach many thousand. Here, we derive a Bayesian genetic sparse factor model for estimating the genetic covariance matrix (G-matrix) of high-dimensional traits, such as gene expression, in a mixed-effects model. The key idea of our model is that we need consider only G-matrices that are biologically plausible. An organism's entire phenotype is the result of processes that are modular and have limited complexity. This implies that the G-matrix will be highly structured. In particular, we assume that a limited number of intermediate traits (or factors, e.g., variations in development or physiology) control the variation in the high-dimensional phenotype, and that each of these intermediate traits is sparse - affecting only a few observed traits. The advantages of this approach are twofold. First, sparse factors are interpretable and provide biological insight into mechanisms underlying the genetic architecture. Second, enforcing sparsity helps prevent sampling errors from swamping out the true signal in high-dimensional data. We demonstrate the advantages of our model on simulated data and in an analysis of a published Drosophila melanogaster gene expression data set.

  3. Hypergraph-based anomaly detection of high-dimensional co-occurrences.

    Science.gov (United States)

    Silva, Jorge; Willett, Rebecca

    2009-03-01

    This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods.

  4. High-Dimensional Function Approximation With Neural Networks for Large Volumes of Data.

    Science.gov (United States)

    Andras, Peter

    2018-02-01

    Approximation of high-dimensional functions is a challenge for neural networks due to the curse of dimensionality. Often the data for which the approximated function is defined resides on a low-dimensional manifold and in principle the approximation of the function over this manifold should improve the approximation performance. It has been show that projecting the data manifold into a lower dimensional space, followed by the neural network approximation of the function over this space, provides a more precise approximation of the function than the approximation of the function with neural networks in the original data space. However, if the data volume is very large, the projection into the low-dimensional space has to be based on a limited sample of the data. Here, we investigate the nature of the approximation error of neural networks trained over the projection space. We show that such neural networks should have better approximation performance than neural networks trained on high-dimensional data even if the projection is based on a relatively sparse sample of the data manifold. We also find that it is preferable to use a uniformly distributed sparse sample of the data for the purpose of the generation of the low-dimensional projection. We illustrate these results considering the practical neural network approximation of a set of functions defined on high-dimensional data including real world data as well.

  5. Visibility of Open Acces Repositories of Digital University Libraries: A Case Study of the EU Visegrád Group

    Directory of Open Access Journals (Sweden)

    Erzsebet Dani

    2014-04-01

    Full Text Available For scientific research institutions, as well as for scientists individually, the degree of accessibility of a given institution's and of an individual researcher's scientific achievement is of growing significance in this world of the internet: i.e., it is vitally important to know about and to have easy access to what research is conducted in what fields and with what results in the different institutions. In this study I intend to survey the present situation concerning the homepages of leading universities of the so-called Visegrad Group inside the European Union and the extent to which the present situation serves or fails to serve the cause of the philosophy of open access. My aim is twofold. (1 I will consider whether the scienetific-knowledge repositories built by universities are accessible or not, and/or how easy or difficult it is to access them. Provided that those repositories exist at all, because, in spite of the fact that the Berlin Declaration is generally adopted in principle, the homepages of a good number of the surveyed Visegrad Group universities or libraries do not make their research databases easily accessible or accessible at all, or they can be accessed in the given national language only. (2[1] http://openaccess.mpg.de/286432/Berlin-Declaration [09.18.2013.] [2] https://www.openaire.eu/en/contact-us/partners

  6. Personal Learning Environments: A proposal to develop digital competences and information in university teaching of Law in Colombia

    Directory of Open Access Journals (Sweden)

    Marcos CABEZAS GONZÁLEZ

    2017-12-01

    Full Text Available Personal Learning Enviroment (PLE is one of the most interesting concepts that has aroused among teachers and software engineers in the last time and that will have an impact on the next years among all the levels and modalities of education. ple is a product of various factors including a social web that is shown by tools and free access services based in an open code technology. But a PLE is not a kind of software or platform; it is a new view about how to use the technologies for the learning process both in the initial formation and the lifelong learning. This article has teaching in Law Schools in Colombia and the inexistence of juridical practice proposals that may develop the professional competencies linked to the non-formal and everyday learning and linked to personal learning environments. The target that we try to acquire is to create a PLE proposal supported by 2.0 technologies and orientated to encourage a lifelong learning that may develop the digital and informative competences in Law practice. We strongly believe that a PLE model will help the student acquire the knowledge, abilities and experiences that may allow them a personal and professional development in the frame of a lifelong learning program that will contribute to approach goals and opportunities in the information and communication society that is in constant evolution.

  7. Development of a universal measure of quadrupedal forelimb-hindlimb coordination using digital motion capture and computerised analysis

    Directory of Open Access Journals (Sweden)

    Jeffery Nick D

    2007-09-01

    Full Text Available Abstract Background Clinical spinal cord injury in domestic dogs provides a model population in which to test the efficacy of putative therapeutic interventions for human spinal cord injury. To achieve this potential a robust method of functional analysis is required so that statistical comparison of numerical data derived from treated and control animals can be achieved. Results In this study we describe the use of digital motion capture equipment combined with mathematical analysis to derive a simple quantitative parameter – 'the mean diagonal coupling interval' – to describe coordination between forelimb and hindlimb movement. In normal dogs this parameter is independent of size, conformation, speed of walking or gait pattern. We show here that mean diagonal coupling interval is highly sensitive to alterations in forelimb-hindlimb coordination in dogs that have suffered spinal cord injury, and can be accurately quantified, but is unaffected by orthopaedic perturbations of gait. Conclusion Mean diagonal coupling interval is an easily derived, highly robust measurement that provides an ideal method to compare the functional effect of therapeutic interventions after spinal cord injury in quadrupeds.

  8. Digital broadcasting

    International Nuclear Information System (INIS)

    Park, Ji Hyeong

    1999-06-01

    This book contains twelve chapters, which deals with digitization of broadcast signal such as digital open, digitization of video signal and sound signal digitization of broadcasting equipment like DTPP and digital VTR, digitization of equipment to transmit such as digital STL, digital FPU and digital SNG, digitization of transmit about digital TV transmit and radio transmit, digital broadcasting system on necessity and advantage, digital broadcasting system abroad and Korea, digital broadcasting of outline, advantage of digital TV, ripple effect of digital broadcasting and consideration of digital broadcasting, ground wave digital broadcasting of DVB-T in Europe DTV in U.S.A and ISDB-T in Japan, HDTV broadcasting, satellite broadcasting, digital TV broadcasting in Korea, digital radio broadcasting and new broadcasting service.

  9. La influencia del género en la cultura digital del estudiantado universitario ICT uses between university students

    Directory of Open Access Journals (Sweden)

    Iolanda García

    2012-11-01

    Full Text Available El objetivo fundamental de este artículo es analizar las diferencias de género respecto al uso de las TIC entre el estudiantado universitario. Presentamos los resultados de un estudio realizado mediante la técnica de encuesta, a una muestra de 1042 estudiantes procedentes de 5 universidades, sobre los usos que realizan y la percepción que tienen en relación a las TIC en la vida cotidiana y en la formación académica. Los resultados muestran diferencias entre sexos, tanto respecto al uso como a su percepción de la tecnología.

     

    The main goal of this paper is to analyse gender differences in the use of ICT among university students. We present the results of a study about the uses and the perception in relation to ICT in everyday life and in academia. The study is based on a statistical simple of 1042 students from 5 different universities. The results show gender differences, both with respect to use as their perception of technology.

  10. Constructing "Authentic" Science: Results from a University/High School Collaboration Integrating Digital Storytelling and Social Networking

    Science.gov (United States)

    Olitsky, Stacy; Becker, Elizabeth A.; Jayo, Ignacio; Vinogradov, Philip; Montcalmo, Joseph

    2018-02-01

    This study explores the implications of a redesign of a college course that entailed a new partnership between a college neuroscience classroom and a high school. In this course, the college students engaged in original research projects which included conducting brain surgery and behavioural tests on rats. They used digital storytelling and social networking to communicate with high school students and were visited by the students during the semester. The aims of the redesign were to align the course with science conducted in the field and to provide opportunities to disseminate scientific knowledge through emerging technologies. This study investigates the impact of these innovations on the college and high school students' perceptions of authentic science, including their relationship with science-centred communities. We found that these collaborative tools increased college students' perceptions that authentic science entailed communication with the general public, in addition to supporting prior perceptions of the importance of conducting experiments and presenting results to experts. In addition, the view of science as high-status knowledge was attenuated as students integrated non-formal communication practices into presentations, showing the backstage process of learning, incorporating music and youth discourse styles, and displaying emotional engagement. An impact of these hybrid presentation approaches was an increase in the high school students' perceptions of the accessibility of laboratory science. We discuss how the use of technologies that are familiar to youth, such as iPads, social networking sites, and multimedia presentations, has the potential to prioritize students' voices and promote a more inclusive view of science.

  11. On-chip generation of high-dimensional entangled quantum states and their coherent control.

    Science.gov (United States)

    Kues, Michael; Reimer, Christian; Roztocki, Piotr; Cortés, Luis Romero; Sciara, Stefania; Wetzel, Benjamin; Zhang, Yanbing; Cino, Alfonso; Chu, Sai T; Little, Brent E; Moss, David J; Caspani, Lucia; Azaña, José; Morandotti, Roberto

    2017-06-28

    Optical quantum states based on entangled photons are essential for solving questions in fundamental physics and are at the heart of quantum information science. Specifically, the realization of high-dimensional states (D-level quantum systems, that is, qudits, with D > 2) and their control are necessary for fundamental investigations of quantum mechanics, for increasing the sensitivity of quantum imaging schemes, for improving the robustness and key rate of quantum communication protocols, for enabling a richer variety of quantum simulations, and for achieving more efficient and error-tolerant quantum computation. Integrated photonics has recently become a leading platform for the compact, cost-efficient, and stable generation and processing of non-classical optical states. However, so far, integrated entangled quantum sources have been limited to qubits (D = 2). Here we demonstrate on-chip generation of entangled qudit states, where the photons are created in a coherent superposition of multiple high-purity frequency modes. In particular, we confirm the realization of a quantum system with at least one hundred dimensions, formed by two entangled qudits with D = 10. Furthermore, using state-of-the-art, yet off-the-shelf telecommunications components, we introduce a coherent manipulation platform with which to control frequency-entangled states, capable of performing deterministic high-dimensional gate operations. We validate this platform by measuring Bell inequality violations and performing quantum state tomography. Our work enables the generation and processing of high-dimensional quantum states in a single spatial mode.

  12. Covariance Method of the Tunneling Radiation from High Dimensional Rotating Black Holes

    Science.gov (United States)

    Li, Hui-Ling; Han, Yi-Wen; Chen, Shuai-Ru; Ding, Cong

    2018-04-01

    In this paper, Angheben-Nadalini-Vanzo-Zerbini (ANVZ) covariance method is used to study the tunneling radiation from the Kerr-Gödel black hole and Myers-Perry black hole with two independent angular momentum. By solving the Hamilton-Jacobi equation and separating the variables, the radial motion equation of a tunneling particle is obtained. Using near horizon approximation and the distance of the proper pure space, we calculate the tunneling rate and the temperature of Hawking radiation. Thus, the method of ANVZ covariance is extended to the research of high dimensional black hole tunneling radiation.

  13. Efficient and accurate nearest neighbor and closest pair search in high-dimensional space

    KAUST Repository

    Tao, Yufei

    2010-07-01

    Nearest Neighbor (NN) search in high-dimensional space is an important problem in many applications. From the database perspective, a good solution needs to have two properties: (i) it can be easily incorporated in a relational database, and (ii) its query cost should increase sublinearly with the dataset size, regardless of the data and query distributions. Locality-Sensitive Hashing (LSH) is a well-known methodology fulfilling both requirements, but its current implementations either incur expensive space and query cost, or abandon its theoretical guarantee on the quality of query results. Motivated by this, we improve LSH by proposing an access method called the Locality-Sensitive B-tree (LSB-tree) to enable fast, accurate, high-dimensional NN search in relational databases. The combination of several LSB-trees forms a LSB-forest that has strong quality guarantees, but improves dramatically the efficiency of the previous LSH implementation having the same guarantees. In practice, the LSB-tree itself is also an effective index which consumes linear space, supports efficient updates, and provides accurate query results. In our experiments, the LSB-tree was faster than: (i) iDistance (a famous technique for exact NN search) by two orders ofmagnitude, and (ii) MedRank (a recent approximate method with nontrivial quality guarantees) by one order of magnitude, and meanwhile returned much better results. As a second step, we extend our LSB technique to solve another classic problem, called Closest Pair (CP) search, in high-dimensional space. The long-term challenge for this problem has been to achieve subquadratic running time at very high dimensionalities, which fails most of the existing solutions. We show that, using a LSB-forest, CP search can be accomplished in (worst-case) time significantly lower than the quadratic complexity, yet still ensuring very good quality. In practice, accurate answers can be found using just two LSB-trees, thus giving a substantial

  14. High-dimensional quantum key distribution based on multicore fiber using silicon photonic integrated circuits

    DEFF Research Database (Denmark)

    Ding, Yunhong; Bacco, Davide; Dalgaard, Kjeld

    2017-01-01

    is intrinsically limited to 1 bit/photon. Here we propose and experimentally demonstrate, for the first time, a high-dimensional quantum key distribution protocol based on space division multiplexing in multicore fiber using silicon photonic integrated lightwave circuits. We successfully realized three mutually......-dimensional quantum states, and enables breaking the information efficiency limit of traditional quantum key distribution protocols. In addition, the silicon photonic circuits used in our work integrate variable optical attenuators, highly efficient multicore fiber couplers, and Mach-Zehnder interferometers, enabling...

  15. High-dimensional chaos from self-sustained collisions of solitons

    Energy Technology Data Exchange (ETDEWEB)

    Yildirim, O. Ozgur, E-mail: donhee@seas.harvard.edu, E-mail: oozgury@gmail.com [Cavium, Inc., 600 Nickerson Rd., Marlborough, Massachusetts 01752 (United States); Ham, Donhee, E-mail: donhee@seas.harvard.edu, E-mail: oozgury@gmail.com [Harvard University, 33 Oxford St., Cambridge, Massachusetts 02138 (United States)

    2014-06-16

    We experimentally demonstrate chaos generation based on collisions of electrical solitons on a nonlinear transmission line. The nonlinear line creates solitons, and an amplifier connected to it provides gain to these solitons for their self-excitation and self-sustenance. Critically, the amplifier also provides a mechanism to enable and intensify collisions among solitons. These collisional interactions are of intrinsically nonlinear nature, modulating the phase and amplitude of solitons, thus causing chaos. This chaos generated by the exploitation of the nonlinear wave phenomena is inherently high-dimensional, which we also demonstrate.

  16. Inferring biological tasks using Pareto analysis of high-dimensional data.

    Science.gov (United States)

    Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri

    2015-03-01

    We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.

  17. A novel algorithm of artificial immune system for high-dimensional function numerical optimization

    Institute of Scientific and Technical Information of China (English)

    DU Haifeng; GONG Maoguo; JIAO Licheng; LIU Ruochen

    2005-01-01

    Based on the clonal selection theory and immune memory theory, a novel artificial immune system algorithm, immune memory clonal programming algorithm (IMCPA), is put forward. Using the theorem of Markov chain, it is proved that IMCPA is convergent. Compared with some other evolutionary programming algorithms (like Breeder genetic algorithm), IMCPA is shown to be an evolutionary strategy capable of solving complex machine learning tasks, like high-dimensional function optimization, which maintains the diversity of the population and avoids prematurity to some extent, and has a higher convergence speed.

  18. Computing and visualizing time-varying merge trees for high-dimensional data

    Energy Technology Data Exchange (ETDEWEB)

    Oesterling, Patrick [Univ. of Leipzig (Germany); Heine, Christian [Univ. of Kaiserslautern (Germany); Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Morozov, Dmitry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Scheuermann, Gerik [Univ. of Leipzig (Germany)

    2017-06-03

    We introduce a new method that identifies and tracks features in arbitrary dimensions using the merge tree -- a structure for identifying topological features based on thresholding in scalar fields. This method analyzes the evolution of features of the function by tracking changes in the merge tree and relates features by matching subtrees between consecutive time steps. Using the time-varying merge tree, we present a structural visualization of the changing function that illustrates both features and their temporal evolution. We demonstrate the utility of our approach by applying it to temporal cluster analysis of high-dimensional point clouds.

  19. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

    Science.gov (United States)

    Kong, Shengchun; Nan, Bin

    2014-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses.

  20. High-dimensional data: p >> n in mathematical statistics and bio-medical applications

    OpenAIRE

    Van De Geer, Sara A.; Van Houwelingen, Hans C.

    2004-01-01

    The workshop 'High-dimensional data: p >> n in mathematical statistics and bio-medical applications' was held at the Lorentz Center in Leiden from 9 to 20 September 2002. This special issue of Bernoulli contains a selection of papers presented at that workshop. ¶ The introduction of high-throughput micro-array technology to measure gene-expression levels and the publication of the pioneering paper by Golub et al. (1999) has brought to life a whole new branch of data analysis under the name of...

  1. Ghosts in high dimensional non-linear dynamical systems: The example of the hypercycle

    International Nuclear Information System (INIS)

    Sardanyes, Josep

    2009-01-01

    Ghost-induced delayed transitions are analyzed in high dimensional non-linear dynamical systems by means of the hypercycle model. The hypercycle is a network of catalytically-coupled self-replicating RNA-like macromolecules, and has been suggested to be involved in the transition from non-living to living matter in the context of earlier prebiotic evolution. It is demonstrated that, in the vicinity of the saddle-node bifurcation for symmetric hypercycles, the persistence time before extinction, T ε , tends to infinity as n→∞ (being n the number of units of the hypercycle), thus suggesting that the increase in the number of hypercycle units involves a longer resilient time before extinction because of the ghost. Furthermore, by means of numerical analysis the dynamics of three large hypercycle networks is also studied, focusing in their extinction dynamics associated to the ghosts. Such networks allow to explore the properties of the ghosts living in high dimensional phase space with n = 5, n = 10 and n = 15 dimensions. These hypercyclic networks, in agreement with other works, are shown to exhibit self-maintained oscillations governed by stable limit cycles. The bifurcation scenarios for these hypercycles are analyzed, as well as the effect of the phase space dimensionality in the delayed transition phenomena and in the scaling properties of the ghosts near bifurcation threshold

  2. High-dimensional free-space optical communications based on orbital angular momentum coding

    Science.gov (United States)

    Zou, Li; Gu, Xiaofan; Wang, Le

    2018-03-01

    In this paper, we propose a high-dimensional free-space optical communication scheme using orbital angular momentum (OAM) coding. In the scheme, the transmitter encodes N-bits information by using a spatial light modulator to convert a Gaussian beam to a superposition mode of N OAM modes and a Gaussian mode; The receiver decodes the information through an OAM mode analyser which consists of a MZ interferometer with a rotating Dove prism, a photoelectric detector and a computer carrying out the fast Fourier transform. The scheme could realize a high-dimensional free-space optical communication, and decodes the information much fast and accurately. We have verified the feasibility of the scheme by exploiting 8 (4) OAM modes and a Gaussian mode to implement a 256-ary (16-ary) coding free-space optical communication to transmit a 256-gray-scale (16-gray-scale) picture. The results show that a zero bit error rate performance has been achieved.

  3. Bayesian Multiresolution Variable Selection for Ultra-High Dimensional Neuroimaging Data.

    Science.gov (United States)

    Zhao, Yize; Kang, Jian; Long, Qi

    2018-01-01

    Ultra-high dimensional variable selection has become increasingly important in analysis of neuroimaging data. For example, in the Autism Brain Imaging Data Exchange (ABIDE) study, neuroscientists are interested in identifying important biomarkers for early detection of the autism spectrum disorder (ASD) using high resolution brain images that include hundreds of thousands voxels. However, most existing methods are not feasible for solving this problem due to their extensive computational costs. In this work, we propose a novel multiresolution variable selection procedure under a Bayesian probit regression framework. It recursively uses posterior samples for coarser-scale variable selection to guide the posterior inference on finer-scale variable selection, leading to very efficient Markov chain Monte Carlo (MCMC) algorithms. The proposed algorithms are computationally feasible for ultra-high dimensional data. Also, our model incorporates two levels of structural information into variable selection using Ising priors: the spatial dependence between voxels and the functional connectivity between anatomical brain regions. Applied to the resting state functional magnetic resonance imaging (R-fMRI) data in the ABIDE study, our methods identify voxel-level imaging biomarkers highly predictive of the ASD, which are biologically meaningful and interpretable. Extensive simulations also show that our methods achieve better performance in variable selection compared to existing methods.

  4. Energy Efficient MAC Scheme for Wireless Sensor Networks with High-Dimensional Data Aggregate

    Directory of Open Access Journals (Sweden)

    Seokhoon Kim

    2015-01-01

    Full Text Available This paper presents a novel and sustainable medium access control (MAC scheme for wireless sensor network (WSN systems that process high-dimensional aggregated data. Based on a preamble signal and buffer threshold analysis, it maximizes the energy efficiency of the wireless sensor devices which have limited energy resources. The proposed group management MAC (GM-MAC approach not only sets the buffer threshold value of a sensor device to be reciprocal to the preamble signal but also sets a transmittable group value to each sensor device by using the preamble signal of the sink node. The primary difference between the previous and the proposed approach is that existing state-of-the-art schemes use duty cycle and sleep mode to save energy consumption of individual sensor devices, whereas the proposed scheme employs the group management MAC scheme for sensor devices to maximize the overall energy efficiency of the whole WSN systems by minimizing the energy consumption of sensor devices located near the sink node. Performance evaluations show that the proposed scheme outperforms the previous schemes in terms of active time of sensor devices, transmission delay, control overhead, and energy consumption. Therefore, the proposed scheme is suitable for sensor devices in a variety of wireless sensor networking environments with high-dimensional data aggregate.

  5. Selecting Optimal Feature Set in High-Dimensional Data by Swarm Search

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2013-01-01

    Full Text Available Selecting the right set of features from data of high dimensionality for inducing an accurate classification model is a tough computational challenge. It is almost a NP-hard problem as the combinations of features escalate exponentially as the number of features increases. Unfortunately in data mining, as well as other engineering applications and bioinformatics, some data are described by a long array of features. Many feature subset selection algorithms have been proposed in the past, but not all of them are effective. Since it takes seemingly forever to use brute force in exhaustively trying every possible combination of features, stochastic optimization may be a solution. In this paper, we propose a new feature selection scheme called Swarm Search to find an optimal feature set by using metaheuristics. The advantage of Swarm Search is its flexibility in integrating any classifier into its fitness function and plugging in any metaheuristic algorithm to facilitate heuristic search. Simulation experiments are carried out by testing the Swarm Search over some high-dimensional datasets, with different classification algorithms and various metaheuristic algorithms. The comparative experiment results show that Swarm Search is able to attain relatively low error rates in classification without shrinking the size of the feature subset to its minimum.

  6. The validation and assessment of machine learning: a game of prediction from high-dimensional data.

    Directory of Open Access Journals (Sweden)

    Tune H Pers

    Full Text Available In applied statistics, tools from machine learning are popular for analyzing complex and high-dimensional data. However, few theoretical results are available that could guide to the appropriate machine learning tool in a new application. Initial development of an overall strategy thus often implies that multiple methods are tested and compared on the same set of data. This is particularly difficult in situations that are prone to over-fitting where the number of subjects is low compared to the number of potential predictors. The article presents a game which provides some grounds for conducting a fair model comparison. Each player selects a modeling strategy for predicting individual response from potential predictors. A strictly proper scoring rule, bootstrap cross-validation, and a set of rules are used to make the results obtained with different strategies comparable. To illustrate the ideas, the game is applied to data from the Nugenob Study where the aim is to predict the fat oxidation capacity based on conventional factors and high-dimensional metabolomics data. Three players have chosen to use support vector machines, LASSO, and random forests, respectively.

  7. Similarity-dissimilarity plot for visualization of high dimensional data in biomedical pattern classification.

    Science.gov (United States)

    Arif, Muhammad

    2012-06-01

    In pattern classification problems, feature extraction is an important step. Quality of features in discriminating different classes plays an important role in pattern classification problems. In real life, pattern classification may require high dimensional feature space and it is impossible to visualize the feature space if the dimension of feature space is greater than four. In this paper, we have proposed a Similarity-Dissimilarity plot which can project high dimensional space to a two dimensional space while retaining important characteristics required to assess the discrimination quality of the features. Similarity-dissimilarity plot can reveal information about the amount of overlap of features of different classes. Separable data points of different classes will also be visible on the plot which can be classified correctly using appropriate classifier. Hence, approximate classification accuracy can be predicted. Moreover, it is possible to know about whom class the misclassified data points will be confused by the classifier. Outlier data points can also be located on the similarity-dissimilarity plot. Various examples of synthetic data are used to highlight important characteristics of the proposed plot. Some real life examples from biomedical data are also used for the analysis. The proposed plot is independent of number of dimensions of the feature space.

  8. High-dimensional quantum key distribution with the entangled single-photon-added coherent state

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yang [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Bao, Wan-Su, E-mail: 2010thzz@sina.com [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Bao, Hai-Ze; Zhou, Chun; Jiang, Mu-Sheng; Li, Hong-Wei [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2017-04-25

    High-dimensional quantum key distribution (HD-QKD) can generate more secure bits for one detection event so that it can achieve long distance key distribution with a high secret key capacity. In this Letter, we present a decoy state HD-QKD scheme with the entangled single-photon-added coherent state (ESPACS) source. We present two tight formulas to estimate the single-photon fraction of postselected events and Eve's Holevo information and derive lower bounds on the secret key capacity and the secret key rate of our protocol. We also present finite-key analysis for our protocol by using the Chernoff bound. Our numerical results show that our protocol using one decoy state can perform better than that of previous HD-QKD protocol with the spontaneous parametric down conversion (SPDC) using two decoy states. Moreover, when considering finite resources, the advantage is more obvious. - Highlights: • Implement the single-photon-added coherent state source into the high-dimensional quantum key distribution. • Enhance both the secret key capacity and the secret key rate compared with previous schemes. • Show an excellent performance in view of statistical fluctuations.

  9. A Feature Subset Selection Method Based On High-Dimensional Mutual Information

    Directory of Open Access Journals (Sweden)

    Chee Keong Kwoh

    2011-04-01

    Full Text Available Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criterion to determine the appropriate subset of features when building classifiers. We prove that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y , then X is a Markov Blanket of Y . We show that in some cases, it is infeasible to approximate the high-dimensional mutual information with algebraic combinations of pairwise mutual information in any forms. In addition, the exhaustive searches of all combinations of features are prerequisite for finding the optimal feature subsets for classifying these kinds of data sets. We show that our approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets.

  10. Reducing the Complexity of Genetic Fuzzy Classifiers in Highly-Dimensional Classification Problems

    Directory of Open Access Journals (Sweden)

    DimitrisG. Stavrakoudis

    2012-04-01

    Full Text Available This paper introduces the Fast Iterative Rule-based Linguistic Classifier (FaIRLiC, a Genetic Fuzzy Rule-Based Classification System (GFRBCS which targets at reducing the structural complexity of the resulting rule base, as well as its learning algorithm's computational requirements, especially when dealing with high-dimensional feature spaces. The proposed methodology follows the principles of the iterative rule learning (IRL approach, whereby a rule extraction algorithm (REA is invoked in an iterative fashion, producing one fuzzy rule at a time. The REA is performed in two successive steps: the first one selects the relevant features of the currently extracted rule, whereas the second one decides the antecedent part of the fuzzy rule, using the previously selected subset of features. The performance of the classifier is finally optimized through a genetic tuning post-processing stage. Comparative results in a hyperspectral remote sensing classification as well as in 12 real-world classification datasets indicate the effectiveness of the proposed methodology in generating high-performing and compact fuzzy rule-based classifiers, even for very high-dimensional feature spaces.

  11. Compound Structure-Independent Activity Prediction in High-Dimensional Target Space.

    Science.gov (United States)

    Balfer, Jenny; Hu, Ye; Bajorath, Jürgen

    2014-08-01

    Profiling of compound libraries against arrays of targets has become an important approach in pharmaceutical research. The prediction of multi-target compound activities also represents an attractive task for machine learning with potential for drug discovery applications. Herein, we have explored activity prediction in high-dimensional target space. Different types of models were derived to predict multi-target activities. The models included naïve Bayesian (NB) and support vector machine (SVM) classifiers based upon compound structure information and NB models derived on the basis of activity profiles, without considering compound structure. Because the latter approach can be applied to incomplete training data and principally depends on the feature independence assumption, SVM modeling was not applicable in this case. Furthermore, iterative hybrid NB models making use of both activity profiles and compound structure information were built. In high-dimensional target space, NB models utilizing activity profile data were found to yield more accurate activity predictions than structure-based NB and SVM models or hybrid models. An in-depth analysis of activity profile-based models revealed the presence of correlation effects across different targets and rationalized prediction accuracy. Taken together, the results indicate that activity profile information can be effectively used to predict the activity of test compounds against novel targets. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantum secret sharing based on modulated high-dimensional time-bin entanglement

    International Nuclear Information System (INIS)

    Takesue, Hiroki; Inoue, Kyo

    2006-01-01

    We propose a scheme for quantum secret sharing (QSS) that uses a modulated high-dimensional time-bin entanglement. By modulating the relative phase randomly by {0,π}, a sender with the entanglement source can randomly change the sign of the correlation of the measurement outcomes obtained by two distant recipients. The two recipients must cooperate if they are to obtain the sign of the correlation, which is used as a secret key. We show that our scheme is secure against intercept-and-resend (IR) and beam splitting attacks by an outside eavesdropper thanks to the nonorthogonality of high-dimensional time-bin entangled states. We also show that a cheating attempt based on an IR attack by one of the recipients can be detected by changing the dimension of the time-bin entanglement randomly and inserting two 'vacant' slots between the packets. Then, cheating attempts can be detected by monitoring the count rate in the vacant slots. The proposed scheme has better experimental feasibility than previously proposed entanglement-based QSS schemes

  13. Similarity measurement method of high-dimensional data based on normalized net lattice subspace

    Institute of Scientific and Technical Information of China (English)

    Li Wenfa; Wang Gongming; Li Ke; Huang Su

    2017-01-01

    The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity, leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals, and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this meth-od, three data types are used, and seven common similarity measurement methods are compared. The experimental result indicates that the relative difference of the method is increasing with the di-mensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition, the similarity range of this method in different dimensions is [0, 1], which is fit for similarity analysis after dimensionality reduction.

  14. The cross-validated AUC for MCP-logistic regression with high-dimensional data.

    Science.gov (United States)

    Jiang, Dingfeng; Huang, Jian; Zhang, Ying

    2013-10-01

    We propose a cross-validated area under the receiving operator characteristic (ROC) curve (CV-AUC) criterion for tuning parameter selection for penalized methods in sparse, high-dimensional logistic regression models. We use this criterion in combination with the minimax concave penalty (MCP) method for variable selection. The CV-AUC criterion is specifically designed for optimizing the classification performance for binary outcome data. To implement the proposed approach, we derive an efficient coordinate descent algorithm to compute the MCP-logistic regression solution surface. Simulation studies are conducted to evaluate the finite sample performance of the proposed method and its comparison with the existing methods including the Akaike information criterion (AIC), Bayesian information criterion (BIC) or Extended BIC (EBIC). The model selected based on the CV-AUC criterion tends to have a larger predictive AUC and smaller classification error than those with tuning parameters selected using the AIC, BIC or EBIC. We illustrate the application of the MCP-logistic regression with the CV-AUC criterion on three microarray datasets from the studies that attempt to identify genes related to cancers. Our simulation studies and data examples demonstrate that the CV-AUC is an attractive method for tuning parameter selection for penalized methods in high-dimensional logistic regression models.

  15. An Improved Ensemble Learning Method for Classifying High-Dimensional and Imbalanced Biomedicine Data.

    Science.gov (United States)

    Yu, Hualong; Ni, Jun

    2014-01-01

    Training classifiers on skewed data can be technically challenging tasks, especially if the data is high-dimensional simultaneously, the tasks can become more difficult. In biomedicine field, skewed data type often appears. In this study, we try to deal with this problem by combining asymmetric bagging ensemble classifier (asBagging) that has been presented in previous work and an improved random subspace (RS) generation strategy that is called feature subspace (FSS). Specifically, FSS is a novel method to promote the balance level between accuracy and diversity of base classifiers in asBagging. In view of the strong generalization capability of support vector machine (SVM), we adopt it to be base classifier. Extensive experiments on four benchmark biomedicine data sets indicate that the proposed ensemble learning method outperforms many baseline approaches in terms of Accuracy, F-measure, G-mean and AUC evaluation criterions, thus it can be regarded as an effective and efficient tool to deal with high-dimensional and imbalanced biomedical data.

  16. Compact Representation of High-Dimensional Feature Vectors for Large-Scale Image Recognition and Retrieval.

    Science.gov (United States)

    Zhang, Yu; Wu, Jianxin; Cai, Jianfei

    2016-05-01

    In large-scale visual recognition and image retrieval tasks, feature vectors, such as Fisher vector (FV) or the vector of locally aggregated descriptors (VLAD), have achieved state-of-the-art results. However, the combination of the large numbers of examples and high-dimensional vectors necessitates dimensionality reduction, in order to reduce its storage and CPU costs to a reasonable range. In spite of the popularity of various feature compression methods, this paper shows that the feature (dimension) selection is a better choice for high-dimensional FV/VLAD than the feature (dimension) compression methods, e.g., product quantization. We show that strong correlation among the feature dimensions in the FV and the VLAD may not exist, which renders feature selection a natural choice. We also show that, many dimensions in FV/VLAD are noise. Throwing them away using feature selection is better than compressing them and useful dimensions altogether using feature compression methods. To choose features, we propose an efficient importance sorting algorithm considering both the supervised and unsupervised cases, for visual recognition and image retrieval, respectively. Combining with the 1-bit quantization, feature selection has achieved both higher accuracy and less computational cost than feature compression methods, such as product quantization, on the FV and the VLAD image representations.

  17. High-dimensional quantum key distribution with the entangled single-photon-added coherent state

    International Nuclear Information System (INIS)

    Wang, Yang; Bao, Wan-Su; Bao, Hai-Ze; Zhou, Chun; Jiang, Mu-Sheng; Li, Hong-Wei

    2017-01-01

    High-dimensional quantum key distribution (HD-QKD) can generate more secure bits for one detection event so that it can achieve long distance key distribution with a high secret key capacity. In this Letter, we present a decoy state HD-QKD scheme with the entangled single-photon-added coherent state (ESPACS) source. We present two tight formulas to estimate the single-photon fraction of postselected events and Eve's Holevo information and derive lower bounds on the secret key capacity and the secret key rate of our protocol. We also present finite-key analysis for our protocol by using the Chernoff bound. Our numerical results show that our protocol using one decoy state can perform better than that of previous HD-QKD protocol with the spontaneous parametric down conversion (SPDC) using two decoy states. Moreover, when considering finite resources, the advantage is more obvious. - Highlights: • Implement the single-photon-added coherent state source into the high-dimensional quantum key distribution. • Enhance both the secret key capacity and the secret key rate compared with previous schemes. • Show an excellent performance in view of statistical fluctuations.

  18. High-Dimensional Single-Photon Quantum Gates: Concepts and Experiments.

    Science.gov (United States)

    Babazadeh, Amin; Erhard, Manuel; Wang, Feiran; Malik, Mehul; Nouroozi, Rahman; Krenn, Mario; Zeilinger, Anton

    2017-11-03

    Transformations on quantum states form a basic building block of every quantum information system. From photonic polarization to two-level atoms, complete sets of quantum gates for a variety of qubit systems are well known. For multilevel quantum systems beyond qubits, the situation is more challenging. The orbital angular momentum modes of photons comprise one such high-dimensional system for which generation and measurement techniques are well studied. However, arbitrary transformations for such quantum states are not known. Here we experimentally demonstrate a four-dimensional generalization of the Pauli X gate and all of its integer powers on single photons carrying orbital angular momentum. Together with the well-known Z gate, this forms the first complete set of high-dimensional quantum gates implemented experimentally. The concept of the X gate is based on independent access to quantum states with different parities and can thus be generalized to other photonic degrees of freedom and potentially also to other quantum systems.

  19. TESTING HIGH-DIMENSIONAL COVARIANCE MATRICES, WITH APPLICATION TO DETECTING SCHIZOPHRENIA RISK GENES.

    Science.gov (United States)

    Zhu, Lingxue; Lei, Jing; Devlin, Bernie; Roeder, Kathryn

    2017-09-01

    Scientists routinely compare gene expression levels in cases versus controls in part to determine genes associated with a disease. Similarly, detecting case-control differences in co-expression among genes can be critical to understanding complex human diseases; however statistical methods have been limited by the high dimensional nature of this problem. In this paper, we construct a sparse-Leading-Eigenvalue-Driven (sLED) test for comparing two high-dimensional covariance matrices. By focusing on the spectrum of the differential matrix, sLED provides a novel perspective that accommodates what we assume to be common, namely sparse and weak signals in gene expression data, and it is closely related with Sparse Principal Component Analysis. We prove that sLED achieves full power asymptotically under mild assumptions, and simulation studies verify that it outperforms other existing procedures under many biologically plausible scenarios. Applying sLED to the largest gene-expression dataset obtained from post-mortem brain tissue from Schizophrenia patients and controls, we provide a novel list of genes implicated in Schizophrenia and reveal intriguing patterns in gene co-expression change for Schizophrenia subjects. We also illustrate that sLED can be generalized to compare other gene-gene "relationship" matrices that are of practical interest, such as the weighted adjacency matrices.

  20. Latent class models for joint analysis of disease prevalence and high-dimensional semicontinuous biomarker data.

    Science.gov (United States)

    Zhang, Bo; Chen, Zhen; Albert, Paul S

    2012-01-01

    High-dimensional biomarker data are often collected in epidemiological studies when assessing the association between biomarkers and human disease is of interest. We develop a latent class modeling approach for joint analysis of high-dimensional semicontinuous biomarker data and a binary disease outcome. To model the relationship between complex biomarker expression patterns and disease risk, we use latent risk classes to link the 2 modeling components. We characterize complex biomarker-specific differences through biomarker-specific random effects, so that different biomarkers can have different baseline (low-risk) values as well as different between-class differences. The proposed approach also accommodates data features that are common in environmental toxicology and other biomarker exposure data, including a large number of biomarkers, numerous zero values, and complex mean-variance relationship in the biomarkers levels. A Monte Carlo EM (MCEM) algorithm is proposed for parameter estimation. Both the MCEM algorithm and model selection procedures are shown to work well in simulations and applications. In applying the proposed approach to an epidemiological study that examined the relationship between environmental polychlorinated biphenyl (PCB) exposure and the risk of endometriosis, we identified a highly significant overall effect of PCB concentrations on the risk of endometriosis.

  1. Generalized reduced rank latent factor regression for high dimensional tensor fields, and neuroimaging-genetic applications.

    Science.gov (United States)

    Tao, Chenyang; Nichols, Thomas E; Hua, Xue; Ching, Christopher R K; Rolls, Edmund T; Thompson, Paul M; Feng, Jianfeng

    2017-01-01

    We propose a generalized reduced rank latent factor regression model (GRRLF) for the analysis of tensor field responses and high dimensional covariates. The model is motivated by the need from imaging-genetic studies to identify genetic variants that are associated with brain imaging phenotypes, often in the form of high dimensional tensor fields. GRRLF identifies from the structure in the data the effective dimensionality of the data, and then jointly performs dimension reduction of the covariates, dynamic identification of latent factors, and nonparametric estimation of both covariate and latent response fields. After accounting for the latent and covariate effects, GRLLF performs a nonparametric test on the remaining factor of interest. GRRLF provides a better factorization of the signals compared with common solutions, and is less susceptible to overfitting because it exploits the effective dimensionality. The generality and the flexibility of GRRLF also allow various statistical models to be handled in a unified framework and solutions can be efficiently computed. Within the field of neuroimaging, it improves the sensitivity for weak signals and is a promising alternative to existing approaches. The operation of the framework is demonstrated with both synthetic datasets and a real-world neuroimaging example in which the effects of a set of genes on the structure of the brain at the voxel level were measured, and the results compared favorably with those from existing approaches. Copyright © 2016. Published by Elsevier Inc.

  2. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  3. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  4. Innovation Rather than Improvement: A Solvable High-Dimensional Model Highlights the Limitations of Scalar Fitness

    Science.gov (United States)

    Tikhonov, Mikhail; Monasson, Remi

    2018-01-01

    Much of our understanding of ecological and evolutionary mechanisms derives from analysis of low-dimensional models: with few interacting species, or few axes defining "fitness". It is not always clear to what extent the intuition derived from low-dimensional models applies to the complex, high-dimensional reality. For instance, most naturally occurring microbial communities are strikingly diverse, harboring a large number of coexisting species, each of which contributes to shaping the environment of others. Understanding the eco-evolutionary interplay in these systems is an important challenge, and an exciting new domain for statistical physics. Recent work identified a promising new platform for investigating highly diverse ecosystems, based on the classic resource competition model of MacArthur. Here, we describe how the same analytical framework can be used to study evolutionary questions. Our analysis illustrates how, at high dimension, the intuition promoted by a one-dimensional (scalar) notion of fitness can become misleading. Specifically, while the low-dimensional picture emphasizes organism cost or efficiency, we exhibit a regime where cost becomes irrelevant for survival, and link this observation to generic properties of high-dimensional geometry.

  5. A New Ensemble Method with Feature Space Partitioning for High-Dimensional Data Classification

    Directory of Open Access Journals (Sweden)

    Yongjun Piao

    2015-01-01

    Full Text Available Ensemble data mining methods, also known as classifier combination, are often used to improve the performance of classification. Various classifier combination methods such as bagging, boosting, and random forest have been devised and have received considerable attention in the past. However, data dimensionality increases rapidly day by day. Such a trend poses various challenges as these methods are not suitable to directly apply to high-dimensional datasets. In this paper, we propose an ensemble method for classification of high-dimensional data, with each classifier constructed from a different set of features determined by partitioning of redundant features. In our method, the redundancy of features is considered to divide the original feature space. Then, each generated feature subset is trained by a support vector machine, and the results of each classifier are combined by majority voting. The efficiency and effectiveness of our method are demonstrated through comparisons with other ensemble techniques, and the results show that our method outperforms other methods.

  6. AucPR: An AUC-based approach using penalized regression for disease prediction with high-dimensional omics data

    OpenAIRE

    Yu, Wenbao; Park, Taesung

    2014-01-01

    Motivation It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. Results We propose an AUC-based approach u...

  7. Digital disruption ?syndromes.

    Science.gov (United States)

    Sullivan, Clair; Staib, Andrew

    2017-05-18

    The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and

  8. High dimensional biological data retrieval optimization with NoSQL technology

    Science.gov (United States)

    2014-01-01

    Background High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. Results In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. Conclusions The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data

  9. Exploring high dimensional data with Butterfly: a novel classification algorithm based on discrete dynamical systems.

    Science.gov (United States)

    Geraci, Joseph; Dharsee, Moyez; Nuin, Paulo; Haslehurst, Alexandria; Koti, Madhuri; Feilotter, Harriet E; Evans, Ken

    2014-03-01

    We introduce a novel method for visualizing high dimensional data via a discrete dynamical system. This method provides a 2D representation of the relationship between subjects according to a set of variables without geometric projections, transformed axes or principal components. The algorithm exploits a memory-type mechanism inherent in a certain class of discrete dynamical systems collectively referred to as the chaos game that are closely related to iterative function systems. The goal of the algorithm was to create a human readable representation of high dimensional patient data that was capable of detecting unrevealed subclusters of patients from within anticipated classifications. This provides a mechanism to further pursue a more personalized exploration of pathology when used with medical data. For clustering and classification protocols, the dynamical system portion of the algorithm is designed to come after some feature selection filter and before some model evaluation (e.g. clustering accuracy) protocol. In the version given here, a univariate features selection step is performed (in practice more complex feature selection methods are used), a discrete dynamical system is driven by this reduced set of variables (which results in a set of 2D cluster models), these models are evaluated for their accuracy (according to a user-defined binary classification) and finally a visual representation of the top classification models are returned. Thus, in addition to the visualization component, this methodology can be used for both supervised and unsupervised machine learning as the top performing models are returned in the protocol we describe here. Butterfly, the algorithm we introduce and provide working code for, uses a discrete dynamical system to classify high dimensional data and provide a 2D representation of the relationship between subjects. We report results on three datasets (two in the article; one in the appendix) including a public lung cancer

  10. A comprehensive analysis of earthquake damage patterns using high dimensional model representation feature selection

    Science.gov (United States)

    Taşkin Kaya, Gülşen

    2013-10-01

    Recently, earthquake damage assessment using satellite images has been a very popular ongoing research direction. Especially with the availability of very high resolution (VHR) satellite images, a quite detailed damage map based on building scale has been produced, and various studies have also been conducted in the literature. As the spatial resolution of satellite images increases, distinguishability of damage patterns becomes more cruel especially in case of using only the spectral information during classification. In order to overcome this difficulty, textural information needs to be involved to the classification to improve the visual quality and reliability of damage map. There are many kinds of textural information which can be derived from VHR satellite images depending on the algorithm used. However, extraction of textural information and evaluation of them have been generally a time consuming process especially for the large areas affected from the earthquake due to the size of VHR image. Therefore, in order to provide a quick damage map, the most useful features describing damage patterns needs to be known in advance as well as the redundant features. In this study, a very high resolution satellite image after Iran, Bam earthquake was used to identify the earthquake damage. Not only the spectral information, textural information was also used during the classification. For textural information, second order Haralick features were extracted from the panchromatic image for the area of interest using gray level co-occurrence matrix with different size of windows and directions. In addition to using spatial features in classification, the most useful features representing the damage characteristic were selected with a novel feature selection method based on high dimensional model representation (HDMR) giving sensitivity of each feature during classification. The method called HDMR was recently proposed as an efficient tool to capture the input

  11. High dimensional biological data retrieval optimization with NoSQL technology.

    Science.gov (United States)

    Wang, Shicai; Pandis, Ioannis; Wu, Chao; He, Sijin; Johnson, David; Emam, Ibrahim; Guitton, Florian; Guo, Yike

    2014-01-01

    High-throughput transcriptomic data generated by microarray experiments is the most abundant and frequently stored kind of data currently used in translational medicine studies. Although microarray data is supported in data warehouses such as tranSMART, when querying relational databases for hundreds of different patient gene expression records queries are slow due to poor performance. Non-relational data models, such as the key-value model implemented in NoSQL databases, hold promise to be more performant solutions. Our motivation is to improve the performance of the tranSMART data warehouse with a view to supporting Next Generation Sequencing data. In this paper we introduce a new data model better suited for high-dimensional data storage and querying, optimized for database scalability and performance. We have designed a key-value pair data model to support faster queries over large-scale microarray data and implemented the model using HBase, an implementation of Google's BigTable storage system. An experimental performance comparison was carried out against the traditional relational data model implemented in both MySQL Cluster and MongoDB, using a large publicly available transcriptomic data set taken from NCBI GEO concerning Multiple Myeloma. Our new key-value data model implemented on HBase exhibits an average 5.24-fold increase in high-dimensional biological data query performance compared to the relational model implemented on MySQL Cluster, and an average 6.47-fold increase on query performance on MongoDB. The performance evaluation found that the new key-value data model, in particular its implementation in HBase, outperforms the relational model currently implemented in tranSMART. We propose that NoSQL technology holds great promise for large-scale data management, in particular for high-dimensional biological data such as that demonstrated in the performance evaluation described in this paper. We aim to use this new data model as a basis for migrating

  12. MATHEMATICAL-Universe-Hypothesis(MUH) BECOME SCENARIO(MUS)!!! (NOT YET A THEORY) VIA 10-DIGITS[ 0 --> 9] SEPHIROT CREATION AUTOMATICALLY from DIGITS AVERAGED-PROBABILITY Newcomb-Benford LOG-Law; UTTER-SIMPLICITY!!!: It's a Jack-in-the-Box Universe: Accidental?/Purposeful?; EMET/TRUTH!!!

    Science.gov (United States)

    Siegel, Edward Carl-Ludwig

    2015-04-01

    Siegel(2012) 10-DIGITS[0 --> 9] AVERAGE PROBABILITY LOG-Law SCALE-INVARIANCE UTTER-SIMPLICITY: Kabbala SEPHIROT SCENARIO AUTOMATICALLY CREATES a UNIVERSE: (1) a big-bang[bosons(BEQS) created from Newcomb[Am.J.Math.4(1),39(1881;THE discovery of the QUANTUM!!!)-Poincare[Calcul des Probabilites,313(12)]-Weyl[Goett.Nach.(14);Math.Ann.77,313(16)] DIGITS AVERAGE STATISTICS LOG-Law[ = log(1 +1/d) = log([d +1]/d)] algebraic-inversion, (2)[initial (at first space-time point created) c = ∞ elongating to timelike-pencil spreading into finite-c light-cone] hidden-dark-energy (HDE)[forming at every-spacetime-point], (3) inflation[logarithm algebraic-inversion-to exponential], (4) hidden[in Siegel(87) ``COMPLEX quantum-statistics in (Nottale-Linde)FRACTAL-dimensions'' expansion around unit-circle/roots-of-unity]-dark-matter(HDM), (4)null massless bosons(E) --> Mellin-(light-speed squared)-transform/Englert-Higgs ``mechanism'' -->(timelike) massive fermions(m), (5) cosmic-microwave-background (CMB)[power-spectrum] Zipf-law HYPERBOLICITY, (6) supersymmetry(SUSY) [projective-geometry conic-sections/conics merging in R/ C projective-plane point at ∞]. UTTER-SIMPLICITY!!!

  13. Penalized estimation for competing risks regression with applications to high-dimensional covariates

    DEFF Research Database (Denmark)

    Ambrogi, Federico; Scheike, Thomas H.

    2016-01-01

    of competing events. The direct binomial regression model of Scheike and others (2008. Predicting cumulative incidence probability by direct binomial regression. Biometrika 95: (1), 205-220) is reformulated in a penalized framework to possibly fit a sparse regression model. The developed approach is easily...... Research 19: (1), 29-51), the research regarding competing risks is less developed (Binder and others, 2009. Boosting for high-dimensional time-to-event data with competing risks. Bioinformatics 25: (7), 890-896). The aim of this work is to consider how to do penalized regression in the presence...... implementable using existing high-performance software to do penalized regression. Results from simulation studies are presented together with an application to genomic data when the endpoint is progression-free survival. An R function is provided to perform regularized competing risks regression according...

  14. Gene masking - a technique to improve accuracy for cancer classification with high dimensionality in microarray data.

    Science.gov (United States)

    Saini, Harsh; Lal, Sunil Pranit; Naidu, Vimal Vikash; Pickering, Vincel Wince; Singh, Gurmeet; Tsunoda, Tatsuhiko; Sharma, Alok

    2016-12-05

    High dimensional feature space generally degrades classification in several applications. In this paper, we propose a strategy called gene masking, in which non-contributing dimensions are heuristically removed from the data to improve classification accuracy. Gene masking is implemented via a binary encoded genetic algorithm that can be integrated seamlessly with classifiers during the training phase of classification to perform feature selection. It can also be used to discriminate between features that contribute most to the classification, thereby, allowing researchers to isolate features that may have special significance. This technique was applied on publicly available datasets whereby it substantially reduced the number of features used for classification while maintaining high accuracies. The proposed technique can be extremely useful in feature selection as it heuristically removes non-contributing features to improve the performance of classifiers.

  15. Entanglement dynamics of high-dimensional bipartite field states inside the cavities in dissipative environments

    Energy Technology Data Exchange (ETDEWEB)

    Tahira, Rabia; Ikram, Manzoor; Zubairy, M Suhail [Centre for Quantum Physics, COMSATS Institute of Information Technology, Islamabad (Pakistan); Bougouffa, Smail [Department of Physics, Faculty of Science, Taibah University, PO Box 30002, Madinah (Saudi Arabia)

    2010-02-14

    We investigate the phenomenon of sudden death of entanglement in a high-dimensional bipartite system subjected to dissipative environments with an arbitrary initial pure entangled state between two fields in the cavities. We find that in a vacuum reservoir, the presence of the state where one or more than one (two) photons in each cavity are present is a necessary condition for the sudden death of entanglement. Otherwise entanglement remains for infinite time and decays asymptotically with the decay of individual qubits. For pure two-qubit entangled states in a thermal environment, we observe that sudden death of entanglement always occurs. The sudden death time of the entangled states is related to the number of photons in the cavities, the temperature of the reservoir and the initial preparation of the entangled states.

  16. Entanglement dynamics of high-dimensional bipartite field states inside the cavities in dissipative environments

    International Nuclear Information System (INIS)

    Tahira, Rabia; Ikram, Manzoor; Zubairy, M Suhail; Bougouffa, Smail

    2010-01-01

    We investigate the phenomenon of sudden death of entanglement in a high-dimensional bipartite system subjected to dissipative environments with an arbitrary initial pure entangled state between two fields in the cavities. We find that in a vacuum reservoir, the presence of the state where one or more than one (two) photons in each cavity are present is a necessary condition for the sudden death of entanglement. Otherwise entanglement remains for infinite time and decays asymptotically with the decay of individual qubits. For pure two-qubit entangled states in a thermal environment, we observe that sudden death of entanglement always occurs. The sudden death time of the entangled states is related to the number of photons in the cavities, the temperature of the reservoir and the initial preparation of the entangled states.

  17. Time–energy high-dimensional one-side device-independent quantum key distribution

    International Nuclear Information System (INIS)

    Bao Hai-Ze; Bao Wan-Su; Wang Yang; Chen Rui-Ke; Ma Hong-Xin; Zhou Chun; Li Hong-Wei

    2017-01-01

    Compared with full device-independent quantum key distribution (DI-QKD), one-side device-independent QKD (1sDI-QKD) needs fewer requirements, which is much easier to meet. In this paper, by applying recently developed novel time–energy entropic uncertainty relations, we present a time–energy high-dimensional one-side device-independent quantum key distribution (HD-QKD) and provide the security proof against coherent attacks. Besides, we connect the security with the quantum steering. By numerical simulation, we obtain the secret key rate for Alice’s different detection efficiencies. The results show that our protocol can performance much better than the original 1sDI-QKD. Furthermore, we clarify the relation among the secret key rate, Alice’s detection efficiency, and the dispersion coefficient. Finally, we simply analyze its performance in the optical fiber channel. (paper)

  18. A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis

    DEFF Research Database (Denmark)

    Abrahamsen, Trine Julie; Hansen, Lars Kai

    2011-01-01

    Small sample high-dimensional principal component analysis (PCA) suffers from variance inflation and lack of generalizability. It has earlier been pointed out that a simple leave-one-out variance renormalization scheme can cure the problem. In this paper we generalize the cure in two directions......: First, we propose a computationally less intensive approximate leave-one-out estimator, secondly, we show that variance inflation is also present in kernel principal component analysis (kPCA) and we provide a non-parametric renormalization scheme which can quite efficiently restore generalizability in kPCA....... As for PCA our analysis also suggests a simplified approximate expression. © 2011 Trine J. Abrahamsen and Lars K. Hansen....

  19. Inference for feature selection using the Lasso with high-dimensional data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Ekstrøm, Claus Thorn

    2014-01-01

    Penalized regression models such as the Lasso have proved useful for variable selection in many fields - especially for situations with high-dimensional data where the numbers of predictors far exceeds the number of observations. These methods identify and rank variables of importance but do...... not generally provide any inference of the selected variables. Thus, the variables selected might be the "most important" but need not be significant. We propose a significance test for the selection found by the Lasso. We introduce a procedure that computes inference and p-values for features chosen...... by the Lasso. This method rephrases the null hypothesis and uses a randomization approach which ensures that the error rate is controlled even for small samples. We demonstrate the ability of the algorithm to compute $p$-values of the expected magnitude with simulated data using a multitude of scenarios...

  20. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang; Tong, Tiejun; Genton, Marc G.

    2017-01-01

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling's tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  1. High-dimensional atom localization via spontaneously generated coherence in a microwave-driven atomic system.

    Science.gov (United States)

    Wang, Zhiping; Chen, Jinyu; Yu, Benli

    2017-02-20

    We investigate the two-dimensional (2D) and three-dimensional (3D) atom localization behaviors via spontaneously generated coherence in a microwave-driven four-level atomic system. Owing to the space-dependent atom-field interaction, it is found that the detecting probability and precision of 2D and 3D atom localization behaviors can be significantly improved via adjusting the system parameters, the phase, amplitude, and initial population distribution. Interestingly, the atom can be localized in volumes that are substantially smaller than a cubic optical wavelength. Our scheme opens a promising way to achieve high-precision and high-efficiency atom localization, which provides some potential applications in high-dimensional atom nanolithography.

  2. Characterization of differentially expressed genes using high-dimensional co-expression networks

    DEFF Research Database (Denmark)

    Coelho Goncalves de Abreu, Gabriel; Labouriau, Rodrigo S.

    2010-01-01

    We present a technique to characterize differentially expressed genes in terms of their position in a high-dimensional co-expression network. The set-up of Gaussian graphical models is used to construct representations of the co-expression network in such a way that redundancy and the propagation...... that allow to make effective inference in problems with high degree of complexity (e.g. several thousands of genes) and small number of observations (e.g. 10-100) as typically occurs in high throughput gene expression studies. Taking advantage of the internal structure of decomposable graphical models, we...... construct a compact representation of the co-expression network that allows to identify the regions with high concentration of differentially expressed genes. It is argued that differentially expressed genes located in highly interconnected regions of the co-expression network are less informative than...

  3. An adaptive ANOVA-based PCKF for high-dimensional nonlinear inverse modeling

    Science.gov (United States)

    Li, Weixuan; Lin, Guang; Zhang, Dongxiao

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect-except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos basis functions in the expansion helps to capture uncertainty more accurately but increases computational cost. Selection of basis functions is particularly important for high-dimensional stochastic problems because the number of polynomial chaos basis functions required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE basis functions are pre-set based on users' experience. Also, for sequential data assimilation problems, the basis functions kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE basis functions for different problems and automatically adjusts the number of basis functions in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm was tested with different examples and demonstrated

  4. Kernel based methods for accelerated failure time model with ultra-high dimensional data

    Directory of Open Access Journals (Sweden)

    Jiang Feng

    2010-12-01

    Full Text Available Abstract Background Most genomic data have ultra-high dimensions with more than 10,000 genes (probes. Regularization methods with L1 and Lp penalty have been extensively studied in survival analysis with high-dimensional genomic data. However, when the sample size n ≪ m (the number of genes, directly identifying a small subset of genes from ultra-high (m > 10, 000 dimensional data is time-consuming and not computationally efficient. In current microarray analysis, what people really do is select a couple of thousands (or hundreds of genes using univariate analysis or statistical tests, and then apply the LASSO-type penalty to further reduce the number of disease associated genes. This two-step procedure may introduce bias and inaccuracy and lead us to miss biologically important genes. Results The accelerated failure time (AFT model is a linear regression model and a useful alternative to the Cox model for survival analysis. In this paper, we propose a nonlinear kernel based AFT model and an efficient variable selection method with adaptive kernel ridge regression. Our proposed variable selection method is based on the kernel matrix and dual problem with a much smaller n × n matrix. It is very efficient when the number of unknown variables (genes is much larger than the number of samples. Moreover, the primal variables are explicitly updated and the sparsity in the solution is exploited. Conclusions Our proposed methods can simultaneously identify survival associated prognostic factors and predict survival outcomes with ultra-high dimensional genomic data. We have demonstrated the performance of our methods with both simulation and real data. The proposed method performs superbly with limited computational studies.

  5. Joint Adaptive Mean-Variance Regularization and Variance Stabilization of High Dimensional Data.

    Science.gov (United States)

    Dazard, Jean-Eudes; Rao, J Sunil

    2012-07-01

    The paper addresses a common problem in the analysis of high-dimensional high-throughput "omics" data, which is parameter estimation across multiple variables in a set of data where the number of variables is much larger than the sample size. Among the problems posed by this type of data are that variable-specific estimators of variances are not reliable and variable-wise tests statistics have low power, both due to a lack of degrees of freedom. In addition, it has been observed in this type of data that the variance increases as a function of the mean. We introduce a non-parametric adaptive regularization procedure that is innovative in that : (i) it employs a novel "similarity statistic"-based clustering technique to generate local-pooled or regularized shrinkage estimators of population parameters, (ii) the regularization is done jointly on population moments, benefiting from C. Stein's result on inadmissibility, which implies that usual sample variance estimator is improved by a shrinkage estimator using information contained in the sample mean. From these joint regularized shrinkage estimators, we derived regularized t-like statistics and show in simulation studies that they offer more statistical power in hypothesis testing than their standard sample counterparts, or regular common value-shrinkage estimators, or when the information contained in the sample mean is simply ignored. Finally, we show that these estimators feature interesting properties of variance stabilization and normalization that can be used for preprocessing high-dimensional multivariate data. The method is available as an R package, called 'MVR' ('Mean-Variance Regularization'), downloadable from the CRAN website.

  6. Traces of humanity: Echoes of social and cultural experience in physical objects and digital surrogates in the University of Victoria Libraries

    Directory of Open Access Journals (Sweden)

    Robbyn Gordon Lanning

    2016-12-01

    Full Text Available The relationships between primary source materials and their digital surrogates warrant consideration about how different materials translate into digitized forms. Physical primary source materials found in library special collections and archives and their digital surrogates challenge the viewer to consider what these objects are communicating through their materiality or lack thereof. For example, how does a clay tablet represent itself digitally, as compared to a parchment manuscript, or a paper accounts book? What qualities, stories or narratives do these resources communicate in their original forms, as digital surrogates, or when engaged with together, and how do these differ? How do both physical and digital resources serve as archival objects with the ability to reflect our social and cultural experiences—and indeed our humanity—back to us? As more and more library and museum resources are digitized and made open to researchers, such questions must be addressed as the use and reuse of digital surrogates becomes increasingly complex as digital scholarship evolves.

  7. Smart Universities: Education's Digital Future

    NARCIS (Netherlands)

    Stracke, Christian M.; Shanks, Michael; Tveiten, Oddgeir

    2018-01-01

    Institutions of learning at all levels are challenged by a fast and accelerating pace of change in the development of communications technology. Conferences around the world address the issue and research journals in a wide range of scholarly fields are placing the challenge of understanding

  8. Smart Universities: Education's Digital Future

    NARCIS (Netherlands)

    Stracke, Christian M.; Shanks, Michael; Tveiten, Oddgeir

    2018-01-01

    Institutions of learning at all levels are challenged by a fast and accelerating pace of change in the development of communications technology. Conferences around the world address the issue. Research journals in a wide range of scholarly fields are placing the challenge of understanding

  9. University digital repositories and authors.

    Directory of Open Access Journals (Sweden)

    Alice Keefer

    2008-02-01

    Full Text Available The Open Access movement offers two strategies for making scientific information available without economic, technical or legal obstacles: the publication of articles in OA journals and the deposit by authors of their Works in stable institutional or discipline-based repositories. This article explores the implementation of the second “route” on the part of authors, because it is the strategy that offers the greatest possibility of attaining OA in the short term. However, it does require repositories to exert great effort in informing the authors of the advantages of self-archiving and of the procedures for depositing their work and, even helping them to do so – through services and promotional activities.

  10. Ultrasensitive Single Fluorescence-Labeled Probe-Mediated Single Universal Primer-Multiplex-Droplet Digital Polymerase Chain Reaction for High-Throughput Genetically Modified Organism Screening.

    Science.gov (United States)

    Niu, Chenqi; Xu, Yuancong; Zhang, Chao; Zhu, Pengyu; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2018-05-01

    As genetically modified (GM) technology develops and genetically modified organisms (GMOs) become more available, GMOs face increasing regulations and pressure to adhere to strict labeling guidelines. A singleplex detection method cannot perform the high-throughput analysis necessary for optimal GMO detection. Combining the advantages of multiplex detection and droplet digital polymerase chain reaction (ddPCR), a single universal primer-multiplex-ddPCR (SUP-M-ddPCR) strategy was proposed for accurate broad-spectrum screening and quantification. The SUP increases efficiency of the primers in PCR and plays an important role in establishing a high-throughput, multiplex detection method. Emerging ddPCR technology has been used for accurate quantification of nucleic acid molecules without a standard curve. Using maize as a reference point, four heterologous sequences ( 35S, NOS, NPTII, and PAT) were selected to evaluate the feasibility and applicability of this strategy. Surprisingly, these four genes cover more than 93% of the transgenic maize lines and serve as preliminary screening sequences. All screening probes were labeled with FAM fluorescence, which allows the signals from the samples with GMO content and those without to be easily differentiated. This fiveplex screening method is a new development in GMO screening. Utilizing an optimal amplification assay, the specificity, limit of detection (LOD), and limit of quantitation (LOQ) were validated. The LOD and LOQ of this GMO screening method were 0.1% and 0.01%, respectively, with a relative standard deviation (RSD) < 25%. This method could serve as an important tool for the detection of GM maize from different processed, commercially available products. Further, this screening method could be applied to other fields that require reliable and sensitive detection of DNA targets.

  11. Technology's Latest Wave: Colleges and Universities Are Increasingly Exploring the Academic Use of Digital Mobile Devices-But Lack of Money Sometimes Stands in the Way

    Science.gov (United States)

    Galuszka, Peter

    2005-01-01

    Using mobile digital devices--iPods, personal digital assistants (PDAs), Tablet PCs or advanced cell phones--is becoming a big campus trend. Their advantages include convenience and the ability to hear lectures or course-related music just about anywhere. PDA's such as Palm Pilots and BlackBerrys, iPods such as Apple's and Tablet PCs, including…

  12. Digital techniques in simulation, communication and control. Proceedings of the IMACS European meeting held at University of Patras, Patras, Greece, July 9-12, 1984

    Energy Technology Data Exchange (ETDEWEB)

    Tzafestas, S G

    1985-01-01

    The book contains 90 papers which are classified in the following five parts: Modelling and simulation; Digital signal processing and 2-D system design; Information and communication systems; Control systems; and Applications (robotics, industrial and miscellaneous applications). The volume reflects the state-of-art of the field of digital techniques. (Auth.).

  13. Using digital badges in South Africa informing the validation of a multi-channel open badge system at a German university

    CSIR Research Space (South Africa)

    Niehaus, E

    2017-05-01

    Full Text Available strategy, remains an area to explore. One of the most crucial points of concern currently with digital badging is how to validate or credit a competence or skillset and what value this validation will carry for the individual. The Mozilla digital badge...

  14. Elearning and digital publishing

    CERN Document Server

    Ching, Hsianghoo Steve; Mc Naught, Carmel

    2006-01-01

    ""ELearning and Digital Publishing"" will occupy a unique niche in the literature accessed by library and publishing specialists, and by university teachers and planners. It examines the interfaces between the work done by four groups of university staff who have been in the past quite separate from, or only marginally related to, each other - library staff, university teachers, university policy makers, and staff who work in university publishing presses. All four groups are directly and intimately connected with the main functions of universities - the creation, management and dissemination

  15. A Near-linear Time Approximation Algorithm for Angle-based Outlier Detection in High-dimensional Data

    DEFF Research Database (Denmark)

    Pham, Ninh Dang; Pagh, Rasmus

    2012-01-01

    projection-based technique that is able to estimate the angle-based outlier factor for all data points in time near-linear in the size of the data. Also, our approach is suitable to be performed in parallel environment to achieve a parallel speedup. We introduce a theoretical analysis of the quality...... neighbor are deteriorated in high-dimensional data. Following up on the work of Kriegel et al. (KDD '08), we investigate the use of angle-based outlier factor in mining high-dimensional outliers. While their algorithm runs in cubic time (with a quadratic time heuristic), we propose a novel random......Outlier mining in d-dimensional point sets is a fundamental and well studied data mining task due to its variety of applications. Most such applications arise in high-dimensional domains. A bottleneck of existing approaches is that implicit or explicit assessments on concepts of distance or nearest...

  16. Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data.

    Science.gov (United States)

    Becker, Natalia; Toedt, Grischa; Lichter, Peter; Benner, Axel

    2011-05-09

    Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net.We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone.Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error.Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters.The penalized SVM

  17. Case study : The California Digital Library

    OpenAIRE

    Ober, John

    2002-01-01

    The California Digital Library was founded in 1997 as a digital “co-library” of the 10 Universities of California campuses. Responses to crisis in scholarly communication and the opportunity presented by digital technologies and the Web. Charged to create a comprehensive system for the management of digital scholarly information.

  18. Prediction-Oriented Marker Selection (PROMISE): With Application to High-Dimensional Regression.

    Science.gov (United States)

    Kim, Soyeon; Baladandayuthapani, Veerabhadran; Lee, J Jack

    2017-06-01

    In personalized medicine, biomarkers are used to select therapies with the highest likelihood of success based on an individual patient's biomarker/genomic profile. Two goals are to choose important biomarkers that accurately predict treatment outcomes and to cull unimportant biomarkers to reduce the cost of biological and clinical verifications. These goals are challenging due to the high dimensionality of genomic data. Variable selection methods based on penalized regression (e.g., the lasso and elastic net) have yielded promising results. However, selecting the right amount of penalization is critical to simultaneously achieving these two goals. Standard approaches based on cross-validation (CV) typically provide high prediction accuracy with high true positive rates but at the cost of too many false positives. Alternatively, stability selection (SS) controls the number of false positives, but at the cost of yielding too few true positives. To circumvent these issues, we propose prediction-oriented marker selection (PROMISE), which combines SS with CV to conflate the advantages of both methods. Our application of PROMISE with the lasso and elastic net in data analysis shows that, compared to CV, PROMISE produces sparse solutions, few false positives, and small type I + type II error, and maintains good prediction accuracy, with a marginal decrease in the true positive rates. Compared to SS, PROMISE offers better prediction accuracy and true positive rates. In summary, PROMISE can be applied in many fields to select regularization parameters when the goals are to minimize false positives and maximize prediction accuracy.

  19. Diagonal Likelihood Ratio Test for Equality of Mean Vectors in High-Dimensional Data

    KAUST Repository

    Hu, Zongliang

    2017-10-27

    We propose a likelihood ratio test framework for testing normal mean vectors in high-dimensional data under two common scenarios: the one-sample test and the two-sample test with equal covariance matrices. We derive the test statistics under the assumption that the covariance matrices follow a diagonal matrix structure. In comparison with the diagonal Hotelling\\'s tests, our proposed test statistics display some interesting characteristics. In particular, they are a summation of the log-transformed squared t-statistics rather than a direct summation of those components. More importantly, to derive the asymptotic normality of our test statistics under the null and local alternative hypotheses, we do not require the assumption that the covariance matrix follows a diagonal matrix structure. As a consequence, our proposed test methods are very flexible and can be widely applied in practice. Finally, simulation studies and a real data analysis are also conducted to demonstrate the advantages of our likelihood ratio test method.

  20. Biomarker identification and effect estimation on schizophrenia –a high dimensional data analysis

    Directory of Open Access Journals (Sweden)

    Yuanzhang eLi

    2015-05-01

    Full Text Available Biomarkers have been examined in schizophrenia research for decades. Medical morbidity and mortality rates, as well as personal and societal costs, are associated with schizophrenia patients. The identification of biomarkers and alleles, which often have a small effect individually, may help to develop new diagnostic tests for early identification and treatment. Currently, there is not a commonly accepted statistical approach to identify predictive biomarkers from high dimensional data. We used space Decomposition-Gradient-Regression method (DGR to select biomarkers, which are associated with the risk of schizophrenia. Then, we used the gradient scores, generated from the selected biomarkers, as the prediction factor in regression to estimate their effects. We also used an alternative approach, classification and regression tree (CART, to compare the biomarker selected by DGR and found about 70% of the selected biomarkers were the same. However, the advantage of DGR is that it can evaluate individual effects for each biomarker from their combined effect. In DGR analysis of serum specimens of US military service members with a diagnosis of schizophrenia from 1992 to 2005 and their controls, Alpha-1-Antitrypsin (AAT, Interleukin-6 receptor (IL-6r and Connective Tissue Growth Factor (CTGF were selected to identify schizophrenia for males; and Alpha-1-Antitrypsin (AAT, Apolipoprotein B (Apo B and Sortilin were selected for females. If these findings from military subjects are replicated by other studies, they suggest the possibility of a novel biomarker panel as an adjunct to earlier diagnosis and initiation of treatment.

  1. A sparse grid based method for generative dimensionality reduction of high-dimensional data

    Science.gov (United States)

    Bohn, Bastian; Garcke, Jochen; Griebel, Michael

    2016-03-01

    Generative dimensionality reduction methods play an important role in machine learning applications because they construct an explicit mapping from a low-dimensional space to the high-dimensional data space. We discuss a general framework to describe generative dimensionality reduction methods, where the main focus lies on a regularized principal manifold learning variant. Since most generative dimensionality reduction algorithms exploit the representer theorem for reproducing kernel Hilbert spaces, their computational costs grow at least quadratically in the number n of data. Instead, we introduce a grid-based discretization approach which automatically scales just linearly in n. To circumvent the curse of dimensionality of full tensor product grids, we use the concept of sparse grids. Furthermore, in real-world applications, some embedding directions are usually more important than others and it is reasonable to refine the underlying discretization space only in these directions. To this end, we employ a dimension-adaptive algorithm which is based on the ANOVA (analysis of variance) decomposition of a function. In particular, the reconstruction error is used to measure the quality of an embedding. As an application, the study of large simulation data from an engineering application in the automotive industry (car crash simulation) is performed.

  2. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    International Nuclear Information System (INIS)

    Snyder, Abigail C.; Jiao, Yu

    2010-01-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.

  3. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  4. Construction of high-dimensional neural network potentials using environment-dependent atom pairs.

    Science.gov (United States)

    Jose, K V Jovan; Artrith, Nongnuch; Behler, Jörg

    2012-05-21

    An accurate determination of the potential energy is the crucial step in computer simulations of chemical processes, but using electronic structure methods on-the-fly in molecular dynamics (MD) is computationally too demanding for many systems. Constructing more efficient interatomic potentials becomes intricate with increasing dimensionality of the potential-energy surface (PES), and for numerous systems the accuracy that can be achieved is still not satisfying and far from the reliability of first-principles calculations. Feed-forward neural networks (NNs) have a very flexible functional form, and in recent years they have been shown to be an accurate tool to construct efficient PESs. High-dimensional NN potentials based on environment-dependent atomic energy contributions have been presented for a number of materials. Still, these potentials may be improved by a more detailed structural description, e.g., in form of atom pairs, which directly reflect the atomic interactions and take the chemical environment into account. We present an implementation of an NN method based on atom pairs, and its accuracy and performance are compared to the atom-based NN approach using two very different systems, the methanol molecule and metallic copper. We find that both types of NN potentials provide an excellent description of both PESs, with the pair-based method yielding a slightly higher accuracy making it a competitive alternative for addressing complex systems in MD simulations.

  5. Two-Sample Tests for High-Dimensional Linear Regression with an Application to Detecting Interactions.

    Science.gov (United States)

    Xia, Yin; Cai, Tianxi; Cai, T Tony

    2018-01-01

    Motivated by applications in genomics, we consider in this paper global and multiple testing for the comparisons of two high-dimensional linear regression models. A procedure for testing the equality of the two regression vectors globally is proposed and shown to be particularly powerful against sparse alternatives. We then introduce a multiple testing procedure for identifying unequal coordinates while controlling the false discovery rate and false discovery proportion. Theoretical justifications are provided to guarantee the validity of the proposed tests and optimality results are established under sparsity assumptions on the regression coefficients. The proposed testing procedures are easy to implement. Numerical properties of the procedures are investigated through simulation and data analysis. The results show that the proposed tests maintain the desired error rates under the null and have good power under the alternative at moderate sample sizes. The procedures are applied to the Framingham Offspring study to investigate the interactions between smoking and cardiovascular related genetic mutations important for an inflammation marker.

  6. Individual-based models for adaptive diversification in high-dimensional phenotype spaces.

    Science.gov (United States)

    Ispolatov, Iaroslav; Madhok, Vaibhav; Doebeli, Michael

    2016-02-07

    Most theories of evolutionary diversification are based on equilibrium assumptions: they are either based on optimality arguments involving static fitness landscapes, or they assume that populations first evolve to an equilibrium state before diversification occurs, as exemplified by the concept of evolutionary branching points in adaptive dynamics theory. Recent results indicate that adaptive dynamics may often not converge to equilibrium points and instead generate complicated trajectories if evolution takes place in high-dimensional phenotype spaces. Even though some analytical results on diversification in complex phenotype spaces are available, to study this problem in general we need to reconstruct individual-based models from the adaptive dynamics generating the non-equilibrium dynamics. Here we first provide a method to construct individual-based models such that they faithfully reproduce the given adaptive dynamics attractor without diversification. We then show that a propensity to diversify can be introduced by adding Gaussian competition terms that generate frequency dependence while still preserving the same adaptive dynamics. For sufficiently strong competition, the disruptive selection generated by frequency-dependence overcomes the directional evolution along the selection gradient and leads to diversification in phenotypic directions that are orthogonal to the selection gradient. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A Comparison of Machine Learning Methods in a High-Dimensional Classification Problem

    Directory of Open Access Journals (Sweden)

    Zekić-Sušac Marijana

    2014-09-01

    Full Text Available Background: Large-dimensional data modelling often relies on variable reduction methods in the pre-processing and in the post-processing stage. However, such a reduction usually provides less information and yields a lower accuracy of the model. Objectives: The aim of this paper is to assess the high-dimensional classification problem of recognizing entrepreneurial intentions of students by machine learning methods. Methods/Approach: Four methods were tested: artificial neural networks, CART classification trees, support vector machines, and k-nearest neighbour on the same dataset in order to compare their efficiency in the sense of classification accuracy. The performance of each method was compared on ten subsamples in a 10-fold cross-validation procedure in order to assess computing sensitivity and specificity of each model. Results: The artificial neural network model based on multilayer perceptron yielded a higher classification rate than the models produced by other methods. The pairwise t-test showed a statistical significance between the artificial neural network and the k-nearest neighbour model, while the difference among other methods was not statistically significant. Conclusions: Tested machine learning methods are able to learn fast and achieve high classification accuracy. However, further advancement can be assured by testing a few additional methodological refinements in machine learning methods.

  8. Exploration of High-Dimensional Scalar Function for Nuclear Reactor Safety Analysis and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer; Michael Pernice; Robert Nourgaliev

    2013-05-01

    The next generation of methodologies for nuclear reactor Probabilistic Risk Assessment (PRA) explicitly accounts for the time element in modeling the probabilistic system evolution and uses numerical simulation tools to account for possible dependencies between failure events. The Monte-Carlo (MC) and the Dynamic Event Tree (DET) approaches belong to this new class of dynamic PRA methodologies. A challenge of dynamic PRA algorithms is the large amount of data they produce which may be difficult to visualize and analyze in order to extract useful information. We present a software tool that is designed to address these goals. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations. We provide a user’s guide to our software tool by highlighting its analysis and visualization capabilities, along with a use case involving dataset from a nuclear reactor safety simulation.

  9. High-dimensional neural network potentials for solvation: The case of protonated water clusters in helium

    Science.gov (United States)

    Schran, Christoph; Uhl, Felix; Behler, Jörg; Marx, Dominik

    2018-03-01

    The design of accurate helium-solute interaction potentials for the simulation of chemically complex molecules solvated in superfluid helium has long been a cumbersome task due to the rather weak but strongly anisotropic nature of the interactions. We show that this challenge can be met by using a combination of an effective pair potential for the He-He interactions and a flexible high-dimensional neural network potential (NNP) for describing the complex interaction between helium and the solute in a pairwise additive manner. This approach yields an excellent agreement with a mean absolute deviation as small as 0.04 kJ mol-1 for the interaction energy between helium and both hydronium and Zundel cations compared with coupled cluster reference calculations with an energetically converged basis set. The construction and improvement of the potential can be performed in a highly automated way, which opens the door for applications to a variety of reactive molecules to study the effect of solvation on the solute as well as the solute-induced structuring of the solvent. Furthermore, we show that this NNP approach yields very convincing agreement with the coupled cluster reference for properties like many-body spatial and radial distribution functions. This holds for the microsolvation of the protonated water monomer and dimer by a few helium atoms up to their solvation in bulk helium as obtained from path integral simulations at about 1 K.

  10. Multi-Scale Factor Analysis of High-Dimensional Brain Signals

    KAUST Repository

    Ting, Chee-Ming

    2017-05-18

    In this paper, we develop an approach to modeling high-dimensional networks with a large number of nodes arranged in a hierarchical and modular structure. We propose a novel multi-scale factor analysis (MSFA) model which partitions the massive spatio-temporal data defined over the complex networks into a finite set of regional clusters. To achieve further dimension reduction, we represent the signals in each cluster by a small number of latent factors. The correlation matrix for all nodes in the network are approximated by lower-dimensional sub-structures derived from the cluster-specific factors. To estimate regional connectivity between numerous nodes (within each cluster), we apply principal components analysis (PCA) to produce factors which are derived as the optimal reconstruction of the observed signals under the squared loss. Then, we estimate global connectivity (between clusters or sub-networks) based on the factors across regions using the RV-coefficient as the cross-dependence measure. This gives a reliable and computationally efficient multi-scale analysis of both regional and global dependencies of the large networks. The proposed novel approach is applied to estimate brain connectivity networks using functional magnetic resonance imaging (fMRI) data. Results on resting-state fMRI reveal interesting modular and hierarchical organization of human brain networks during rest.

  11. Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Jiang, Jiancheng; Tong, Xin

    We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS. In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing.

  12. Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.

    Science.gov (United States)

    Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen

    2017-12-01

    In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.

  13. Multi-SOM: an Algorithm for High-Dimensional, Small Size Datasets

    Directory of Open Access Journals (Sweden)

    Shen Lu

    2013-04-01

    Full Text Available Since it takes time to do experiments in bioinformatics, biological datasets are sometimes small but with high dimensionality. From probability theory, in order to discover knowledge from a set of data, we have to have a sufficient number of samples. Otherwise, the error bounds can become too large to be useful. For the SOM (Self- Organizing Map algorithm, the initial map is based on the training data. In order to avoid the bias caused by the insufficient training data, in this paper we present an algorithm, called Multi-SOM. Multi-SOM builds a number of small self-organizing maps, instead of just one big map. Bayesian decision theory is used to make the final decision among similar neurons on different maps. In this way, we can better ensure that we can get a real random initial weight vector set, the map size is less of consideration and errors tend to average out. In our experiments as applied to microarray datasets which are highly intense data composed of genetic related information, the precision of Multi-SOMs is 10.58% greater than SOMs, and its recall is 11.07% greater than SOMs. Thus, the Multi-SOMs algorithm is practical.

  14. Digital asset ecosystems rethinking crowds and cloud

    CERN Document Server

    Blanke, Tobias

    2014-01-01

    Digital asset management is undergoing a fundamental transformation. Near universal availability of high-quality web-based assets makes it important to pay attention to the new world of digital ecosystems and what it means for managing, using and publishing digital assets. The Ecosystem of Digital Assets reflects on these developments and what the emerging 'web of things' could mean for digital assets. The book is structured into three parts, each covering an important aspect of digital assets. Part one introduces the emerging ecosystems of digital assets. Part two examines digital asset manag

  15. Reseña de Gaspar Aguilar, «La comedia segunda de Los agravios perdonados», ed. C. George Peale, Santa Barbara, University of California, Publications of «eHumanista», 2016, 128 pp., edición digital

    Directory of Open Access Journals (Sweden)

    Ariel Núñez Sepúlveda

    2017-11-01

    Full Text Available Reseña de Gaspar Aguilar, La comedia segunda de Los agravios perdonados, ed. C. George Peale, Santa Barbara, University of California, Publications of eHumanista, 2016, 128 pp., edición digital

  16. The Global University Press

    Science.gov (United States)

    Dougherty, Peter J.

    2012-01-01

    The modern world's understanding of American university press has long been shaped by university-press books. American university-press books are good international advertisements for the universities whose logos grace their spines. The growth of transnational scholarship and the expansion of digital communications networks are converging in ways…

  17. HDclassif : An R Package for Model-Based Clustering and Discriminant Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Laurent Berge

    2012-01-01

    Full Text Available This paper presents the R package HDclassif which is devoted to the clustering and the discriminant analysis of high-dimensional data. The classification methods proposed in the package result from a new parametrization of the Gaussian mixture model which combines the idea of dimension reduction and model constraints on the covariance matrices. The supervised classification method using this parametrization is called high dimensional discriminant analysis (HDDA. In a similar manner, the associated clustering method iscalled high dimensional data clustering (HDDC and uses the expectation-maximization algorithm for inference. In order to correctly t the data, both methods estimate the specific subspace and the intrinsic dimension of the groups. Due to the constraints on the covariance matrices, the number of parameters to estimate is significantly lower than other model-based methods and this allows the methods to be stable and efficient in high dimensions. Two introductory examples illustrated with R codes allow the user to discover the hdda and hddc functions. Experiments on simulated and real datasets also compare HDDC and HDDA with existing classification methods on high-dimensional datasets. HDclassif is a free software and distributed under the general public license, as part of the R software project.

  18. Estimate of the largest Lyapunov characteristic exponent of a high dimensional atmospheric global circulation model: a sensitivity analysis

    International Nuclear Information System (INIS)

    Guerrieri, A.

    2009-01-01

    In this report the largest Lyapunov characteristic exponent of a high dimensional atmospheric global circulation model of intermediate complexity has been estimated numerically. A sensitivity analysis has been carried out by varying the equator-to-pole temperature difference, the space resolution and the value of some parameters employed by the model. Chaotic and non-chaotic regimes of circulation have been found. [it

  19. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation

    International Nuclear Information System (INIS)

    Tripathy, Rohit; Bilionis, Ilias; Gonzalez, Marcial

    2016-01-01

    Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range of physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the

  20. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation

    Science.gov (United States)

    Tripathy, Rohit; Bilionis, Ilias; Gonzalez, Marcial

    2016-09-01

    Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range of physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the

  1. Gaussian processes with built-in dimensionality reduction: Applications to high-dimensional uncertainty propagation

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, Rohit, E-mail: rtripath@purdue.edu; Bilionis, Ilias, E-mail: ibilion@purdue.edu; Gonzalez, Marcial, E-mail: marcial-gonzalez@purdue.edu

    2016-09-15

    Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range of physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the

  2. Creative Digital Media Practices

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth

    , 2006, 2009), machinima (machine + cinema + anime, real-time animation captured in games etc.), and the digital storytelling movement. A dialogic perspective on the diversity of digital media practices opens up for understanding the complex evolution of language on socio-historical, cultural......The presentation reviews the interplay of dialogic (Bakhtin, 1981) and multimodal theories on media production practices, with attention to visual communication (Kress and van Leeuwen, 2001, 2006). This theoretical approach aids in reflecting on digital media practices as novel (new) sign systems...... develop a collaborative digital storytelling showcase for their own digital stories about Roskilde University. This course is intended to bring up reflections on the wider phenomenon of contemporary media practices, such as: YouTube, DIY (do-it-yourself) filmmaking or homemade, garage cinema (Jenkins...

  3. Probabilistic numerical methods for high-dimensional stochastic control and valuation problems on electricity markets

    International Nuclear Information System (INIS)

    Langrene, Nicolas

    2014-01-01

    This thesis deals with the numerical solution of general stochastic control problems, with notable applications for electricity markets. We first propose a structural model for the price of electricity, allowing for price spikes well above the marginal fuel price under strained market conditions. This model allows to price and partially hedge electricity derivatives, using fuel forwards as hedging instruments. Then, we propose an algorithm, which combines Monte-Carlo simulations with local basis regressions, to solve general optimal switching problems. A comprehensive rate of convergence of the method is provided. Moreover, we manage to make the algorithm parsimonious in memory (and hence suitable for high dimensional problems) by generalizing to this framework a memory reduction method that avoids the storage of the sample paths. We illustrate this on the problem of investments in new power plants (our structural power price model allowing the new plants to impact the price of electricity). Finally, we study more general stochastic control problems (the control can be continuous and impact the drift and volatility of the state process), the solutions of which belong to the class of fully nonlinear Hamilton-Jacobi-Bellman equations, and can be handled via constrained Backward Stochastic Differential Equations, for which we develop a backward algorithm based on control randomization and parametric optimizations. A rate of convergence between the constraPned BSDE and its discrete version is provided, as well as an estimate of the optimal control. This algorithm is then applied to the problem of super replication of options under uncertain volatilities (and correlations). (author)

  4. Evaluation of a new high-dimensional miRNA profiling platform

    Directory of Open Access Journals (Sweden)

    Lamblin Anne-Francoise

    2009-08-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are a class of approximately 22 nucleotide long, widely expressed RNA molecules that play important regulatory roles in eukaryotes. To investigate miRNA function, it is essential that methods to quantify their expression levels be available. Methods We evaluated a new miRNA profiling platform that utilizes Illumina's existing robust DASL chemistry as the basis for the assay. Using total RNA from five colon cancer patients and four cell lines, we evaluated the reproducibility of miRNA expression levels across replicates and with varying amounts of input RNA. The beta test version was comprised of 735 miRNA targets of Illumina's miRNA profiling application. Results Reproducibility between sample replicates within a plate was good (Spearman's correlation 0.91 to 0.98 as was the plate-to-plate reproducibility replicates run on different days (Spearman's correlation 0.84 to 0.98. To determine whether quality data could be obtained from a broad range of input RNA, data obtained from amounts ranging from 25 ng to 800 ng were compared to those obtained at 200 ng. No effect across the range of RNA input was observed. Conclusion These results indicate that very small amounts of starting material are sufficient to allow sensitive miRNA profiling using the Illumina miRNA high-dimensional platform. Nonlinear biases were observed between replicates, indicating the need for abundance-dependent normalization. Overall, the performance characteristics of the Illumina miRNA profiling system were excellent.

  5. Multivariate linear regression of high-dimensional fMRI data with multiple target variables.

    Science.gov (United States)

    Valente, Giancarlo; Castellanos, Agustin Lage; Vanacore, Gianluca; Formisano, Elia

    2014-05-01

    Multivariate regression is increasingly used to study the relation between fMRI spatial activation patterns and experimental stimuli or behavioral ratings. With linear models, informative brain locations are identified by mapping the model coefficients. This is a central aspect in neuroimaging, as it provides the sought-after link between the activity of neuronal populations and subject's perception, cognition or behavior. Here, we show that mapping of informative brain locations using multivariate linear regression (MLR) may lead to incorrect conclusions and interpretations. MLR algorithms for high dimensional data are designed to deal with targets (stimuli or behavioral ratings, in fMRI) separately, and the predictive map of a model integrates information deriving from both neural activity patterns and experimental design. Not accounting explicitly for the presence of other targets whose associated activity spatially overlaps with the one of interest may lead to predictive maps of troublesome interpretation. We propose a new model that can correctly identify the spatial patterns associated with a target while achieving good generalization. For each target, the training is based on an augmented dataset, which includes all remaining targets. The estimation on such datasets produces both maps and interaction coefficients, which are then used to generalize. The proposed formulation is independent of the regression algorithm employed. We validate this model on simulated fMRI data and on a publicly available dataset. Results indicate that our method achieves high spatial sensitivity and good generalization and that it helps disentangle specific neural effects from interaction with predictive maps associated with other targets. Copyright © 2013 Wiley Periodicals, Inc.

  6. Uncertainty quantification in transcranial magnetic stimulation via high-dimensional model representation.

    Science.gov (United States)

    Gomez, Luis J; Yücel, Abdulkadir C; Hernandez-Garcia, Luis; Taylor, Stephan F; Michielssen, Eric

    2015-01-01

    A computational framework for uncertainty quantification in transcranial magnetic stimulation (TMS) is presented. The framework leverages high-dimensional model representations (HDMRs), which approximate observables (i.e., quantities of interest such as electric (E) fields induced inside targeted cortical regions) via series of iteratively constructed component functions involving only the most significant random variables (i.e., parameters that characterize the uncertainty in a TMS setup such as the position and orientation of TMS coils, as well as the size, shape, and conductivity of the head tissue). The component functions of HDMR expansions are approximated via a multielement probabilistic collocation (ME-PC) method. While approximating each component function, a quasi-static finite-difference simulator is used to compute observables at integration/collocation points dictated by the ME-PC method. The proposed framework requires far fewer simulations than traditional Monte Carlo methods for providing highly accurate statistical information (e.g., the mean and standard deviation) about the observables. The efficiency and accuracy of the proposed framework are demonstrated via its application to the statistical characterization of E-fields generated by TMS inside cortical regions of an MRI-derived realistic head model. Numerical results show that while uncertainties in tissue conductivities have negligible effects on TMS operation, variations in coil position/orientation and brain size significantly affect the induced E-fields. Our numerical results have several implications for the use of TMS during depression therapy: 1) uncertainty in the coil position and orientation may reduce the response rates of patients; 2) practitioners should favor targets on the crest of a gyrus to obtain maximal stimulation; and 3) an increasing scalp-to-cortex distance reduces the magnitude of E-fields on the surface and inside the cortex.

  7. An adaptive optimal ensemble classifier via bagging and rank aggregation with applications to high dimensional data

    Directory of Open Access Journals (Sweden)

    Datta Susmita

    2010-08-01

    Full Text Available Abstract Background Generally speaking, different classifiers tend to work well for certain types of data and conversely, it is usually not known a priori which algorithm will be optimal in any given classification application. In addition, for most classification problems, selecting the best performing classification algorithm amongst a number of competing algorithms is a difficult task for various reasons. As for example, the order of performance may depend on the performance measure employed for such a comparison. In this work, we present a novel adaptive ensemble classifier constructed by combining bagging and rank aggregation that is capable of adaptively changing its performance depending on the type of data that is being classified. The attractive feature of the proposed classifier is its multi-objective nature where the classification results can be simultaneously optimized with respect to several performance measures, for example, accuracy, sensitivity and specificity. We also show that our somewhat complex strategy has better predictive performance as judged on test samples than a more naive approach that attempts to directly identify the optimal classifier based on the training data performances of the individual classifiers. Results We illustrate the proposed method with two simulated and two real-data examples. In all cases, the ensemble classifier performs at the level of the best individual classifier comprising the ensemble or better. Conclusions For complex high-dimensional datasets resulting from present day high-throughput experiments, it may be wise to consider a number of classification algorithms combined with dimension reduction techniques rather than a fixed standard algorithm set a priori.

  8. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  9. AucPR: an AUC-based approach using penalized regression for disease prediction with high-dimensional omics data.

    Science.gov (United States)

    Yu, Wenbao; Park, Taesung

    2014-01-01

    It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. We propose an AUC-based approach using penalized regression (AucPR), which is a parametric method used for obtaining a linear combination for maximizing the AUC. To obtain the AUC maximizer in a high-dimensional context, we transform a classical parametric AUC maximizer, which is used in a low-dimensional context, into a regression framework and thus, apply the penalization regression approach directly. Two kinds of penalization, lasso and elastic net, are considered. The parametric approach can avoid some of the difficulties of a conventional non-parametric AUC-based approach, such as the lack of an appropriate concave objective function and a prudent choice of the smoothing parameter. We apply the proposed AucPR for gene selection and classification using four real microarray and synthetic data. Through numerical studies, AucPR is shown to perform better than the penalized logistic regression and the nonparametric AUC-based method, in the sense of AUC and sensitivity for a given specificity, particularly when there are many correlated genes. We propose a powerful parametric and easily-implementable linear classifier AucPR, for gene selection and disease prediction for high-dimensional data. AucPR is recommended for its good prediction performance. Beside gene expression microarray data, AucPR can be applied to other types of high-dimensional omics data, such as miRNA and protein data.

  10. Development of Higher Education in Albania: The Case of the Public University Libraries in Efforts to Build Digital and Electronic Services for the Academic Community

    Directory of Open Access Journals (Sweden)

    Erena Haska

    2013-01-01

    This paper brings conclusions that contribute to: a national information communication technology policy for university libraries and b the creation of an integrated system for management and transmission of knowledge at the national level for all Albanian university libraries.

  11. The Digital Marmor Parium

    OpenAIRE

    Berti, Monica

    2017-01-01

    The Digital Marmor Parium is a project of the Alexander von Humboldt Chair of Digital Humanities at the University of Leipzig (http://www.dh.uni-leipzig.de/wo/dmp). The aim of this work is to produce a new digital edition of the so called Marmor Parium (Parian Marble), which is a Hellenistic chronicle on a marble slab coming from the Greek island of Paros. The importance of the document is due to the fact that it preserves a Greek chronology (1581/80-299/98 BC) with a list of kings and archon...

  12. The establishment phases of ETD program in a brand new University

    KAUST Repository

    Baessa, Mohamed A.

    2014-07-24

    For many Universities, ETD is a combination of digitized documents and digitally born documents. But at brand new Universities like King Abdullah University of Science and Technology (KAUST), or more specifically for the libraries that born digitally, est

  13. Integrating high dimensional bi-directional parsing models for gene mention tagging.

    Science.gov (United States)

    Hsu, Chun-Nan; Chang, Yu-Ming; Kuo, Cheng-Ju; Lin, Yu-Shi; Huang, Han-Shen; Chung, I-Fang

    2008-07-01

    Tagging gene and gene product mentions in scientific text is an important initial step of literature mining. In this article, we describe in detail our gene mention tagger participated in BioCreative 2 challenge and analyze what contributes to its good performance. Our tagger is based on the conditional random fields model (CRF), the most prevailing method for the gene mention tagging task in BioCreative 2. Our tagger is interesting because it accomplished the highest F-scores among CRF-based methods and second over all. Moreover, we obtained our results by mostly applying open source packages, making it easy to duplicate our results. We first describe in detail how we developed our CRF-based tagger. We designed a very high dimensional feature set that includes most of information that may be relevant. We trained bi-directional CRF models with the same set of features, one applies forward parsing and the other backward, and integrated two models based on the output scores and dictionary filtering. One of the most prominent factors that contributes to the good performance of our tagger is the integration of an additional backward parsing model. However, from the definition of CRF, it appears that a CRF model is symmetric and bi-directional parsing models will produce the same results. We show that due to different feature settings, a CRF model can be asymmetric and the feature setting for our tagger in BioCreative 2 not only produces different results but also gives backward parsing models slight but constant advantage over forward parsing model. To fully explore the potential of integrating bi-directional parsing models, we applied different asymmetric feature settings to generate many bi-directional parsing models and integrate them based on the output scores. Experimental results show that this integrated model can achieve even higher F-score solely based on the training corpus for gene mention tagging. Data sets, programs and an on-line service of our gene

  14. Greedy algorithms for high-dimensional non-symmetric linear problems***

    Directory of Open Access Journals (Sweden)

    Cancès E.

    2013-12-01

    Full Text Available In this article, we present a family of numerical approaches to solve high-dimensional linear non-symmetric problems. The principle of these methods is to approximate a function which depends on a large number of variates by a sum of tensor product functions, each term of which is iteratively computed via a greedy algorithm ? . There exists a good theoretical framework for these methods in the case of (linear and nonlinear symmetric elliptic problems. However, the convergence results are not valid any more as soon as the problems under consideration are not symmetric. We present here a review of the main algorithms proposed in the literature to circumvent this difficulty, together with some new approaches. The theoretical convergence results and the practical implementation of these algorithms are discussed. Their behaviors are illustrated through some numerical examples. Dans cet article, nous présentons une famille de méthodes numériques pour résoudre des problèmes linéaires non symétriques en grande dimension. Le principe de ces approches est de représenter une fonction dépendant d’un grand nombre de variables sous la forme d’une somme de fonctions produit tensoriel, dont chaque terme est calculé itérativement via un algorithme glouton ? . Ces méthodes possèdent de bonnes propriétés théoriques dans le cas de problèmes elliptiques symétriques (linéaires ou non linéaires, mais celles-ci ne sont plus valables dès lors que les problèmes considérés ne sont plus symétriques. Nous présentons une revue des principaux algorithmes proposés dans la littérature pour contourner cette difficulté ainsi que de nouvelles approches que nous proposons. Les résultats de convergence théoriques et la mise en oeuvre pratique de ces algorithmes sont détaillés et leur comportement est illustré au travers d’exemples numériques.

  15. Digital radiography

    International Nuclear Information System (INIS)

    Brody, W.R.

    1984-01-01

    Digital Radiography begins with an orderly introduction to the fundamental concepts of digital imaging. The entire X-ray digital imagining system is described, from an overall characterization of image quality to specific components required for a digital radiographic system. Because subtraction is central to digital radiographic systems, the author details the use of various subtraction methods for image enhancement. Complex concepts are illustrated with numerous examples and presented in terms that can readily be understood by physicians without an advanced mathematics background. The second part of the book discusses implementations and applications of digital imagining systems based on area and scanned detector technologies. This section includes thorough coverage of digital fluoroscopy, scanned projection radiography, and film-based digital imaging systems, and features a state-of-the-art synopsis of the applications of digital subtraction angiography. The book concludes with a timely assessment of anticipated technological advances

  16. Digital Culture and Digital Library

    Directory of Open Access Journals (Sweden)

    Yalçın Yalçınkaya

    2016-12-01

    Full Text Available In this study; digital culture and digital library which have a vital connection with each other are examined together. The content of the research consists of the interaction of culture, information, digital culture, intellectual technologies, and digital library concepts. The study is an entry work to integrity of digital culture and digital library theories and aims to expand the symmetry. The purpose of the study is to emphasize the relation between the digital culture and digital library theories acting intersection of the subjects that are examined. Also the perspective of the study is based on examining the literature and analytical evaluation in both studies (digital culture and digital library. Within this context, the methodology of the study is essentially descriptive and has an attribute for the transmission and synthesis of distributed findings produced in the field of the research. According to the findings of the study results, digital culture is an inclusive term that describes the effects of intellectual technologies in the field of information and communication. Information becomes energy and the spectrum of the information is expanding in the vertical rise through the digital culture. In this context, the digital library appears as a new living space of a new environment. In essence, the digital library is information-oriented; has intellectual technology support and digital platform; is in a digital format; combines information resources and tools in relationship/communication/cooperation by connectedness, and also it is the dynamic face of the digital culture in time and space independence. Resolved with the study is that the digital libraries are active and effective in the formation of global knowing and/or mass wisdom in the process of digital culture.

  17. Enhanced, targeted sampling of high-dimensional free-energy landscapes using variationally enhanced sampling, with an application to chignolin.

    Science.gov (United States)

    Shaffer, Patrick; Valsson, Omar; Parrinello, Michele

    2016-02-02

    The capabilities of molecular simulations have been greatly extended by a number of widely used enhanced sampling methods that facilitate escaping from metastable states and crossing large barriers. Despite these developments there are still many problems which remain out of reach for these methods which has led to a vigorous effort in this area. One of the most important problems that remains unsolved is sampling high-dimensional free-energy landscapes and systems that are not easily described by a small number of collective variables. In this work we demonstrate a new way to compute free-energy landscapes of high dimensionality based on the previously introduced variationally enhanced sampling, and we apply it to the miniprotein chignolin.

  18. Enhanced, targeted sampling of high-dimensional free-energy landscapes using variationally enhanced sampling, with an application to chignolin

    Science.gov (United States)

    Shaffer, Patrick; Valsson, Omar; Parrinello, Michele

    2016-01-01

    The capabilities of molecular simulations have been greatly extended by a number of widely used enhanced sampling methods that facilitate escaping from metastable states and crossing large barriers. Despite these developments there are still many problems which remain out of reach for these methods which has led to a vigorous effort in this area. One of the most important problems that remains unsolved is sampling high-dimensional free-energy landscapes and systems that are not easily described by a small number of collective variables. In this work we demonstrate a new way to compute free-energy landscapes of high dimensionality based on the previously introduced variationally enhanced sampling, and we apply it to the miniprotein chignolin. PMID:26787868

  19. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.

    Science.gov (United States)

    Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros

    2018-05-01

    We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.

  20. Vacuum expectation values of high-dimensional operators and their contributions to the Bjorken and Ellis-Jaffe sum rules

    International Nuclear Information System (INIS)

    Oganesian, A.G.

    1998-01-01

    A method is proposed for estimating unknown vacuum expectation values of high-dimensional operators. The method is based on the idea that the factorization hypothesis is self-consistent. Results are obtained for all vacuum expectation values of dimension-7 operators, and some estimates for dimension-10 operators are presented as well. The resulting values are used to compute corrections of higher dimensions to the Bjorken and Ellis-Jaffe sum rules

  1. TripAdvisor^{N-D}: A Tourism-Inspired High-Dimensional Space Exploration Framework with Overview and Detail.

    Science.gov (United States)

    Nam, Julia EunJu; Mueller, Klaus

    2013-02-01

    Gaining a true appreciation of high-dimensional space remains difficult since all of the existing high-dimensional space exploration techniques serialize the space travel in some way. This is not so foreign to us since we, when traveling, also experience the world in a serial fashion. But we typically have access to a map to help with positioning, orientation, navigation, and trip planning. Here, we propose a multivariate data exploration tool that compares high-dimensional space navigation with a sightseeing trip. It decomposes this activity into five major tasks: 1) Identify the sights: use a map to identify the sights of interest and their location; 2) Plan the trip: connect the sights of interest along a specifyable path; 3) Go on the trip: travel along the route; 4) Hop off the bus: experience the location, look around, zoom into detail; and 5) Orient and localize: regain bearings in the map. We describe intuitive and interactive tools for all of these tasks, both global navigation within the map and local exploration of the data distributions. For the latter, we describe a polygonal touchpad interface which enables users to smoothly tilt the projection plane in high-dimensional space to produce multivariate scatterplots that best convey the data relationships under investigation. Motion parallax and illustrative motion trails aid in the perception of these transient patterns. We describe the use of our system within two applications: 1) the exploratory discovery of data configurations that best fit a personal preference in the presence of tradeoffs and 2) interactive cluster analysis via cluster sculpting in N-D.

  2. Joint High-Dimensional Bayesian Variable and Covariance Selection with an Application to eQTL Analysis

    KAUST Repository

    Bhadra, Anindya

    2013-04-22

    We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. © 2013, The International Biometric Society.

  3. Miniature JPL Universal Instrument Bus

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Universal Digital Processor Bus architecture using state of the art commercial packaging technologies. This work will transition commercial advanced- yet...

  4. Digital mammography; Mamografia digital

    Energy Technology Data Exchange (ETDEWEB)

    Chevalier, M.; Torres, R.

    2010-07-01

    Mammography represents one of the most demanding radiographic applications, simultaneously requiring excellent contrast sensitivity, high spatial resolution, and wide dynamic range. Film/screen is the most widely extended image receptor in mammography due to both its high spatial resolution and contrast. The film/screen limitations are related with its narrow latitude, structural noise and that is at the same time the medium for the image acquisition, storage and presentation. Several digital detector made with different technologies can overcome these difficulties. Here, these technologies as well as their main advantages and disadvantages are analyzed. Also it is discussed its impact on the mammography examinations, mainly on the breast screening programs. (Author).

  5. Apprendere con il digital storytelling

    Directory of Open Access Journals (Sweden)

    Corrado Petrucco

    2009-01-01

    Full Text Available Discussion deli 'narrative storytelling approach in terms of support that this teaching strategy can' deliver on the cognitive and emotional. The authors report a 'laboratory experience of Digital Storytelling made in a university course of Education.

  6. Scaled Data from Digital Ionograms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — All the data contributed come from 1957 through 1990. They have been digitized, reformatted, converted to universal time (the software also can display the data in...

  7. Photography and digital media: the bloger’s universe in the creative construction of tourist destinations Fotografia e mídia digital: o universo blogueiro na construção criativa de destinos turísticos

    Directory of Open Access Journals (Sweden)

    Silvia Oliveira

    2007-01-01

    Full Text Available Analyses the discourse held in Voyage blogs through texts and pictures. It measures the influence of such digital media in building the tourist image reality. It shows how the scenario construction is ensured by daily publication in blogs. It seeks to point out discursive ant thematic strategies concerning the tourism portrait through photography. It discusses the movement of ideas within the blogosphere. Analisa o discurso conduzido pelos blogs de viagem através de textos e fotos. Verifica a influência dessa mídia digital na construção da realidade da imagem turística. Mostra como é feita a construção de cenários através da publicação cotidiana nos blogs. Busca apontar estratégias discursivas e temáticas, relacionadas ao retrato do turismo através da fotografia. Discute o movimento das idéias em torno da blogosfera.

  8. Digital Tectonics

    DEFF Research Database (Denmark)

    Christiansen, Karl; Borup, Ruben; Søndergaard, Asbjørn

    2014-01-01

    Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated.......Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated....

  9. Digital squares

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Kim, Chul E

    1988-01-01

    Digital squares are defined and their geometric properties characterized. A linear time algorithm is presented that considers a convex digital region and determines whether or not it is a digital square. The algorithm also determines the range of the values of the parameter set of its preimages....... The analysis involves transforming the boundary of a digital region into parameter space of slope and y-intercept...

  10. Digital skrivedidaktik

    DEFF Research Database (Denmark)

    Digital skrivedidaktik består af to dele. Første del præsenterer teori om skrivekompetence og digital skrivning. Digital skrivning er karakteriseret ved at tekster skrives på computer og med digitale værktøjer, hvilket ændrer skrivningens traditionelle praksis, produkt og processer. Hvad er digital...... om elevens skriveproces) og Blogskrivning (der styrker eleverne i at bruge blogs i undervisningen)....

  11. Digital radiology. Clinical experience

    Energy Technology Data Exchange (ETDEWEB)

    Stacul, F; Smathers, R L

    1985-01-01

    The authors report the experience achieved ad the Stanford University (USA) with a digital radiography system which allows the digitization of the film and of the images collected with photostimulable phophors. The phophor is essentially an intensifying screen where a latent image is stored after exposure to X-rays and is extracted by a laser scanning. The images collected with the digitized film and with the phophor (chest, breast, bone) have been analyzed. The digitized film offers potential diagnostic advantages over the conventional film, because of the contrast manipulation and many other processing options. The possibility to recover the information of overexposed films appears very attractive. The photostimulable phophors allow to get good quality images, with a consistent reduction of dose and costs. These plates offer the possibility, in the next future, to replace the conventional screen-film systems.

  12. Digital Citizenship

    Science.gov (United States)

    Isman, Aytekin; Canan Gungoren, Ozlem

    2014-01-01

    Era in which we live is known and referred as digital age.In this age technology is rapidly changed and developed. In light of these technological advances in 21st century, schools have the responsibility of training "digital citizen" as well as a good citizen. Digital citizens must have extensive skills, knowledge, Internet and …

  13. Digital subtraktion

    DEFF Research Database (Denmark)

    Mussmann, Bo Redder

    2004-01-01

    Digital subtraktion er en metode til at fjerne uønskede oplysninger i et røntgenbillede. Subtraktionsteknikken bruges primært i forbindelse med angiografi hvor man kun er interesseret i at se selve karret. Derfor er digital subtraktion i daglig tale synonymt med DSA eller DVI – hhv. Digital...... Subtraction Angiography eller Digital Vascular Imaging. Benævnelserne er to røntgenfirmaers navn for den samme teknik. Digital subtraktion kræver speciel software, samt at apparaturet kan eksponere i serier....

  14. Digital preservation

    CERN Document Server

    Deegan, Marilyn

    2013-01-01

    Digital preservation is an issue of huge importance to the library and information profession right now. With the widescale adoption of the internet and the rise of the world wide web, the world has been overwhelmed by digital information. Digital data is being produced on a massive scale by individuals and institutions: some of it is born, lives and dies only in digital form, and it is the potential death of this data, with its impact on the preservation of culture, that is the concern of this book. So how can information professionals try to remedy this? Digital preservation is a complex iss

  15. Digital Natives or Digital Tribes?

    Science.gov (United States)

    Watson, Ian Robert

    2013-01-01

    This research builds upon the discourse surrounding digital natives. A literature review into the digital native phenomena was undertaken and found that researchers are beginning to identify the digital native as not one cohesive group but of individuals influenced by other factors. Primary research by means of questionnaire survey of technologies…

  16. An integrated service digital network (ISDN)-based international telecommunication between Samsung Medical Center and Hokkaido University using telecommunication helped radiotherapy planning and information system (THERAPIS).

    Science.gov (United States)

    Huh, S J; Shirato, H; Hashimoto, S; Shimizu, S; Kim, D Y; Ahn, Y C; Choi, D; Miyasaka, K; Mizuno, J

    2000-07-01

    This study introduces the integrated service digital network (ISDN)-based international teleradiotherapy system (THERAPIS) in radiation oncology between hospitals in Seoul, South Korea and in Sapporo, Japan. THERAPIS has the following functions: (1) exchange of patient's image data, (2) real-time teleconference, and (3) communication of the treatment planning, dose calculation and distribution, and of portal verification images between the remote hospitals. Our preliminary results of applications on eight patients demonstrated that the international telecommunication using THERAPIS was clinically useful and satisfactory with sufficient bandwidth for the transfer of patient data for clinical use in radiation oncology.

  17. An integrated service digital network (ISDN)-based international telecommunication between Samsung Medical Center and Hokkaido University using telecommunication helped radiotherapy planning and information system (THERAPIS)

    International Nuclear Information System (INIS)

    Huh, S.J.; Kim, D.Y.; Ahn, Y.C.; Choi, D.; Shirato, H.; Hashimoto, S.; Shimizu, S.; Miyasaka, K.; Mizuno, J.

    2000-01-01

    This study introduces the integrated service digital network (ISDN)-based international teleradiotherapy system (THERAPIS) in radiation oncology between hospitals in Seoul, South Korea and in Sapporo, Japan. THERAPIS has the following functions: (1) exchange of patient's image data, (2) real-time teleconference, and (3) communication of the treatment planning, dose calculation and distribution, and of portal verification images between the remote hospitals. Our preliminary results of applications on eight patients demonstrated that the international telecommunication using THERAPIS was clinically useful and satisfactory with sufficient bandwidth for the transfer of patient data for clinical use in radiation oncology. (author)

  18. An angle-based subspace anomaly detection approach to high-dimensional data: With an application to industrial fault detection

    International Nuclear Information System (INIS)

    Zhang, Liangwei; Lin, Jing; Karim, Ramin

    2015-01-01

    The accuracy of traditional anomaly detection techniques implemented on full-dimensional spaces degrades significantly as dimensionality increases, thereby hampering many real-world applications. This work proposes an approach to selecting meaningful feature subspace and conducting anomaly detection in the corresponding subspace projection. The aim is to maintain the detection accuracy in high-dimensional circumstances. The suggested approach assesses the angle between all pairs of two lines for one specific anomaly candidate: the first line is connected by the relevant data point and the center of its adjacent points; the other line is one of the axis-parallel lines. Those dimensions which have a relatively small angle with the first line are then chosen to constitute the axis-parallel subspace for the candidate. Next, a normalized Mahalanobis distance is introduced to measure the local outlier-ness of an object in the subspace projection. To comprehensively compare the proposed algorithm with several existing anomaly detection techniques, we constructed artificial datasets with various high-dimensional settings and found the algorithm displayed superior accuracy. A further experiment on an industrial dataset demonstrated the applicability of the proposed algorithm in fault detection tasks and highlighted another of its merits, namely, to provide preliminary interpretation of abnormality through feature ordering in relevant subspaces. - Highlights: • An anomaly detection approach for high-dimensional reliability data is proposed. • The approach selects relevant subspaces by assessing vectorial angles. • The novel ABSAD approach displays superior accuracy over other alternatives. • Numerical illustration approves its efficacy in fault detection applications

  19. AN EFFECTIVE MULTI-CLUSTERING ANONYMIZATION APPROACH USING DISCRETE COMPONENT TASK FOR NON-BINARY HIGH DIMENSIONAL DATA SPACES

    Directory of Open Access Journals (Sweden)

    L.V. Arun Shalin

    2016-01-01

    Full Text Available Clustering is a process of grouping elements together, designed in such a way that the elements assigned to similar data points in a cluster are more comparable to each other than the remaining data points in a cluster. During clustering certain difficulties related when dealing with high dimensional data are ubiquitous and abundant. Works concentrated using anonymization method for high dimensional data spaces failed to address the problem related to dimensionality reduction during the inclusion of non-binary databases. In this work we study methods for dimensionality reduction for non-binary database. By analyzing the behavior of dimensionality reduction for non-binary database, results in performance improvement with the help of tag based feature. An effective multi-clustering anonymization approach called Discrete Component Task Specific Multi-Clustering (DCTSM is presented for dimensionality reduction on non-binary database. To start with we present the analysis of attribute in the non-binary database and cluster projection identifies the sparseness degree of dimensions. Additionally with the quantum distribution on multi-cluster dimension, the solution for relevancy of attribute and redundancy on non-binary data spaces is provided resulting in performance improvement on the basis of tag based feature. Multi-clustering tag based feature reduction extracts individual features and are correspondingly replaced by the equivalent feature clusters (i.e. tag clusters. During training, the DCTSM approach uses multi-clusters instead of individual tag features and then during decoding individual features is replaced by corresponding multi-clusters. To measure the effectiveness of the method, experiments are conducted on existing anonymization method for high dimensional data spaces and compared with the DCTSM approach using Statlog German Credit Data Set. Improved tag feature extraction and minimum error rate compared to conventional anonymization

  20. Stable long-time semiclassical description of zero-point energy in high-dimensional molecular systems.

    Science.gov (United States)

    Garashchuk, Sophya; Rassolov, Vitaly A

    2008-07-14

    Semiclassical implementation of the quantum trajectory formalism [J. Chem. Phys. 120, 1181 (2004)] is further developed to give a stable long-time description of zero-point energy in anharmonic systems of high dimensionality. The method is based on a numerically cheap linearized quantum force approach; stabilizing terms compensating for the linearization errors are added into the time-evolution equations for the classical and nonclassical components of the momentum operator. The wave function normalization and energy are rigorously conserved. Numerical tests are performed for model systems of up to 40 degrees of freedom.

  1. Conjugate-Gradient Neural Networks in Classification of Multisource and Very-High-Dimensional Remote Sensing Data

    Science.gov (United States)

    Benediktsson, J. A.; Swain, P. H.; Ersoy, O. K.

    1993-01-01

    Application of neural networks to classification of remote sensing data is discussed. Conventional two-layer backpropagation is found to give good results in classification of remote sensing data but is not efficient in training. A more efficient variant, based on conjugate-gradient optimization, is used for classification of multisource remote sensing and geographic data and very-high-dimensional data. The conjugate-gradient neural networks give excellent performance in classification of multisource data, but do not compare as well with statistical methods in classification of very-high-dimentional data.

  2. Mewujudkan Sekolah atau Kampus Digital

    Directory of Open Access Journals (Sweden)

    M. Eka Mahmud

    2011-06-01

    Full Text Available The information age is supported by the power of information and communication technology, known as ICT (information communication and technology, has a great influence in the lives of everyday people, such as ways of working and managing organizations (including educational institutions as well as the perception of the outside world. Thus, interactions between individuals, organizations, communities and countries can be carried out unimpeded by space, time and integrated into the global communications network. Management of the school/university-based digital is managing cultural change, values, and ways to communicate and interact in the education building. Strategic steps undertaken to get to school / university digital is the establishment of an agency that works to provide accelerated programs school/university digital works; work easier for the technical operational issues related to the digital world, he worked his support to all operational difficulties be easy.

  3. Digital mammography

    International Nuclear Information System (INIS)

    Bick, Ulrich; Diekmann, Felix

    2010-01-01

    This state-of-the-art reference book provides in-depth coverage of all aspects of digital mammography, including detector technology, image processing, computer-aided diagnosis, soft-copy reading, digital workflow, and PACS. Specific advantages and disadvantages of digital mammography in comparison to screen-film mammography are thoroughly discussed. By including authors from both North America and Europe, the book is able to outline variations in the use, acceptance, and quality assurance of digital mammography between the different countries and screening programs. Advanced imaging techniques and future developments such as contrast mammography and digital breast tomosynthesis are also covered in detail. All of the chapters are written by internationally recognized experts and contain numerous high-quality illustrations. This book will be of great interest both to clinicians who already use or are transitioning to digital mammography and to basic scientists working in the field. (orig.)

  4. Digital Insights

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark

    , by incorporating media as both channel, frame, and apparatus for advertising response, the dissertation brings into attention that more aspects than the text-reader relationship influence ad response. Finally, the dissertation proposes the assemblage approach for exploring big data in consumer culture research...... and practices with digital media, when they meet and interpret advertising. Through studies of advertising response on YouTube and experiments with consumers’ response to digitally manipulated images, the dissertation shows how digital media practices facilitate polysemic and socially embedded advertising......This dissertation forwards the theory of digital consumer-response as a perspective to examine how digital media practices influence consumers’ response to advertising. Digital consumer-response is a development of advertising theory that encompasses how consumers employ their knowledge...

  5. Linguistics and the digital humanities

    DEFF Research Database (Denmark)

    Jensen, Kim Ebensgaard

    2014-01-01

    Corpus linguistics has been closely intertwined with digital technology since the introduction of university computer mainframes in the 1960s. Making use of both digitized data in the form of the language corpus and computational methods of analysis involving concordancers and statistics software......, corpus linguistics arguably has a place in the digital humanities. Still, it remains obscure and figures only sporadically in the literature on the digital humanities. This article provides an overview of the main principles of corpus linguistics and the role of computer technology in relation to data...... and method and also offers a bird's-eye view of the history of corpus linguistics with a focus on its intimate relationship with digital technology and how digital technology has impacted the very core of corpus linguistics and shaped the identity of the corpus linguist. Ultimately, the article is oriented...

  6. Digital Signage

    OpenAIRE

    Fischer, Karl Peter

    2011-01-01

    Digital Signage for in-store advertising at gas stations/retail stores in Germany: A field study Digital Signage networks provide a novel means of advertising with the advantage of easily changeable and highly customizable animated content. Despite the potential and increasing use of these media empirical research is scarce. In a field study at 8 gas stations (with integrated convenience stores) we studied the effect of digital signage advertising on sales for different products and servi...

  7. Fast digital recorders of signal shaping

    International Nuclear Information System (INIS)

    Meleshko, E.A.

    1997-01-01

    Methodology of fast digital registration and pulse signals through fast-action analog-to-digital converters is considered. Systems of digital recorders: sampling and storage devices and operational memory units are described. Main attention is paid to developing parallel analog-to-digital converters, making it possible to bring the conversion frequencies up to several gigahertzes are described. Parallel-sequential analog-to-digital converters, combining high action with increased accuracy are also considered. Concrete examples of designing universal and specialized digital signal recorders, applied in experimental physics, are presented. 44 refs., 12 figs

  8. Architectural management in the digital arena : proceedings of the CIB-W096 conference Vienna 2011, Vienna University of Technology, Austria, 13-14 October 2011

    NARCIS (Netherlands)

    Otter, den A.F.H.J.; Emmitt, S.; Achammer, Ch.

    2011-01-01

    Leading research into architectural design management is the CIB’s working committee W096 Architectural Management. CIB-W096 was officially established in 1993, following a conference on ‘Architectural Management’ at the University of Nottingham in the UK. Since this time the commission has been

  9. Sports Digitalization

    DEFF Research Database (Denmark)

    Xiao, Xiao; Hedman, Jonas; Tan, Felix Ter Chian

    2017-01-01

    evolution, as digital technologies are increasingly entrenched in a wide range of sporting activities and for applications beyond mere performance enhancement. Despite such trends, research on sports digitalization in the IS discipline is surprisingly still nascent. This paper aims at establishing...... a discourse on sports digitalization within the discipline. Toward this, we first provide an understanding of the institutional characteristics of the sports industry, establishing its theoretical importance and relevance in our discipline; second, we reveal the latest trends of digitalization in the sports...

  10. Digital printing

    Science.gov (United States)

    Sobotka, Werner K.

    1997-02-01

    Digital printing is described as a tool to replace conventional printing machines completely. Still this goal was not reached until now with any of the digital printing technologies to be described in the paper. Productivity and costs are still the main parameters and are not really solved until now. Quality in digital printing is no problem anymore. Definition of digital printing is to transfer digital datas directly on the paper surface. This step can be carried out directly or with the use of an intermediate image carrier. Keywords in digital printing are: computer- to-press; erasable image carrier; image carrier with memory. Digital printing is also the logical development of the new digital area as it is pointed out in Nicholas Negropotes book 'Being Digital' and also the answer to networking and Internet technologies. Creating images text and color in one country and publishing the datas in another country or continent is the main advantage. Printing on demand another big advantage and last but not least personalization the last big advantage. Costs and being able to coop with this new world of prepress technology is the biggest disadvantage. Therefore the very optimistic growth rates for the next few years are really nonexistent. The development of complete new markets is too slow and the replacing of old markets is too small.

  11. Review of DigitalSignage.com

    OpenAIRE

    Clifford Richmond; Matthew Daley

    2014-01-01

    Digital signage has been used in the commercial sector for decades. As display and networking technologies become more advanced and less expensive, it is surprisingly easy to implement a digital signage program at a minimal cost. In the fall of 2011, the University of Florida (UF), Health Sciences Center Library (HSCL) initiated the use of digital signage inside and outside its Gainesville, Florida facility. This article details UF HSCL's use and evaluation of DigitalSignage.com signage so...

  12. DSH: Special Issue 'Digital Humanities 2014'

    OpenAIRE

    Terras Melissa; Clivaz Claire; Verhoeven Deb; Kaplan Frédéric

    2015-01-01

    The special issue presents selected proceedings from Digital Humanities 2014. Digital Humanities is the annual conference organized and sponsored by the Alliance of Digital Humanities Organizations (ADHO) and between 7th and 12th July 2014 over 700 attendees gathered for DH2014 at the University of Lausanne and the École polytechnique fédérale de Lausanne Switzerland (http://dh2014.org/): to date this remains the largest ever meeting of Digital Humanities scholars worldwide. Traditionally the...

  13. High-Dimensional Additive Hazards Regression for Oral Squamous Cell Carcinoma Using Microarray Data: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Omid Hamidi

    2014-01-01

    Full Text Available Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P<0.001. The Lasso showed maximum median of area under ROC curve over time (0.95 and smoothly clipped absolute deviation showed the lowest prediction error (0.105. It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements.

  14. TSAR: a program for automatic resonance assignment using 2D cross-sections of high dimensionality, high-resolution spectra

    Energy Technology Data Exchange (ETDEWEB)

    Zawadzka-Kazimierczuk, Anna; Kozminski, Wiktor [University of Warsaw, Faculty of Chemistry (Poland); Billeter, Martin, E-mail: martin.billeter@chem.gu.se [University of Gothenburg, Biophysics Group, Department of Chemistry and Molecular Biology (Sweden)

    2012-09-15

    While NMR studies of proteins typically aim at structure, dynamics or interactions, resonance assignments represent in almost all cases the initial step of the analysis. With increasing complexity of the NMR spectra, for example due to decreasing extent of ordered structure, this task often becomes both difficult and time-consuming, and the recording of high-dimensional data with high-resolution may be essential. Random sampling of the evolution time space, combined with sparse multidimensional Fourier transform (SMFT), allows for efficient recording of very high dimensional spectra ({>=}4 dimensions) while maintaining high resolution. However, the nature of this data demands for automation of the assignment process. Here we present the program TSAR (Tool for SMFT-based Assignment of Resonances), which exploits all advantages of SMFT input. Moreover, its flexibility allows to process data from any type of experiments that provide sequential connectivities. The algorithm was tested on several protein samples, including a disordered 81-residue fragment of the {delta} subunit of RNA polymerase from Bacillus subtilis containing various repetitive sequences. For our test examples, TSAR achieves a high percentage of assigned residues without any erroneous assignments.

  15. Network-based regularization for high dimensional SNP data in the case-control study of Type 2 diabetes.

    Science.gov (United States)

    Ren, Jie; He, Tao; Li, Ye; Liu, Sai; Du, Yinhao; Jiang, Yu; Wu, Cen

    2017-05-16

    Over the past decades, the prevalence of type 2 diabetes mellitus (T2D) has been steadily increasing around the world. Despite large efforts devoted to better understand the genetic basis of the disease, the identified susceptibility loci can only account for a small portion of the T2D heritability. Some of the existing approaches proposed for the high dimensional genetic data from the T2D case-control study are limited by analyzing a few number of SNPs at a time from a large pool of SNPs, by ignoring the correlations among SNPs and by adopting inefficient selection techniques. We propose a network constrained regularization method to select important SNPs by taking the linkage disequilibrium into account. To accomodate the case control study, an iteratively reweighted least square algorithm has been developed within the coordinate descent framework where optimization of the regularized logistic loss function is performed with respect to one parameter at a time and iteratively cycle through all the parameters until convergence. In this article, a novel approach is developed to identify important SNPs more effectively through incorporating the interconnections among them in the regularized selection. A coordinate descent based iteratively reweighed least squares (IRLS) algorithm has been proposed. Both the simulation study and the analysis of the Nurses's Health Study, a case-control study of type 2 diabetes data with high dimensional SNP measurements, demonstrate the advantage of the network based approach over the competing alternatives.

  16. Applications of hybrid and digital computation methods in aerospace-related sciences and engineering. [problem solving methods at the University of Houston

    Science.gov (United States)

    Huang, C. J.; Motard, R. L.

    1978-01-01

    The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.

  17. Digital Audiobooks

    DEFF Research Database (Denmark)

    Have, Iben; Pedersen, Birgitte Stougaard

    Audiobooks are rapidly gaining popularity with widely accessible digital downloading and streaming services. The paper is framing how the digital audiobook expands and changes the target groups for book publications and how it as an everyday activity is creating new reading experiences, places...

  18. Digital TMI

    Science.gov (United States)

    Rios, Joseph

    2012-01-01

    Presenting the current status of the Digital TMI project to visiting members of the FAA Command Center. Digital TMI is an effort to store national-level traffic management initiatives in a standards-compliant manner. Work is funded by the FAA.

  19. Digital Social Science Lab

    DEFF Research Database (Denmark)

    Svendsen, Michael; Lauersen, Christian Ulrich

    2015-01-01

    At the Faculty Library of Social Sciences (part of Copenhagen University Library) we are currently working intensely towards the establishment of a Digital Social Science Lab (DSSL). The purpose of the lab is to connect research, education and learning processes with the use of digital tools...... at the Faculty of Social Sciences. DSSL will host and facilitate an 80 m2 large mobile and intelligent study- and learning environment with a focus on academic events, teaching and collaboration. Besides the physical settings DSSL has two primary functions: 1. To implement relevant social scientific software...... and hardware at the disposal for students and staff at The Faculty of Social Sciences along with instruction and teaching in the different types of software, e.g. Stata, Nvivo, Atlas.ti, R Studio, Zotero and GIS-software. 2. To facilitate academic events focusing on use of digital tools and analytic software...

  20. Digital displacements

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2014-01-01

    In recent years digital reforms are being introduced in the municipal landscape of Denmark. The reforms address the interaction between citizen and local authority. The aim is, that by 2015 at least 80 per cent of all correspondence between citizens and public authority will be transmitted through...... digital interface. However, the transformation of citizen services from traditional face-to-face interaction to digital self-service gives rise to new practices; some citizens need support to be able to manage self-service through digital tools. A mixture of support and teaching, named co......-service, is a new task in public administration, where street level bureaucrats assist citizens in using the new digital solutions. The paper is based on a case study conducted primarily in a citizen service centre in Copenhagen, Denmark. Based on ethnography the paper gives an empirical account of the ongoing...

  1. Digitized mammograms

    International Nuclear Information System (INIS)

    Bruneton, J.N.; Balu-Maestro, C.; Rogopoulos, A.; Chauvel, C.; Geoffray, A.

    1988-01-01

    Two observers conducted a blind evaluation of 100 mammography files, including 47 malignant cases. Films were read both before and after image digitization at 50 μm and 100 μm with the FilmDRSII. Digitization permitted better analysis of the normal anatomic structures and moderately improved diagnostic sensitivity. Searches for microcalcifications before and after digitization at 100 μm and 50 μm showed better analysis of anatomic structures after digitization (especially for solitary microcalcifications). The diagnostic benefit, with discovery of clustered microcalcifications, was more limited (one case at 100 μm, nine cases at 50 μm). Recognition of microcalcifications was clearly improved in dense breasts, which can benefit from reinterpretation after digitization at 50 μm rather 100μm

  2. High dimensional ICA analysis detects within-network functional connectivity damage of default mode and sensory motor networks in Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    Ottavia eDipasquale

    2015-02-01

    Full Text Available High dimensional independent component analysis (ICA, compared to low dimensional ICA, allows performing a detailed parcellation of the resting state networks. The purpose of this study was to give further insight into functional connectivity (FC in Alzheimer’s disease (AD using high dimensional ICA. For this reason, we performed both low and high dimensional ICA analyses of resting state fMRI (rfMRI data of 20 healthy controls and 21 AD patients, focusing on the primarily altered default mode network (DMN and exploring the sensory motor network (SMN. As expected, results obtained at low dimensionality were in line with previous literature. Moreover, high dimensional results allowed us to observe either the presence of within-network disconnections and FC damage confined to some of the resting state sub-networks. Due to the higher sensitivity of the high dimensional ICA analysis, our results suggest that high-dimensional decomposition in sub-networks is very promising to better localize FC alterations in AD and that FC damage is not confined to the default mode network.

  3. Exploring the Lyapunov instability properties of high-dimensional atmospheric and climate models

    Science.gov (United States)

    De Cruz, Lesley; Schubert, Sebastian; Demaeyer, Jonathan; Lucarini, Valerio; Vannitsem, Stéphane

    2018-05-01

    The stability properties of intermediate-order climate models are investigated by computing their Lyapunov exponents (LEs). The two models considered are PUMA (Portable University Model of the Atmosphere), a primitive-equation simple general circulation model, and MAOOAM (Modular class="text">Arbitrary-Order Ocean-Atmosphere Model), a quasi-geostrophic coupled ocean-class="text">atmosphere model on a β-plane. We wish to investigate the effect of the different levels of filtering on the instabilities and dynamics of the atmospheric flows. Moreover, we assess the impact of the oceanic coupling, the dissipation scheme, and the resolution on the spectra of LEs. The PUMA Lyapunov spectrum is computed for two different values of the meridional temperature gradient defining the Newtonian forcing to the temperature field. The increase in the gradient gives rise to a higher baroclinicity and stronger instabilities, corresponding to a larger dimension of the unstable manifold and a larger first LE. The Kaplan-Yorke dimension of the attractor increases as well. The convergence rate of the rate function for the large deviation law of the finite-time Lyapunov exponents (FTLEs) is fast for all exponents, which can be interpreted as resulting from the absence of a clear-cut atmospheric timescale separation in such a model. The MAOOAM spectra show that the dominant atmospheric instability is correctly represented even at low resolutions. However, the dynamics of the central manifold, which is mostly associated with the ocean dynamics, is not fully resolved because of its associated long timescales, even at intermediate orders. As expected, increasing the mechanical atmosphere-ocean coupling coefficient or introducing a turbulent diffusion parametrisation reduces the Kaplan-Yorke dimension and Kolmogorov-Sinai entropy. In all considered configurations, we are not yet in the regime in which one can robustly define large deviation laws describing the statistics of the FTLEs. This

  4. Comparative evaluation of image quality of various digital radiology systems and quality control and optimization of radiation protection at the Amiens university hospital

    International Nuclear Information System (INIS)

    Foro, Saturnin Didace L.

    2004-01-01

    This study was centered on two axes: the first is the comparative evaluation of image quality of various numerical radiology system, the second is the quality control and the optimization of protection against radiation. The publication of directive 97/43 Euratom from council of June 30, 1997 and consequently, the decree of February 12, 2004 founded a strict lawful framework that is engaged to respect the University Hospital of Amiens by the installation of a operational dosimetry system and the application of February 12, 2004 decree. (author) [fr

  5. TH-CD-207A-07: Prediction of High Dimensional State Subject to Respiratory Motion: A Manifold Learning Approach

    International Nuclear Information System (INIS)

    Liu, W; Sawant, A; Ruan, D

    2016-01-01

    Purpose: The development of high dimensional imaging systems (e.g. volumetric MRI, CBCT, photogrammetry systems) in image-guided radiotherapy provides important pathways to the ultimate goal of real-time volumetric/surface motion monitoring. This study aims to develop a prediction method for the high dimensional state subject to respiratory motion. Compared to conventional linear dimension reduction based approaches, our method utilizes manifold learning to construct a descriptive feature submanifold, where more efficient and accurate prediction can be performed. Methods: We developed a prediction framework for high-dimensional state subject to respiratory motion. The proposed method performs dimension reduction in a nonlinear setting to permit more descriptive features compared to its linear counterparts (e.g., classic PCA). Specifically, a kernel PCA is used to construct a proper low-dimensional feature manifold, where low-dimensional prediction is performed. A fixed-point iterative pre-image estimation method is applied subsequently to recover the predicted value in the original state space. We evaluated and compared the proposed method with PCA-based method on 200 level-set surfaces reconstructed from surface point clouds captured by the VisionRT system. The prediction accuracy was evaluated with respect to root-mean-squared-error (RMSE) for both 200ms and 600ms lookahead lengths. Results: The proposed method outperformed PCA-based approach with statistically higher prediction accuracy. In one-dimensional feature subspace, our method achieved mean prediction accuracy of 0.86mm and 0.89mm for 200ms and 600ms lookahead lengths respectively, compared to 0.95mm and 1.04mm from PCA-based method. The paired t-tests further demonstrated the statistical significance of the superiority of our method, with p-values of 6.33e-3 and 5.78e-5, respectively. Conclusion: The proposed approach benefits from the descriptiveness of a nonlinear manifold and the prediction

  6. Digital Ethics/Going Digital.

    Science.gov (United States)

    Wilson, Bradley

    1996-01-01

    Finds that the recent National Press Photographers Association code of ethics can serve as a model for any photography staff. Discusses how digital imaging is becoming commonplace in classrooms, due to decreasing costs and easier software. Explains digital terminology. Concludes that time saved in the darkroom and at the printer is now spent on…

  7. Digital radiography

    International Nuclear Information System (INIS)

    Coulomb, M.; Dal Soglio, S.; Pittet-Barbier, L.; Ranchoup, Y.; Thony, F.; Ferretti, G.; Robert, F.

    1992-01-01

    Digital projection radiography may replace conventional radiography some day, provided it can meet several requirements: equal or better diagnostic effectiveness of the screen-film systems; reasonable image cost; real improvement in the productivity of the Departments of Imaging. All digital radiographic systems include an X-ray source, an image acquisition and formatting sub-system, a display and manipulation sub-system, and archiving subsystem and a laser editing system, preferably shared by other sources of digital images. Three digitization processes are available: digitization of the radiographic film, digital fluorography and phospholuminescent detectors with memory. The advantages of digital fluoroscopy are appealing: real-time image acquisition, suppression of cassettes; but its disadvantages are far from negligible: it cannot be applied to bedside radiography, the field of examination is limited, and the wide-field spatial resolution is poor. Phospholuminescent detectors with memory have great advantages: they can be used for bedside radiographs and on all the common radiographic systems; spatial resolution is satisfactory; its current disadvantages are considerable. These two systems, have common properties making up the entire philosophy of digital radiology and specific features that must guide our choice according to the application. Digital fluorography is best applied in pediatric radiology. However, evaluation works have showed that it was applicable with sufficient quality to many indications of general radiology in which a fluoroscopic control and fast acquisition of the images are essential; the time gained on the examination may be considerable, as well as the savings on film. Detectors with memory are required for bedside radiographs, in osteoarticular and thoracic radiology, in all cases of traumatic emergency and in the resuscitation and intensive care departments

  8. Becoming digital

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2015-01-01

    . An ethnographic account of how digital reforms are implemented in practice shows how street-level bureaucrat’s classic tasks such as specialized casework are being reconfigured into educational tasks that promote the idea of “becoming digital”. In the paper, the author argues that the work of “becoming digital....... Originality/value: The study contributes to ethnographic research in public administration by combining two separate subfields, e-government and street-level bureaucracy, to discern recent transformations in public service delivery. In the digital era, tasks, control and equality are distributed in ways...

  9. Digital Humanities

    DEFF Research Database (Denmark)

    Brügger, Niels

    2016-01-01

    , and preserving material to study, as an object of study in its own right, as an analytical tool, or for collaborating, and for disseminating results. The term "digital humanities" was coined around 2001, and gained currency within academia in the following years. However, computers had been used within......Digital humanities is an umbrella term for theories, methodologies, and practices related to humanities scholarship that use the digital computer as an integrated and essential part of its research and teaching activities. The computer can be used for establishing, finding, collecting...

  10. Digital Snaps

    DEFF Research Database (Denmark)

    Sandbye, Mette; Larsen, Jonas

    . Distance as the New Punctum / Mikko Villi -- pt. II. FAMILY ALBUMS IN TRANSITION -- ch. 4. How Digital Technologies Do Family Snaps, Only Better / Gillian Rose -- ch. 5. Friendship Photography: Memory, Mobility and Social Networking / Joanne Garde-Hansen -- ch. 6. Play, Process and Materiality in Japanese...... -- ch. 9. Retouch Yourself: The Pleasures and Politics of Digital Cosmetic Surgery / Tanya Sheehan -- ch. 10. Virtual Selves: Art and Digital Autobiography / Louise Wolthers -- ch. 11. Mobile-Media Photography: New Modes of Engagement / Michael Shanks and Connie Svabo....

  11. Digital electronics

    CERN Document Server

    Morris, John

    2013-01-01

    An essential companion to John C Morris's 'Analogue Electronics', this clear and accessible text is designed for electronics students, teachers and enthusiasts who already have a basic understanding of electronics, and who wish to develop their knowledge of digital techniques and applications. Employing a discovery-based approach, the author covers fundamental theory before going on to develop an appreciation of logic networks, integrated circuit applications and analogue-digital conversion. A section on digital fault finding and useful ic data sheets completes th

  12. Digital Leadership

    DEFF Research Database (Denmark)

    Zupancic, Tadeja; Verbeke, Johan; Achten, Henri

    2016-01-01

    Leadership is an important quality in organisations. Leadership is needed to introduce change and innovation. In our opinion, in architectural and design practices, the role of leadership has not yet been sufficiently studied, especially when it comes to the role of digital tools and media....... With this paper we intend to initiate a discussion in the eCAADe community to reflect and develop ideas in order to develop digital leadership skills amongst the membership. This paper introduces some important aspects, which may be valuable to look into when developing digital leadership skills....

  13. Digital radiography

    International Nuclear Information System (INIS)

    Zani, M.L.

    2002-01-01

    X-ray radiography is a very common technique used to check the homogeneity of a material or the inside of a mechanical part. Generally the radiation that goes through the material to check, produced an image on a sensitized film. This method requires time because the film needs to be developed, digital radiography has no longer this inconvenient. In digital radiography the film is replaced by digital data and can be processed as any computer file. This new technique is promising but its main inconvenient is that today its resolution is not so good as that of film radiography. (A.C.)

  14. Digital radiography

    International Nuclear Information System (INIS)

    Kusano, Shoichi

    1993-01-01

    Firstly, from an historic point of view, fundamental concepts on digital imaging were reviewed to provide a foundation for discussion of digital radiography. Secondly, this review summarized the results of ongoing research in computed radiography that replaces the conventional film-screen system with a photo-stimulable phosphor plate; and thirdly, image quality, radiation protection, and image processing techniques were discussed with emphasis on picture archiving and communication system environment as our final goal. Finally, future expansion of digital radiography was described based on the present utilization of computed tomography at the National Defense Medical College Hospital. (author) 60 refs

  15. Thermodynamics of noncommutative high-dimensional AdS black holes with non-Gaussian smeared matter distributions

    Energy Technology Data Exchange (ETDEWEB)

    Miao, Yan-Gang [Nankai University, School of Physics, Tianjin (China); Chinese Academy of Sciences, State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, P.O. Box 2735, Beijing (China); CERN, PH-TH Division, Geneva 23 (Switzerland); Xu, Zhen-Ming [Nankai University, School of Physics, Tianjin (China)

    2016-04-15

    Considering non-Gaussian smeared matter distributions, we investigate the thermodynamic behaviors of the noncommutative high-dimensional Schwarzschild-Tangherlini anti-de Sitter black hole, and we obtain the condition for the existence of extreme black holes. We indicate that the Gaussian smeared matter distribution, which is a special case of non-Gaussian smeared matter distributions, is not applicable for the six- and higher-dimensional black holes due to the hoop conjecture. In particular, the phase transition is analyzed in detail. Moreover, we point out that the Maxwell equal area law holds for the noncommutative black hole whose Hawking temperature is within a specific range, but fails for one whose the Hawking temperature is beyond this range. (orig.)

  16. Thermodynamics of noncommutative high-dimensional AdS black holes with non-Gaussian smeared matter distributions

    CERN Document Server

    Miao, Yan-Gang

    2016-01-01

    Considering non-Gaussian smeared matter distributions, we investigate thermodynamic behaviors of the noncommutative high-dimensional Schwarzschild-Tangherlini anti-de Sitter black hole, and obtain the condition for the existence of extreme black holes. We indicate that the Gaussian smeared matter distribution, which is a special case of non-Gaussian smeared matter distributions, is not applicable for the 6- and higher-dimensional black holes due to the hoop conjecture. In particular, the phase transition is analyzed in detail. Moreover, we point out that the Maxwell equal area law maintains for the noncommutative black hole with the Hawking temperature within a specific range, but fails with the Hawking temperature beyond this range.

  17. Estimation of the local response to a forcing in a high dimensional system using the fluctuation-dissipation theorem

    Directory of Open Access Journals (Sweden)

    F. C. Cooper

    2013-04-01

    Full Text Available The fluctuation-dissipation theorem (FDT has been proposed as a method of calculating the response of the earth's atmosphere to a forcing. For this problem the high dimensionality of the relevant data sets makes truncation necessary. Here we propose a method of truncation based upon the assumption that the response to a localised forcing is spatially localised, as an alternative to the standard method of choosing a number of the leading empirical orthogonal functions. For systems where this assumption holds, the response to any sufficiently small non-localised forcing may be estimated using a set of truncations that are chosen algorithmically. We test our algorithm using 36 and 72 variable versions of a stochastic Lorenz 95 system of ordinary differential equations. We find that, for long integrations, the bias in the response estimated by the FDT is reduced from ~75% of the true response to ~30%.

  18. Efficient computation of k-Nearest Neighbour Graphs for large high-dimensional data sets on GPU clusters.

    Directory of Open Access Journals (Sweden)

    Ali Dashti

    Full Text Available This paper presents an implementation of the brute-force exact k-Nearest Neighbor Graph (k-NNG construction for ultra-large high-dimensional data cloud. The proposed method uses Graphics Processing Units (GPUs and is scalable with multi-levels of parallelism (between nodes of a cluster, between different GPUs on a single node, and within a GPU. The method is applicable to homogeneous computing clusters with a varying number of nodes and GPUs per node. We achieve a 6-fold speedup in data processing as compared with an optimized method running on a cluster of CPUs and bring a hitherto impossible [Formula: see text]-NNG generation for a dataset of twenty million images with 15 k dimensionality into the realm of practical possibility.

  19. Clustering high-dimensional mixed data to uncover sub-phenotypes: joint analysis of phenotypic and genotypic data.

    Science.gov (United States)

    McParland, D; Phillips, C M; Brennan, L; Roche, H M; Gormley, I C

    2017-12-10

    The LIPGENE-SU.VI.MAX study, like many others, recorded high-dimensional continuous phenotypic data and categorical genotypic data. LIPGENE-SU.VI.MAX focuses on the need to account for both phenotypic and genetic factors when studying the metabolic syndrome (MetS), a complex disorder that can lead to higher risk of type 2 diabetes and cardiovascular disease. Interest lies in clustering the LIPGENE-SU.VI.MAX participants into homogeneous groups or sub-phenotypes, by jointly considering their phenotypic and genotypic data, and in determining which variables are discriminatory. A novel latent variable model that elegantly accommodates high dimensional, mixed data is developed to cluster LIPGENE-SU.VI.MAX participants using a Bayesian finite mixture model. A computationally efficient variable selection algorithm is incorporated, estimation is via a Gibbs sampling algorithm and an approximate BIC-MCMC criterion is developed to select the optimal model. Two clusters or sub-phenotypes ('healthy' and 'at risk') are uncovered. A small subset of variables is deemed discriminatory, which notably includes phenotypic and genotypic variables, highlighting the need to jointly consider both factors. Further, 7 years after the LIPGENE-SU.VI.MAX data were collected, participants underwent further analysis to diagnose presence or absence of the MetS. The two uncovered sub-phenotypes strongly correspond to the 7-year follow-up disease classification, highlighting the role of phenotypic and genotypic factors in the MetS and emphasising the potential utility of the clustering approach in early screening. Additionally, the ability of the proposed approach to define the uncertainty in sub-phenotype membership at the participant level is synonymous with the concepts of precision medicine and nutrition. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. A Fast Exact k-Nearest Neighbors Algorithm for High Dimensional Search Using k-Means Clustering and Triangle Inequality.

    Science.gov (United States)

    Wang, Xueyi

    2012-02-08

    The k-nearest neighbors (k-NN) algorithm is a widely used machine learning method that finds nearest neighbors of a test object in a feature space. We present a new exact k-NN algorithm called kMkNN (k-Means for k-Nearest Neighbors) that uses the k-means clustering and the triangle inequality to accelerate the searching for nearest neighbors in a high dimensional space. The kMkNN algorithm has two stages. In the buildup stage, instead of using complex tree structures such as metric trees, kd-trees, or ball-tree, kMkNN uses a simple k-means clustering method to preprocess the training dataset. In the searching stage, given a query object, kMkNN finds nearest training objects starting from the nearest cluster to the query object and uses the triangle inequality to reduce the distance calculations. Experiments show that the performance of kMkNN is surprisingly good compared to the traditional k-NN algorithm and tree-based k-NN algorithms such as kd-trees and ball-trees. On a collection of 20 datasets with up to 10(6) records and 10(4) dimensions, kMkNN shows a 2-to 80-fold reduction of distance calculations and a 2- to 60-fold speedup over the traditional k-NN algorithm for 16 datasets. Furthermore, kMkNN performs significant better than a kd-tree based k-NN algorithm for all datasets and performs better than a ball-tree based k-NN algorithm for most datasets. The results show that kMkNN is effective for searching nearest neighbors in high dimensional spaces.

  1. Review: Michael Kerres & Reinhard Keil-Slawik (Eds. (2005. Hochschulen im digitalen Zeitalter: Innovationspotenziale und Strukturwandel [Universities in the Digital Age: Opportunities for Innovation and Structural Change

    Directory of Open Access Journals (Sweden)

    Sabine Hoidn

    2008-01-01

    Full Text Available In this anthology, international experts from universities and corporations share their experiences in facilitating the use of information and communication technologies in higher education, focusing on unfolding potentials and possibilities as well as identifying the general conditions necessary. Thirty-five experts address current and future challenges, pointing out some of the issues higher education institutions must tackle when it comes to e-learning, while presenting a large variety of projects and (so far successful practices. Due to limited resources, increasing competition, and the requirements of a global knowledge society, institutions of (elearning have to focus more on the development and implementation of e-learning strategies, strategic partnerships, integrated information management services, innovative learning and teaching concepts, continuing education, and quality development. In this book, researchers and practitioners alike will find interesting ideas and insights regarding these topics. URN: urn:nbn:de:0114-fqs0801584

  2. Evaluation Methodologies for Information Management Systems; Building Digital Tobacco Industry Document Libraries at the University of California, San Francisco Library/Center for Knowledge Management; Experiments with the IFLA Functional Requirements for Bibliographic Records (FRBR); Coming to Term: Designing the Texas Email Repository Model.

    Science.gov (United States)

    Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia

    2002-01-01

    Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…

  3. Digital fabrication

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 3) issue of the Nexus Network Journal features seven original papers dedicated to the theme “Digital Fabrication”. Digital fabrication is changing architecture in fundamental ways in every phase, from concept to artifact. Projects growing out of research in digital fabrication are dependent on software that is entirely surface-oriented in its underlying mathematics. Decisions made during design, prototyping, fabrication and assembly rely on codes, scripts, parameters, operating systems and software, creating the need for teams with multidisciplinary expertise and different skills, from IT to architecture, design, material engineering, and mathematics, among others The papers grew out of a Lisbon symposium hosted by the ISCTE-Instituto Universitario de Lisboa entitled “Digital Fabrication – A State of the Art”. The issue is completed with four other research papers which address different mathematical instruments applied to architecture, including geometric tracing system...

  4. Digital Relationships

    DEFF Research Database (Denmark)

    Ledborg Hansen, Richard

    -­rich information and highly interesting communication are sky-­high and rising. With a continuous increase in digitized communication follows a decrease in face-­to-­face encounters and our ability to engage in inter-­personal relationships are suffering for it (Davis, 2013). The behavior described in this paper......-­‐Jones, 2011) for increases in effectiveness and efficiency we indiscriminately embrace digital communication and digitized information dissemination with enthusiasm – at the risk of ignoring the potentially dark side of technology. However, technology also holds a promise for better understanding precisely...... for the same reasons – that the growing amount of digitized communication “out there” represents data waiting to be sifted, analyzed and decoded. In this paper “Facebook behavior” refers to a particular behavior characterized by presenting your self and representations of selected self in the hope of getting...

  5. Digital Discretion

    DEFF Research Database (Denmark)

    Busch, Peter Andre; Zinner Henriksen, Helle

    2018-01-01

    discretion is suggested to reduce this footprint by influencing or replacing their discretionary practices using ICT. What is less researched is whether digital discretion can cause changes in public policy outcomes, and under what conditions such changes can occur. Using the concept of public service values......This study reviews 44 peer-reviewed articles on digital discretion published in the period from 1998 to January 2017. Street-level bureaucrats have traditionally had a wide ability to exercise discretion stirring debate since they can add their personal footprint on public policies. Digital......, we suggest that digital discretion can strengthen ethical and democratic values but weaken professional and relational values. Furthermore, we conclude that contextual factors such as considerations made by policy makers on the macro-level and the degree of professionalization of street...

  6. Do hospital physicians really want to go digital? Acceptance of a picture archiving and communication system in a university hospital; Moechten Krankenhausaerzte wirklich auf digitale Systeme umsteigen? Die Akzeptanz gegenueber einem Bildarchivierungs- und Uebermittlungssystem in einer Universitaetsklinik

    Energy Technology Data Exchange (ETDEWEB)

    Duyck, P.; Pynoo, B.; Devolder, P.; Voet, T.; Adang, L.; Vercruysse, J. [Radiologie und medizinische Bildgebung, Universitaetsklinik Gent (Belgium)

    2008-07-15

    Purpose: radiology departments are making the transition from analog film to digital images by means of PACS (Picture Archiving and Communication System). It is critical for the hospital that its physicians adopt and accept the new digital work method regarding radiological information. The aim of this study is to investigate hospital physicians' acceptance of PACS using questionnaires pre- and post-implementation and to identify main influencing factors. Materials and methods: the study was conducted in an 1169 bed university hospital. The UTAUT (Unified Theory of Acceptance and Use of Technology) questionnaire was administered at two times: one month pre-implementation (T1) and 1.5 years post-implementation (T2) of PACS, targeting all hospital physicians with the exemption of radiologists. The UTAUT scales (Behavioral Intention BI; Facilitating Conditions FC; Effort Expectancy EE; Performance Expectancy PE; Anxiety ANX; Social Influence SI; System Use USE; Attitude toward technology ATT; Self-Efficacy SE) were used to assess questions regarding: (a) PACS' usefulness, (b) PACS' ease of learning/using, (c) PACS support availability, (d) the perceived pressure to use PACS, (e) physicians' attitude towards PACS and (f) physicians' intention to use and actual use of PACS. Results: at T1 scale ratings were positive toward the PACS implementation. The ratings on all scales with the exception of self-efficacy improved at T2. Regression analysis revealed that the key factor for intention to use PACS at T1 was the usefulness of PACS, while the availability and awareness of support was its most important predictor at T2. Overall, PE was the best predictor of BI, but all four UTAUT-determinants (PE, FC, EE and SI) were salient for its prediction. Variance explained in BI ranged from 31 to 37% while variance explained in USE was very low (3%). (orig.)

  7. Digital Collections, Digital Libraries & the Digitization of Cultural Heritage Information.

    Science.gov (United States)

    Lynch, Clifford

    2002-01-01

    Discusses digital collections and digital libraries. Topics include broadband availability; digital rights protection; content, both non-profit and commercial; digitization of cultural content; sustainability; metadata harvesting protocol; infrastructure; authorship; linking multiple resources; data mining; digitization of reference works;…

  8. Review of DigitalSignage.com

    Directory of Open Access Journals (Sweden)

    Clifford Richmond

    2014-04-01

    Full Text Available Digital signage has been used in the commercial sector for decades. As display and networking technologies become more advanced and less expensive, it is surprisingly easy to implement a digital signage program at a minimal cost. In the fall of 2011, the University of Florida (UF, Health Sciences Center Library (HSCL initiated the use of digital signage inside and outside its Gainesville, Florida facility. This article details UF HSCL's use and evaluation of DigitalSignage.com signage software to organize and display its digital content.

  9. digital natives and digital immigrants

    OpenAIRE

    Cardina, Bruno; Francisco, Jerónimo; Reis, Pedro; trad. Silva, Fátima

    2011-01-01

    This article focuses on the generational gaps in school learning. Initially, we have tried to provide the framework in relation to the term digital native in order to understand the key aspects of the generation born after the advent and the global use of the Internet. They were found to be “multitasking” people, linked to technology and connectivity, as opposed to digital immigrants, born in an earlier period and seeking to adapt to the technological world. We also present some r...

  10. The establishment phases of ETD program in a brand new University

    KAUST Repository

    Baessa, Mohamed A.; Vijayakumar, J.K.

    2014-01-01

    For many Universities, ETD is a combination of digitized documents and digitally born documents. But at brand new Universities like King Abdullah University of Science and Technology (KAUST), or more specifically for the libraries that born

  11. Accessible Geoscience - Digital Fieldwork

    Science.gov (United States)

    Meara, Rhian

    2017-04-01

    Accessible Geoscience is a developing field of pedagogic research aimed at widening participation in Geography, Earth and Environmental Science (GEES) subjects. These subjects are often less commonly associated with disabilities, ethnic minorities, low income socio-economic groups and females. While advancements and improvements have been made in the inclusivity of these subject areas in recent years, access and participation of disabled students remains low. While universities are legally obligated to provide reasonable adjustments to ensure accessibility, the assumed incompatibility of GEES subjects and disability often deters students from applying to study these courses at a university level. Instead of making reasonable adjustments if and when they are needed, universities should be aiming to develop teaching materials, spaces and opportunities which are accessible to all, which in turn will allow all groups to participate in the GEES subjects. With this in mind, the Swansea Geography Department wish to enhance the accessibility of our undergraduate degree by developing digital field work opportunities. In the first instance, we intend to digitise three afternoon excursions which are run as part of a 1st year undergraduate module. Each of the field trips will be digitized into English- and Welsh-medium formats. In addition, each field trip will be digitized into British Sign Language (BSL) to allow for accessibility for D/deaf and hard of hearing students. Subtitles will also be made available in each version. While the main focus of this work is to provide accessible fieldwork opportunities for students with disabilities, this work also has additional benefits. Students within the Geography Department will be able to revisit the field trips, to revise and complete associated coursework. The use of digitized field work should not replace opportunities for real field work, but its use by the full cohort of students will begin to "normalize" accessible field

  12. The relationship between second-to-fourth digit (2D:4D) ratios and problematic and pathological Internet use among Turkish university students.

    Science.gov (United States)

    Canan, Fatih; Karaca, Servet; Düzgün, Melike; Erdem, Ayşe Merve; Karaçaylı, Esranur; Topan, Nur Begüm; Lee, Sang-Kyu; Zhai, Zu Wei; Kuloğlu, Murat; Potenza, Marc N

    2017-03-01

    Background and aims The ratio of the second and fourth fingers (2D:4D ratio) is a sexually dimorphic trait, with men tending to have lower values than women. This ratio has been related to prenatal testosterone concentrations and addictive behaviors including problematic video-gaming. We aimed to investigate the possible association between 2D:4D ratios and Internet addiction and whether such a relationship would be independent of impulsivity. Methods A total of 652 university students (369 women, 283 men), aged 17-27 years, were enrolled in the study. Problematic and pathological Internet use (PPIU) was assessed using the Internet Addiction Test (IAT). The participants also completed the Barratt Impulsiveness Scale (version 11; BIS-11) and had their 2D:4D ratios measured. Results 2D:4D ratios were not significantly different in women with PPIU and in those with adaptive Internet use (AIU). Men with PPIU exhibited lower 2D:4D ratios on both hands when compared with those with AIU. Correlation analysis revealed that 2D:4D ratios on both hands were negatively correlated with IAT scores among men, but not among women. The multiple linear regression analysis revealed that age, duration of weekly Internet use, impulsiveness, and 2D:4D ratios on the right hand were independently associated with IAT scores among men, and impulsivity did not mediate the relationship between 2D:4D ratios and PPIU. Conclusions For men, 2D:4D ratios on the right hand were inversely correlated with Internet addiction severity even after controlling for individual differences in impulsivity. These findings suggest that high prenatal testosterone levels may contribute to the occurrence of PPIU among men.

  13. Chronopolis Digital Preservation Network

    Directory of Open Access Journals (Sweden)

    David Minor

    2010-07-01

    Full Text Available The Chronopolis Digital Preservation Initiative, one of the Library of Congress’ latest efforts to collect and preserve at-risk digital information, has completed its first year of service as a multi-member partnership to meet the archival needs of a wide range of domains.Chronopolis is a digital preservation data grid framework developed by the San Diego Supercomputer Center (SDSC at UC San Diego, the UC San Diego Libraries (UCSDL, and their partners at the National Center for Atmospheric Research (NCAR in Colorado and the University of Maryland's Institute for Advanced Computer Studies (UMIACS.Chronopolis addresses a critical problem by providing a comprehensive model for the cyberinfrastructure of collection management, in which preserved intellectual capital is easily accessible, and research results, education material, and new knowledge can be incorporated smoothly over the long term. Integrating digital library, data grid, and persistent archive technologies, Chronopolis has created trusted environments that span academic institutions and research projects, with the goal of long-term digital preservation.A key goal of the Chronopolis project is to provide cross-domain collection sharing for long-term preservation. Using existing high-speed educational and research networks and mass-scale storage infrastructure investments, the partnership is leveraging the data storage capabilities at SDSC, NCAR, and UMIACS to provide a preservation data grid that emphasizes heterogeneous and highly redundant data storage systems.In this paper we will explore the major themes within Chronopolis, including:a The philosophy and theory behind a nationally federated data grid for preservation. b The core tools and technologies used in Chronopolis. c The metadata schema that is being developed within Chronopolis for all of the data elements. d Lessons learned from the first year of the project.e Next steps in digital preservation using Chronopolis: how we

  14. Pedagogical Digital Competence--Between Values, Knowledge and Skills

    Science.gov (United States)

    From, Jorgen

    2017-01-01

    The fact that the education provided by universities and university colleges is becoming ever more digitalized has resulted in new challenges for university teachers in providing high-quality teaching and adapting to the needs of changing student populations. Digitalization has increasingly introduced a new dimension in teachers' pedagogical…

  15. Digital evidence

    Directory of Open Access Journals (Sweden)

    Lukić Tatjana

    2012-01-01

    Full Text Available Although computer makes human activities faster and easier, innovating and creating new forms of work and other kinds of activities, it also influenced the criminal activity. The development of information technology directly affects the development of computer forensics without which, it can not even imagine the discovering and proving the computer offences and apprehending the perpetrator. Information technology and computer forensic allows us to detect and prove the crimes committed by computer and capture the perpetrators. Computer forensics is a type of forensics which can be defined as a process of collecting, preserving, analyzing and presenting digital evidence in court proceedings. Bearing in mind, that combat against crime, in which computers appear as an asset or object of the offense, requires knowledge of digital evidence as well as specific rules and procedures, the author in this article specifically addresses the issues of digital evidence, forensic (computer investigation, specific rules and procedures for detecting, fixing and collecting digital evidence and use of this type of evidence in criminal proceedings. The author also delas with international standards regarding digital evidence and cyber-space investigation.

  16. Digital watermark

    Directory of Open Access Journals (Sweden)

    Jasna Maver

    2000-01-01

    Full Text Available The huge amount of multimedia contents available on the World-Wide-Web is beginning to raise the question of their protection. Digital watermarking is a technique which can serve various purposes, including intellectual property protection, authentication and integrity verification, as well as visible or invisible content labelling of multimedia content. Due to the diversity of digital watermarking applicability, there are many different techniques, which can be categorised according to different criteria. A digital watermark can be categorised as visible or invisible and as robust or fragile. In contrast to the visible watermark where a visible pattern or image is embedded into the original image, the invisible watermark does not change the visual appearance of the image. The existence of such a watermark can be determined only through a watermark ex¬traction or detection algorithm. The robust watermark is used for copyright protection, while the fragile watermark is designed for authentication and integrity verification of multimedia content. A watermark must be detectable or extractable to be useful. In some watermarking schemes, a watermark can be extracted in its exact form, in other cases, we can detect only whether a specific given watermarking signal is present in an image. Digital libraries, through which cultural institutions will make multimedia contents available, should support a wide range of service models for intellectual property protection, where digital watermarking may play an important role.

  17. Digital Creativity

    DEFF Research Database (Denmark)

    Petersson Brooks, Eva; Brooks, Anthony Lewis

    2014-01-01

    This paper reports on a study exploring the outcomes from children’s play with technology in early childhood learning practices. The paper addresses questions related to how digital technology can foster creativity in early childhood learning environments. It consists of an analysis of children......’s interaction with the KidSmart furniture focusing on digital creativity potentials and play values suggested by the technology. The study applied a qualitative approach and included125 children (aged three to five), 10 pedagogues, and two librarians. The results suggests that educators should sensitively...... consider intervening when children are interacting with technology, and rather put emphasize into the integration of the technology into the environment and to the curriculum in order to shape playful structures for children’s digital creativity....

  18. Universal service policy in Vietnam

    DEFF Research Database (Denmark)

    Do Manh, Thai; Falch, Morten; Von Salakpi, Simeon

    2016-01-01

    Universal service provision is a key to bridge the digital divide. This paper provides an empirical examination of the Vietnamese universal policy introduced in 2015 for implementation up to 2020. Using the framework of King et al. (1994) the paper analyses the universal services policy in Vietna...

  19. Using Digital Games to Learn Mathematics – What students think?

    OpenAIRE

    Su Ting Yong; Ian Harrison; Peter Gates

    2016-01-01

    The aim of this study was to explore how university foundation students perceive the use of digital games in learning mathematics. Data was collected using an online questionnaire and 209 foundation university students participated in this study.  The questionnaire was used to explore students’ gaming experience and students’ attitude towards mathematics learning with digital games.  It was found that most of the university foundation students liked to play different types of digital games.  ...

  20. Digital photogrammetry

    CERN Document Server

    Egels, Yves

    2003-01-01

    Photogrammetry is the use of photography for surveying primarily and is used for the production of maps from aerial photographs. Along with remote sensing, it represents the primary means of generating data for Geographic Information Systems (GIS). As technology develops, it is becoming easier to gain access to it. The cost of digital photogrammetric workstations are falling quickly and these new tools are therefore becoming accessible to more and more users. Digital Photogrammetry is particularly useful as a text for graduate students in geomantic and is also suitable for people with a good basic scientific knowledge who need to understand photogrammetry, and who wish to use the book as a reference.

  1. Digital Marketing

    OpenAIRE

    Jerry Wind; Vijay Mahajan

    2002-01-01

    The digital revolution has shaken marketing to its core with consumers being offered greater price transparency and often even the chance to dictate the price. What does pricing mean in a world in which customers propose their own prices (as at priceline.com) or buyers and sellers haggle independently in auctions (as at e-Bay)? The most significant changes in the digital marketing show the emergence of 'cyber consumers', the cyber business-to-business world and the changing reality of an incr...

  2. Digital "X"

    DEFF Research Database (Denmark)

    Baiyere, Abayomi; Grover, Varun; Gupta, Alok

    2017-01-01

    Interest in using digital before existing research concepts seem to be on the rise in the IS field. This panel is positioned to explore what value lies in labelling our research as digital “x” as opposed to the well established IT “x” (where “x” can be strategy, infrastructure, innovation, artifa...... between this stream of research and existing research. Central among the expected output of the panel is the advancement of suggestions for future research and the critical pitfalls to avoid in doing so....

  3. Digital Radiography

    Science.gov (United States)

    1986-01-01

    System One, a digital radiography system, incorporates a reusable image medium (RIM) which retains an image. No film is needed; the RIM is read with a laser scanner, and the information is used to produce a digital image on an image processor. The image is stored on an optical disc. System allows the radiologist to "dial away" unwanted images to compare views on three screens. It is compatible with existing equipment and cost efficient. It was commercialized by a Stanford researcher from energy selective technology developed under a NASA grant.

  4. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  5. Digital voltmeter

    International Nuclear Information System (INIS)

    Yohannes Kamadi; Soekarno.

    1976-01-01

    The electrical voltage measuring equipment with digital display has been made. This equipment uses four digits display with single polarity measurement and integrating system. Pulses from the oscillator will be counted and converted to the staircase voltages, and compared to the voltage measured. When the balance is already achieved, the pulse will appear at the comparator circuit. This pulse will be used to trigger univibrator circuit. The univibrator output is used as signal for stopping the counting, and when reading time T already stops, the counting system will be reset. (authors)

  6. Digital communication

    CERN Document Server

    Das, Apurba

    2010-01-01

    ""Digital Communications"" presents the theory and application of the philosophy of Digital Communication systems in a unique but lucid form. This book inserts equal importance to the theory and application aspect of the subject whereby the authors selected a wide class of problems. The Salient features of the book are: the foundation of Fourier series, Transform and wavelets are introduces in a unique way but in lucid language; the application area is rich and resemblance to the present trend of research, as we are attached with those areas professionally; a CD is included which contains code

  7. Digital literacies

    CERN Document Server

    Hockly, Nicky; Pegrum, Mark

    2014-01-01

    Dramatic shifts in our communication landscape have made it crucial for language teaching to go beyond print literacy and encompass the digital literacies which are increasingly central to learners' personal, social, educational and professional lives. By situating these digital literacies within a clear theoretical framework, this book provides educators and students alike with not just the background for a deeper understanding of these key 21st-century skills, but also the rationale for integrating these skills into classroom practice. This is the first methodology book to address not jus

  8. SkyFACT: high-dimensional modeling of gamma-ray emission with adaptive templates and penalized likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Storm, Emma; Weniger, Christoph [GRAPPA, Institute of Physics, University of Amsterdam, Science Park 904, 1090 GL Amsterdam (Netherlands); Calore, Francesca, E-mail: e.m.storm@uva.nl, E-mail: c.weniger@uva.nl, E-mail: francesca.calore@lapth.cnrs.fr [LAPTh, CNRS, 9 Chemin de Bellevue, BP-110, Annecy-le-Vieux, 74941, Annecy Cedex (France)

    2017-08-01

    We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (∼> 10{sup 5}) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that are motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |ℓ|<90{sup o} and | b |<20{sup o}, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.

  9. High-dimensional structured light coding/decoding for free-space optical communications free of obstructions.

    Science.gov (United States)

    Du, Jing; Wang, Jian

    2015-11-01

    Bessel beams carrying orbital angular momentum (OAM) with helical phase fronts exp(ilφ)(l=0;±1;±2;…), where φ is the azimuthal angle and l corresponds to the topological number, are orthogonal with each other. This feature of Bessel beams provides a new dimension to code/decode data information on the OAM state of light, and the theoretical infinity of topological number enables possible high-dimensional structured light coding/decoding for free-space optical communications. Moreover, Bessel beams are nondiffracting beams having the ability to recover by themselves in the face of obstructions, which is important for free-space optical communications relying on line-of-sight operation. By utilizing the OAM and nondiffracting characteristics of Bessel beams, we experimentally demonstrate 12 m distance obstruction-free optical m-ary coding/decoding using visible Bessel beams in a free-space optical communication system. We also study the bit error rate (BER) performance of hexadecimal and 32-ary coding/decoding based on Bessel beams with different topological numbers. After receiving 500 symbols at the receiver side, a zero BER of hexadecimal coding/decoding is observed when the obstruction is placed along the propagation path of light.

  10. CyTOF workflow: differential discovery in high-throughput high-dimensional cytometry datasets [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Malgorzata Nowicka

    2017-05-01

    Full Text Available High dimensional mass and flow cytometry (HDCyto experiments have become a method of choice for high throughput interrogation and characterization of cell populations.Here, we present an R-based pipeline for differential analyses of HDCyto data, largely based on Bioconductor packages. We computationally define cell populations using FlowSOM clustering, and facilitate an optional but reproducible strategy for manual merging of algorithm-generated clusters. Our workflow offers different analysis paths, including association of cell type abundance with a phenotype or changes in signaling markers within specific subpopulations, or differential analyses of aggregated signals. Importantly, the differential analyses we show are based on regression frameworks where the HDCyto data is the response; thus, we are able to model arbitrary experimental designs, such as those with batch effects, paired designs and so on. In particular, we apply generalized linear mixed models to analyses of cell population abundance or cell-population-specific analyses of signaling markers, allowing overdispersion in cell count or aggregated signals across samples to be appropriately modeled. To support the formal statistical analyses, we encourage exploratory data analysis at every step, including quality control (e.g. multi-dimensional scaling plots, reporting of clustering results (dimensionality reduction, heatmaps with dendrograms and differential analyses (e.g. plots of aggregated signals.

  11. Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points

    Science.gov (United States)

    Regis, Rommel G.

    2014-02-01

    This article develops two new algorithms for constrained expensive black-box optimization that use radial basis function surrogates for the objective and constraint functions. These algorithms are called COBRA and Extended ConstrLMSRBF and, unlike previous surrogate-based approaches, they can be used for high-dimensional problems where all initial points are infeasible. They both follow a two-phase approach where the first phase finds a feasible point while the second phase improves this feasible point. COBRA and Extended ConstrLMSRBF are compared with alternative methods on 20 test problems and on the MOPTA08 benchmark automotive problem (D.R. Jones, Presented at MOPTA 2008), which has 124 decision variables and 68 black-box inequality constraints. The alternatives include a sequential penalty derivative-free algorithm, a direct search method with kriging surrogates, and two multistart methods. Numerical results show that COBRA algorithms are competitive with Extended ConstrLMSRBF and they generally outperform the alternatives on the MOPTA08 problem and most of the test problems.

  12. Transforming high-dimensional potential energy surfaces into sum-of-products form using Monte Carlo methods

    Science.gov (United States)

    Schröder, Markus; Meyer, Hans-Dieter

    2017-08-01

    We propose a Monte Carlo method, "Monte Carlo Potfit," for transforming high-dimensional potential energy surfaces evaluated on discrete grid points into a sum-of-products form, more precisely into a Tucker form. To this end we use a variational ansatz in which we replace numerically exact integrals with Monte Carlo integrals. This largely reduces the numerical cost by avoiding the evaluation of the potential on all grid points and allows a treatment of surfaces up to 15-18 degrees of freedom. We furthermore show that the error made with this ansatz can be controlled and vanishes in certain limits. We present calculations on the potential of HFCO to demonstrate the features of the algorithm. To demonstrate the power of the method, we transformed a 15D potential of the protonated water dimer (Zundel cation) in a sum-of-products form and calculated the ground and lowest 26 vibrationally excited states of the Zundel cation with the multi-configuration time-dependent Hartree method.

  13. Modeling genome-wide dynamic regulatory network in mouse lungs with influenza infection using high-dimensional ordinary differential equations.

    Science.gov (United States)

    Wu, Shuang; Liu, Zhi-Ping; Qiu, Xing; Wu, Hulin

    2014-01-01

    The immune response to viral infection is regulated by an intricate network of many genes and their products. The reverse engineering of gene regulatory networks (GRNs) using mathematical models from time course gene expression data collected after influenza infection is key to our understanding of the mechanisms involved in controlling influenza infection within a host. A five-step pipeline: detection of temporally differentially expressed genes, clustering genes into co-expressed modules, identification of network structure, parameter estimate refinement, and functional enrichment analysis, is developed for reconstructing high-dimensional dynamic GRNs from genome-wide time course gene expression data. Applying the pipeline to the time course gene expression data from influenza-infected mouse lungs, we have identified 20 distinct temporal expression patterns in the differentially expressed genes and constructed a module-based dynamic network using a linear ODE model. Both intra-module and inter-module annotations and regulatory relationships of our inferred network show some interesting findings and are highly consistent with existing knowledge about the immune response in mice after influenza infection. The proposed method is a computationally efficient, data-driven pipeline bridging experimental data, mathematical modeling, and statistical analysis. The application to the influenza infection data elucidates the potentials of our pipeline in providing valuable insights into systematic modeling of complicated biological processes.

  14. Big Data Challenges of High-Dimensional Continuous-Time Mean-Variance Portfolio Selection and a Remedy.

    Science.gov (United States)

    Chiu, Mei Choi; Pun, Chi Seng; Wong, Hoi Ying

    2017-08-01

    Investors interested in the global financial market must analyze financial securities internationally. Making an optimal global investment decision involves processing a huge amount of data for a high-dimensional portfolio. This article investigates the big data challenges of two mean-variance optimal portfolios: continuous-time precommitment and constant-rebalancing strategies. We show that both optimized portfolios implemented with the traditional sample estimates converge to the worst performing portfolio when the portfolio size becomes large. The crux of the problem is the estimation error accumulated from the huge dimension of stock data. We then propose a linear programming optimal (LPO) portfolio framework, which applies a constrained ℓ 1 minimization to the theoretical optimal control to mitigate the risk associated with the dimensionality issue. The resulting portfolio becomes a sparse portfolio that selects stocks with a data-driven procedure and hence offers a stable mean-variance portfolio in practice. When the number of observations becomes large, the LPO portfolio converges to the oracle optimal portfolio, which is free of estimation error, even though the number of stocks grows faster than the number of observations. Our numerical and empirical studies demonstrate the superiority of the proposed approach. © 2017 Society for Risk Analysis.

  15. Low-storage implicit/explicit Runge-Kutta schemes for the simulation of stiff high-dimensional ODE systems

    Science.gov (United States)

    Cavaglieri, Daniele; Bewley, Thomas

    2015-04-01

    Implicit/explicit (IMEX) Runge-Kutta (RK) schemes are effective for time-marching ODE systems with both stiff and nonstiff terms on the RHS; such schemes implement an (often A-stable or better) implicit RK scheme for the stiff part of the ODE, which is often linear, and, simultaneously, a (more convenient) explicit RK scheme for the nonstiff part of the ODE, which is often nonlinear. Low-storage RK schemes are especially effective for time-marching high-dimensional ODE discretizations of PDE systems on modern (cache-based) computational hardware, in which memory management is often the most significant computational bottleneck. In this paper, we develop and characterize eight new low-storage implicit/explicit RK schemes which have higher accuracy and better stability properties than the only low-storage implicit/explicit RK scheme available previously, the venerable second-order Crank-Nicolson/Runge-Kutta-Wray (CN/RKW3) algorithm that has dominated the DNS/LES literature for the last 25 years, while requiring similar storage (two, three, or four registers of length N) and comparable floating-point operations per timestep.

  16. Library performance measurement in the digital age

    OpenAIRE

    Conyers, A.; Payne, Philip

    2011-01-01

    Book synopsis: University libraries around the world have embraced the possibilities of the digital learning environment, facilitating its use and proactively seeking to develop the provision of electronic resources and services. The digital environment offers opportunities and challenges for librarians in all aspects of their work - in information literacy, virtual reference, institutional repositories, e-learning, managing digital resources and social media. The authors in this timely book ...

  17. Digital data monitoring display and logging

    International Nuclear Information System (INIS)

    Ficaro, E.P.; Wehe, D.K.

    1987-01-01

    A digital data acquisition system for monitoring plant variables has been designed and implemented at the University of Michigan's Ford Nuclear Reactor (FNR), a 2 Megawatt, open-pool, research reactor. The digital data provided by this system is useful for: closed loop control, real time experimental calculations, advanced simulation-as-knowledge techniques, improved operator training, and expert system applications. The purpose of this paper is to discuss the transition to the digital data world and the anticipated applications and benefits

  18. Digital Forensics

    Science.gov (United States)

    Harron, Jason; Langdon, John; Gonzalez, Jennifer; Cater, Scott

    2017-01-01

    The term forensic science may evoke thoughts of blood-spatter analysis, DNA testing, and identifying molds, spores, and larvae. A growing part of this field, however, is that of digital forensics, involving techniques with clear connections to math and physics. This article describes a five-part project involving smartphones and the investigation…

  19. Digital Disruption

    DEFF Research Database (Denmark)

    Rosenstand, Claus Andreas Foss

    det digitale domæne ud over det niveau, der kendetegner den nuværende debat, så præsenteres der ny viden om digital disruption. Som noget nyt udlægges Clayton Christens teori om disruptiv innovation med et særligt fokus på små organisationers mulighed for eksponentiel vækst. Specielt udfoldes...... forholdet mellem disruption og den stadig accelererende digitale udvikling i konturerne til ny teoridannelse om digital disruption. Bogens undertitel ”faretruende og fascinerende forandringer” peger på, at der er behov for en nuanceret debat om digital disruption i modsætning til den tone, der er slået an i...... videre kalder et ”disruption-råd”. Faktisk er rådet skrevet ind i 2016 regeringsgrundlaget for VLK-regeringen. Disruption af organisationer er ikke et nyt fænomen; men hastigheden, hvormed det sker, er stadig accelererende. Årsagen er den globale mega-trend: Digitalisering. Og derfor er specielt digital...

  20. Digital books.

    Science.gov (United States)

    Wink, Diane M

    2011-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes digital books.